Master Asimov and Master Jones are full of love and friendship

Wade 2022-04-23 07:01:22

The title is inappropriate, as Isaac Asimov had flown robotic battleships to the Galactic Empire back in 1992, when Duncan Jones was probably just starting to work toward his bachelor's degree in philosophy at Worcester College. This year is 2009, and I see the three laws of robotics everywhere in "Moon" - I read "Runaround", and Jones probably read it, but we don't have any communication with each other: so I also believe, The review titled "Director Jones's Tribute to Asimov and the Three Laws of Robotics - The Moon" was relevant but uninteresting, so the title stayed the same and we went straight to the point.

The first law of robots: A robot may not injure a human being or, through inaction, allow a human being to come to harm.

Remember the "I, Robot" from 2004? Among them, Sonny is not governed by the Three Laws, but because he is not a complete "robot", but was born through the repair and synthesis of future technology. The development trend of artificial intelligence in science fiction and movies is that robots will eventually become humans, and humans will eventually evolve into robots - the core question of this concept is to explore "what is the difference between humans and robots?" Where should the "invisible boundaries" be? In fact, this is not just a problem limited to the field of robotics. Friends who love science fiction movies, think of the previous "AI Artificial Intelligence", Martin, the robot child, his birth process and Dr. Tianma's creation of Astro Boy is simply same. The emotional systems created by geniuses are almost the same as real people, except that the body structure is different from that of ordinary people. If it is said that they also have souls, I believe that friends who have read these two works will definitely not object. In the recent new film "District 9", the alien that Wikus mutated into is also entangled in the identity of "human | alien". In "AI", Martin finally accepted the help of aliens tens of thousands of years later and became a real "human", but human beings were already extinct by this time; Wikus in "The Ninth District" firmly believed that aliens would be three years later He will return as scheduled and give him his human identity - the two of them are so obsessed with the simple word formed by one stroke and one stroke, but what does the word "person" stand for? Are there exact definitions? Aliens, robots, earthlings - as long as they have human-like ways of thinking and emotional activities, can we abandon the narrow definition of skin color, beauty, ugliness, height, shortness, fatness and thinness, and the broad sense of earth and Mars crawling upright, and calmly accept the "human beings"? Datong"?

In 2009, this problem was not so direct and sharp. We always felt that the "future" was still early, and we had never seen aliens with our own eyes. The white and silver ones shown in the news could move, walk, and have cameras to catch things. The real robot that automatically cleans the carpet in the room is still so simple and stupid, and the cloning technology has only stayed at the laboratory level after the great sensation caused by Dolly the sheep more than ten years ago. Jones was led by Asimov, waved the guide tube, and looked into the near future - should clones be considered robots? Or are they actually closer to humans? Jones gives an inequality: Human > Clone Sam > Mechanical Gerty.

The inequality in this film is reflected in the first law of robotics, which corresponds to the following interesting facts:
First of all, Sam can't kill people - the setting of Moon Energy Company seeks the best interests of the company under the premise of satisfying the first law. If Sam can kill, they will probably activate a large number of clones after finding out the truth, and will dispatch the maintenance crew on the ship to kill them, and then attack the earth. However, the strategy proposed by the new Sam (I would like to call the former No. 1 and the latter No. 2) activated after the accident of the old Sam in the film is to hide on the transport ship and return to Earth to expose it. In another place, during the dispute between No. 1 and No. 2 (they were already aware of each other's clone identities at that time), they mentioned, "You know very well that I will not kill you, and neither will I": this is obviously a manufacturing First-law constraints given when cloning humans. Interestingly enough, the paradox involved here is: Sam couldn't have killed Sam - the object's Sam was given the powers of a real human, but the subject is the unquestioned clone.. The guest of honor can also be exchanged. Because Sam can restart Gerty (I would like to believe that this is a dead and new life for the computer, after all, all previous memories have been lost), and there may be the possibility of physically destroying Gerty, so Sam is higher than Gerty in identity, but he has never To the height of real human beings: their lifespan is only 3 years, even if they have the memory of the body, they are only used to alleviate the depression of loneliness - the moon energy company is unstable in humaneness and cruelty, on the one hand, Sam lives completely and human beings on the moon. The astronaut lives the same life, with some good pasts to reminisce about, while his wife and daughter are responsible for providing hope for the future so that Sam can stick to the job at hand. While watching the movie, I felt more than once that Sam might be truly happy if he didn't know the truth - even if he lived for only 3 years, and finally died in an incinerator-style sleeping box, his hope was still alive: this Is it a kind of mercy? The cruel part is the despair that is gradually given - the film from the wife gradually shows signs of breaking up, and the way back is always far away, and the state of the body has been deteriorating. When I saw this, I thought of a pair of contradictions: on the one hand, the capital requirements for mass production of Sam should not be low for 3 years, and the work efficiency of clones nearing the end of life is not high, and they still need treatment from time to time. Electromagnetic interference The tower itself also consumes energy - is this really much less than an astronaut's three-year contract salary? What's more, the risk cost is also very high (in the end, the company is obviously notorious to the point that it is about to go bankrupt). If human loneliness reaches its limit in 3 years, why not send a cloned wife to visit regularly, or simply remove the corresponding "useless" emotional functions in the clones' brains, so that they can more effectively serve the moon mining work Woolen cloth? I believe this may just be a bug made to make the film more provocative (there are actually more than one such bug) - Jones certainly succeeded.

Second Law of Robots: A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.

On this one I would like to mention Gerty. It is believed that when artificial intelligence has developed to be able to breed "sympathy", the artificial brain has already been very developed in functional development. Take a look at that good partner of Sam, his "facial expressions" are rich, he can solve many random questions of Sam, he can accept Sam's request to violate the "rules" in the event of an accident, and he can ask to be "restarted" (reiterate: I thought that would amount to suicide). Is his brain still machine intelligence? Towards the end of the movie, I also thought that Gerty was actually doing everything according to the program (the greeting after each activation of Sam was the same, giving me a little "the scene is actually going back and forth"), the reason why he Helping Sam is actually to complete a larger conspiracy, to carry out a preset emergency response procedure to prevent the situation from getting out of control. But after seeing the subtitles, I found out that Gerty simply chose to sympathize with the clones and betray the Moon Energy Company - this is a very conscientious artificial intelligence who has a considerable relationship with Sam. Under normal circumstances, he will listen to the cloned Sam as a real person, but the authority of the real human being (that is, the Moon Energy Company) is greater than that of Sam, so he has to communicate with the earth in real time behind Sam's back and deceive Sam when necessary (that is, It's during IM and activation, tricking Sam into saying that what he's seeing "is a hallucination". More than once in the film) and so on. Such an indecisive, compassionate Gerty (I even think she should be the heroine of the movie, look at those cute and melancholy expressions... ^^). Asimov's "unless the first law is violated" in the second law is well understood, because robots should not be instigated by humans to harm humans. But does Gerty's intelligence fit the rules? I don't think it counts: he is also faced with a paradox - according to the inequality, Sam is considered "near-human" and the contact person of Luneng Company is human, and the instructions given to him from the two sides are diametrically opposite - Sam obviously wants to know the truth and return to Earth, and Yueneng hopes that Gerty can monitor Sam in his work assistance to ensure that he will not find out that he is a clone of the secret (by the logic of the ending, they certainly do not want to clone Sam to return to Earth). Note that Sam No. 2 had a dispute with Gerty. After No. 1 had an accident, No. 2 wanted to go out but Gerty wouldn't let him - this is because when No. 2 went out, it was possible to find No. 1 and then know the fact that he was a clone. Such a clear logical relationship, under Sam's repeated request, Gerty allowed it. Perhaps Gerty will give a random answer when faced with the paradoxical choice caused by the level of authority, but if this is the case, isn't Moon Energy digging its own grave? The plot here can't be considered rigorous at all, and it can even be said to be a bug.. But it's not impossible to explain it through brain supplements - after all, the film gives too many clues about the history of the lunar base and the strategy of the moon energy company. Little, just think about it: maybe Gerty was given such a high level of intelligence because of the needs of the development task? Human technology probably wasn't capable of producing a Sam clone with Gerty's electronic brain at the time, so the collapse of the lunar base (inferred from the known plot), even with some robotic-law-like restrictions on the artificial brain, would only It can be said that it will happen sooner or later.

Rule 2 also confirms the extended inequality proposed by Jones, but if restrictions are added, paradoxes will easily occur: for example, the robot is required to self-destruct in two ways that cannot be carried out at the same time. Such contradictions are also raised in many related films and science fiction: often the more complex the rules, the more loopholes.

The Third Law of Robots: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Rule 3 also has a lot of performance in the film: Sam 1's self-protection and self-rescue scenes appear many times, especially after his physical condition is getting worse and worse, he knows how to protect himself very well. No. 2 is equally good at self-protection, but he is also sacrificial (he is willing to let No. 1 return to Earth, while he remains on the moon - remember, rover accidents do not happen often, and rescue ships may be used for many years. won't go to the moon once..). There is also an interesting plot here, that is, on the activation of No. 3, No. 2 proposed to knock No. 3 stunned and put it on the wrecked moon car for the rescue team to deal with - the rescue team must deal with No. 3. into the incinerator, so this should be considered a deliberate murder between clones. Referring to Jones' inequality, we can even see three social systems that can be isolated from each other: human society, clone society, and robot society. Although clones and robots must follow the three laws (based on human logic), in fact , in the three societies, without considering the influence of other groups, the three laws have no binding force on the same type of species (it can be seen that the setting of the three laws of robots is completely based on human interests), clones and robots Society can also do everything that can happen between people in human society - they can help each other, they can hurt each other and even murder! However, as a result of this, another linkage bug appeared in the film: it was mentioned in the commentary on the first law that when No. 1 and No. 2 were fighting, they both declared that they would not kill each other, because this is what the law stipulates. Then the strategy given by No. 2 to No. 3 is murder... Maybe the setting made by humans for clones is that they cannot directly kill humans. This slightly contradictory setting is regarded as a performance in the fight between No. 1 and No. 2. Barely qualified (because clones aren't really human), but there's a problem in the presumed strategy of 2 vs. 3 - if premeditation with predictable outcomes can be allowed, then why can't the Sams go and murder humans Woolen cloth? As long as it is not a direct behavior, where is the standard for dividing "direct" and "indirect"?

This concludes the discussion of the three laws of robotics, and now I intend to return to my thoughts:

After watching the movie, there are four things that moved me the most: First, Gerty entered the correct password that only he knew, and showed Sam1 the truth about the death of several Sams; second, on the other side of the moon , when Sam can see with his own eyes the unreachable earth; the third is when Sam1 barely opened his eyes and watched the spaceship carrying Sam2 go away; the fourth is when Sam2 changed the preset path of the mining vehicle and collided with it. When the interference tower is turned over and the communication between the earth and the moon is reconnected (I also do not understand here, since Gerty can communicate directly with Live, there must be other ways to communicate with Live. Why doesn't Yueneng Company directly destroy the signal of the communication equipment? Looking for a function, but it takes a lot of trouble to build a jamming tower? Although it can be explained that the first generation of Sam has also worked on the moon, all constructions are based on the past when people were there. However, Yueneng's consideration seems to be too incomprehensible). In all fairness, I didn't really like this ending (that's the only reason I deducted one star), but what I really want to see is the clone corps intercepting the rescue ship, then using the "murder bug" to counterattack the earth, and finally restart the game. The cult scene of creating a cloned evolutionary new human being; or the more tragic "Gerty betrayal, the rescue ship never came, the No. 2 spacecraft exploded in the middle (the clones have a program that will self-destruct when they approach the earth), and No. 3 restarts. Start the work full of anticipation and end the dark department; or when Sam communicated on the 1st halfway and found out that the daughter was actually the great-great-great-granddaughter (or an old lady) of the main body Sam, and Sam had been on the moon for more than ten generations, Then he dies at the end of his lifespan, and the 2 is still in the dark (it's okay to get here... sadly); or the Sam 2 arrives at the earth and finds that the human race is wiped out, and the earth is just a new moon, The rescue ship was set by the Moon Energy Company hundreds of years ago. In fact, all Sams are just busy for nothingness and feel hopelessly lonely at the same time. It's kind of like what Wall-E did in "WALL-E," where the Earth is devoid of humans but still disposing of waste... It's like, how should I put it, the motivation to survive.

I'm not a serious school, I admit that I was moved by "The Moon". Even if I think about it and find that there are bugs of this kind, it is harmless after all, and it does not prevent "The Moon" from becoming a classic in science fiction films. It seems that Jones's script can be written more profoundly or more shockingly, but after thinking about it, everything is still too much. This is just right, so we always have a more exciting future, hope and expectation to accompany us.. In this way, life will not be boring, and the mind will not be as lonely as when you are on the moon 384,400 kilometers away.^^

View more about Moon reviews

Extended Reading
  • Sidney 2021-10-20 19:01:37

    Real loneliness is when others don't know your existence, and you always think you are among them.

  • Dahlia 2022-03-18 09:01:03

    What a fucking lonely job

Moon quotes

  • Sam Bell: Gerty, is there someone else in the room?

  • GERTY: Sam, I can only account for what occurs on this base.