"Machine Public Enemy": Questioning the limits of human logic

Emil 2021-10-13 13:05:37

The key to understanding the plot of "Machine Public Enemy" is the "Three Safety Laws of Robots". In the future world, highly simulated artificial intelligence robots will enter thousands of households as fully automatic housework assistants, and then become a part of everyone's daily life. The safety performance of robots used on such a large scale must be absolutely guaranteed, and in addition, they must always be loyal to humans. For this reason, Dr. Langing formulated the "Three Safety Laws of Robots". Every robot is designed to strictly abide by these "Three Principles". The specific contents are:
1. Robots cannot harm humans, or sit and watch humans being harmed. Stand by.
2. Unless the first rule is violated, robots must obey human orders.
3. Under the premise of not violating the first and second laws, the robot must protect itself.
The "three major safety rules" are intertwined, logically rigorous, and all aspects are considered: the first premise is that "robots cannot harm humans", under this premise (otherwise it will be difficult to prevent robots from being used by some people to attack others. Human), which stipulates the subordinate status of robots relative to humans. As long as the above regulations are met, the robot can protect itself (in fact, it is also protecting human property). The "Safety Laws" seem to be perfect, so when the "Problem Robot" Sony appeared in the plot of "Machine Public Enemy", everyone's reaction was that the "three major safety rules" were violated! This was quickly confirmed, and the audience's suspicions immediately shifted to the president of the company who behaved abnormally and had a ghost in his heart. However, the plot continued to turn around, the president was killed, and the robot rioted-all the problems in the previous year were on the host computer "Vickie".
The most interesting thing is that when Vicky commanded all NS5 robots to fully occupy the human world, his behavior did not "violate" the three major safety rules. The reason NS5 restricts people’s freedom to go out is very "justified": to ensure your safety. Logically, this is in line with the primary premise that "robots cannot harm humans". If the robot knows that people may encounter injuries when going out (such as a traffic accident), but still let them go out, is this considered as "harming to humans"? So the natural result of Vicky's self-evolution is from the negative "cannot harm humans" to the active "protecting humans". And this kind of "protection" is undoubtedly "enslavement" in the eyes of mankind. Paradoxically, this right of enslavement is granted to robots by human beings through the "three laws" in every possible way to ensure their own safety!
Also paradoxical is the judgment of Sony's behavior. Robot killing should undoubtedly be regarded as the most serious act of "harming humans". However, from the macro perspective of the plot of "Machine Public Enemy", Sony's killing of Dr. Ronning is the first part of the entire plan to help humans fight the NS5 robot riot. Where is "harming mankind" is clearly "saving mankind"! In this regard, Detective Spooner's (Will Smith) intuition is particularly accurate. His distrust of robots is that they are too dogmatic and logical, and they are "logical slaves", so he described them contemptuously. Only "light bulb and clockwork".
In fact, the film’s suspicion of this overly mechanical logic has projected a lot of the creators’ understanding of the reality of human society. For example, Dr. Langing authorized Sony to kill himself. On the surface, it is difficult to judge whether it really violates the three major safety rules. In fact, it is a question of whether people can kill themselves but have the right to kill themselves with the help of others. "Ethanasia" is still controversial and reflects the powerlessness of existing human logic on this issue. Will Smith once said: "The central concept of "Machine Public Enemy" is that there is no problem with robots, and technology is not the problem itself. The limit of human logic is the real problem." Cheng Zai said.

View more about I, Robot reviews

Extended Reading

I, Robot quotes

  • Detective Del Spooner: So, Dr. Calvin, what exactly do you do around here?

    Susan Calvin: My general fields are advanced robotics and psychiatry. Although, I specialize in hardware-to-wetware interfaces in an effort to advance U.S.R.'s robotic ahthropomorphization program.

    Detective Del Spooner: So, what exactly do you do around here?

    Susan Calvin: I make the robots seem more human.

    Detective Del Spooner: Now wasn't that easier to say?

    Susan Calvin: Not really. No.

  • Susan Calvin: A robot could not commit a murder, no more than a human could walk on water.

    Detective Del Spooner: Well, you know, there was this one guy... a long time ago.