A dark allegory disguised as a love story

Constance 2022-04-19 09:01:39

The title should be translated as "sublimation" or "transcendence". Of course, the vulgar translation of foreign movie names in China has a long history, so I won't comment here.

A 6.6 rating is a shame, at least this movie is beyond my knowledge of Hollywood's ability to make sci-fi movies. Hollywood is good at shooting aliens wearing hoods, and is good at smashing hundreds of millions of dollars in special effects to create a big production with fierce fighting. However, Hollywood sci-fi movies can also reflect a little deep thinking, which is very commendable.

On the surface, it looks like a vulgar love movie. Even though the hero and heroine have gone through life and death, they have to love and stay with each other across the boundaries of body and soul, which is very touching. But in fact, the love story is just a pretense to cover up too cold philosophical thinking.

A few hidden issues:

1. What is human nature? What makes me unique as a human individual? Liberalism worships human nature, places it in the place of God in religious belief, and claims that human rights are innate and inviolable. So what exactly is this divine humanity?

Is it my life experience? Is it my memory? Is it my emotions and desires as a human being, like love and hate? Is it a way of thinking that I have developed through my own life experiences?

If so, then when I upload all of the above information to the digital network, the reconstituted digital life should be equivalent to my existence as a human being. The digital Will should be equivalent to Will's physical body.

But apparently some people don't think so. In their view, even if this brand-new digital life possesses all the memories of Homo sapiens Will, as well as Will's thoughts and even emotions—such as his love for the heroine—it is still not equivalent to Will as a Homo sapiens life. Because the digital mind is fundamentally different from the human mind.

The question, then, is what is so special about the human psyche if it is not memory, where thinking and emotion separate us? What is the neuron, the organizational structure of the neural network that determines the fundamental difference between the human mind and the digital network? If this distinction does exist, isn't it in the emotions and ways of thinking that will still be reflected in the end?

Another inference is that because the digital Will has stronger computing power and develops new ideas, even if he still retains all the thoughts and memories uploaded before, he is no longer the physical Will of the past. This is tantamount to denying human development and progress. According to the same logic, the me of today is not the me of yesterday, because the me of today has more memories and new ideas, so I should be called a completely different name.

2. Digital Will did not kill or hurt anyone from beginning to end. It can even be said that he was doing good deeds throughout the whole process, as he said: I'm trying to help. Digital Will has done many things, such as curing terminally ill diseases, purifying the environment, creating jobs and wealth for poor towns, and helping the FBI locate terrorists and maintain social order. Many people have been transformed by the digital Will and connected to the online network controlled by Will, but most of them are voluntary choices, not the forced deprivation of their free will by the digital Will. They gave up their freedom by their own free will and chose to submit to a king, and they themselves felt no loss. Instead, as Mat said, he felt sublimated. So what is wrong with what Digital Will is doing?

On the contrary, digital Will is undoubtedly evil, even if he does not do anything that can be called "evil" in the moral sense. Perhaps what the villain really fears is Will's ability, Will's takeover and control of human society. If Will's abilities are allowed to continue to develop, it will be inevitable that this digital intelligence will take over human society in its entirety. Will will become the only dictator of all mankind, and mankind will completely lose the freedom of action and even the freedom to think independently. Because all minds will be connected to the same network.

But the problem is that this supreme dictator is not a cruel tyrant, but an absolutely rational ruler who has good intentions for mankind. The logical inference is that if such a philosopher king takes over the world, then permanent peace will come, the global economy will operate more orderly and efficiently, the earth's ecological environment will be restored, and the earth's civilization will flourish.

On the other hand, under the autonomy of human beings who lack collective rationality, lack of cooperation, contradictory and fickle, the whole world is in chaotic chaos. The global population is exploding, the ecological environment is destroyed, the global economy is extremely unbalanced, wars break out from time to time in the corners of the world, and the sword of nuclear winter hangs over human heads at any time. Why do we think that the takeover and domination of human society by digital intelligence must be evil?

3. In the long run, the replacement of human intelligence by digital intelligence is an irresistible historical trend.

It doesn't matter whether this digitized number has human memory, human-like emotions, or even self-awareness. The important thing is that digital intelligence will take over human society in all aspects.

Wisdom and consciousness are not necessarily unified. Wisdom is the ability to solve problems, and consciousness is the emotion and response to external stimuli. Organic life has both intelligence and consciousness, but for inorganic digital networks, it is likely to have intelligence far beyond human beings, but it will never be able to give birth to self-consciousness.

After all, what is self-awareness, we cannot clearly define it ourselves. Self-awareness is a black box, I can see the input and output of the outside world, but what is in the black box, I have no way of knowing. In theory, I can't even be sure that the person I'm talking to across from me must be self-aware, all I can be sure of is that he can respond to what I say, but whether that response comes from a given procedure, or does it come from In free will? I can't be sure.

Even this year, AI has surpassed humans in many ways. Only four or five years ago, there were still people on the Internet optimistically claiming that artificial intelligence would not be able to defeat humans at Go within 50 years, but in 2017, alphago was born, which completely shattered this illusion. In what areas are we now confident that we can maintain our advantage over the long term? natural language? face recognition? In fact, these have all been deciphered by artificial intelligence.

Nowadays, artificial intelligence has gradually begun to be applied in various practical fields, such as analyzing user preferences, user value, and analyzing traffic congestion through algorithms, but this is only the initial stage. As data improves, as algorithms develop, AI will have a deeper and more accurate understanding of humans and human society than we do. What kind of books, what kind of music, what kind of movies we like, artificial intelligence will accurately push them to us. There is no need to hold polls anymore, because AI is perfectly capable of accurately predicting the outcome of any vote.

The next stage is that algorithms will in turn try to influence our preferences and choices. AI will have the ability to influence our thinking by choosing the material that is pushed to us. As we grow accustomed to AI's advice, it will have the power to dictate our choices. The power of algorithms at first may be exploited by some ambitious politicians who like to exercise control over the people through algorithms to suggest a society like 1984. But even they themselves will eventually need to rely on AI for advice, so eventually they will lose the power to rule. Algorithms will be the ultimate rulers, whether or not the algorithm is autonomously conscious.

4. An interesting setting is that the virus developed by the villain years ago can still infect the upgraded and enhanced digital version of Will. This means one thing, although the digital version of Will has far more computing power than human beings and has logical abilities far beyond human beings, the digital version of Will may lack the creativity that human beings have, because he did not create a whole new version of source code.

The Matrix has a similar setting. The Matrix completely controls humans, but the machine intelligence lacks creativity, so it needs to be upgraded through Neo. In the current era, there is no way to predict whether future digital intelligence will suffer from this fatal flaw. If this is the case, it means that human beings at least still have the value of existence. If not, then it means a total replacement of human mission by digital intelligence.

The film ends with two layers of reversal. The first level of reversal is that digital Will still chooses to upload the heroine knowing that the heroine has a virus. This self-destructive act seems to prove that the digital Will still has humanity. The second level of reversal is that Will has left the seeds of rebirth in advance. This makes Will's self-sacrifice less certain—perhaps it's just a tactical tactic, knowing that he'll be reborn digitally alongside the uploaded heroine.

So, is the digital Will really still human? Or did he just make a cool strategic choice, upload the heroine, make her his own kind, and never stop opposing his mission? The answer is not so certain.

So, on the surface this is a Romeo and Juliet-esque love story ending, but it's actually extremely weird. The destruction of human civilization is doomed. One digital intelligence is enough to rule the world, but two superintelligent equivalents have emerged today. If more humans are uploaded, what will happen in the virtual digitized space?

View more about Transcendence reviews

Extended Reading
  • Antonio 2022-03-23 09:01:36

    I really don't understand the plot

Transcendence quotes

  • Will Caster: I'll never let you go.

  • Will Caster: Look, I'll be long gone. And you, you will never be in the end of it if you don't try.

    Max Waters: I know. I'd like to think that I was smart enough to save you.

    Will Caster: Don't underestimate yourself. You're the third smartest person I know.