The Truman Show Today | The Algorithmic Threat Not Predicted by Science Fiction

Jaren 2022-03-22 09:02:11

A film I want to discuss today is a documentary film "The social dilemma" (Chinese name: "Surveillance Capitalism: Smart Trap") produced by Netflix in 2020. In the film, a group of employees who have worked for Facebook, Apple, Google and other technology companies came to the camera and admitted that the Internet algorithms they created may be bringing unbearable consequences to humans.

Countless people's jobs have been replaced by artificial intelligence, and robots have human emotions and consciousness. This is the artificial intelligence crisis I once fantasized about. But this documentary tells a different story.

It speaks of the life we ​​are living, and in a sense, the era of artificial intelligence manipulating human beings has come.

How did this at first blunt conspiracy theory come about? The employees and tech experts featured in the film take aim at tech companies' profit models .

Maybe you've heard the phrase "where there is traffic, you can make money" or "if you don't spend money on a product, you're a product being sold." Both sentences are saying the same question: why are you able to use the vast majority of websites and products on the Internet for free? (Excluding internet charges of course)

The video thinks we don't have to pay because advertisers or other third parties pay for us. Advertising revenue accounts for 98.5% of Facebook’s total revenue, and Google, Baidu, Douyin, etc. also mainly rely on advertising fees to make profits.

Advertisers are the real customers of these platforms, and they pay for our attention .

So the business model of tech platforms is to keep people's attention on the screen as much as possible . The more attention, the higher the user stickiness, and the greater the expected exposure and benefit of the advertisement.

For example, the Aurora Building is the closest high-end office building to the Bund in Shanghai. If you want to use its large LED screen for advertising, the cost of the screen is 530,000 per hour per day. This high price is undoubtedly due to the 700,000 daily traffic on the Bund.

In the third quarter of 2020, Facebook's daily active users exceeded 1.8 billion, compared with over 400 million daily active users on WeChat and 200 million daily active users on Weibo. Online or offline, it is difficult for us to find a place that can accommodate hundreds of millions of active people at the same time. It is conceivable that this will attract advertisers.

The video introduces three main goals of technology companies in specific monetization operations : engagement goals (increase usage, swipe the screen), growth goals (keep you coming back to the software, and invite more friends) and advertising goals (get the top two. advertisers develop as expected, try to make money on advertising).

I don't know if you are like me. The more you look, the more familiar these three goals become. For example, the once popular Pinduoduo variety of fruit trees, a friend who has not spoken to in 800 years sent an invitation to bargain, and watching advertisements to win gold coins, etc.

To meet and continue to exceed these three goals, tech companies must understand people’s likes and interests well enough to collect extensive data on users’ usage of the platform .

It’s no exaggeration to say that today’s social platform is an omniscient surveillance system closely monitoring all traces (aka user data) people leave online .

Specific to the time you stay on a picture, the emoji you like to use, your search history, and the comments made on the Internet. Inadvertently revealed data is entered into the system, and the software algorithm tracks and evaluates every possible profit-making corner, predicting what we are going to do and what kind of person we are.

The business interest of tech companies is never to sell data, but to constantly analyze our behavior to build an ever-improving user profile that simulates consumer choices.

Big data analysis shows that people with similar personalities and growth environments as me are easily influenced by what kind of speech, why they are tempted by discounts, and what kind of products they like to buy.

It may be that there are too many scoldings of Facebook abroad. Facebook is forced to continuously improve its privacy and advertising management regulations, and show more than other platforms how it collects data and uses it to attract advertisers. . This also gives us ordinary users a chance to peek behind the scenes of the algorithm black box.

This is my Facebook ads management page, the system predicts the direction of my interests. Although I've never filled out any of the surveys, a profile card that incorporates my identity, age, and hobbies is already on the page.

But why would I put a camera in my head, not only to let it know what I'm thinking, but also to judge what I'm going to do? Does this thing do me any good?

Perfecting the predictive function of an algorithm, that is, the intelligence level of artificial intelligence, is a never-ending process. So mining user data is like taking a drug. Once you start, it's hard to stop. And users have almost no power to fight back, and they have only one choice in front of them, agree or not use it.

In order to attract advertisers, the platform needs to make a package ticket with the advertiser and buy the advertising space on my platform, which will definitely increase the expected future sales of the brand and the brand's popularity. What the platform sells is actually a certainty . Based on big data, by simulating consumer choices, social platforms' terrible prediction and simulation capabilities allow advertisers to have precise choices .

They can clearly tick off their target groups and decide on the target effect (whether to increase exposure or increase purchasing power conversion rate).

The following two pictures are the interface for advertisers to buy Facebook ad slots

But what's the point of knowing the fact that "our personal information is being used by tech companies to make money"? Is the advertiser, traffic-led monetization model irrelevant to us? Here, I would like to talk about my views from three directions based on the answers given in the film.

【The guinea pig in the lab】

It's not enough to simply show ads to "potential customers," users who may already be interested in a product. What tech companies are doing is directing your interests, stimulating your desire to buy, and even interfering in your behavior.

At the beginning of 2012, Facebook was revealed to have carried out an experiment called "massive scale spread" , how to use the subconscious signals on the Facebook page to make more people vote in the 2010 US midterm elections.

On voting day, Facebook gave 60 million users a front page headline with a reminder to vote, including where to vote, a clickable "I vote" button and a Facebook counter, in addition to showing six close Facebook friends , to indicate how many of your friends have already voted.

Correspondingly, there are two control groups of users in the experiment. One group received voting information, but did not display their friends’ avatars and voter numbers; the other group did not receive voting information at all. The experiment proved that the 60 million users who saw their close friends voted had a much higher turnout than the other two groups.

It is possible to manipulate real-world behavior and emotions without even triggering people's consciousness .

Sandy, Facebook’s former chief strategy officer, said: “Google Facebook did a lot of little experiments where they kept testing on users until they found the best way to get them to do what you want them to do, which is manipulation. "

This "manipulation" is aided by artificial intelligence (a series of algorithms represented by machine learning in a narrow sense). When a human gives a machine a goal and says, "I want this outcome," the machine learns how to achieve it.

But in this way, as a deep user of these software, am I still the master of my own behavior?

When we are laboratory mice, the beneficiaries of the experiments are not ourselves.

【The thing about addiction

Many of us are heavy mobile phone addicts. When we say that someone is particularly addicted to a product, we will jokingly say "the TA lives in xx software".

But is this really because we have poor self-control and can't control ourselves? Why is it so hard to quit cellphones? Maybe you will find a coincidence, we can't help playing mobile phones, and jumping from page to page, which is exactly what technology companies are craving . Attracted to the screen.

The film reveals that Google designers, Facebook's director of profitability, and many people who work in Silicon Valley have taken a class in the Stanford Persuasion Lab on how to use psychological knowledge to make technology more persuasive .

You pull down, refresh, and the top is the new content. If you scroll down again, new content will appear. This is called positive reinforcement in psychology . Its principle is the slot machine of the casino, which always makes you look forward to new content.

A lot of software is designed to intentionally "addict" users, enticing them to keep clicking. For example, "Pa Yi Pai" is often created at the "social death scene". Everyone who is "patted" will stay for a few more minutes, wondering why TA "pats" me, should I "pat" back?

For another example, at the top of the WeChat chat box will prompt "the other party is typing". As soon as I saw these six words, I involuntarily stayed in the dialog box and waited for the other party's reply.

It's an addictive feeling that even the people who design these programs can't get rid of. Jalen Lanier, Facebook's former monetization director, said: "It's ironic that I go to work during the day and build something that has me as prey."

Here are two tips for getting rid of addiction, one is to turn off the message push function of all software except the necessary communication software. The second is to put a time lock on the software in the mobile phone to strengthen the concept of time in use (see: "[Student He] This video can make you quit your mobile phone" at station B)

The torn society

There is a cold joke that if you want to know the most divisive and opposing views in the country, you might as well go to the Weibo comment area.

We often wonder how people whose values ​​are contrary to our values ​​can be so stupid. How can people in developed countries who do not wear masks lack such basic medical knowledge? Why in the riot that broke out in 2019, the mainlanders saw that the mob was chaotic in Hong Kong, and the people of Hong Kong said that it was an encirclement and suppression of the people by the government and the police? Trump supporters chanting "Election Fake" stormed the Capitol on January 6, with some saying America has never been more politically divided.

A Pew Research Center survey of 10,000 U.S. youth shows that personal and political polarization is at its highest in 20 years. More than a quarter of Democrats and Republicans see each other as a great threat to the United States.

One answer to this question is: because we are seeing completely different information.

When you open Google and enter "climate change is...", people in different regions will see different results in the [Prompt Fill Options] below the search bar. People in some places will see "climate change is a huge hoax" and people in other places will see "climate change is a destruction of nature". But none of these are facts about climate change , but are generated based on what Google knows about your interests in your area.

This push mode is used by Facebook, YouTube, Douyin, and all other software with recommendation mechanisms: the cards you choose, they choose for you. Content that you are not interested in will never appear in your field of vision. This is the "information cocoon room".

This "information cocoon room" just caters to the human psychological defense mechanism (Defence mechanism) . This defense mechanism is often an involuntary response. Through denial, empathy, and projection, we increase our self-esteem and thereby protect ourselves psychologically.

Algorithmic recommendation takes advantage of this mechanism but also exacerbates its side effects. Once our cognition of things is solidified by the information we see, and we believe in the "truth" we identify, it will be difficult to digest things that are contrary to our values. View.

"I'm concerned that the algorithm I'm working on increases the polarization of society," said Shalo, a former YouTube engineer. "But in terms of viewing time, that polarization is extremely effective in keeping people watching online."

Fake news spreads 6 times faster than real news on Twitter. During the epidemic, according to research and statistics by Southern California information science expert Freira, a large number of zombie accounts were active on Twitter, and they deliberately contributed to conspiracy theories and ideological narratives.

Biden's inauguration is over, and Washington has lifted the city's alerts and resumed its former hustle and bustle. But people's divergent ideas and the chasm of social division cannot be filled in a day.

A platform with healthy information acquisition channels and a platform for sincere communication and mutual understanding is particularly precious in this era. And where will algorithms take us?

The article was first published on the public account "The Distance Between Us and Hunger", welcome to communicate.

View more about The Social Dilemma reviews

Extended Reading
  • Kenton 2022-03-25 09:01:12

    The expression is a bit sensational, but the meaning is that

  • Quinn 2022-03-25 09:01:12

    As soon as you pick up your phone, twenty minutes are gone. This was a moment of shock when I tested it myself many years ago. At that time, it was not only to pick up the phone and open the browser, but 20 minutes were gone.

The Social Dilemma quotes

  • Self - Founding Father of Virtual Reality: It's the critics who are the true optimists.

  • Tristan Harris - Google, Former Design Ethicist: If you're not paying for the product, then you are the product.