Thoughts: The Social Dilemma

I recently had a conversation with Emily Moses, and she recommended I watch The Social Dilemma so that we could later have a conversation about it. Although the running time is about ninety minutes, it took me nearly three hours to watch, rewind, and take notes.

There are too many important points in this film to distill into a single thesis. A takeaway is that if you work in technology — creator, leader, shareholder, etc. — this film should be required watching.

The remainder of this post is a transcription of the notes that I took.

By Source, Fair use, https://en.wikipedia.org/w/index.php?curid=65297263

Pull Quotes

  • “Nothing vast enters the life of mortals without a curse.” — Sophocles
  • “If you’re not paying for the product, then you are the product.”
  • “Any sufficiently advanced technology is indistinguishable from magic.” — Arthur C. Clarke
  • “There are only two industries that refer to their customers as ‘users’: illegal drugs and software.” — Edward Tufte
  • “Whether it is to be utopia or oblivion will be a touch-and-go relay race right up to the final moment…” — Buckminster Fuller

Notes

  • How the technologies (e.g., liking, sharing, the platforms themselves) are used is different than you’d expect.
  • These platforms amplify the human beast.
  • What an interesting road we’ve gone down… The origins of Facebook were born from toxic masculinity (see FaceMash) and gave rise to influencing political elections with global consequences.
  • We live in the disinformation age.
  • Given the implications shared through the interviews in this film by the some of the early pioneers at companies the likes of Google and Facebook, I’m shocked that this film saw the light of day, much less got traction on Netflix. If I were one of these individuals, I’d want my identity hidden before I appeared on camera for fear of retribution or worse.
  • No one is working to make these platforms less addictive; the model is engagement.
  • From Jaron Lanier… You don’t sell products (e.g., servers); you sell users. Not even that, you sell the opportunity to create a change in perception and behavior (i.e., what people do and what they think). You’re selling the certainty of ad placements via great predictions.
  • From Shoshana Zuboff… see surveillance capitalism
  • There’s almost no human supervision for what happens on these platforms.
  • It’s not the data that’s important — it’s the models that predict our actions.
  • I was reminded of this July 2017 TED talk where a black man created an alternate profile for himself to see how social media tailored content for him: A Black Man Goes Undercover in the Alt-Right
  • Three focuses: engagement, growth, advertising
  • A thought I had… I wonder what kind of precedent we’ve set with the freemium business model. So many people expect things for free, even if they know their data is what makes the company money. I doubt many businesses would survive if they replaced ad revenue with customer-driven revenue.
  • The meaning of culture is manipulation.
  • I learned that Stanford has (had?) a Persuasive Technology Lab.
  • Emails and notifications are click-bait. Instead of telling you which friend tagged you in what photo, you get notified of the event and have to log on to the platform (i.e., engage) to get the real answer. LinkedIn does this, too, for direct messages. I’m of two minds on this. I don’t care for having to return to the platform, but there are other services like Gmail that mine information from your emails. For example, you wouldn’t want your patient portal emailing your test results, as who knows what Google would do with that information.
  • “Growth hacking” — what we can do from a psychological perspective to increase engagement
  • Facebook uses A/B testing and similar experiments to optimize for engagement, growth, and ad revenue. This technique is considered a best practice in technology companies.
  • An unsettling comment about using social platforms for influencing elections… it’s about affecting behavior and emotions without people being aware that it’s happening.
  • We’ve shifted from tools-based technology to addiction- and manipulation-based technology.
  • Social media is a drug; we have a basic biological imperative to connect with other people. We need to ask, “What does our tribe think of us?” This reminds me of Yuval Harari’s Sapiens. Our brains are optimized for interacting and forming meaningful relationships with somewhere between 150-300 people. How can we process having tens of thousands of connections on social media? (See also: Dunbar’s number)
  • We conflate likes and hearts with value and truth.
  • The film shared some troubling statistics about the increase in attempted and successful suicides in teens, which seem to be correlated to the rise in social media use.
  • YouTube Kids was mentioned, which received some criticism regarding whether children could distinguish content from advertising. In response, many content creators had their income put at risk.
  • At several points during the film where the interviewees shared how the platforms were benefiting from certain techniques, I wondered how deep the rabbit hole goes. How many lawsuits about data misuse were settled out of court so no one can understand the events fully? Who knows what would still be happening if Cambridge Analytica wasn’t caught interfering with the 2016 US election? (Fun fact: This was the trigger for me to delete my Facebook account.)
  • Social media has become a digital pacifier.
  • The computing processing power of today (measured in flops) has outpaced our brains.
  • The goals of social platforms and the goals of the people that use those platforms are not aligned. Their algorithms are optimized to a definition of success, namely the platform’s definition of success (e.g., engagement, growth, and ad revenue).
  • If you type the query “climate change is” into Google, the search suggestions you receive will vary based on where you live and what Google knows about you. This is “not a matter of truth, but about you.”
  • Summarizing a comment from Roger McNamee… You have 2.7 billion Truman Shows.
  • Other people are not stupid; they’re just not seeing the same information you are. Traditional media has the same increased left/right polarization that social media does, because they are optimizing for the same thing: selling attention to advertisers.
  • From Aza Raskin… “You can imagine these things are sort of like… They tilt the floor of human behavior. They make some behavior harder and some easier. And you’re always free to walk up the hill, but fewer people do. So at scale, at society-scale, you really are just tilting the floor and changing what billions of people think and do.”
  • False information makes companies more money than the truth. At least one study has shown that lies spread faster than truth.
  • Facebook is a tool of persuasion. Imagine what would happen if it were being controlled by a dictator or an authoritarian. Some have considered Facebook to be an “assault on democracy.”
  • McNamee: “The manipulations by third parties is not a hack. Right? The Russians didn’t hack Facebook. What they did was they used the tools that Facebook created for legitimate advertisers and legitimate users and they applied it to a nefarious purpose.”
  • Cathy O’Neil: “We are allowing the technologists to frame this as a problem that they are equipped to solve. That’s a lie. People will talk about AI as if it will know truth. AI’s not going to solve these problems. AI cannot solve the problem of fake news. Google doesn’t have the option of saying, ‘Oh, is this conspiracy? Is this truth?’, because they don’t know what truth is. They don’t have a proxy for truth that’s better than a click.”
  • I was reminded of the concept of “imagined realities” and “legal fictions” from Harari’s Sapiens. Humans have invented concepts — nationalism, laws, fiat currency — to deal with one another at scale.
  • If we can’t agree on what is true, we can’t navigate our way out of these problems. This made something click for me: the importance of philosophers. Many of the societal issues we face are human ones, and many of them have been with us for millennia. Having re-learned some tropes from Greek mythology, it amazed me how many of the issues we face are not new. This is why I’m a firm advocate for liberal arts education in lieu of STEM-only. I believe it’s myopic to only learn hard skills without learning how to understand and communicate with other people, and to understand the context of our past successes and failures as a species.
  • Technology brings out the worst in society. This reminded me of another theme in the US (and many other Western capitalist economies): Find something that’s required to exist in society (e.g., connection, healthcare) and determine how to make a profit from it. Another connection… “We are the virus.
  • When the interviewers were asked about upcoming concerns… Civil war, climate change, civil destruction through willful ignorance, autocratic dysfunction of democracies.
  • Technology is neither good nor evil. We are simultaneously living in utopia and dystopia. It’s amazing that we can get food delivered to our homes within minutes, and discomforting that information about the kinds of food we buy could be sold to advertisers to influence other products we should buy.
  • Summarizing Cathy O’Neil… Companies say they can regulate themselves as de facto governments without competition; this is false.
  • We worship at the altar of short-term thinking and profit at all costs.
  • It’s flawed to think of these platforms as selfless individual actors doing what’s best for society.
  • AI outsmarts us to pull our attention instead of focusing on real goals and values.