The Social Dilemma documentary explores the question on many people’s minds: What is the problem with social media? The exploration begins by asking various tech experts, “What is the problem?” Unable to respond with a simple answer, the expectation is set that a deeper explanation will unfold throughout the film.
Within the first 5 minutes, big questions are thrown to the viewer:
Is the tech industry getting too big?
Are these platforms eroding the fabric of how society works?
Have we gone from the Information age to the Disinformation age?
Are social media platforms negatively impacting the mental health of its users?
We’re introduced to key players like Tristan Harris, Tim Kendall, Justin Rosenstein, and others. These previous employees of the major social media platforms are interviewed to express their concerns about the adverse effects social media is having on society at large.
We begin with Tristan Harris, a former Design Ethicist at Google, rehearsing a presentation addressing other leaders within the tech industry. He states, “I want to talk about a new agenda for technology. And why we want to do that is because if you ask people: what’s wrong with the tech industry right now? — there’s a cacophony of grievances and scandals… they stole our data, and there’s tech addiction, and there’s fake news, and there’s polarization, and there’s elections that are getting hacked… but is there something that is beneath all these problems that’s causing all these things to happen at once?”
Tristan is also the co-founder of the Center for Humane Technology. He believes that there’s a problem within the tech industry that doesn’t have a name, and he’s trying to get to what the root cause could be. What is the name of the problem that’s happening in the tech industry? Does it need to be named? If we can put our finger on it, then we can change it and positively influence all of the afore-mentioned issues we’re now seeing within society.
There’s a lot to unpack with this documentary. I consider myself somewhat “in the know” regarding each of these issues, yet it seems overwhelming even for me, let alone a viewer who’s new to this information. Where does one begin to get a grasp on this? Sure, it sounds like uncovering the root cause should be the place to start, but we/re headlong into the trajectory of watching these issues unfold in real time.
The concerns are profound and call for our attention. How does one quell the sense of urgency to discuss tech addiction, or fake news, or polarization? Can we be patient and hopeful that this in-depth look will provide some resolution for these prevailing problems? These initial questions come up for me as the issues are framed.
- -
Tristan goes on to reflect: “Does it seem like the world is going crazy when you look around you? Is this normal? Or have we all fallen under some kind of spell?” I know I’ve certainly reflected on these questions, and I imagine others have too. I’ve had numerous discussions with friends about some of the craziness we’ve seen on social media platforms. From articles shared to polarizing discussions to people just losing it. There have been many ‘shaking my head’ moments. So does the allure of talking about such ‘craziness’ transfer to the on-lookers? Does it link us to the ‘crazy’ matrix? Is this the ‘spell’ Tristan is alluding to?
I usually don’t throw around the word ‘spell’ lightly. Is Tristan correct using this word? I’m not ready to declare anything the new ‘normal’ although there’s clear evidence that social interactions have changed. We’ve entered a new phase of how our technologies yield a huge influence over our communication with one another. This documentary really gets into this premise. I appreciate that it does, because I believe people are in need of having this part laid out for them. It’s something one may comprehend on one level of thinking, but how do we get it to really resonate? To have people ‘know’ about what’s going on beyond the cognitive level.
Tristan believes this should be something everyone’s aware of, not just the tech industry. I’m in agreement with this. I’ve been doing my own research for several years now. I’ve listened to a dozen of Tristan’s talks and interviews. I support his mission to shed light on the consequences of these technological advancements. In fact, I recommend people listen to those talks and interviews in conjunction with this documentary.
- -
The film shows a news clip dubbing Tristan Harris as “the closest thing silicone valley has to a conscience.” I can get behind this assertion. When hearing his thoughts on these matters, they’ve always seemed genuine and sensible. What he shares resonates with what I believe needs to be part of the broader discussion around these platforms and tech developments.
When he worked at Google, he noticed that he was getting addicted to Gmail. He also questioned why no one was talking about the addictive nature of these things. He was feeling a frustration with the tech industry and thought “we have lost our way.” He decided to make a presentation about how they could change things from the inside. He felt it would be “a call to arms.”
The gist of the presentation centered around the claim: “Never before in history have 50 tech designers made decisions that would effect 2 billion people.” He states, “2 billion people will have thoughts that they didn’t intend to have, because the designers at Google said this is how notifications work on that screen you wake up to in the morning.”
This is profound when you ponder it. Amidst our current online climate, it appears we must explore this further. Yet I wonder, how do you get all individuals on the same page with this? We’re all intertwined with this, but our levels of awareness vary. This is why Tristan was sounding the alarm within the tech industry. His presentation highlights the concern: Is there a moral responsibility for these tech designers? Is it up to Google to do something about it?
Tristan brought this to the attention of a select group of his colleagues, which grew to hundreds viewing it and sending him messages. People seemed interested and agreed with him. They saw how this was affecting other people around them. He mentions how it was brought to the desk of Larry Page who was the CEO at Google at the time. Although Tristan felt like he was launching a revolution, nothing happened afterwards. There was no follow-up regarding the concerns he addressed.
- -
The film shifts to hearing from other voices within the tech industry. Tim Kendall, a former executive at Facebook and former president at Pinterest recalls as far back as 2006 when he was called in as director of monitorization. He was in discussions figuring out the answer to the question, “How is this going to get monetized?” He concluded that an Advertisement model seemed to be the most elegant way.
With this bit placed here in the documentary, as a viewer am I supposed to wonder if the Advertisement model for monetization on Facebook was the beginning of the crisis we’re now facing? Are we supposed to presume that this truly was partly due to a well-intended business model? Maybe. It’s one piece of the puzzle. A contributing part of the perfect storm that was coming together at the core of what could go bring about the underbelly of social media.
We are then introduced to Jaron Lanier via a clip of his appearance on the TV daytime talk show The View. Jaron, who is identified as the Founding Father of Virtual Reality and a Computer Scientist, is the author of the book “Ten Arguments For Deleting Your Social Media Accounts Right Now.” Well, if that isn’t a direct enough title for a book about social media.
I remember seeing this interview when it first aired. I agreed with what Jaron shared, but for some reason I didn’t feel any urgency about reading his book. I don’t know if it was how Jaron shared about the pervceived dangers, or if I simply had resigned to the fact that this was happening and I was okay with it. There wasn’t a strong enough feeling in the air that something really needed to be done about this. Not sure what I was waiting for. He appeared on The View in June of 2018, so perhaps there was a level of him simply stating what I, and others, already knew about social media. Two years later, he’s back on my radar as a contributer here.
In the documentary, Jaron reiterates that companies like Facebook and Google are some of the wealthiest and most successful of all time. He mentions how they have small staff and large computers, and manage to rake in a lot of money. According to Jaron, the question becomes, “What are they getting paid for?” We know we as users certainly aren’t paying them to use their services.
I think this is where the plot gets lost with people understanding this arrangement. We are usually the consumer, consuming a product or service. From the user’s vantage point, Facebook and google appear to be things consumed. But in this business model equation, they are not the product/service. We can think they are the product/service, but they aren’t. We don’t purchase them. This is where the model gets turned around. They are merely a platform in which we become the product through our participated.
We willingly give our attention and our information. The platform then brings in a third party, like advertisers, who pay them. They consume us (our data). The more attention we give them, the more data and possible sales from ads can be made. Does this make sense? It’s rather simple from their vantage point. But it gets lost on people, to see themselves this way. It’s not the usual way the product/payment/consumption works for most people with most things.
— —
Next up is Roger McNamee, an Early Investor Venture Capitalist of Facebook. He confirms the business model of these platforms selling their users. He’s been an investor in technology for 35 years. He states that the tech companies used to be in the business of selling products. For the past 10 years, they biggest companies of Silicon Valley have been in the business of selling their users.
We also hear from Aza Raskin, former employee at Firefox and Mozilla Labs, who states that the advertisers are the customers and we’re the thing being sold. Leading to the quote: “If you’re not paying for the product, you are the product,” reiterated by Tristan Harris. Perhaps for those newer to taking a deep dive into this, the quote will likely raise an eyebrow. A little thought bubble will likely go up saying, “Hey, when did I agree to getting sold?” Um, well you didn’t really. Or did you? Have you checked the Terms of Agreement for the platforms you’re on?
All of these platforms are competing for your attention. Their business model is to keep people engaged. Tim Kendall opines, “Let’s figure out how to get as much of this person’s attention as we can” and “How much of your life can we get you to give to us?” Are we fully aware of how valuable our attention is? Are we giving it away too freely?
At this point in the documentary I can hear echoes of Gary Vaynerchuck (who’s not in the film) and other motivational entrepreneurs imploring how necessary it is for businesses to gain the attention of their audience. In some instances we get attention, but usually we’re the audience. For the most part, we’re giving away our attention. There are a number of entities wanting that attention from us. There’s an Attention Economy and there’s a lot at stake. We’re part of the Attention Economy whether we’re aware of it or not.
Justin Rosenstein, a former engineer at Facebook and Google, chimes in “When you think about how some of these companies work, it starts to make sense. All of these services on the internet that we think of as free, are not free, they are paid for by the advertisers.” They pay to show their adds to us, which in turn makes us the product. Our attention is the product that the advertisers are buying. I think over the last several years more and more people have realized this. It’s become accepted to a degree.
Is this an accurate depiction of the exchange equation going? Jaron Lanier says “it’s not that simplistic.” Jaron poses the exchange this way: “It’s the gradual, slight, imperceptible change in your own behavior and perception that is the product.” He’s adamant that this is the only possible product, there’s nothing else on the table that could actually be called a product. Hmm, so let that sink in a moment.
He goes on to say, “That’s the only thing there is for them to make money from. Changing what you do, how you think, who you are. It’s a gradual change, it’s slight. If you could go to somebody and say, give me 10 million dollars and I will change the world by 1% in the direction you want it to change… that’s worth a lot of money.” Hold up! Did he just say “changing what you do, how you think, and who you are?”
When things are framed this way, you might take pause to what’s happening. With regard to being manipulated, are you willing to have your behavior modified to such extent? Behavior modification to the point it changes who you are? These are some hefty consequences to what seems like a simple act of just interacting on social media.
— —
Shoshana Zuboff, a Professor Emeritus at Harvard Business School, says “this is what businesses have always dreamt of: to have a guarantee that if it places an Ad that it will be successful. That’s their business.” She’s also the author of The Age of Surveillance Capitalism. “In order to be successful in that business, you have to have great predictions. Great predictions begin with one imperative: You need a lot of data.” This allows tech industry platforms to sell certainty.
Tristan Harris then explains the concept of surveillance capitalism. “Capitalism profiting off of the infinite tracking of everywhere everyone goes. Large technology companies whose business model is to make sure advertisers are as successful as possible.” Shoshana then puts a fine point on it, “It’s a market place that trades in human futures.”
If it were only about data collection, would most users be okay with this? Is the problem that these companies keep take it a step further? That they keep increasing the amount of manipulative design into the platforms in order to manipulate how the user engages. I’m leaning to believe that this is the case. The problem isn’t so much that data harvesting is being done, but how the platforms are coercing users to give more attention and provide more data.
Jeff Seibert, a former Executive at Twitter, extrapolates about how AI knows a lot about you from what you view. “Everything you’re doing online is being watched, is being tracked, is being measured. Every single action you take is carefully monitored and recorded. Exactly what image you stop and look at, how long you look at it. Oh yeah, seriously for how long you look at it.” This assertian by Jeff would likely be a reality check for most people.
He goes on to say that this allows information that can determine when someone is lonely, or depressed, or experiencing any other emotion. This will be used to manipulate you. Are you looking at photos of an ex romantic partner? Are you an introvert or an extrovert? They know a lot of intimate details about you. Perhaps in some situations, more than you even know about yourself at that moment. This is pretty profound stuff.
“They have more information about us than has ever been imagined in human history.” Shoshana reminds us this is unprecedented. Sandy Parakilas, former Operations Manager at Facebook and former Product Manager at Uber, adds that these systems have minimal human supervision. “They are making better and better predictions about what we do and who we are.”
—
People have misconceptions that it’s our data that’s being sold,” says Aza, it’s what they do with that data. “They build models that predict our actions, and who has the best model, wins.” Tristan confirms this, noting that all the clicks we’ve ever made and all things we’ve ever watched all come back to building a more accurate model. “The model, once you have it, you can predict the kinds of things that person does.” It can predict what kinds of videos you’ll want to watch and what kinds of emotions tend to trigger you.
During our engagement, these are things that we believe we are doing to some extent at our own freewill. But are we? Sure, we are having the emotion. But we are being manipulated into having it be triggered. I can be triggered from multiple fronts. Is this what we want in our social media experience? For someone to have the power to send us things that will get the most reaction out of us. If they’re noticing that arguing and negative emotion leads to more engagement and times spent on the platform, then it will further calculate that these are things that will take precedence in your feed.
Tristan informs us of the main goals of these tech companies: engagement, growth, and advertisement. The engagement goal is to keep you scrolling and drive up your usage. The growth goal is to keep you coming back and invite as many friends and their friends. The advertisement goal is to make sure that as those other things are happening, they can make as much money as possible from advertising. Tristan claims, “Each of these goals are powered by algorithms whose job is to figure out what to show you to keep those numbers going up.”
Tim says, “We often talked about at Facebook, this idea of being able to dial that as needed. We talked about Mark having those dials. There’s the opportunity to dial up monetization at any given time.” The platform can do this gradually over time, whenever they choose. But how much is too much? Now that we’re seeing the negative repercussions of this, will they ever turn those dials back?
“We’ve created a world in which online connection has become primary, especially for young people” says Jaron Lanier. “Yet, in that world, anytime two people connect, the only way it’s financed is through some sneaky third person who’s paid to manipulate those two people.” He warns, “So we’ve created an entire global generation of people who are raised within a context where the very meaning of communication, the very meaning of culture is manipulation. We’ve put deceit at the center of everything we do.”
This is followed by a quote from Arthur C. Clarke stating that “Any sufficient advanced technology is indistinguishable from magic.”
Which brings us to a pivotal place in the discussion, as Jaron alludes to the sneakiness and deception that is happening here. Even if the intent is proven not to be sinister, it is indeed sneaky and deceitful. The consequences of these actions are rearing their ugly heads. Who is to be held responsible for this? Why do the ethics around what’s happening seem so unclear?
- -
Tristan demonstrates a few simple magic tricks for the camera and segues the discussion of magicians being the first to be able to understand how people’s minds work. “The magician understands some part of your mind, that we’re not aware of. That’s what makes the illusions work.” He brings up this perspective to get us to start thinking from another vantage point about what technology is doing.
He harkens back to his days at Stanford University’s Persuasive Technology Lab. “How can you use everything you know about the psychology that persuades people and build that into technology?” Students in the tech industry were encouraged to become behavior change geniuses. “There were many prominent Silicon Valley figures that went through that class” Sandy tells us. “They learned how to make technology more persuasive.” Tristan explains “Persuasive technology is just sort of design intentionally applied to the extreme. Where we really want to modify someone’s behavior. We want someone to take this action.” An example would be to get you to scroll through the platform for longer.
Joe Toscano, a former Experience Design Consultant at Google, mentions another example of how when your on the platform and you pull down to refresh, you’ll get new entries. He mentions how in psychology this is referred to as positive intermittent reinforcement. Which refers to getting a reward or reinforcement, but you don’t know when you’re going to get it or if you’re going to get.
Tristan draws the comparison, “This operates just like the slot machines in Vegas.” These tactics are attempting to “go deep down to the brainstem and plant inside of you an unconscious habit.” When you feel the urge to reach over to your phone, that’s not an accident, it’s a design technique according to Tristan.
So yes, ladies and gentlemen, these newly made behavior geniuses chose to imitate the tactics used in Vegas gambling casinos rather developing tactics beyond them. Rather than taking caution of the addictive nature learned from gambling, they figured they’d harass that same psychological effect but for new tech purposes. Genius, right?!
This has been done all without considering the numerous adverse consequence to this. It seems their claims to make these decisions unwittingly faltered from having any thought of negative repercussions. Is it possible that all these intelligent people forgot how to troubleshoot? Shall we still give these noble designers and creators a pass for not taking precautions for what they’ve unleashed? For now, it seems they are somewhat forgiven as they are coming clean on what brought us to the current crisis. Of course they are, right?!
— —
Another example of this persuasive technology was the additional design feature of photo-tagging. Tristan explains how Facebook saw this as an opportunity to increase activity. Another feature to keep your attention was to show an ellipsis. The dot dot dot appears to show you that someone is typing as they respond to you. The natural human ingrained reaction is to wait to see how the person is going to respond. Therefore creating another opportunity for the platform to garner more of your attention.
“There’s an entire discipline and field called Growth Hacking. Teams of engineers whose job is to hack people’s psychology so they can get more growth and more users sign ups, more engagement and get you to invite more people.” Sandy Parakilas discusses Chamath Palihapitiya, former VP of Growth at Facebook, and how “he’s very well known in the tech industry for pioneering a lot of the growth tactics that were used to grow Facebook at incredible speed. And those growth tactics had then become standard playbook within Silicon Valley.”
“One of the things he pioneered was the use of scientific A/B testing of small feature changes,” Sandy continues, “Over time by running these constant experiments you develop the most optimal way to get users to do what you want them to do. It’s manipulation.” Does this mean you are a lab rat? “You are a lab rat. We all are lab rats.”
“Facebook conducted what they called massive scale contagion experiments.” Shoshana mentions the example of “How do we use subliminal cues on Facebook pages to get more people to go vote in the midterm election? And they discovered that they were able to do that.” They could effect real world behavior and emotions.
- -
Sean Parker, former President of Facebook, confirms that as a hacker “You’re exploiting a vulnerability in human psychology. I think the inventors, creators… all of these people understood this consciously, and we did it anyway.”
“No one got upset when bicycles showed up” Tristan says, as he makes a comparison to the introduction of bicycles into society. “If something is a tool, it genuinely is just sitting there waiting patiently. If something is not a tool, it’s demanding things from you. It’s seducing you, it’s manipulating you, it wants things from you. We’ve moved away from having a tools based technology environment to an addiction and manipulation based technology environment. That’s what’s changed. Social media isn’t a tool that’s just waiting to be used. It has its own goals and it has its own means of pursuing them by using your psychology against you.”
If we are certain of this, if these platforms have goals of their own, can this fundamental aspect change? I think it likely won’t change for the current platforms we’re using. Now, we know the arrangement we are subjecting ourselves to when we choose to be a “user” on such platforms. Will these change people’s minds about how innocuous their participation is? Or will there be a mass exodus away from these platforms? Time will tell.