Addictive Influencing: the A.I. behind A.I.
A Continued Look at the Revelations of The Social Dilemma Documentary
“There are only two industries that call their customers ‘users’: illegal drugs and software.” — Edward Tufte.
What do you think of when you read this statement? What correlation can we make between illegal drugs and software?
‘Users’ are reliant upon ‘products,’ usually to the point of getting addicted to them. The discussion around addiction is not new to society, but there can be disagreement on where to place responsibility. Is it the responsibility of the user? What about the responsibility of the provider? What about the responsibility of the tech companies who increasingly incorporate addictive design to elicit more usage?
In the documentary “The Social Dilemma,” Tim Kendall speaks of when he was the president of Pinterest a few years ago, “I was coming home, and I couldn’t get off my phone once I got home, despite having two young kids who needed my love and attention.” He admits, “This is classic irony. I am going to work during the day building something that then I am falling prey to… and some of those moments, I just couldn’t help myself.”
The tech designers who put these manipulating designs into the software aren’t even successful at curbing their own behavior. Behavior reaching levels akin to substance addiction. Isn’t this alarming? If the tech designers, who are already aware, are still do this, can you image the impact for those who are less aware?
Aza Raskin says the platform he’s most prone to is Twitter, and it used to be Reddit. He states “I actually had to write myself software to break my addiction to reading Reddit.” Tristan Harris says he’s most addicted to his e-mail. Once again, this is commentary from the experts in the tech field who are aware of the nature of the design features.
“It’s interesting that knowing what was going on behind the curtain, I still wasn’t able to control my usage.” Tim confesses, “That’s a little scary.” Eric adds, “Even knowing how these tricks work, I’ll still be susceptible to them. I’ll still pick up the phone and 20 minutes will disappear.”
Tim continues to mention how he would try to use willpower to leave his phone. He tried thousands of different times to change his own tactics on how to separate himself from his phone. “Willpower was kind of attempt one. And attempt two was, you know, brute force.”
This isn’t being said jokingly. There may be partial humor for the designers, as they’re part of the tech industry, but their tone clearly comes across as cautionary. How seriously are we meant to take these admissions? Are we supposed to think, “Yes, this is some serious stuff! Thanks for fessing up.”
Yet the tech designers who were once complicit aren’t giving us any assurance that this is going to stop. If anything, they seem to imply that it will continue to move in the direction it’s been going. Perhaps I’m missing something here, but shouldn’t it be a top priority to reverse the aspects of tech design that are proving to be detrimental to society?
Dr. Anna Lembke, a Medical Director of Addictive Medicine in conjunction with Stanford University, proclaims, “So here’s the thing: Social Media is a drug. “ I imagine many people who’ve been using social media throughout the years can attest to this. Especially when viewing addiction from the lens of consumption.
She continues, “We have a basic biological imperative to connect with other people that directly effects the release of dopamine and the reward pathway. Millions of years of evolution are behind that system to get us to come together and to live in communities, to find mates, to propagate our species.”
From this physiological standpoint, since social media optimizes this connection for people, it has potential for addictions. When you think about it, this is the primary drive to be on these platforms. The designers keep adding more drives increasing the pull to engage more on their platforms. The controllability of the individuals is much less than most people know.
Dr. Lembke continues, “I’m worried about my kids. And if you have kids, I’m worried about your kids. Armed with all the knowledge that I have, and the experience, I am fighting my kids about the time they spend on phones and on the computer.” When someone checks in about how many hours are being spent online, it’s very likely that the amount of time will be downplayed. We hear from Dr. Lembke’s kids about what they think their usage is. It’s quickly revealed how they indeed spend more time online than they estimate.
“There’s not a day that goes by that I don’t remind my kid’s about pleasure/pain balance, about dopamine deficit states, about the risk of addiction,” emphasizes Dr. Lembke. Hold up! So, are parents supposed to be talking to their kids about pleasure/pain balance and dopamine deficit states? Will they even understand what this is and that their screen time plays a roll with it. At what age are kids supposed to be learning about and questioning tech addiction?
Not just your kids, but you too. How often do you monitor your dopamine deficit states? I’m gathering most people do this minimally, if at all. Sure, I think it’s great to check-up on these things, but are Medical Directors recommending the frequency of how this is needed? If there are physical and neurochemical responses happening, how do we increase the importance of recovery and revitalization?
“These technology products were not designed by child psychologists who were trying to protect and nurture children,” says Tristan. “They were just designing to make these algorithms that were really good at recommending the next video to you, or really good at getting you to take a photo with a filter on it.”
Tristan continues to reveal how severe the level of influence these platforms have on the user. “It’s not just that it’s controlling where they’re spending their attention. Especially social media, it starts digging deeper and deeper down into the brainstem and take over kid’s sense of self-worth and identity.” Continually posting pictures of oneself and waiting for comments can have negative side effects. Whether it be from using filters to enhance the way one looks, or looking for external validation so frequently. These behaviors contribute to a recipe for lower self-worth.
Tristan alarms us that we’ve moved far beyond just taking into consideration what other people in our tribe think of us. “But are we evolved to be aware of what 10,000 people think of us? We were not evolved to have social approval being dosed to us every 5 minutes. That was not at all what we were built to experience.”
I think this is another point that escapes people. You have to take scale into account. It’s not only about the modified behaviors or the types of interaction that must considered, you have to factor the number, frequency and intensity.
Chamath Palihapitiya, former VP of Growth at Facebook, states “We curate our lives around this perceived sense of perfection. Because we get rewarded in these short term signals of hearts, likes and thumbs up. We conflate that with value and we conflate it with truth.” He claims, “It leaves you even vacant and empty before you did it. Because then it forces you into this vicious cycle of: What’s the next thing I need to do now? I need it back.”
Do you think it’s as bad as Chamath claims it to be? Any one of us can determine this for ourselves if we pay attention to it. Each person is their own experiment, right?! Are you noticing if this behavior leaves you feeling vacant and empty like he says it does? Do you feel yourself caught in a vicious cycle of continually thinking of what you need to do next on social media? Continually thinking about what you need to do in order to get more likes and approval?
As if the challenge were only about conflating value to where there is none. This is only part of the overall picture in which one’s perception gets distorted. As someone who’s studied psychology and health coaching, I can attest that there are negative repercussions for the continual seeking of external validation. Whether the attention and admiration is true or not, users are being drawn away from their own internal recognition. This is reshaping an individuals construction of self-esteem
Jonathan Haidt, PhD, a social psychologist from the NYU Stern School of Business, brings up some startling statistics about teen and pre-teen girls regarding the predominance of social media use. “There has been a gigantic increase in depression and anxiety for American teenagers which began right around between 2011 and 2013.” His findings show that a marked increase in the number of children in the US being admitted to hospital due to self-harm.
For girls aged 15 to 19, since 2009 there’s been a 62% increase since. Among pre-teens aged 10 to 14, there’s been a 189% increase. “That’s nearly triple,” says Prof Haidt. “Even more horrifying, we’re seeing the same pattern with suicide.” Suicide for pre-teen girls was practically non-existent before this past decade. Suicide for pre-teen girls has risen by 151%. In the US, deaths by suicide are up 70% in teenage girls this past decade when compared with the one before.
What is going on with these huge increases? Fortunately Jonathan and others are taking note and sharing their research. Jonathan is also the author of The Righteous Mind: Why Good People Are Divided by Politics and Religion. I’ve seen interviews with him before and I’m glad his perspective was included in this documentary. It’s necessary to hear some voices outside of the tech industry to assess what’s happening. He notes how “that pattern points to social media” and how the pattern coincides with growing use of mobile devices.
While affecting all generations, it’s particularly disconcerting for those in Generation Z. He says, “Gen Z, the kids born after 1996 or so, those kids are the first generation in history that got on social media in middle school. How do they spend their time? They come home from school, and they’re on their devices. A whole generation is more anxious, more fragile, more depressed.”
There are a list of problems that come along with this. Society is learning the hard way that desperation for approval and cyberbullying are ingrained within these social media platforms. It’s leaving youngsters afraid to take risk and socialize in-person the way previous generations have. “They are much less comfortable taking risks. The rates at which they get drivers licenses have been dropping. The number who have ever gone out on a date or had any kind of romantic interaction is dropping rapidly. This is a real change in a generation.”
Once again, looking at the broader picture, “Remember for every one of these, for every hospital admission, there’s a family that is traumatized and horrified.” As if this weren’t damaging enough for the kids, this is a negative ripple effect for everyone connected. ‘My God, what is happening to our kids?’” Parents are feeling guilt and anxiety that they are playing a part in this, but it is not just their responsibility.
Tim Kendall comments, “It’s plain as day to me. These services are killing people, causing people to kill themselves.” These statements sound pretty dire to me. Tristan adds, “I don’t know any parent who says, “I really want my kids to be growing up being manipulated by tech designers manipulating their attention, making it impossible to do their homework, making them compare themselves to unrealistic standards of beauty. No one wants that.” Yet this is what’s happening.
Is there a way around this? Other than abstaining from using social media platforms. Which seems impossible in modern countries like the US. Is it too late for these platforms to make reformations to their design strategies to boost growth and user engagement? What could these changes be? I can’t say I have faith that they’ll shift from the tactics they’ve been using for their advertisement and growth models. I also don’t have faith that there will be much done in the form of regulation for these designers to act more ethically.
“We used to have these protections,” Tristan states “When children watched Saturday morning cartoons, we cared about protecting children. You can’t advertise to kids in these ways.” Yet this has changed. All kids are exposed to this now. “All those protections and all those regulations are gone.” This wasn’t a conscious decision for parents and society to get rid of these protections, so why are people accepting this?
So we’ve gotten to the part of the documentary I find most disconcerting. With the revelation that suicide rates are increasing for children and the correlation with social media use, I feel there needs to be a call to action. What are these tech designers doing about this? Are they merely just warning us, and leaving it in parent’s hands to manage their kid’s behavior? It’s not just kids though, it’s every person who interacts regularly on these platforms. We must do something about these root factors.
We continue to be warned about the psychological and mental health components resulting from the growth of social media in society. “We’re training and conditioning a whole new generation of people, that when we are uncomfortable, or lonely, or uncertain, or afraid, we have a digital pacifier for ourselves,” Tristan points out, “This is kind of atrophying our own ability to deal with that.” These kind of negative effects won’t be resolved over night either. This will indirectly effect other ways that we relate and communicate with one another.
At the Chicago Anti-trust Tech Conference Tristan is a panel speaker sharing his perspective, “Photoshop didn’t have a thousand engineers on the other side of the screen using notifications, using your friends, using AI to predict what’s going to perfectly addict you or manipulate you.”
This part of the discussion reminds of how there has always been some new level of media manipulation involved with innovative technology entering our lives. Yet the innovative technologies we’re currently experiencing are not equivalent when it comes to the addictive influencing. Or so Tristan and other designers are trying to warn us. “This is a totally new species of power.”
“There’s this narrative that, you know, we’ll just adapt to it, we’ll learn how to live with these devices. Just like we’ve learned how to live with everything else.” Tristan continues to explain. “But what this misses is that there’s something distinctly new here.” This is where people must realize that throughout history, there have been similar responses to societal advancement. But this is not the same. The differences must be acknowledged. It’s startling that some people can’t see this distinction.
Randy Fernando, a former Project Manager for Nvidia, and former Executive Director at Mindful Schools, “Perhaps the most dangerous piece of all this it it’s driven by technology that’s advancing exponentially.” He uses the comparison that Computer Processing has improved over a trillions times, whereas the human brain has not. As co-founder for the Center for Humane Technology, Randy sees this as impactful on how to move forward.
“Human beings, at a mind and body and sort of physical level, are not going to fundamentally change,” adds Tristan. “You’re living inside of hardware — the brain — that’s millions of years old. Then there’s this screen, and then on the opposite side of the screen there’s these thousands of engineers and supercomputers that have goals that are different than your goals. And so who’s going to win in that game?”
Do you have confidence in your brain having the suitable hardware coming up against all the tech engineering? Who do you think is going to win?
AI is already running so much of the world already. Justin Rosenstein, a former Engineer at Google and Facebook, talks about AI as “massive, massive rooms.” He goes on to describe that “some of them are underground, some of them are underwater. Just computers, tons and tons of computers as far as the eye can see. They are deeply interconnected with each other and running extremely complicated programs, sending information back and forth between each other all the time.”
Cathy O’Neil, Phd, a Data Scientist and author of “Weapons of Math Destruction” explains, “Algorithms are opinions embedded in code. Algorithms are not objective. Algorithms are optimized to some definition of success.” More specifically, she states, “if a commercial enterprise builds an algorithm to their definition of success, it’s a commercial interest. It’s usually profit.”
Jeff Siebert, a former Executive at Twitter says “You are giving the computer the goal-stand: I want this outcome. And then the computer itself is learning how to do it. That’s where the term machine-learning comes from.” He tells us what that goal-stand looks like for social media platforms. “And so every day, it’s gets slightly better at picking the right post, in the right order so you spend longer and longer on that product.”
“The algorithm has a mind of its own” says Bailey Richardson, of the Early Team at Instagram. “Even though a person writes it, it’s written in a way that you kind of build the machine, and the machine changes itself.” Sandy Parakilas, a former Operations Manager at Facebook, reminds “there’s only a few people who understand how those systems work. And even they don’t fully understand what’s going to happen with a particular piece of content.”
Sandy has a rather dystopian take on the matter, “So, as humans we’ve almost lost control over these systems. Because they’re controlling the information that we see. They’re controlling us, more than we’re controlling them.”
At this point, the film tries to get us to consider whether or not someone’s feed is good for them. More particular, does the AI or algorithms work on the basis of providing a feed that is good for them. It appears we’re coming to the realization that they do not. They do not discern for us, not in a way that we can discern for ourselves.
After an interlude of Screamin Jay Hawkin’s “I Put a Spell On You” there’s hardly any debate about how deep of a hold social media has on us.
Roger McNamee, Early Investor Venture Capitalist of Facebook, says “So imagine you’re on Facebook and you’re effectively playing against this Artificial Intelligence that knows everything about you, can anticipate your next move, and you know literally nothing about it.” A clear point is made: “That’s not a fair fight.”
The documentary brings us to The Center for Humane Technology as we see Tristan Harris speaking to a large group of tech industry colleagues. “We’re all looking out for the moment when technology will overwhelm human strengths and intelligence. When is it going to cross singularity, replace our jobs, and be smarter than humans. But there’s this much earlier moment when technology exceeds and overwhelms human weaknesses.”
“This point being crossed is at the root of addiction, polarization, radicalization… this is overpowering human nature.” Tristan grimly states, “This is checkmate on humanity.” This is a rather serious statement to make. Is it hyperbole? If this is as serious as Tristan claims, then why is no one reacting to this? How come people aren’t taking issue about the severity of these assertions?
“One of the ways I try to get people to understand just how wrong feeds from places like Facebook are is to think about Wikipedia.” Jaron Lanier, a Computer Scientist and Founding Father of Virtual Reality, draws a striking comparison to demonstrate a large concern about how we perceive our Facebook feed. With Wikipedia, “when you go to a page, you’re seeing the same thing as other people. So, it’s one of the few things online that we at least hold in common”
“Just imagine for a second that Wikipedia said we’re gonna give each person a different customized definition. And we’re gonna be paid by people to do that.” To further this extreme hypothetical, he says, “Wikipedia will be spying on you. Wikipedia will calculate what’s the thing I’m going to do to get this person to change a little bit on behalf of some commercial interest. And then it would change the entry.”
“Can you imagine that?” Jaron comically reminds us that “well, yes you should be able to, cause that’s exactly what’s happening on Facebook, that’s exactly what’s happening in your YouTube feed.”
Justin talks about the varying Google results that will come up for different people. These results are not a function of the truth, but a function of the particular things Google knows about your interests. It’s also influenced by where you’re Googling from.
Roger reminds us, “Facebook is in charge of your newsfeed.” I think most people know this, but do they know the extent of how many problems come with this. Will we ever stop ourselves from willfully participating in these delusions. Yes, even while knowing your newsfeed isn’t fact, part of you will perceive a false sense of what’s going on in the world around you when you continuously view it. It’s s collection of curated slices of information every day.
Rashida Richardson, professor at NYU School of Law and Director of Policy Research at A.I. Now Institute, mentions, “We all simply are operating on a different set of facts. When that happens at scale, you’re no longer able to reckon with or even consume information that contradicts with that world view you’ve created. That means we aren’t actually being objective or constructive individuals.”
Justin reminds us of how this set up shapes the polarization that occurs. Your average person lacks the understanding of why the polarization is happening. He says, “Then you look over at the other side, and you start to think, “How can those people be so stupid? Look at all of this information that I’m constantly seeing. How are they not seeing that same information?” The answer is: they’re not seeing that same information.”
These tech experts are raising awareness about what is really happening. Will society listen? Will each person get access to the information these experts are sharing? Can they truly assess and process what’s happening? Once the problems are clearly stated and the record is straight, then we have a better chance to create solutions.