
Transcript:
DARIAN WOODS: Hey, Darian Woods here. It is a big day in the Planet Money world today. Our book is finally out, and you can buy it in stores. It’s called Planet Money– A Guide to the Economic Forces That Shape Your Life. And if you do go and buy it, let us know what you think and take a pic of you reading it and tag us. Let’s see if we can make this a bestseller.
ANNOUNCER: NPR.
[MUSIC PLAYING]
ADRIAN MA: If you spend even a little bit of time on social media, you know how addictive it can be. You wake up in the morning, and you scroll through feed. You check it again on your way to work and during work. You scroll when you’re eating lunch, when you’re laying down to bed, even when you’re on the toilet.
WOODS: Well, speak for yourself. But no, I imagine a lot of our listeners are even scrolling right now.
MA: Yeah. And if this is you, don’t feel too embarrassed because this is what these apps were for– to keep users engaged. And they do it so well that it’s become a liability for them.
WOODS: That’s right. A couple of weeks ago, a jury in Los Angeles found Mehta and Google negligent for designing apps like Instagram and YouTube to be addictive and harmful to children’s mental health. The same week, a jury in Santa Fe found Meta is harming children’s mental health and safety.
MA: Now, Google and Mehta have said they disagree with the verdicts and will appeal. But there’s already a lot of talk about how these verdicts could reshape social media as we know it. This is The Indicator from Planet Money. I’m Adrian Ma.
WOODS: And I’m Darian Woods. Today on the show, we speak with Aza Raskin. He’s an entrepreneur credited with inventing the infinite scroll feature common in so many social media apps.
MA: He also testified in that New Mexico trial of Meta. After the break, Aza tells us about the changes he thinks platforms should make to help people take their attention back.
WOODS: It was 2006 when Aza Raskin says he came up with infinite scroll. You know, the thing when you get to the bottom of feed and more content loads. And you never really get to the end.
MA: Yeah, Aza says he’s sorry for that, by the way.
AZA RASKIN: So I invented the technology before social media got going, really as a technology to help people. And then I went around, and I explained to Twitter and Google and other companies that this is just a more efficient interface. And what I was blind to was that despite my good intentions, in technology, incentives eat intentions.
MA: In other words, even though Aza didn’t intend it, social media companies had a powerful incentive to use infinite scroll to keep users engaged.
WOODS: Aza talked about this when he testified in the New Mexico trial of Meta,
RASKIN: And what I was explaining to the jury is that even though I know perfectly how infinite scroll works to remove a stop in queue, so you keep scrolling– it’s sort of like if wine glass filled up without you looking at it. You would drink much more because you don’t– your brain doesn’t wake up when you reach the bottom of wine glass. I was finding myself, like, disappearing to the bathroom in the middle of a dinner to, like, scroll. I actually had to write software to break my own addiction. And it was really important for the jury to understand that this is not a fair fight, that when you open up Instagram or Facebook or YouTube, it’s not just your mind trying to like have willpower or control. On the other side of that screen are thousands of engineers who have done hundreds of millions of tests using your own psychology to keep you there.
MA: What’s the evidence that social media companies are knowingly making products that are designed to be addictive or even harmful?
RASKIN: Well, in this case, it’s very clear from internal memos and emails that everyone from executives down knew exactly what they were doing and chose to do it anyway because these are engagement-based companies, and doing anything that dropped engagement in a number of users lowers their stock price, lowers bonuses for people, and causes their competitors to be able to outcompete them.
WOODS: Based on this type of evidence, jurors in the California case awarded the plaintiffs $6 million in damages. In the New Mexico case, jurors awarded $375 million. That might not be much for companies with billions and billions of profits each year, but there are thousands more similar lawsuits that are still pending.
MA: For Aza, the money is significant, but it’s not the most important thing about these cases.
RASKIN: This is really the first time that these companies are being held accountable for the design decisions that they make. Normally, these companies hide behind something called Section 230, which says that they are not responsible for the content that users post.
MA: But the plaintiffs in these cases didn’t base their claims on the content of the platforms, so the companies couldn’t use Section 230 as a legal shield. Instead, the plaintiffs focused on the design of the apps, arguing the companies knew what they were doing was likely to cause harm, and they did it anyway. Now, the tech companies have argued that they do a lot to keep kids safe online. They also say there’s no clinical diagnosis for social media addiction, and that mental health is complex and can’t be traced to a single app.
WOODS: And what does the research say? Well, the American Psychological Association says social media use comes with both potential benefits and potential risks. But children and adolescents may be particularly vulnerable to the risks because their brains are still developing.
MA: As we said, the companies are appealing the verdicts, but if they eventually stick, these cases and the thousands like them have the potential to reshape our experience with social media. For Aza, it’s an opportunity to force companies to change their apps to make them less addictive. And what might that look like exactly?
WOODS: We might the end of our social media infinite feed. I don’t know. But you know, Aza says, for starters, companies could just add a little bit of friction to the user experience.
RASKIN: So you know when you sit on an airplane, and it has sort of bad Wi-Fi? You, like, see that Instagram isn’t loading fast, so you go do something else. It’s literally like adding speed bumps to a road. It doesn’t remove any freedom. It just says maybe go a little bit slower, give you a little more time to think. Other tweaks could be just remove infinite scroll, so you have to click to go to the next page, and remove autoplaying videos. Even that would already do a lot.
MA: I guess I’m hearing the voice in my head of somebody who sees these sort of recommendations that you’re making and feels like this feels like a nanny state telling private companies to make their products worse for people. What would you say to that person?
RASKIN: Well, this is very similar to the kinds of arguments that people made about seat belts. I should just be able to have whatever car with whatever lack of safety. And that’s not the society we live in. We make rules that keep us all safe.
WOODS: Listening to Aza, we couldn’t help but see some similarities between the incentives driving social media companies and those driving AI companies. And he says he’s concerned about AI too.
RASKIN: Well, now the race to attention becomes the race to intimacy. That is, there is massive market incentive to have your company’s AI occupy the chief intimate relational spot in someone’s life, especially kids. Because any moment you spend talking with friends or spending the outside world is a moment you are not talking to the AI.
MA: So whether it’s AI or social media, these companies have the incentives and the tools to hack our brains. And for the most part, as a society, we’ve accepted that. But Aza thinks that might be changing.
RASKIN: You know, what’s so exciting is that as of a couple of weeks ago, India and Indonesia both announced that they are contemplating or will ban social media for kids.
WOODS: Indonesia actually made the social media ban for kids official on March 28.
MA: And when you put it all together, they’re following Australia and Denmark and Spain and France. If you went back two years, you’d say that’s impossible. There’s this growing, I think, human movement where we are recognizing that technology is encroaching onto our humanity.
WOODS: You could argue about whether these bans are the right way to deal with the issue. But one thing is for sure, if social media was like a natural global experiment on humanity, it does seem like that experiment is entering a new phase. This episode was produced by Angel Carreras, with engineering by Kwesi Li. It was fact-checked by Cooper Katz McKim. Kate Concannon is our editor. And The Indicator is a production of NPR.


