970x125
Meta and YouTube were just ordered to pay a woman a combined six million dollars for creating addictive products that harmed her psychologically. TikTok and Snapchat were also sued but settled out of court.
The 20-year-old woman joined YouTube at age 6 and Instagram at age 9. Over the course of her childhood and adolescence, she explained that her experience on these platforms gave her body dysmorphia and thoughts of suicide, and led her to self-harm.
The lawsuit accused Meta and YouTube of intentionally creating addictive platforms that enabled harm. Importantly, this circumvented Section 230, which protects social media companies from liability for content posted on their platforms. This strategy accused the platforms themselves of being harmful, not just what users uploaded or posted.
Competing for Our Attention
Meta and companies like it are engaged in what one design ethicist calls a “race to the bottom of the brainstem.” That is, each social media company has to compete for our attention. When one company designs a feature like infinite scrolling, all similar companies must either add it too or lose market share. That’s why when TikTok became an overnight success, Instagram and Facebook added Reels, and YouTube added YouTube Shorts. Short-form videos are powerful ways to keep our attention. If they hadn’t added them, they would have lost users and ad revenue.
Facebook used to simply host everything your friends had posted in reverse-chronological order. This was great for users, but bad for Facebook’s bottom line. Most people would open Facebook, read the most recent posts, and then leave the platform. Instagram functioned the same way. The trouble for the platforms was that people were leaving satisfied. The longer you stay on their platforms, the more money they make. So they hired user experience researchers to explore what ways they can maximize your attention.
Since this early version of Facebook, they’ve randomized the order of posts, added suggested content, added short-form videos, added infinite scroll, and collected all the data they can. They’re noting whether you scroll past or linger on every post on your feed, compiling everything you’ve ever “liked” or commented on, comparing it with everyone you talk to to guess what content is most likely to keep you engaged, and pushing notifications whenever anyone posts anything in a group you follow. They’re even tracking what websites you visit when you’re not using Facebook. They don’t care whether you actually enjoyed the content, just that it kept you engaged long enough to keep you on the platform for a few more precious seconds.
Platforms like Meta and YouTube claim that they are not trying to make their products “addictive.” I believe them, but this is only a semantic difference. I don’t think they want to be “addictive,” but they do need to keep you on their platform as long as possible to stay competitive. A product that uses psychological tricks to be hard to stop using and keep you coming back is not meaningfully different from something addictive.
Failing to Implement Safety Features
The lawsuit also accuses companies like Meta of removing or failing to implement safety features in order to not lose users. For example, their “growth team” looked into making teen accounts private by default—preventing strangers from talking to children under 17. When they found that they would lose teen users by doing so, they decided to continue allowing adults to talk to minors.
Meta says that they have since created a number of safety features in their Teen Accounts program, but most of the promised features either were never implemented or do not work as advertised.
I hope this pushes social media companies to remove features like infinite scroll, algorithmic recommendations, and other habit-forming features, but I do not think that this will solve the problem long term. It will be a setback, but it will not stop them from searching for even more engaging features to implement. As I mentioned above, the line between “engaging” and “addictive” is very blurry.
The only solution I can see is to create a dedicated, safe social media space for minors. No randomized post order. No infinite scrolling. Hidden like counts. No tracking user data. No suggested accounts. No ability to communicate with strangers. A robust, effective system to report other users. However, I remain pessimistic that any tech company big enough to create it can be trusted, so we also need legislation requiring social media companies to protect their users from anticipated harm.

