Youth online safety is a hot topic with Australia’s social media ban for under 16-year-olds taking effect on 10 December.
In the U.S, Meta (Facebook/Instagram), ByteDance (TikTok), Alphabet (YouTube) and Snap (Snapchat) will have to defend themselves in front of juries over claims that their products were intentionally engineered to hook young users.
Los Angeles Superior Court Judge Carolyn B. Kuhl issued the ruling earlier this week. It was the last major procedural attempt by the companies to avoid trial. One negligence claim was narrowed, but most of the allegations survive.
The decision sets up the first wave of trials from a massive litigation campaign that originally began filing roughly three years ago. It includes thousands of lawsuits from teens, parents, school districts and state attorneys general.
The first three jury trials are scheduled to begin in January. These will be the “bellwether” cases: early tests that help shape how the rest of the litigation may go, and which often drive settlement negotiations.
What the suits claim
Plaintiffs say the companies intentionally built features including algorithmic content recommendations, frictionless endless scrolling and recurring personalised alerts to maximise time spent scrolling, despite knowing heavy use could harm young people’s mental health.
They cite alleged outcomes including:
- depression and anxiety
- sleep disruption
- eating disorders
- self-harm
- suicide
What the companies say
The tech platforms all deny wrongdoing and say the suits misunderstand how their products operate.
Meta said it has worked with experts and parents for years to increase safety features and support young users.
Google spokesperson José Castaneda said the suits are based on incorrect premises about YouTube’s actual usage patterns, arguing YouTube is used primarily as a streaming service rather than an interpersonal social network.
Lawyers for Snap said Snapchat is designed fundamentally differently and is built to prioritise privacy and safety.
TikTok didn’t comment in time for the ruling.
High-stakes test of Section 230
The upcoming trials will also be a major test of Section 230, the well-known federal shield that has historically protected platforms from liability for content posted by users.
The judge said a jury should decide whether product design choices (like infinite scroll) are distinct from moderation of user content and, therefore, potentially not protected by 230.
The first trial will centre on a 19-year-old California plaintiff who says she developed addiction, anxiety, depression and body dysmorphia because of using the platforms. Jury selection starts in January. Meta CEO Mark Zuckerberg, Instagram lead Adam Mosseri, and Snap CEO Evan Spiegel are expected to testify.
