Landmark L.A. Jury Verdict Finds Instagram, YouTube Were Designed to Addict Kids
Key keywords: Landmark Los Angeles jury verdict, Instagram child addiction, YouTube addictive design for minors, social media youth mental health harm, Meta Platforms product liability, Google social media lawsuit, teen social media addiction regulation
In a landmark ruling delivered by a Los Angeles County Superior Court jury in August 2024, social media giants Meta Platforms and Google were found liable for intentionally designing their respective platforms Instagram and YouTube to addict underage users, marking the first time a U.S. jury has ruled that core social media product design constitutes a defective, dangerous product for children.
The class-action lawsuit was filed on behalf of more than 14,000 adolescent plaintiffs and their guardians, who presented evidence that both companies had internal research dating back as early as 2018 confirming that their platform design features – including infinite scroll, auto-playing content, algorithmic feeds prioritizing highly engaging, often harmful content, and persistent push notifications – were directly linked to elevated rates of addiction, anxiety, depression, eating disorders, and self-harm behaviors in users under 18. Internal company documents submitted as evidence showed that Meta’s own research found 32% of teen girls reported Instagram worsened their body image issues, while Google’s internal reports revealed 62% of children aged 10 to 12 spent more than 3 hours daily on YouTube, with 1 in 8 showing symptoms of clinical addiction to the platform.
Jurors rejected arguments from Meta and Google that their platforms were protected under Section 230 of the Communications Decency Act, noting that the lawsuit focused on product design defects rather than liability for third-party user content, placing the case firmly under standard consumer protection law. The jury awarded $1.2 billion in combined compensatory and punitive damages to the plaintiffs, and issued an injunction requiring both companies to remove all addictive design features targeted at users under 18 within 90 days, pending appeal.
Representatives for both Meta and Google released statements following the verdict claiming they had already implemented robust child safety measures in recent years, including default 1-hour daily time limits for underage users, private account defaults for users under 16, and restrictions on content related to self-harm and eating disorders. Both companies stated they plan to appeal the ruling, arguing it sets a dangerous precedent that could limit access to beneficial online content for young users.
Legal experts note the verdict is a turning point for social media accountability, as it clears the path for hundreds of similar pending class-action lawsuits against social media companies across the U.S., and is expected to push Congress to expedite passage of the Kids Online Safety Act, which would impose federal mandatory safety standards for platforms serving underage users.
Featured Comments
As a mother of a 13-year-old girl who developed severe anorexia after being pushed endless pro-eating disorder content on Instagram for 6 months, this verdict feels like the first time anyone has held these companies accountable for choosing profits over kids’ lives. The so-called 'parental controls' they rolled out were useless, my daughter found a work-around in 10 minutes. I hope this ruling forces them to actually change how they build their platforms for kids, not just put out performative safety features.
I work as a UX designer in the tech industry, and this verdict is a long-overdue wake-up call for our entire field. For years, our KPIs have been 100% focused on increasing engagement and time on site, even when we knew those same features were harming minor users. We can’t keep relying on self-regulation, companies will always prioritize shareholder returns over child safety. We need clear federal rules that ban predatory design features like infinite scroll for underage users entirely.
This is such a groundbreaking ruling because it avoids the Section 230 shield that Big Tech has hidden behind for decades. The suit didn’t argue that the platforms were liable for content posted by users, it argued that the product itself was defective, which is the same standard we use for any other consumer product like a dangerous toy or a faulty car. If this holds up on appeal, we’re going to see a massive shift in how social media platforms operate for young users, and that’s a win for every family in the country.