A Los Angeles jury has found Meta and Google liable for intentionally building addictive social media platforms that harmed a young woman's mental health during her childhood. The 20-year-old plaintiff, identified as Kaley, was awarded $6 million (approximately £4.5 million) in damages.
The verdict marks the first time a US jury has held major social media companies directly responsible for addiction-related harm to an individual user. Jurors concluded that both Meta, which owns Instagram, Facebook and WhatsApp, and Google, owner of YouTube, designed their platforms in ways that fostered compulsive use.
Hundreds of similar cases in the pipeline
The ruling is expected to influence hundreds of comparable lawsuits currently progressing through US courts. Both companies have indicated they will appeal. Meta responded by arguing that teen mental health is a complex issue that cannot be attributed to a single application.
Google issued a separate statement disagreeing with the verdict but did not elaborate on its grounds for appeal.
UK implications: pressure on Online Safety Act enforcement
This case is a watershed moment. The public, governments and now the US legal system have made clear that social media companies must bear responsibility for the products that they create.
Silverman noted that platforms had historically sheltered behind the hosting defence in both the EU and UK, which treated them as passive hosts of user-generated content. That position, she argued, is no longer tenable.
While I don't expect to see similar class action in the UK, this most recent decision will put increased pressure on the government to ensure the Online Safety Act is enforced, and to implement further measures to ensure the safety of young people online.
The Online Safety Act received Royal Assent in October 2023, but Ofcom is still phasing in its enforcement regime. The regulator published its first codes of practice in late 2025, with full compliance deadlines extending into 2026 and beyond. Critics have argued the rollout has been too slow given the pace at which platform harms continue to emerge.
The $6 million award itself is modest by US class-action standards, but the precedent it sets for individual liability may prove more consequential than the sum. If appeals fail, platforms face the prospect of being held to account on a case-by-case basis for design decisions that prioritise engagement over user welfare.