This week, a Los Angeles County Superior Court jury found that Meta and Google were liable in negligence for deliberately designing platforms that trapped the Plaintiff, a young woman who begun using Meta’s Instagram app and Google’s You Tube app as a child, in a cycle of addiction causing her mental health harms.
During the trial of the action filed in 2023, the Plaintiff, K.G.M testified that she began using Google’s YouTube at the age of 6 and Meta’s Instagram at the age of 9 and that she had become addicted to these platforms. She testified that because of her addiction, she spent hours on end on the platforms which negatively impacted her relationships with family and school.
By age 10 she became depressed and was engaging in self-harm and was diagnosed with body dysmorphic disorder and social phobia attributed to her use of the platforms. The jury found that Meta’s and Google’s apps played a substantial role in the mental health harms suffered by the Plaintiff and awarded her damages in the amount of $6 million.
This decision is likely to shape ongoing global debates about the extraordinary power and influence wielded by social media companies over infrastructure, behaviour and norms that shape the lives of social media users, and how to ensure that social media companies are held accountable. K.G.M is the first test case known as a bellwether, tied to more than 1,600 lawsuits brought by parents and social districts across the United States against social media companies.
Meta and Google have stated that they intend to appeal the jury’s decision. If they lose their appeal and the jury’s verdict is upheld, the verdict is likely to have an impact on the outcome of other lawsuits against social media companies and may enable groups who have suffered similar harm through social media platforms to seek damages for harms suffered. The decision may also put some pressure on social media companies to change their product design.
Litigation is not a magic bullet
Although the jury’s decision highlights the harms caused by addictive design features that are embedded in social media platforms such as algorithmic amplification, infinite scroll, constant notifications and beauty filters, the decision does not mandate companies to change the design of their platforms.
Additionally, while there have been efforts to regulate social media companies in some jurisdictions through regulations such as the UK’s Online Safety Act, the EU’s Digital Services Act and Digital Markets Act and the EU’s and UK’s General Data Protection Regulations, regulatory challenges remain.
The question is whether legislators and regulators will now take steps to regulate the design of social media platforms in ways that protect rights, and that address structural conditions that drive harm through these platforms.
Specifically, whether legislators and regulators will: (1) enact regulations that fully addresses the problematic surveillance business model of social media companies which is based on extraction and exploitation of data and ‘profit-oriented algorithmic curation of content’ to maximise user engagement on the platforms and (2) take steps to prevent continued regulatory capture by social media companies which contributes to watered down regulations that do not fully address harms caused by these platforms.
ENDS