Over just two days in the last week of March, juries in two states sent a message to social media companies; we will protect our children.

In verdicts that have been years in the making, jurors found social media companies liable for harms suffered by young people—underscoring a growing recognition that these platforms must be held accountable for conduct that endangers a vulnerable population. The significance of these wins is monumental, but they are just the beginning of a much longer road ahead.

Two verdicts, two states, a clear message

On March 24, 2026, a jury in New Mexico delivered a verdict holding Meta Platforms Inc. accountable for endangering children. Just one day later, on March 25, 2026, a jury in California State Court reached a verdict, finding Meta and Google LLC liable for causing harm to one individual plaintiff known by her first name, “Kaley.”

Together, these back‑to‑back verdicts reinforced a growing legal consensus: social media companies can be held responsible when their conduct causes harm to young users.

Evidence the jury couldn’t ignore

The California jury heard from Google and Meta executives—including Mark Zuckerberg—as well as former Meta whistleblowers, the plaintiff herself, and multiple experts in the fields of technology and mental health. Ultimately, the jury concluded that Meta and Google were responsible for Kaley’s injuries. Jurors awarded Kaley $3 million in compensatory damages split between Meta (70%) and Google (30%) and further found that the defendants should ultimately pay double that amount through another $3 million in punitive damages, which are awarded when a jury finds a defendant acted with “malice, oppression, or fraud.”

Kaley alleged her mental health struggles, including social media addiction, severe anxiety and social anxiety, depression, and body dysmorphia were significantly caused or made worse by her use of Instagram and YouTube. Her case was the first personal injury case against social media companies to go to trial alleging harms due to defective design of the product.

Over multiple years of legal argument and resulting rulings in California, the plaintiff—represented at trial by a team led by Mark Lanier and including support from Motley Rice attorneys Sara Couch, Nelson Drake, Jessica Colombo, Florence Simon, and myself—successfully argued that Section 230 of the Communications Decency Act of 1996 does not apply to defective designs of certain features of the platform that encourage addiction and cause other mental health issues. Section 230 offers significant protections to social media companies where the alleged harms are caused by third-party content, but plaintiffs offered a theory that it is the features of the platforms themselves, not just content, that cause harm. Some common examples of such features include endless scrolling, autoplay, and the use of beauty filters.

Just the beginning

While Kaley’s case is historic, she is far from alone. Thousands of children and young people across the country have alleged harms stemming from social media use including claims similar to Kaley’s but also including other mental health harms like eating disorders, self-harm, suicide, and attempted suicide. The message sent to social media companies by this verdict and New Mexico’s verdict are not singular victories—they represent the leading edge of broader accountability. These verdicts are simply the tip of the spear.

Multiple litigations continue to work through courts around the country via mechanisms allowing individual plaintiffs, school districts, and entire states to bring various claims against social media companies, particularly Meta (Instagram and Facebook), Google (YouTube), TikTok, and Snapchat. You can learn more about the landmark verdict in the New Mexico Department of Justice’s trial, which held Meta liable for misrepresenting the safety of the platforms and taking unfair advantage of children and imposed $375 million in civil penalties. Read a timeline of the litigation.

Connect With a Lawyer
or call1.800.768.4026.

About The Author