Active case

Technology and Media /

Social Media Harm Lawsuit

Social Media Harm Lawsuit

Social media use has negatively affected some children and teens. People are now filing lawsuits against social media companies to hold them accountable.

Connect with an attorney

Case Overview

States, school districts and individuals are filing social media mental health lawsuits alleging platforms like Instagram, TikTok, Snapchat and YouTube are addictive and harmful to youth. Children and teens are especially vulnerable. If you or your child has suffered harm from social media use, you may be eligible to file a social media harm lawsuit.

Important social media harm lawsuit updates

  • December 2024: New lawsuits have been filed in the Adolescent Social Media Addiction MDL, bringing the total number of claims to 815.
  • October 24, 2024: District Judge Yvonne Gonzalez Rogers released a ruling that allows many lawsuits against Meta (Instagram and Facebook), ByteDance (TikTok), Alphabet (YouTube), Snap (Snapchat) to move forward. Although her ruling included some limitations, it may still encourage more plaintiffs to file mental health lawsuits against these social platforms.
  • October 15, 2024: A recent ruling from Judge Rogers allows social media lawsuits to proceed in court. The lawsuits in question were filed by dozens of school districts across 19 states. The school districts aim to hold social media companies responsible for related mental health problems.
  • October 1, 2024: New lawsuits have been filed in the Adolescent Social Media Addiction MDL, bringing the total number of claims to 594.

Learn about more social media harm lawsuit updates.

Key social media harm lawsuit takeaways

  • In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation is a multidistrict litigation case (MDL 3047) centralized in the Northern District of California.
  • MDL 3047 involves lawsuits against Meta (Instagram and Facebook), Snap, Inc. (Snapchat), ByteDance (TikTok) and Google (YouTube).
  • Social media harm lawsuits address the impacts of social media on adolescent mental health. These lawsuits claim social media addiction leads to other harms like eating disorders, anxiety, depression, self-harm and suicidal thoughts and actions.

Social media lawsuit overview

Individuals harmed by social media companies are filing lawsuits for compensation. This includes litigation against Facebook, Instagram, Snapchat, TikTok and YouTube. These social media lawsuits claim harm to adolescents. Plaintiffs include parents on behalf of minors, younger adults who were harmed as minors, states, local governments and school districts.

Motley Rice represents clients in multidistrict litigation 3047 (MDL) in the Northern District of California against several social media companies. MDLs are a common federal litigation tool that consolidate a mass of related individual cases filed by people across the country and put them in one district court. This helps resolve pretrial and discovery matters cost-effectively and efficiently.

Parties in social media harm lawsuits

In legal terms, the “parties” to the lawsuit include the individuals or organizations filing the lawsuit (plaintiffs) and the people or organizations the lawsuit is filed against (defendants).

The plaintiffs in MDL 3047 include:

  • Young adults, as well as parents and guardians representing their children (individual plaintiffs)
  • Local governments and school districts
  • States

The defendants named in MDL 3047 include:

Claims in social media harm lawsuits

The lawsuits from the two categories of plaintiffs are combined in the MDL because their claims are somewhat similar. There are some key differences between the two entities:

Purpose

  • Young adults, parents and guardians want personal restitution and compensation for their or their children’s suffering.
  • School district and local government claims are being filed as public nuisance liability claims.
  • States are filing Unfair and Deceptive Acts and Practices (UDAP) cases. UDAP cases are designed to protect the public from deceptive business practices. 

Damages

  • Individual plaintiffs request compensation for the physical and mental harm caused by social media’s defective products. Individual plaintiffs may also claim compensation for their emotional distress (non-economic damages).
  • School districts and local governments request compensation for the economic damages of social media addiction within their communities. Examples of the damages sought by school districts and governments include:
    • Costs of repairing physical damage due to social media challenges or damage caused by students that was related to social media
    • Increased costs associated with monitoring for and confiscating phones used to access social media
    • Increased need for school resource officers, psychologists, and counselors at school, etc.
    • Hiring additional personnel to address mental, emotional, and social health issues stemming from use of Defendants’ platforms
    • Educating teachers staff, and members of the community about the harms caused by Defendants’ conduct
  • States want to protect the public from further harm due to social media companies’ alleged actions.

Arguments in social media lawsuit cases about defective features and addictive designs include:

  1. Age verification and parental controls: YouTube, TikTok, Snapchat, Instagram and Facebook lack effective age verification methods and parental controls.
  2. Time tracking: Social media companies do not have restrictions on the length, time of day and frequency of user sessions.
  3. Addictive features: Social media features are intentionally addictive.
  4. Photo filters: Social media companies create and promote photo filters even though they know filters can negatively impact children.

These issues affect the most popular apps, but the design tactics differ slightly for each.

Lawsuits in the MDL also outline how companies are allegedly weaponizing collected data to build technology that compels young users to engage compulsively. The type of information companies are tracking about users to fine-tune their platforms includes:

  • Connections to other users
  • Demographics
  • Physical location where they access the product and the WiFi network that connects them
  • Post engagement metrics
  • Times of day they use the technology

Motley Rice attorney Previn Warren is co-lead counsel for the MDL. Our lawyers assert that social media companies design their products to maximize screen time and encourage addictive behaviors.

As evidence of the dangers of social media increases, our social media lawyers continue to review allegations about social media companies:

  • Created platforms designed to addict children and young adults
  • Marketed their products as safe for children and young adults in the face of contradictory evidence
  • Failed to warn parents and users of the negative health effects of their platforms
  • Failed to verify the ages of users

Our clients’ allegations detail heartbreaking stories of mental illness, self-harm and even death. If you or your family are struggling with the dangerous effects of social media, you are not alone. As awareness of these issues grows, so does a desire to hold these companies accountable for the harm they have caused.

Learn more about the lawsuits against Instagram, Facebook, Snapchat, TikTok and YouTube below. 

Meta (Facebook and Instagram) lawsuits

Meta Platforms, Inc. owns Facebook and Instagram. Facebook allows users to connect with friends and family, share content and engage with various communities. Instagram is a photo- and video-centric social media application where users can upload their photos, videos, or other information and discover information through curated feeds and connections.

Evidence is building that Meta knew about the harm its platforms caused children. Concealed evidence, such as leaked documents from former employees, is being brought to the spotlight through social media litigation. Facebook and Instagram lawsuits allege Meta intentionally makes its platforms addictive for children but fails to warn about the physical and mental health risks.

Plaintiffs argue Meta platforms share issues similar to those of YouTube, TikTok and Snapchat regarding defective parental controls and age verifications. Meta lawsuits claim the company used an inappropriate volume of notifications to increase engagement and did not warn users of excessive use. Another significant argument against Meta is the alleged exploitation of minors developing brains using its behavior-driven design features that encourage addictive behaviors and compulsive use and drive negative social comparisons among young users.

Some of Meta’s behavior-driven design tactics include:

  • Meaningful Social Interaction (MSI) algorithm
  • Notification delivery
  • Sense of urgency
  • Social comparison

The content presented to users on Meta platforms is dictated by an algorithm called Meaningful Social Interaction (MSI). The MSI collects on- and off-platform user behavior data to determine prioritization of information to populate a user’s feed with items that should yield high engagement. The goal is to keep users on the platform for as long as possible but could result in addictive behaviors.

Another tactic Meta deploys is intermittent variable rewards (IVR). This reward system, often used in gambling, unpredictably and inconsistently delivers rewards. Engagement increases when a reward is offered at an unexpected time rather than at a fixed time.

One reward example on social media is a “like.” A “like” is a form of engagement where users can express approval of a post, photo or video. For kids, tweens and teens, “likes” act as social validation. The “like” engagement metric is considered an IVR because Meta does not deliver notifications about them in real time. Instead, Meta makes users aware of their “likes” at times when the user may not otherwise be engaged with the application, like when they’re sleeping or in class. The notifications could arrive when a user has not visited the application recently or when Meta suspects a user is about to end a session.

Notifications create a sense of urgency for users to spend time on Facebook and Instagram. Meta further instills urgency with other features, including:

  • Instagram Live: The Live feature allows Instagram users to live stream video. Users can only interact with the Live video during the broadcast, so the feature pressures users to join immediately and be present.
  • Stories: Stories are short-lived posts by users designed to compete with Snapchat. The stories are visible on Facebook and Instagram for only 24 hours. Once they disappear, they cannot be viewed again. The Stories feature pressures young users to actively engage with social media or risk missing out, which can result in negative emotions of the user.

Facebook and Instagram addiction can affect mental health. The social media addiction may manifest as anxiety, depression, eating disorders, suicidal thoughts or another condition.

Snapchat lawsuits

Snapchat allows users to share videos and photos that expire within 10 seconds of being viewed. Like other companies in social media lawsuits, Snap, Inc. faces allegations that its age verification process is defective, and that the platform contains harmful features and lack of safety measures.

Snapchat preys on the adolescent need for social acceptance and the fear of missing out (FOMO). Snapchat’s harm originates from its ephemeral (fleeting) design. This design extends to several features that drive social metrics. 

An ephemeral design refers to the length of time Snaps (photos or videos sent on Snapchat) are viewable. The maximum viewable time for Snaps is 10 seconds and Snap Stories are viewable for only 24 hours. This concept pressures kids to check the application frequently to ensure they don’t miss a Story before it disappears.

Other Snapchat features are designed as social metrics to keep kids interested and engaged. These features include:

  • Beauty filters: Snapchat offers custom lenses and filters that allow users to edit their photos into an unnatural appearance. Filters can dramatically change a person’s appearance by altering key attributes of their face and body. It is argued that filters are setting unrealistic beauty standards and are leading to body dysmorphia in children. This condition is known as Snapchat dysmorphia.
  • Snapscore: Snapscore is a visible metric on a user’s profile. The more a user engages with Snapchat, the higher their Snapscore. This metric is important to young people because it provides public social validation. It ranks the number of people a child is sending to and receiving messages from. A high Snapscore can indicate excessive use of the application.
  • Trophies: Users receive Trophies to celebrate engagement milestones. These Trophies are visible to the user’s connections, so kids are encouraged to collect them through compulsive use. The Trophies feature retired in 2020, but not before allegedly causing unhealthy habits among many young people.
  • Charms: Users can receive Charms for reaching relationship milestones with other users. Charms rewards teens for using the application and engaging with their peers.
  • Snapstreak: Like the other features, Snapstreak rewards users for using Snapchat. Snapstreak is particularly dangerous because it rewards users based on the consistency of their engagement with another user. To keep a Snapstreak going, users must send at least one Snap to each other every day. The Snapstreak score grows the longer the streak goes. Kids feel pressure to keep their Snapstreaks going, and the pressure can lead to anxiety, signs of betrayal and other negative emotions.
  • Snap Map: The Snap Map shows the location of its users. Not only is this dangerous information, but it is also a social metric. Kids use the Snap Map feature to see where their friends are and who they are with to see if they’re missing out. As a social metric, the more connections someone has on Snapchat, the more populated their Snap Map will appear, making the user feel popular.
  • Spotlight: The Spotlight feature displays Snapchat community-posted short-form videos, similar to TikTok, YouTube Shorts and Instagram Discover. Videos in Spotlight are not from the user’s direct connections. This feature has led to a 200% increase in time spent on Snapchat.

Snapchat also deploys disruptive push notifications to increase engagement with their application. Push notifications are sent throughout the day to redirect attention to Snapchat. This constant interruption disrupts school and sleep.

Snapchat lawsuits also claim that parental controls are inadequate. Snapchat parental control limitations include:

  • Parents are required to have their own Snapchat account to link the two accounts.
  • Guardians cannot view the photos or videos their children send and receive, even with parental controls enabled.
  • Snapchat does not allow guardians to control direct messaging, their child’s connections or use of specific features, such as disabling the geolocation.
  • The My Eyes Only feature allows minors to hide Snaps from their parents in a folder that requires a password. Items hidden in the My Eyes Only folder self-destruct if an incorrect password is entered.

Addictive features that leverage the psychological need for acceptance among teens and the lack of effective parental controls make Snapchat dangerous, plaintiffs allege. People are filing lawsuits against Snapchat to hold it accountable for allegedly contributing to the mental health crisis among American youth. Plaintiffs are advocating for better warnings for the public about the harm Snapchat can cause teenagers.

State-specific Snapchat lawsuits

Individual states have also filed lawsuits against Snapchat. On September 4, 2024, New Mexico filed a lawsuit against the social media company stating that Snapchat ignored repeated reports of grooming and other forms of sextortion, specifically with minors. 

The lawsuit alleges that Snapchat knew its platform created an environment that fostered sextortion and inappropriate conduct between adults and minors and failed to take action to protect youth. This lawsuit is one of many brought against the company for sextortion and is not part of the federal MDL.

TikTok lawsuits

TikTok is a short-form video-sharing application created by ByteDance Inc. Social media harm lawsuits against TikTok focus on inadequate age verification and parental control features that allow minors to use the app without supervision. Social media lawyers also argue that TikTok uses manipulative design techniques to boost engagement among youth. Time spent on TikTok is allegedly harming children.
 
Many parents claim TikTok has weak age verification methods that allow children to engage with the platform without parental supervision. In 2020, TikTok reported that more than a third of its American users were 14 or younger. 

TikTok’s alleged defects related to age verification and parental controls include:

  • Kids can use TikTok without an account.
  • Parental consent is not required for minors to sign up.
  • TikTok does not conduct age verification to validate the birthday a user enters when signing up.

TikTok released a flawed “Family Pairing” feature in 2020. The feature was intended to give parents control over things like screen time, direct messaging and restricted content. The issues with Family Pairing include:

  • Inability to control the frequency or time of day when kids receive notifications
  • Pairing is only available on the mobile application
  • Parents must be aware of their child’s account
  • Parents are required to sign up for their own TikTok account

TikTok is also designed to maximize engagement. Children are especially susceptible to addictive features, and social media attorneys argue TikTok should’ve known the harm they are causing. Some addictive features include:

  • Auto-play: The auto-play feature cannot be disabled on TikTok’s For You Page, so videos automatically start playing.
  • Continuous scroll: TikTok users can swipe up on their screen to move from one video to the next. The continuous scroll can warp time and push users into a flow state. Plus, videos on TikTok are selected by the platform, not the user. Each swipe offers a potential dopamine rush.
  • Limited buttons: The TikTok interface is very basic. By keeping the application simple, there are no barriers to prevent users from entering the flow state.
  • Hidden clock: TikTok covers the clock on some smartphones. This design feature makes it easy for users to lose track of time and can lead to excessive use. 
  • Stories: TikTok Stories are time-sensitive videos that expire after 24 hours on the application. Stories play on the social psychological need for kids to know what their peers are posting. The fear of missing out (FOMO) propels the need to constantly check the application for new Stories. 
  • TikTok Now: The TikTok Now feature sends a push notification to users, encouraging them to share within three minutes. The notification is also sent to their friends, and the goal is to get everyone to post a video or image simultaneously. Users cannot see their friends’ TikTok Now posts if they do not participate. If they submit late, their posts will be branded “late.” TikTok Now puts unnecessary pressure on kids to engage with the application at the exact time the application requests.

Some believe that TikTok can break a user’s flow state and disable addictive features, but the company has not effectively leveraged its technology to do so. Social media harm lawsuits argue TikTok is not doing enough to keep young users safe. 

In August 2024, the Third Circuit Court of Appeals ruled that TikTok must face a lawsuit filed by the family of 10-year-old Nylah Anderson. The lawsuit alleges that TikTok’s algorithm exposed her to videos and challenges that encouraged dangerous behavior. Nylah passed away as a result of participating in one of the challenges she saw on TikTok. The family is not represented by Motley Rice.

TikTok and other platforms have attempted to dismiss this kind of lawsuit by claiming protections under Section 230. Section 230 is a law that protects online platforms like TikTok from liability for content posted by users. The August ruling allows the lawsuit to continue because of the amount of harmful content that TikTok’s algorithm pushed to 10-year-old Nylah.

States file TikTok lawsuits

In October 2024, new lawsuits were filed against TikTok in several states. The Texas Attorney General alleges that the company violated the Securing Children Online through Parental Empowerment Act (SCOPE). According to the lawsuit, TikTok failed to protect children’s information and did not give parents increased control over their children’s accounts.

Lawsuits from 13 other states have been filed against TikTok. The lawsuits allege that the platform creates an environment that fosters social media addiction. The states believe the company is not doing enough to keep children safe. Additionally, the lawsuits argue that the platform needs to do more to reduce the negative mental health impacts associated with the app use.

YouTube lawsuits

YouTube is a video-sharing social media platform where users can upload, view and engage with any conceivable category of information. The harm YouTube allegedly causes youth primarily comes from defective age verification, ineffective parental control features and a dangerous algorithm.

Children have easy access to YouTube. The application comes pre-installed on many kid-friendly devices, such as Nintendo’s Wii and Switch, Sony’s Playstation or Microsoft’s Xbox gaming consoles. YouTube doesn’t require a login or an age verification to watch videos. Without age control, kids can access every video on YouTube.

Kids are also exposed to YouTube’s recommended and autoplay features. The recommended videos and autoplay features are designed to increase user engagement by providing a never-ending stream of videos. With these features available, children can become fully absorbed by YouTube and lose track of time. This absorption is called a “flow state” and can lead to:

  • Addiction
  • Compulsive use
  • Sleep deprivation 

Short videos called YouTube Shorts can also distort a user’s sense of time and cause excessive use. Children do not have the same impulse control as adults, and their developing brains are more susceptible to YouTube’s addictive features. With the continuous stream of video, there is no natural stopping point to make it easy for people to stop watching.

YouTube’s features increase the time kids spend watching videos and dictate the videos they watch. A YouTube executive revealed that as of 2018, users spend 70% of their time watching videos recommended by the platform — not videos they actively sought out. 

Plaintiffs argue YouTube failed to warn children and their parents about the risks associated with using their product, including :

  • Addiction
  • Anxiety
  • Compulsive use
  • Depression
  • Eating disorders
  • Self-harm
  • Sexual exploitation from adult users
  • Sleep deprivation
  • Suicide ideation

YouTube lawsuits claim that Google knew, or should’ve known, that their features can harm children.

In September 2024, Arkansas joined other states in filing independent lawsuits alleging that YouTube harmed the state’s youth. The lawsuit alleges the addictive content on the site compelled the state to pay millions to improve mental health care services for its youth.

Social media use among American youth

Social media use is high among American youth. In 2023, the Pew Research Center surveyed 1,453 U.S. teens aged 13 – 17 to uncover how often they engage with social media. The survey found that 92% of teens use the internet at least daily, and 24% use the internet “almost constantly.” Social media engagement has increased over time, doubling the “almost constantly” engagement score from the research center’s 2014 – 2015 survey. 

Social Media Platform

Percent of teens using the app

Percent using the app several times a day

YouTube

93%

38%

TikTok

63%

32%

Snapchat

60%

29%

Instagram

59%

27%

Facebook

59%

8%

The social media addiction statistics are especially concerning when also considering the nation’s mental health crisis. A growing body of research suggests a link between social media and mental health problems. Some allege social media companies put profit over people by designing their platforms with harmful features to boost engagement and fuel revenue.

The dangers of social media platforms and features

A lawsuit against social media companies focuses on defective and harmful features. Social media lawyers argue that platforms like YouTube, TikTok, Snapchat, Instagram and Facebook were knowingly created and promoted in ways that could harm children.
 
Social media can affect the brain the way gambling and certain drugs do. Likes, comments and other notifications may cause the brain to release dopamine. This chemical – the same one released while playing slot machines or drinking alcohol – creates feelings of pleasure.

The brain can begin associating that pleasurable response with the social media application and encourage people to continuously chase that feeling by revisiting the digital source. Unfortunately, people can become accustomed to those feelings and require more exposure over time to receive a dopamine response. Young people are at particular risk. The impulse-control part of the brain in teens and children is less developed than in adults.
 
The increased engagement with social media applications can cause:

  • Aggressive online behavior
  • Compulsive use
  • Decreased interest in offline activities
  • Excessive time spent on social media
  • Withdrawal symptoms when logged off, including mood swings

Teens and children manifesting these symptoms may be at risk for serious mental health issues. If you're concerned that your child may be experiencing these symptoms, learn more about how to stop social media addiction here.

Social media health effects

Social media is contributing to an ongoing youth mental health crisis in the United States and abroad. Victims of this crisis have begun filing social media lawsuits. These lawsuits cite studies on adolescents documenting several health issues allegedly caused or worsened due to social media use, including the following:

Young users may also experience cyberbullying and are at risk of algorithmic direction, resulting in social media addiction. Defective features can cause harm to young users unable to disengage with social media because of addictive behaviors. In some cases, social media may aggravate pre-existing mental health problems like ADHD and body image issues.

Researchers have also found differences in how social media affects people of different genders. A 2021 study found for girls 12 – 15, “a high level of social media or television use in early adolescence followed by a marked increase over time was most predictive of suicide risk in emerging adulthood.” Teen girls have also self-reported worsening disordered eating from frequent social media use.

Frances Haugen, a former Facebook data analyst, provided internal documents and testified before the Senate that Facebook’s own research. It linked Instagram use to suicidal thoughts, eating disorders and body image issues. The risks of social media are better understood than before. Still, companies continue to deflect responsibility and fail to make impactful changes.

In need of help?

If you or a loved one is struggling with suicide ideation, know that free, confidential support is available 24/7 through the National Suicide Prevention Lifeline by dialing 988. Visit the Lifeline online at 988lifeline.org.

The National Eating Disorder Association is also available online at nationaleatingdisorders.org or by phone toll-free at 1-800-931-2237.

Contact a social media attorney

Motley Rice is reviewing allegations that multiple social media platforms, including Instagram, TikTok, YouTube and Snapchat, are intentionally and deliberately designed without regard for the safety of children.

Our thoughts go out to those affected by suicide, self-harm and eating disorders worsened by social media.

Call Attorney Jodi Westbrook Flowers at 1.800.768.4026 or complete this form to explore your options. 

Social media harm lawsuit news and updates

12.01.24

Legal actions grow

December 2024: With nearly 200 additional cases, the social media harm MDL has grown to 815 cases at the start of the month.
November 2024: More people suffering from social media related mental health issues filed cases, bringing the total number of actions in the MDL to 620 at the start of November.
October 2024: 594 actions were pending in MDL 3047.
September 2024: 584 actions were pending in MDL 3047.
August 2024: 557 actions were pending in MDL 3047.
July 2024: 499 actions were pending in MDL 3047.
June 2024: 475 actions were pending in MDL 3047. 
May 2024: 455 actions were pending in MDL 3047.

10.24.24

Social media companies must face state lawsuits

On October 24, 2024, U.S. District Judge Yvonne Gonzales Rogers ruled that Meta (Facebook and Instagram), Alphabet (YouTube), Snap (Snapchat), and ByteDance (TikTok) must face lawsuits filed by state attorneys general. Her ruling dismissed some aspects of the states’ lawsuits concerning Section 230 of the Communications Decency Act. However, the lawsuits are still able to proceed.

10.15.24

Social media companies required to face school district lawsuits

On October 15, 2024, U.S. District Judge Yvonne Gonzales Rogers ruled social media lawsuits filed by multiple school districts may proceed. These lawsuits were filed by school districts spanning 19 different states. 

Judge Rogers excluded some claims against the tech companies based on Section 230 of the Communications Decency Act. 

10.03.24

Texas files a lawsuit against TikTok

On October 3, 2024, Texas Attorney General, Ken Paxton, filed a lawsuit against the social media company TikTok. The lawsuit alleges that the platform isn’t doing enough to protect children. Additionally, the lawsuit claims that the company has violated the Securing Children Online through Parental Empowerment Act (SCOPE). 

Under TikTok’s current model, parents can use the “family pairing” system to oversee their children’s accounts. 

However, the lawsuit alleges that the system doesn’t provide enough secure ways for parents to verify their identities. The system also requires minors to approve the family pairing system on their account. TikTok plans to fight these allegations.

10.01.24

States are suing TikTok for social media addiction

In October 2024, 13 states filed lawsuits against TikTok, claiming the app negatively impacts children’s mental health. They argue that the platform creates an environment that fosters social media addiction by exposing children to short-form videos at a rapid pace.

09.09.24

Judge denies motion to dismiss

On September 9, 2024, Judge Neal Kravitz denied Meta’s motion to dismiss an ongoing lawsuit in the Superior Court for the District of Columbia. Meta filed a motion to dismiss the cases, claiming protection under Section 230 of the Communications Decency Act.  In his decision, Judge Kravitz explained, “Section 230 provides no refuge to Meta because none of the omissions-based deceptive trade practice claims seek to treat Meta as a publisher of any particular third-party content.” Meta has until September 23, 2024, to file an answer to the complaint.

09.09.24

Attorneys general send letter to Congress

On September 9, 2024, the attorneys general of 42 states and U.S. territories sent a letter to Congress urging lawmakers to pass legislation for the protection of American youth. The letter called for legislation requiring an official warning for all social media platforms. This proposed warning was originally devised by the U.S. Surgeon General Vivek Murthy.

08.01.24

Judge ruled on scope of case

As of August, the scope of cases against social media platforms has changed. The 3rd U.S. Circuit Court of Appeals determined TikTok must face a lawsuit after a child died from participating in a challenge recommended by the platform’s algorithm.

06.01.24

Four school districts dismissed

Judge Yvonne Gonzalez Rogers ruled that four public school districts cannot proceed with their claims against Facebook, Instagram, Snapchat, YouTube and TikTok. The California state judge found that the districts could not prove a direct link between the social media apps and the alleged mental health crisis among their students.

11.01.23

Rejected defendants’ motion to dismiss

U.S. District Judge Yvonne Gonzales Rogers partially rejected the Defendants’ motion to dismiss. The Defendants sought immunity under the U.S. Constitution’s First Amendment and Section 230 of the 1996 Communications Decency Act. Claims about harmful platform features, like filtered images without transparent warnings, remain a part of the lawsuit and will be treated as product defects. Other design defects that remain in the case include failure to offer usage limits, an adequate age verification process, effective parental controls and disruptive notifications. 

10.01.23

Attorneys general file suit

The attorneys general of 41 states filed lawsuits against Meta and Instagram for their contributions to the mental health crisis in the United States. 

07.01.23

Transfer order

A Conditional Transfer Order added more cases to the MDL. As of July 2023, the total number of cases was over 100. 

04.01.23

Amended complaint

The plaintiffs submitted an amended master complaint. 

02.01.23

Direct filing order

A Direct Filing Order for the master complaint and short form complaint were agreed on. This order allows the plaintiffs to skip filing in their home jurisdiction, and instead file directly in the MDL court to avoid delays.
Master complaint: February 14, 2023, marked the deadline for the plaintiffs to file their master complaint. The goal of the master complaint is to identify the strongest claims for the first phase of motions to dismiss. 
Short form complaint: February 28, 2023, was the deadline for the proposed short form complaint. A short form complaint is a simplified version of the master complaint that outlines the essential allegations and claims in the case. 

12.01.22

Timeline set

A second case management conference occurred in December 2022. As a result of the conference, the plaintiffs and defendants agreed on a timeline for filing documents in the case. 

11.01.22

Previn Warren appointed lead

The Court held its inaugural case management conference. Following the conference, Motley Rice media and tech attorney Previn Warren was appointed by the Court as co-lead counsel. 

10.01.22

Consolidation order

The Judicial Panel on Multidistrict Litigation (JPML) entered a Transfer Order to consolidate a growing number of social media harm lawsuits. In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation MDL No. 3047 was assigned to the Northern District of California. 
 

View Full Timeline

Frequently asked questions about social media harm lawsuits

Is there a social media class action lawsuit?

There are class action lawsuits against social media companies. In early August, a lawsuit seeking class action status was filed in California on behalf of a 13-year-old girl who experienced mental health effects after using Instagram. Her case is seeking $5 billion in damages, which would be split among people with similar complaints if the case is certified as a class action.

Most other social media mental health cases are individual lawsuits filed in the multidistrict litigation. So far, the mental health lawsuits against social media companies are in the discovery stages of litigation.

Multidistrict litigation differs from a class action lawsuit in several ways. Both are a way of managing a mass tort, which is a harm experienced by a broad group of people. In a class action lawsuit, a single case is filed by a plaintiff. This plaintiff (class representative) represents a large group (class) who was harmed in the same way by the same person or entity. When a class action lawsuit is decided in favor of the plaintiff, the award is split among class members.

Multidistrict litigation collects several individual lawsuits with similar harms and defendants into one court system. This can help expedite proceedings. Each plaintiff still has an individual case though, and any award or resolution they receive is theirs alone.

What social media sites are being sued?

The parent companies of Instagram, Facebook, TikTok, Snapchat and YouTube are the primary defendants in social media harm lawsuits. 

Defendants include:

  • ByteDance Ltd. (TikTok)
  • Google LLC (YouTube)
  • Meta Platforms, Inc. (Instagram and Facebook)
  • Snap, Inc. (Snapchat)

Who can file a social media harm lawsuit?

Parents or guardians can file a lawsuit against social media companies on behalf of their children. Adults can also file a lawsuit if they were minors when their trauma was diagnosed or treated. To file a lawsuit, people must demonstrate that social media caused or worsened the victim’s mental or physical health. You can contact a Motley Rice social media attorney to learn more and see if you qualify for a lawsuit. Our attorneys have been handling these calls for over two years and are trained to handle calls with this type of sensitive and painful information.

What can social media be sued for?

Social media companies can be sued for causing mental health problems in kids. Many lawsuits claim that too much time on social media can be the source of many mental health issues, including anxiety, body dysmorphia, depression, eating disorders, low self-esteem, self-harm, suicide and more.

Additionally, social media companies may face lawsuits for civil liability depending on the type of injury kids and their families suffer. The Third Circuit Court of Appeals recently decided that TikTok and its parent company, ByteDance, could be sued for the death of a young girl caused by her participation in a challenge video she viewed on the platform.

Additionally, social media companies may face lawsuits for civil liability depending on the type of injury kids and their families suffer. The 3rd U.S. Circuit Court of Appeals recently decided that TikTok and its parent company, ByteDance, could be sued. The lawsuit comes from the family of a child who died after participating in a challenge recommended to her by the platform’s algorithm.

How does Section 230 affect social media?

Section 230, of the 1996 Communications Decency Act, protects social media companies from being held responsible for content published on their platforms. It states, “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Section 230 ensures that social media companies cannot be held liable for inappropriate or harmful content kids see on social media or the cyberbullying they may experience. However, the social media harm lawsuits in which Motley Rice is representing clients are not centered around harmful content. Rather, they place the burden on the harmful platforms created by social media app companies.

The decision of the Third Circuit Court of Appeals (Anderson v. TikTok) calls into question the validity and enforceability of Section 230. The court determined that TikTok may face a liability lawsuit for the death of a 10-year-old girl who participated in and passed away as a result of a challenge the platform’s algorithm recommended to her.

The decision of the 3rd U.S. Circuit Court of Appeals raises questions about whether Section 230 is valid and enforceable. The court determined that TikTok may face a liability lawsuit for the death of a 10-year-old girl who participated in and passed away from a challenge the platform’s algorithm recommended to her.

Meta has also attempted to leverage Section 230 as a defense in multiple lawsuits across the country. But this tactic hasn’t succeeded in every courtroom. Judge Neal Kravitz of the Superior Court of the District of Columbia denied Meta’s motion to dismiss a lawsuit that cited Section 230. Judge Kravitz’s decision on September 9, 2024, follows several other recent court decisions against Meta in Utah and New Mexico.

However, in October 2024, U.S. District Judge Yvonne Gonzales Rogers excluded some claims against social media companies based on Section 230. 

Meta has attempted to leverage Section 230 as a defense in multiple lawsuits across the country, but this tactic has not been successful for the company in every courtroom. Most recently, Judge Neal Kravitz of the Superior Court of the District of Columbia denied Meta’s motion to dismiss a lawsuit based on Section 230. Judge Kravitz’s decision on September 9, 2024 follows several other recent court decisions ruling against Meta in Utah and New Mexico.

Our social media lawsuit experience

Motley Rice attorneys have worked for decades fighting for families and people. Our experience includes representing people dealing with tech companies and the harm they’ve caused. Our firm can assist if you or your child:

  • Attempted or died by suicide
  • Was treated for self-harm
  • Was diagnosed by a healthcare professional with an eating disorder

If you believe these conditions were created or worsened by social media, our firm’s attorneys can help you file a lawsuit for social media harm. Your well-being is important to our team.

Help for suicide ideation and eating disorders

If you or a loved one need help, national resources are available.

  • Call 988 or visit 988lifeline.org to connect with the National Suicide Prevention Lifeline. This resource is free, confidential and available 24/7.
  • Call 1-800-931-2237 or visit NationalEatingDisorders.org to connect with the National Eating Disorder Association (NEDA).

Important social media updates

Social media lawsuit overview

Meta (Facebook and Instagram) lawsuits

Snapchat lawsuits

TikTok lawsuits

YouTube lawsuits

Social media use among American youth

The dangers of social media platforms and features

Social media health effects

Case Timeline

Frequently asked questions about social media harm lawsuits

Our social media litigation experience

About the Authors

Sources
  1. 5Rights Foundation. Pathways: How digital design puts children at risk.
  2. Adinoff B. Neurobiologic Processes in Drug Reward and Addiction. Harvard Review of Psychiatry. 2004 Nov;12(6):305-320.
  3. Associated Press. Arkansas sues YouTube over claims that the site is fueling a mental health crisis.
  4. Associated Press. Details from New Mexico’s lawsuit against Snap show site failed to act on reports of sextortion.
  5. BBC. Social media apps are ‘deliberately’ addictive to users.
  6. Bloomberg Law. Meta Can’t Escape States’ Claims It Hooked Kids on Platforms
  7. Bloomberg Law. Meta’s Section 230 Claim Fails in Bid to Escape Kids Harm Case
  8. Casetext. In re Soc. Media Adolescent Addiction/Personal Injury Prods. Liab. Litig.
  9. CNBC. States sue TikTok over app’s effects on kids’ mental health.
  10. CNET. YouTube's AI is the puppet master over most of what you watch.
  11. Globecalls. 2 Best Ways You Can Turn Off TikTok Autoplay.
  12. Governor Gavin Newsom. Governor Newsom signs landmark bill to protect kids from social media addiction, takes action on other measures.
  13. Healthline. Science Behind Why Instagram Stories Are So Addicting.
  14. The Hill. 42 states and territories press Congress on social media warning labels
  15. ICLG.com. Instagram accused of knowingly harming children.
  16. Koob GF, Volkow ND. Neurobiology of addiction: a neurocircuitry analysis. Lancet Psychiatry. 2016 Aug;3(8):760-773.
  17. Mashable. Lawsuit against major social media companies moves forward.
  18. Newsweek. TikTok Denies Violating Texas Child Safety Law.
  19. Nir and Far. The Secret Psychology of Snapchat.
  20. NPR. Meta sued by states claiming Instagram and Facebook fueled youth mental health crisis.
  21. PC Magazine. School Districts Notch a Win in Tech Addiction Case Against Top Platforms.
  22. Pew Research Center. Teens, Social Media and Technology 2023.
  23. Reuters. Instagram, TikTok teen addiction lawsuits grouped in northern California.
  24. Reuters. Meta must face US state lawsuits over teen social media addiction.
  25. Reuters. Social media companies must face youth addiction lawsuits, US judge rules.
  26. Reuters. TikTok must face lawsuit over 10-year-old girl's death, US court rules.
  27. Superior Court of the District of Columbia Civil Division. Memorandum Opinion and Order Denying Defendant’s Motion to Dismiss
  28. TikTok. TikTok introduces Family Pairing.
  29. U.S. Department of Justice. DEPARTMENT OF JUSTICE’S REVIEW OF SECTION 230 OF THE COMMUNICATIONS DECENCY ACT OF 1996.
  30. U.S. Department of Justice. Section 230 Key Takeaways and Recommendations.
  31. U.S. District Court Northern District of California. Amended Master Complaint.
  32. U.S. District Court Northern District of California. Case Management Order No. 1.
  33. U.S. District Court Northern District of California. Case Management Order No. 3.
  34. U.S. Judicial Panel on Multidistrict Litigation. Pending MDLs.
  35. The Washington Post. Teenager sues Meta over ‘addictive’ Instagram features.
  36. Wired. On TikTok, There Is No Time.
  37. Yahoo Finance. SNAP Q4 Earnings Beat Estimates, User Growth Aids Top Line.
     
     

Your Legal Options

Three simple steps with Motley Rice

Connect with an attorney 1.800.768.4026

1

Submit
Information

Call us or fill out our online form with your case details.

2

Case
Review

Our team meticulously reviews your information to assess your case's potential.

3

Case
Consultation

Discuss next steps with our lawyer for tailored guidance.