Meta and Other Tech Giants Face Setback in Youth Impact Lawsuits
A Los Angeles judge has approved expert testimony about social media's impact on youth in upcoming trials against major platforms like Meta, Snap, Google, and TikTok. The ruling allows 10 out of 11 proposed expert witnesses to testify, potentially strengthening hundreds of lawsuits alleging harm to young users. The judge rejected the companies' Section 230 defense, stating it doesn't apply when claims focus on platform design. Experts are prohibited from discussing the companies' intent, focusing instead on the effects of the platforms.

*this image is generated using AI for illustrative purposes only.
Meta Platforms Inc., along with other major social media companies, has encountered a significant legal challenge in ongoing lawsuits concerning the alleged harm their platforms cause to young users. A Los Angeles judge has ruled to allow expert testimony about social media's impact on youth in upcoming trials, potentially strengthening the cases against these tech giants.
Expert Testimony Approved
In an 87-page ruling, Judge Carolyn B. Kuhl approved testimony from 10 out of 11 proposed expert witnesses. This decision comes as a blow to Meta Platforms, Snap, Google, and TikTok, who are facing hundreds of lawsuits from individuals, school districts, and state attorneys general. The plaintiffs allege that these social media platforms have caused harm to young users.
Scope of Expert Testimony
While the judge's ruling allows for a wide range of expert opinions, it does come with some limitations. Notably, experts are prohibited from discussing the companies' intent. This restriction aims to maintain focus on the effects of the platforms rather than speculating about the motivations behind their design and operation.
Section 230 Defense Rejected
A key aspect of the ruling is Judge Kuhl's rejection of the companies' arguments to exclude testimony under Section 230 of the Communications Decency Act. This legal provision has historically shielded internet companies from liability for user-generated content. However, the judge stated that this protection does not apply when claims focus on platform design or operation rather than third-party content.
Implications for Social Media Companies
This ruling presents a significant challenge for Meta Platforms and other social media giants. They must now defend against claims that their platform designs potentially cause addiction or other harm to young users. The decision could set a precedent for how similar cases are handled in the future, potentially exposing these companies to increased legal scrutiny and liability.
Looking Ahead
As these trials move forward, the tech industry will be closely watching their outcomes. The cases could have far-reaching implications for how social media platforms are designed and operated, particularly concerning their younger user base. For Meta Platforms, this development adds to the ongoing debates and challenges surrounding the impact of its products on society, especially on vulnerable populations like youth.
The inclusion of expert testimony may provide courts and the public with a more comprehensive understanding of the complex relationship between social media use and youth well-being. As these legal proceedings unfold, they may shape the future landscape of social media regulation and corporate responsibility in the tech sector.



























