Meta Ordered to Pay $375 Million in New Mexico Child Exploitation Trial
Major Legal Defeat for Meta
A New Mexico jury has ordered Meta to pay $375 million in a lawsuit over child sexual exploitation on its platforms, marking one of the largest verdicts against a social media company in cases related to child safety.
The Case
The lawsuit alleged that Meta's platforms — primarily Instagram and Facebook — failed to adequately protect children from sexual exploitation and grooming. The New Mexico Attorney General's office argued that Meta's recommendation algorithms actively connected predators with minors, and that the company knew about these risks but prioritized growth over safety.
Key Allegations
- Algorithmic harm: Recommendation systems that connected predators with children
- Inadequate age verification: Insufficient checks to prevent minors from accessing harmful content
- Known risks: Internal documents showing Meta was aware of exploitation on its platforms
- Failure to act: Despite knowledge of the problem, Meta allegedly did not take sufficient remedial action
Broader Context
This verdict is part of a growing wave of litigation against social media companies over child safety:
- Multiple states have filed similar lawsuits against Meta, TikTok, and other platforms
- The US Surgeon General has warned about social media's impact on youth mental health
- Congressional hearings have repeatedly pressed tech executives on child safety measures
- The EU's Digital Services Act imposes stricter obligations on platforms regarding child safety
Meta's Response
Meta is expected to appeal the verdict. The company has previously argued that it invests significant resources in safety features and that Section 230 of the Communications Decency Act protects platforms from liability for user-generated content.
Implications
A $375 million verdict, if upheld, could have far-reaching implications for how social media companies design their platforms. It may accelerate the shift toward stronger age verification, algorithmic transparency, and safety-by-design principles in social media product development.