Courtroom gavel on desk symbolizing historic legal verdict against Meta for child safety violations

Jury Rules Meta Harmed Kids, Awards $375M in First US Case

✨ Faith Restored

A New Mexico jury found Meta knowingly harmed children's mental health and made misleading safety claims, setting a legal precedent that could change how tech platforms protect young users. The verdict challenges 30 years of tech industry legal immunity.

After seven weeks of testimony, a New Mexico jury ruled that Meta knowingly damaged children's mental health through Instagram and Facebook. The verdict marks the first time a jury in the United States has held a major social media company accountable for harm to young users.

The $375 million penalty may seem small for a company worth $1.5 trillion, but the legal precedent could reshape the entire tech industry. Prosecutors built their case using Meta's own internal documents, executive testimony, and an undercover investigation where state agents posed as children online.

What they found was alarming. Meta failed to enforce its ban on users under 13, and its algorithms actively pushed harmful content to teens, including material about suicide and self-harm.

"We know the output is meant to be engagement and time spent for kids," prosecution attorney Linda Singer told jurors. "That choice that Meta made has profound negative impacts on kids."

The jury reviewed specific statements from CEO Mark Zuckerberg, Instagram head Adam Mosseri, and Meta's global safety chief Antigone Davis. They found thousands of individual violations and chose the maximum $5,000 penalty for each one.

Jury Rules Meta Harmed Kids, Awards $375M in First US Case

Juror Linda Payton said the panel believed each affected child was worth the maximum amount.

The Ripple Effect

This verdict does something tech companies have avoided for three decades. It challenges Section 230 of the Communications Decency Act, which has protected platforms from liability for user content.

New Mexico prosecutors argued successfully that this protection doesn't cover Meta's algorithmic choices, the systems the company built to decide what content to show users and when. The jury agreed those choices caused direct harm.

More than 40 state attorneys general have filed similar suits against Meta over child mental health. This verdict gives them all a roadmap.

ParentsSOS, a coalition of families who lost children to social media-related harm, called it "a watershed moment." For parents who have fought for years to hold tech platforms accountable, the verdict proves what they've known: these companies chose profit over protecting kids.

Meta says it will appeal and maintains it invests heavily in safety. A company spokesperson said they work hard to remove bad actors and harmful content from their platforms.

A second trial phase in May will determine whether Meta must fund public programs to address the documented harms. That's where abstract accountability could transform into real change for millions of young users.

Based on reporting by Optimist Daily

This story was written by BrightWire based on verified news reports.

Spread the positivity!

Share this good news with someone who needs it

More Good News