Major Social Platforms Join Teen Mental Health Ratings
Meta, TikTok, YouTube, Snap, and other major platforms are voluntarily submitting to independent safety ratings focused on protecting teen mental health. The new system will grade how well each platform shields users aged 13-19 from harmful content.
Major social media companies are stepping up to be graded on how well they protect teenagers online.
Meta, TikTok, YouTube, Snap, Roblox, and Discord have all agreed to participate in a new rating system that evaluates their teen safety features. The program, led by the Mental Health Coalition's Safe Online Standards initiative, brings together independent experts to assess how platforms handle adolescent mental health.
Dr. Dan Reidenberg, Managing Director of the National Council for Suicide Prevention, leads the initiative. The evaluation covers about two dozen standards including platform policies, content moderation, transparency, and how companies address exposure to suicide and self-harm content.
Platforms will voluntarily submit documentation about their safety tools and product features. An independent panel of global experts will review everything and assign one of three ratings: "use carefully," "partial protection," or "does not meet standards."
The highest rating of "use carefully" comes with a blue badge that compliant platforms can display. To earn it, platforms must have easy-to-use reporting tools, clear privacy settings for parents, and effective filters that reduce exposure to harmful content.
Companies receiving "partial protection" have some safety tools, but they're hard to find or use. The lowest rating goes to platforms where filters and moderation don't reliably block unsafe content.
The participating companies also include platforms that have faced scrutiny over child safety. Roblox recently dealt with concerns about children's wellbeing on its platform, while Discord has strengthened its age verification processes.
Why This Inspires
For years, parents have struggled to know which platforms genuinely prioritize teen safety versus those that just talk about it. This independent rating system finally gives families clear, expert-backed guidance.
The voluntary participation from major platforms signals a willingness to be held accountable. While the standards may seem basic, establishing baseline expectations creates pressure for continuous improvement across the industry.
Most importantly, the focus on adolescent mental health acknowledges what parents have known: social media affects developing minds differently. Having experts specifically evaluate how platforms protect the 13-19 age group addresses a critical gap in online safety standards.
The ratings put power back in parents' hands, giving them concrete information to make informed decisions about their teens' digital lives.
More Images
Based on reporting by Engadget
This story was written by BrightWire based on verified news reports.
Spread the positivity!
Share this good news with someone who needs it

