
UK Regulators Push Social Media to Protect Under-13s
British regulators are asking major social media platforms to strengthen age checks after research showed 86% of children aged 10-12 have social media profiles despite minimum age limits. Seven companies including TikTok, YouTube, and Instagram must now prove they're keeping young children safer online.
Getting kids off platforms designed for teens and adults just became a top priority for UK regulators.
Ofcom and the Information Commissioner's Office sent letters to seven major tech companies asking them to stop relying on self-reported ages, which children can easily fake. The companies include Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, and X.
Research reveals the current system isn't working. Even though most platforms require users to be at least 13 years old, 86% of children aged 10-12 already have their own social media profiles.
The regulators want companies to use stronger age verification methods similar to those required for adult content sites. These tools could include AI-based age detection and facial estimation technology.
Several companies are already making changes. Meta says it uses AI to detect users' ages based on their activity and has tested facial age estimation tools. TikTok reported removing over 90 million suspected under-13 accounts between October 2024 and September 2025.

Roblox announced it has added 140 new safety features in the past year, including mandatory age checks before children can access chat features. Snapchat is testing verification tools of its own.
Why This Inspires
Beyond just blocking young users, this push represents a fundamental shift in how tech companies approach child safety. Instead of treating protection as an afterthought, platforms are being asked to build safety into their core design.
Professor Amy Orben from Cambridge University calls this movement essential but notes it's just the beginning. The next challenge involves making sure recommendation algorithms don't exploit children's attention, even when their age is verified.
Technology Secretary Liz Kendall made the government's position clear: no platform gets a free pass when it comes to protecting children. Companies shouldn't need court orders to act responsibly.
The message from regulators is simple but powerful. Children deserve digital spaces designed for their developmental needs, not platforms that put them at risk by default.
More Images


Based on reporting by BBC Technology
This story was written by BrightWire based on verified news reports.
Spread the positivity!
Share this good news with someone who needs it


