
States Cap Kids' Screen Time to Just One Hour a Day
Virginia just launched a bold experiment that limits teens to one hour of social media per day. While courts are blocking some measures, a wave of new protections for young users is sweeping the nation in 2025.
After years of watching kids glued to screens, governments are finally stepping in with actual rules that put guardrails on how much time young people spend scrolling.
Virginia became the first state to launch a default one-hour daily limit for anyone under 16 using social media. The law requires companies to use age verification to enforce the cap, marking a major shift in how we protect kids online.
The move comes as lawmakers across the country worry about the mental health impacts of infinite scroll feeds and AI chatbots that can answer anything without filters. Texas recently proposed requiring Apple and Google to verify ages before allowing minors to download apps, though courts blocked it just before Christmas.
The legal battles are heating up fast. NetChoice, a tech industry group that includes Amazon, Google, and Meta, sued Virginia over its time limits. The group argues that capping social media use is like limiting kids' ability to read books or watch educational documentaries.

Not everyone agrees the crackdown is genuine. Adam Kovacevich from the Chamber of Progress points out that some app store verification bills were actually written and pushed by Meta itself. The social media giant may be using these laws to shift blame away from its own platforms.
Legal experts say the approach misses important nuance. Catalina Goanta, who studies technology law at Utrecht University, notes that lawmakers are treating all internet services as net negatives for kids. The reality is much more complex than simply banning access.
The Ripple Effect
Despite the court challenges, the message is clear: protecting kids online has become a top priority across the political spectrum. Whether through time limits, age gates, or parental consent requirements, states are refusing to wait for tech companies to regulate themselves.
Multiple states are now pushing forward with similar measures, even as early laws face legal hurdles. The momentum suggests we're entering a new era where companies will need to prove who's using their platforms and show they're protecting younger users.
This shift represents something parents have been demanding for years: real accountability for how social media affects their children. Even if courts strike down individual laws, the pressure on tech companies to create safer spaces for kids is only growing stronger.
Based on reporting by Fast Company
This story was written by BrightWire based on verified news reports.
Spread the positivity! π
Share this good news with someone who needs it

