
Olympics Team Shields Athletes From Online Abuse With AI
After Olympic skier Kokone Kondo faced cruel comments following an injury withdrawal, Japan's Olympic Committee deployed AI monitoring in Milan to protect athletes from harassment. The system flags abusive posts 24/7, offering a blueprint for athlete mental health protection worldwide.
When 22-year-old Japanese freestyle skier Kokone Kondo had to withdraw from the Milan Olympics due to a knee injury, online trolls told her she shouldn't qualify next time. But this heartbreaking moment sparked something powerful: a full-scale protection system now shielding athletes from digital cruelty.
Kondo's injury came at the worst possible time. A fall during practice on February 5 damaged her left knee, forcing her out of both the slopestyle and big air competitions. It was her second consecutive Olympics ended by injury, following a similar setback in Beijing 2022.
The online attacks came swiftly. Strangers who knew nothing about her life or sport sent messages like "Your potential only exists if you can manage your body" and "Missing two Games in a row is unacceptable." Kondo shared the abuse on Instagram, refusing to stay silent about the cruelty athletes face during their most vulnerable moments.
Her courage revealed a problem bigger than one athlete. Research shows female competitors face disproportionate online harassment, including sexualized attacks that male athletes rarely encounter. These aren't just mean comments. They're human rights violations that threaten psychological safety, damage mental health, and can force athletes into early retirement.
The Japanese Olympic Committee responded with action. Their Milan team now uses artificial intelligence combined with human reviewers to monitor social media around the clock. When abusive posts appear, staff flag them for removal and provide legal support to targeted athletes.

The system builds on lessons from October, when Japanese athletes at the World Athletics Championships faced roughly 500 abusive messages. About 230 were serious enough to warrant removal requests. That experience taught the JOC how to scale protection for the Winter Games.
Attorney Shoichi Sugiyama, who leads the Japan Safe Sport Project, explains that victims often face dead ends when seeking justice. Social media companies don't always respond to removal requests, and technical limitations make identifying anonymous abusers nearly impossible. Athletes end up bearing the financial and emotional costs of legal action alone.
The Ripple Effect
Japan isn't alone in this fight. U.S. figure skater Amber Glenn received threats after making political comments at a press conference, describing the volume as "scary." Athletes worldwide now face a reality where every post and interview can trigger waves of digital harassment.
But the protective measures emerging from these challenges offer hope for systemic change. The AI monitoring system proves that organizations can actively shield competitors instead of leaving them to face abuse alone. Other national Olympic committees are watching Japan's approach as a potential model.
Sugiyama offers simple guidance that could prevent most harassment: don't post anything online you wouldn't say to someone's face. That basic empathy could protect athletes already carrying the weight of competition, injury, and impossible expectations.
Athletes like Kondo are transforming their pain into progress by speaking up, ensuring future competitors won't face the same digital cruelty alone.
More Images



Based on reporting by Japan Times
This story was written by BrightWire based on verified news reports.
Spread the positivity! π
Share this good news with someone who needs it


