
3 Million Dating Photos Deleted After Privacy Victory
Following an FTC investigation, AI company Clarifai deleted 3 million dating profile photos and the facial recognition models trained on them. The move marks a rare accountability win for user privacy in the tech industry.
When 3 million OkCupid users uploaded their photos to find love in 2014, they never imagined their faces would train artificial intelligence. But that's exactly what happened, and now those images are finally gone.
AI platform Clarifai just deleted all 3 million photos it received from OkCupid over a decade ago, along with any facial recognition models built using that data. The deletion came after a Federal Trade Commission investigation revealed the dating app shared user photos, demographic details, and location data without permission.
The story began when Clarifai's founder emailed OkCupid's co-founder in 2014, noting that the dating platform "must have a HUGE amount of awesome data" for training AI. OkCupid executives, who had invested in Clarifai, shared the information despite their own privacy policies clearly prohibiting such behavior.
The images were used to build AI tools that could guess someone's age, sex, and race from their face. The practice only came to light five years later when a New York Times article mentioned it in 2019, prompting the FTC to investigate.

Last month, the FTC and OkCupid reached a settlement. While OkCupid and its parent company Match Group didn't admit wrongdoing, Clarifai's deletion of the data confirms the photos were indeed shared.
The Bright Side
This case represents a meaningful shift in how regulators hold tech companies accountable for misusing personal data. While the FTC can't fine companies for first-time offenses like this, they've permanently banned OkCupid and Match from misrepresenting their data collection practices or helping others do the same.
The deletion of both the photos and the AI models trained on them goes beyond a simple slap on the wrist. It means the technology built on improperly obtained data no longer exists, setting a precedent that could influence how other companies handle similar situations.
For the millions of people whose faces were used without consent, the outcome offers a rare form of digital justice. Their images are gone, and the AI trained on their faces has been erased.
Sometimes holding tech companies accountable takes years, but this case proves it can actually happen.
More Images


Based on reporting by TechCrunch
This story was written by BrightWire based on verified news reports.
Spread the positivity!
Share this good news with someone who needs it


