
UK Bans AI Apps That Create Fake Nudes Without Consent
The UK just made it illegal to create or share AI apps that digitally remove clothing from photos, closing a dangerous loophole that's been weaponizing technology against women and girls. Developers and distributors now face criminal charges for tools that have caused real psychological harm.
The UK just declared war on a disturbing corner of the internet that's been hiding in plain sight. Starting now, creating or distributing AI-powered apps that strip clothing from photos without consent is officially a crime.
These so-called "nudification" apps use artificial intelligence to turn ordinary photos into realistic fake nudes. They've been marketed as harmless fun, but the damage they cause is anything but entertaining.
"Women and girls deserve to be safe online as well as offline," said Technology Secretary Liz Kendall when announcing the ban. The new law targets the people making money from these tools, not just the users abusing them.
The problem had grown more urgent than most people realized. According to the Internet Watch Foundation, nearly one in five young people who reported explicit images in 2024 said their photos had been digitally manipulated without permission.
Even more alarming, these fake images sometimes ended up in the darkest corners of the internet, including sites sharing child exploitation material. The psychological impact on victims can be devastating, especially when the images spread beyond their control.

The new ban builds on the Online Safety Act, which already made it illegal to create non-consensual explicit deepfakes. But until now, the companies building and selling the apps themselves operated in a legal gray area.
The Ripple Effect
This legislation sets a powerful precedent for how governments can respond to AI-enabled abuse. The UK isn't stopping here. Officials are partnering with tech companies to develop tools that can detect and block intimate image abuse in real time.
SafeToNet, a UK company, is already working on software that can disable phone cameras if it detects someone is about to capture sexual content. Similar protections already exist on platforms like Meta to help protect minors from sharing intimate photos.
Child protection organizations applaud the move but say there's more work to do. The NSPCC wants mandatory device-level protections that automatically block harmful content creation at the hardware level, not just in apps.
The government promises this is just the beginning. Next up: legislation to ban AI tools designed to create child exploitation material and make it impossible for children to take, view, or share nude images on their devices.
As AI technology races forward, laws protecting people from digital exploitation are finally starting to catch up.
More Images




Based on reporting by Optimist Daily
This story was written by BrightWire based on verified news reports.
Spread the positivity! π
Share this good news with someone who needs it

