UK Law Now Criminalises AI Deepfake Porn Images Amid Victim Calls for Stronger Protections
February 7, 2026
Victims of deepfake abuse in the UK now have stronger legal protection. From Friday, making AI-generated explicit images without consent is a criminal offence. Campaigners from Stop Image-Based Abuse handed a petition with over 73,000 signatures to Downing Street. They want more civil measures like takedown orders for abusive images on sites and devices.
Jodie, a victim who uses a fake name, said, “Today’s a really momentous day.” She praised the government for protecting more women and girls. Jodie found AI-made porn images of herself in 2021. She joined 15 women in testifying against Alex Woolf, 26, who posted stolen images online. He was jailed for 20 weeks.
“But I had a really difficult route to getting justice because there simply wasn’t a law that really covered what I felt had been done to me,” Jodie said.
The offence was added to the Data (Use and Access) Act 2025. Though passed last July, it only started being enforced this April, frustrating campaigners. Jodie said, “They should have brought them in immediately. The delay has caused millions more women to become victims.”
In January, Leicestershire police began probing deepfake images made by Grok AI. Madelaine Thomas, a sex worker and founder of Image Angel, spoke out. She called the law a “very emotional day” but said it does not fully protect sex workers. She explained that abuse of commercial sexual images is seen only as copyright violation, not harmful enough for strong help.
Thomas shared, “When I first found out my intimate images were shared, I felt suicidal.”
One in three UK women face online abuse, according to Refuge, a domestic abuse group. Stop Image-Based Abuse includes major groups like End Violence Against Women Coalition and experts from Durham University.
A Ministry of Justice spokesperson said, “Weaponising technology to target and exploit people is completely abhorrent. It’s already illegal to share intimate deepfakes – and as of yesterday, creating them is a criminal offence too.”
The government will ban ‘nudification’ apps and make creating such deepfakes a priority offence under the Online Safety Act. Platforms must actively block these images.
The UK is stepping up to curb AI image abuse, but victims say more action is needed to bring justice and support to all affected.
Read More at Theguardian →
Tags:
Deepfake Abuse
Ai-Generated Images
Non-Consensual Images
Image-Based Abuse
Uk Law
Intimate Image Abuse
Comments