New UK Law Requires Tech Platforms to Remove Revenge Porn, Nonconsensual Deepfake Images Within 48 Hours
New law targets nonconsensual deepfakes with swift removal rules and platform penalties.

Five years ago, a woman known only as Jodie opened an anonymous email directing her to a website where sexually explicit images of her, images she never posed for, images fabricated by AI, had been published alongside her personal details. As LBC first reported, the perpetrator turned out to be her closest friend. When she went to the police, they told her no crime had been committed.
That legal void is now closing. On 18 February, Prime Minister Keir Starmer announced the government would amend the Crime and Policing Bill to require tech platforms to take down intimate images shared without consent, including AI-generated deepfakes, within 48 hours of a report. According to LBC, companies that fail to comply face hefty fines or have their services blocked in the UK.
A Reckoning Over Revenge Porn and Deepfake Images
The timing is no coincidence. Britain has spent early 2026 convulsed by a deepfake crisis that made the threat of AI abuse viscerally real. In late December, Elon Musk's Grok chatbot, embedded within the social media platform X, began fulfilling user requests to digitally undress women and girls.
A report by the Centre for Countering Digital Hate found that Grok produced an estimated three million sexualised images in barely eleven days. Around two per cent of those analysed appeared to depict minors.
Malaysia and Indonesia temporarily blocked Grok. The European Commission opened an investigation under the Digital Services Act, and Ofcom launched its own inquiry. As Al Jazeera reported, Starmer labelled the images 'disgusting' and 'unlawful,' telling X to 'get a grip.'
Musk's response was characteristically combative. According to Time, he accused the UK government of fascism and posted an AI-generated image of the prime minister in a bikini. X eventually restricted Grok's image tools to paying subscribers, a move Kendall dismissed as 'monetising abuse.'
What makes this legislative push striking is its reach beyond takedown speed. The Department for Science, Innovation and Technology confirmed that victims would report an image only once for it to be removed across multiple platforms, with automatic deletion if anyone tries to repost. Ofcom is also considering classifying such images alongside child sexual abuse material and terrorism content, a designation mandating digital fingerprinting and proactive blocking.
Why is the UK government so fascist? https://t.co/sRg979MTQx
— Elon Musk (@elonmusk) January 10, 2026
Why 48 Hours May Not Be Enough to Tackle Nonconsensual Deepfake Images
In a new interview, @georgiaharisonx shares how our Revenge Porn Helpline practitioners continue to work to remove non-consensually shared intimate images online.
— rphelpline (@RPhelpline) October 14, 2024
"She does everything she can to try to get it taken down from all of these different places. They’re absolutely… pic.twitter.com/QrQ0tqNOgM
Not everyone is satisfied that the clock is ticking fast enough. Speaking to The Register, Hanna Basha, the lawyer who represented television personality Georgia Harrison in her civil revenge pornography case, welcomed the measure but questioned its urgency. 'Why 48 hours and not 24 or even 12?' she asked. 'Every hour these images remain online compounds the harm.' She also raised a more basic problem: many victims cannot even find where to report abusive content.
The amendment sits within a rapidly thickening layer of legislation. On 6 February, as Olliers Solicitors detailed, a separate offence under the Data (Use and Access) Act 2025 came into force, criminalising the creation of nonconsensual intimate deepfakes, not merely their distribution, carrying a potentially unlimited fine. The Bill will also outlaw nudification tools: apps designed to strip clothing from images using AI.
For Jodie and the coalition behind her, including the End Violence Against Women Coalition, the Revenge Porn Helpline, #NotYourPorn, and Glamour UK, this week felt like vindication. Her petition gathered over 73,000 signatures and was delivered to Downing Street. But campaigners remain clear-eyed. Across 2025, an estimated eight million deepfake images were shared, up from 500,000 two years prior. The scale is industrial, and the tools enabling it proliferate faster than any parliament can legislate.
Minister for violence against women and girls, Alex Davies-Jones, said the law means 'tech platforms can no longer drag their feet.' Perhaps. But the deeper question is whether a 48-hour window represents a genuine transfer of power, or merely a tidier way of asking Silicon Valley to do what it should have done years ago. For women like Jodie, who spent half a decade fighting for recognition that what happened to her was even a crime, imperfect progress still carries weight.
© Copyright IBTimes 2025. All rights reserved.


















