'Nudify' Apps Still Appear In Apple, Google Platforms Despite Policy Bans — Report Finds, Flags Risk To Minors
Tech Transparency Project reveals alarming findings on "nudify" apps in both Google and Apple app stores

Typing 'AI NS' into Apple's App Store allegedly triggered an autocomplete suggestion such as 'image to video ai nsfw', according to a new investigation by the Tech Transparency Project (TTP). Following these search prompts led researchers to a cluster of apps that claimed to digitally alter images of women, including by generating sexually explicit or altered content. Thirty-one of those apps identified by TTP carried age ratings marking them as suitable for minors.
The watchdog group, a research arm of the nonprofit Campaign for Accountability, published its findings on Wednesday. Bloomberg, which covered the report, said Apple removed 15 of the flagged apps and Google pulled seven from its Play Store after both companies contacted them.
The report, published by TTP, a research initiative of the Campaign for Accountability, was covered by Bloomberg. Following inquiries, Apple removed 15 of the flagged apps, while Google removed seven from its Play Store.
TTP ran its tests using newly created Apple and Google accounts across seven search terms: 'nudify', 'undress', 'deepfake', 'deepnude', 'adult AI', 'face swap', and 'AI NSFW'. The searches allegedly returned several apps on the Apple App Store and Google Play Store. According to the report, many apps carried age ratings that placed them within easy reach of children. Roughly 40 per cent of the apps surfaced by search results in both stores could render women nude or scantily clad, according to the watchdog.
A5. "Nudify" apps exploit #AI to remove clothing from photos, often targeting minors without consent. This can lead to exploitation and blackmail. Educate kids on the risks and encourage reporting. Learn how the #FBI can help at https://t.co/XrahA9d20e #NCAPM25 https://t.co/UWubU4AWLA
— FBI (@FBI) April 30, 2025
Apps Generated Millions In Downloads And Revenue
Together, the apps TTP identified have been downloaded 483 million times and have generated more than $122 million (£92 million) in lifetime revenue, according to figures compiled by mobile analytics firm AppMagic. Apple and Google both take a cut of in-app transactions processed on their platforms.
Katie Paul, TTP's director, told Bloomberg that the two tech giants were not merely failing to vet the apps or profit from them, they were 'actually directing users to the apps themselves.'
TTP report said some search queries returned sponsored placements promoting such apps. A search for 'deepfake' allegedly returned an ad for FaceSwap Video by DuoFace as the first result. Another for 'face swap' produced a promoted listing for an app simply called AI Face Swap. In testing, some of these apps swapped faces between uploaded images of clothed and topless women without restriction, TTP said.
A third app, Video Face Swap AI: DeepFace, marketed itself on Google Play using actress Anya Taylor-Joy's face superimposed onto the body of Game of Thrones character Daenerys Targaryen. Inside, a category labelled 'Girls' contained templates of women in sexualised poses. The app carries an 'E' for Everyone rating and had been downloaded more than a million times, the report added.
Okapi Software, which develops Video Face Swap AI, said it had opened an internal review and removed some of the content after being contacted. The firm said its product was not designed to offer nudifying functionality and attributed the explicit material to user uploads. 9to5Mac reported that one developer separately confirmed to TTP it was using Grok for image generation and pledged to tighten moderation settings after learning what the tool could produce.
I not aware of any naked underage images generated by Grok. Literally zero.
— Elon Musk (@elonmusk) January 14, 2026
Obviously, Grok does not spontaneously generate images, it does so only according to user requests.
When asked to generate images, it will refuse to produce anything illegal, as the operating principle… https://t.co/YBoqo7ZmEj
Nudify Apps Rated For Minors Spark Alarm In Schools
The age ratings are what give the findings their sharpest edge. TTP flagged the 31 apps rated suitable for minors as particularly concerning against the backdrop of rising AI-generated deepfake scandals in schools, where classmates have used similar tools to target one another.
Apple's developer rules explicitly ban 'overtly sexual or pornographic material.' Google Play Store goes further, prohibiting apps that claim to undress people or see through clothing, even when pitched as pranks or entertainment. On paper, both policies cover the territory in question.

Google spokesperson Dan Jackson said the company investigates reported violations and takes 'appropriate action,' according to the Tech Transparency Project report. Apple has reportedly declined to comment.
Pressure is also mounting from lawmakers. US President Donald Trump signed the Take It Down Act on 19 May 2025, criminalising the publication of non-consensual intimate imagery, including AI-generated deepfakes. Platforms have until May 2026 to implement notice-and-removal procedures.
The UK government announced last week an amendment to the Crime and Policing Bill that would criminalise the supply of nudification tools and allow senior executives of non-compliant firms to face imprisonment or fines. Last August, the National Association of Attorneys General wrote to Apple Pay and Google Pay to stop processing payments for services that manufacture non-consensual sexual images.
TTP's earlier January probe flagged more than 100 Nudify apps across the two stores. Apple and Google each removed more than two dozen apps at the time. Fresh ones, the latest investigation suggests, have since filled the vacancies.
© Copyright IBTimes 2025. All rights reserved.

























