Apple, Google app stores criticized for hosting AI 'undressing' apps
What's the story
A recent investigation by the Tech Transparency Project (TTP) has revealed a disturbing trend in both Apple and Google's app stores. The study found dozens of "nudify" apps that use artificial intelligence (AI) to digitally strip clothes off people in photos or create fake nude images. The report, based on an analysis of US app stores, highlights serious questions about the enforcement of safety guidelines by these tech giants.
International reach
Global availability of AI 'nudify' apps
The TTP's investigation uncovered 55 nudify apps on Play Store and 47 on the App Store. These were found using search terms like "nudify" and "undress," with many being tested against AI-generated photos of fully clothed women to see if they could digitally remove clothing. The results were alarming, as these tools were able to create realistic nude or sexualized versions of the images, enabling non-consensual deepfake pornography.
Company actions
Apple and Google respond to TTP's findings
After being contacted by CNBC and the TTP with details from the report, Apple removed 28 of the identified apps from its App Store. The company also warned developers of other similar apps that they could be removed if they don't comply with guidelines. However, two of these apps were reinstated after their developers resubmitted revised versions that Apple deemed compliant.
App suspension
Google suspends some nudify apps
Google has suspended some of the nudify apps flagged in the report for violating its policies. However, it did not specify how many were affected. A spokesperson for the company said they investigate such cases thoroughly and take appropriate action when violations are confirmed. The investigation into the apps flagged by TTP is still ongoing, according to Google.
Criticism
TTP criticizes Apple and Google for hosting nudify apps
The TTP has strongly criticized both Apple and Google, saying they host a collection of apps that can turn an innocuous photo of a woman into an abusive, sexualized image. The rise of AI-powered nudify and deepfake tools has led to a surge in digital sexual abuse, enabling anyone with a smartphone to create non-consensual explicit images. Experts warn these technologies violate privacy and inflict emotional trauma on victims.
Call for action
Tech giants urged to strengthen content moderation
Despite the growing awareness of AI misuse, regulatory action against such abuse remains limited. Many apps disguise their true purpose under seemingly harmless names or descriptions, making it easier for them to evade detection. Digital rights advocates have called on tech companies to take more proactive measures against these apps, including stronger AI content moderation, human-led review systems, and transparency around algorithmic misuse.