Meta sues AI company over 'Nudify' ads
Meta is taking legal action against Joy Timeline HK Limited, the creator of AI-powered nudify apps that turn regular photos into explicit images without consent. These apps were promoted on Facebook and Instagram, raising serious privacy concerns—so Meta is stepping up efforts to protect users from non-consensual intimate imagery.
How the developer kept dodging bans
Meta has already removed over 87,000 ads and blocked accounts linked to these nudify tools, but the developer kept dodging bans by making new accounts and sites. To stay ahead, Meta has upgraded its AI to spot these ads faster and is teaming up with other platforms through the Tech Coalition's Lantern Program to help take down similar content elsewhere.
Laws supported by this lawsuit
This lawsuit supports laws like the Take It Down Act, which bans sharing explicit images without consent—even those made by AI. Meta also backs stricter age checks for app downloads. Moves like this highlight how tech companies are being pushed to do more for user privacy and digital safety.