Page Loader

AI-Powered 'Nudify' apps drive digital blackmail surge

Technology

Sextortion scams targeting kids—especially boys aged 14-17—are on the rise, with criminals using AI-powered "nudify" apps to create fake nude images for blackmail.
The tragedy of 16-year-old Elijah Heacock, who died by suicide after being threatened for $3,000 over a fake image, shows just how serious this has become.

How scammers use AI to create fake images

AI-powered sextortion doesn't need real photos anymore; scammers just use regular pics and turn them into convincing fakes.
These nudify apps advertise openly online, and the market is huge—studies estimate up to $36 million in yearly revenue.
Even with Meta's lawsuit against a company behind a nudify app and new laws in the US and UK targeting deepfakes, these services keep popping up and are tough to shut down.
The FBI says this new tech makes it easier than ever for scammers to target teens, so staying aware is more important than ever.