
US victims of explicit deepfakes can soon sue creators
What's the story
President Donald Trump is expected to sign the "Take It Down Act" into law on Monday.
The legislation makes it illegal to share nonconsensual, explicit images, either real or computer-generated. It also mandates tech platforms to remove such content within 48 hours of notification.
The law strengthens protections for victims of revenge porn and AI-generated sexual images, holding tech platforms accountable while providing clarity for law enforcement prosecution.
Legislative backing
Take It Down Act receives bipartisan support
The Take It Down Act has received overwhelming support in Congress, passing with only two dissenting votes. Over 100 organizations, including Meta, TikTok, and Google, have backed the legislation.
First Lady Melania Trump also supported the bill by lobbying House lawmakers and inviting teenage victim Elliston Berry to a joint session of Congress.
Texas Senator Ted Cruz and Minnesota Senator Amy Klobuchar introduced this important legislation last summer.
AI impact
Deepfake legislation addresses AI-generated content concerns
The Act is one of the first federal laws in the United States to tackle the dangers posed by AI-generated content.
Ilana Beller from Public Citizen said, "AI is new to a lot of us... but (non-consensual) intimate deepfakes are such a clear harm with no benefit."
The law seeks to protect victims like Berry, who was targeted by an explicit deepfake image shared on Snapchat.
Industry response
Tech platforms respond to deepfake concerns
In light of the new legislation, major tech platforms have already taken steps to remove nonconsensual sexual images.
Google, Meta, and Snapchat have forms for users to request the removal of explicit images. Apple and Google have also removed apps that turn clothed images into nude ones from their stores.
However, some sites still don't cooperate with organizations like StopNCII.org and Take It Down for image removal across multiple platforms.