Next Article
Google will now automatically remove NCII content from Search
Technology
Google is teaming up with StopNCII.org to help stop the spread of non-consensual intimate images (NCII) on the internet.
Soon, Google Search will use special tech to spot and remove these images without saving originals—using PDQ for photos and MD5 for videos.
This comes after some called out Google for being slower than platforms like Instagram and TikTok in tackling this issue.
Google is taking some pressure off victims
Until now, victims often had to report abusive content themselves, which many felt was unfair.
Critics also want Google to do more about AI-generated NCII, without making users jump through extra hoops.
With this partnership, Google aims to catch harmful images faster and take some pressure off those affected—showing a stronger commitment to online safety.