Page Loader
Summarize
Instagram trials new auto image-blurring feature to combat sextortion
There are increasing calls for stricter regulation of social media platforms

Instagram trials new auto image-blurring feature to combat sextortion

Apr 12, 2024
11:43 am

What's the story

Instagram is set to test a new facility aimed at tackling sextortion on its platform. The "nudity protection" feature will automatically blur images identified as containing nudity for users under 18. A warning that reads, "Photo may contain nudity," will also be displayed. The social media giant's goal with this feature is to protect users from unsolicited sexual images, and encourage them to reconsider sharing explicit photos.

User education

Instagram to guide users on risks of sharing sensitive photos

In addition to the image-blurring function, Instagram will also issue reminders about the risks associated with sending sensitive photos. The platform will inform users of their ability to retract these photos if they change their minds. Guidance toward Meta's Safety Center and various support helplines will be provided. These enhancements are part of Instagram's ongoing commitment to protect youngsters from unwanted or potentially harmful interactions, as stated in the company's blog post.

Online safety

Growing push for stricter regulation of social media

This move comes amid increasing calls for stricter regulation of social media platforms in the US. Recently, Florida Governor Ron DeSantis enacted a law barring children under 14 from owning social media accounts. This legislation might face legal challenges over potential First Amendment infringement. In December, over 200 organizations requested Senate Majority Leader Chuck Schumer to bring the Kids Online Safety Act (KOSA) to vote, which aims to hold digital platforms accountable for content that could harm minors' mental health.

Expert opinion

Experts commend Instagram's initiative

Instagram has been working with experts for several years to understand the tactics used by scammers to find and extort victims online. The goal is to develop effective prevention strategies. Sameer Hinduja, Co-director of the Cyberbullying Research Center at Harvard University, praised Instagram's initiative. He stated that the new feature would reduce "unwanted exposure to potentially traumatic images" and increase awareness about the potential dangers of sharing explicit photos.

Support services

Instagram endorses services for removing explicit images

Instagram supports the National Center for Missing and Exploited Children's Take It Down service, which assists in removing explicit or semi-explicit images of minors online. Users below 18 will be directed to this service. Those over 18 will be guided to the Stop Non-Consensual Intimate Image Abuse website, operated by the Revenge Porn Helpline. These endorsements further demonstrate Instagram's commitment to combating sextortion and protecting its users from potentially harmful content.