25 Apr 2019
Dating app Bumble's AI will save you from unwanted nudes
Priyanka Chopra-backed Bumble dating app will soon use artificial intelligence to save people from unwanted nudes.
Yes, the service will have an AI-assisted 'private detector' to flag lewd photos and warn the user before they actually open those photos.
The feature adds to the wide-ranging capabilities the app already offers to prevent harassment in the name of dating.
Here's all about it.
Bumble has been blurring, watermarking images
Unlike Tinder, Bumble has long been offering an option to share photos with matches.
However, as a safety precaution, the service automatically blurs out all sent photos, giving the receiver an option to hold down the image and open it.
In addition, the company also watermarks sent photos with the profile picture of the sender so that the image-sending feature isn't abused for harassment.
But, these precautions haven't proved truly effective
While these features have helped Bumble gain an edge over its competitors, they haven't truly tackled the case of harassment.
Users have reported that people have been making fake profiles to bombard their accounts with unwanted explicit images.
In fact, a 2016 study even claimed that 57% of women and 21% of men feel harassed while using dating services.
Now, AI will tackle the issue
Having said that, in bid to bolster its safety toolkit, Bumble has announced the new private detector.
The feature will automatically detect inappropriate content and flag it to the user, giving them the option to block it, open it, or report it to Bumble moderators.
Notably, Bumble claims that its AI can detect and warn users about lewd content with 98% accuracy.
Other dating apps will also get the same feature
To note, Bumble siblings - other dating apps owned by its parent - Badoo, Chappy, and Lumen will also get the same feature.
Plus, a spokesperson from the company tells Engadget that the service will also get an AI-assisted feature to detect inappropriate messages and foul language.
Interestingly, Tinder already offers an AI engine to scan profiles with harmful language/images.
Statement form Andrey Andreev, CEO of Bumble's parent company
"The sharing of lewd images is a global issue of critical importance and it falls upon all of us in the social media and social networking worlds to lead by example and to refuse to tolerate inappropriate behavior on our platforms."