Science

Facebook removed 583 million fake accounts during Jan-March 2018

16 May 2018 | By Bhavika Bhuwalka
Facebook releases its enforcement report for first time

According to Facebook's Community Standards Enforcement Report that has been released for the first time, the company has disabled 583 million fake accounts during the first quarter of 2018.

Further, the majority of fake accounts were blocked just after registration.

On an average, this is 6.5 million fake accounts being created daily, which is over a quarter of Facebook's 2.2 billion monthly active users.

In context: Facebook releases its enforcement report for first time

16 May 2018Facebook removed 583 million fake accounts during Jan-March 2018

ContextFacebook blocks millions of fake account attempts every day

However, Facebook's rate of blocking fake accounts is actually decreasing. It removed about a 100 million more fake accounts in the last quarter of 2017.

This decline was attributed to the "variability of our detection technology's ability to find and flag" fake accounts.

About 3-4% of accounts on Facebook are estimated to be fake. This translates to 66 million fake accounts on the website.

Love Tech news?
Stay updated with the latest happenings.
For every 10,000 views on Facebook posts, 8 were removed

DetailsFor every 10,000 views on Facebook posts, 8 were removed

Facebook also removed 837 million spam posts from the platform in the first quarter of 2018. A good majority of these posts were deleted before anyone reported them.

They included 21 million pieces of content featuring sex or nudity, 2.5 million pieces of content featuring hate speech, and almost 2 million pieces of content related to terrorism by Al Qaida and ISIS.

Humans vs AIAI moderates objectionable content on Facebook

A key tool in fighting against fake accounts and abusive posts was AI. Over 96% of posts removed by Facebook for featuring inappropriate content were flagged by AI deployed to monitor the platform.

However, users are still key in reviewing content featuring hate speech, with about 62% of such posts being manually reported by humans.

Facebook plans to update its enforcement report twice a year

Only a month ago, Facebook had published internal rules for how content moderators are to decide what should be removed from the platform. Facebook currently has 10,000 reviewers to remove objectionable content from the website and the company plans to double this number by year-end.