Facebook moderated 32mn posts in compliance with new intermediary guidelines
Shortly after being summoned by a Parliamentary Standing Committee on Information Technology chaired by Congress MP Shashi Tharoor, Facebook and its subsidiary Instagram revealed they took down as many as 32 million posts in compliance with the IT Ministry's new guidelines for social media intermediaries. However, no data has been released by Facebook yet regarding government orders or political pressure to remove content.
30 million posts moderated on Facebook, 2 million on Instagram
Between May 15 and June 15, Facebook and its subsidiary Instagram took down 32 million posts and instances of spam, nudity, bullying, harassment, and violent content. The lion's share of 30 million posts was redacted from Facebook while two million posts were taken down from Instagram. The Silicon Valley giant said the content was identified by algorithms that scour its platforms for unlawful content.
Facebook acted on content flagged from 10 different categories
Facebook said it had "actioned" over content from ten categories, including spam (25 million), violent and graphic content (2.5 million), adult nudity and sexual activity (1.8 million), and hate speech (311,000). Additionally, the social media giant acted on content flagged as bullying and harassment (118,000), suicide and self-harm (589,000), dangerous organizations and individuals: terrorist propaganda (106,000), and dangerous organizations and individuals: organized hate (75,000).
Instagram identified objectionable content across nine categories
On Instagram, the content was addressed across nine categories, including suicide and self-harm (699,000), violent and graphic content (668,000), adult nudity and sexual activity (490,000), and bullying and harassment (108,000). Facebook explained that flagged content could be posts, photos, videos, or comments while corrective action could include deletion of said content, or covering it with an appropriate warning for audiences.
Social media platforms would have to submit monthly compliance reports
IT Ministry's guidelines state that all social media platforms with over five million users will have to submit compliance reports on a monthly basis in addition to appointing officers responsible for a structured grievance redressal system. The reports must include details of the complaints received and the subsequent action taken by the platform. Facebook said it will publish its next report on July 15.
We will continue to bring more transparency, information: Facebook
Earlier this week, Facebook said an interim report would be published on July 2 and the final version on July 15. The latter will include user complaints received and the corresponding action that was taken. Speaking to PTI, a Facebook spokesperson said, "We will continue to bring more transparency to our work and include more information about our efforts in future reports."
Facebook claims it proactively moderated almost all the flagged content
Facebook's final reports are expected to contain data of actioned posts from its subsidiaries, including WhatsApp and Instagram. The parent company claims to have acted upon 96.4% to 99.9% of the content proactively. This means the content was moderated before users reported it. However, the proactivity rate was a dismal 36.7% for bullying-related content, probably due to its contextual nature.
Google, YouTube, Aprameya Radhakrishna's Koo also file similar compliance reports
Separately, tech giants Google and YouTube also submitted their reports in compliance with the guidelines for intermediaries. Homegrown Twitter rival Koo also submitted its reports. Google and YouTube reportedly received 27,762 complaints from Indian users in April, resulting in the removal of 59,350 pieces of content. Meanwhile, in June, 5,502 posts on Koo were reported by its users while the platform moderated 54,235 posts.