Tech giant, Facebook has today released its industry-leading Community Standards Enforcement Report (CSER) for the first quarter of 2021. The new report showcases data on how Facebook is performing in enforcing 12 of its policy areas on Facebook and 10 on Instagram for the January-March 2021 period.
Additionally, it reveals key metrics attained by Facebook in its quest to prevent and take action on content that goes against its Community Standards while protecting the community’s safety, privacy, and dignity.
The report shows some positive strides towards improvements in prevalence, providing greater transparency and accountability around content moderation operations across different Facebook products.
Facebook has also appointed EY to audit the latest report, as part of its commitment to independent oversight and verification of its performance.
In a statement, Facebook revealed that they have expanded reporting efforts by adding more prevalence metrics for Instagram, which will now include prevalence for adult nudity and sexual activity and violent and graphic content.
The most interesting thing about the report is that the prevalence of hate speech on Facebook has continued to reduce, owing largely to the changes that Facebook has made to reduce problematic content in News Feed.
Commenting on the report, Kojo Boakye, Director of Public Policy for Africa, at Facebook said
“The report highlights how we are getting better and more efficient at enforcing our Community Standards through a multipronged approach that includes Artificial Intelligence (AI), human moderation, user reporting tools and collaboration with external experts.”
“It’s pleasing that hate speech prevalence on Facebook continues to decrease for the third quarter in a row. Across the board, we’re seeing improvements in our numbers, and we will continue to share our progress each quarter,” he concluded.
Facebook is also launching a Transparency Centre to provide a single destination for information about its integrity efforts. This site will make it easier to view Community Standards Enforcement Report trends and find data around specific policy areas
Facebook said they are committed to “improving technology and enforcement efforts to remove harmful content from our platforms and keep people safe while using the Facebook family of apps.”
Facebook has consistently come under fire on the manner by which they enforce community standards on their platform.
WhatsApp, a subsidiary of Facebook has recently sued the Indian government over new digital rules that will force the messaging service to violate privacy protection.