Facebook Inc will invite external experts to independently audit its quarterly report on metrics used for removing content, the social network said on Tuesday, as it unveiled the sixth such report.
Introduced in 2018, Facebook’s Community Standards Enforcement Report provides details on content it removed across its apps for policy violations, including violence, suicide and hate speech.
The company said it relied more heavily on automation technology for reviewing content during the months of April, May and June, with fewer reviewers at offices due to the COVID-19 pandemic.
That resulted in company taking action on fewer pieces of content related to suicide and self-injury, child nudity and sexual exploitation on its platforms, Facebook said https://about.fb.com/news/2020/08/community-standards-enforcement-report-aug-2020 in a blog.
(Reporting by Munsif Vengattil in Bengaluru; Editing by Shinjini Ganguli)