Facebook doubles size of content reviewing team to keep check on offensive or sensitive posts
In a blog post, social media giant Facebook has revealed that it has added more human moderators to their team to evaluate sensitive content.
Facebook is reportedly doubling the team size to 7,500 content reviewers to scan posts with offensive or sensitive content that are shared on the platform each minute. The size is intended to be able to review in a post’s native language.
When offensive posts are reported on the website or detected by filters on the site, they are sent for appraisal to the thousands of content reviewers around the world.
The social networking giant, which also uses artificial intelligence to weed out the unwanted content on its platform, demands that its content reviewers go through intensive training to understand the process related to content reviewing.
In 2017, Facebook had claimed that it would expand the content reviewing team to 7,500.
(Featured Image – alliance/dpa/S. Stache)