Facebook adds more human moderators to keep your posts in check

Last Updated: Fri, Jul 27, 2018 13:00 hrs
The logo of Facebook is pictured on a window at new Facebook Innovation Hub in Berlin

In its latest blog post, Facebook revealed that it has added more human moderators to review content for hate speech.

Facebook is doubling the team size to 7,500 content reviewers to scan the thousands of posts shared on the platform each minute, Engadget reported.

These content reviewers undergo weeks of intensive training to understand the process related to content reviewing. Facebook also uses artificial intelligence to tackle unwanted content on its platform.