Facebook Grooms 7,500 Content Reviewers over Offensive PostsSocial Media

July 28, 2018 15:14
Facebook Grooms 7,500 Content Reviewers over Offensive Posts

(Image source from: Independent.ie)

A social networking company Facebook said it is constantly grooming over 7,500 content reviewers how to handle posts related to hate speeches, terror, and child sexual exploitation on its platform, after facing ire over reports that its moderators protect far-right activists and under-age accounts.

The content reviewers comprise regular employees, contractors, and companies Facebook partners with - covering over 50 languages and every time zone across the world.

"Content review at this size has never been done before. After all, there has never been a platform where so many people communicate in so many different languages across so many different countries and cultures. We recognize the enormity of this challenge and the responsibility we have to get it right," Ellen Silver, Vice President of Operations at Facebook, wrote in a blog post on Friday.

"Language proficiency is key and it lets us review content around the clock. If something is reported in a language that we don't support 24/7, we can work with translation companies and other experts who can help us understand local context and language to assist in reviewing it," Silver added.

The company came under massive unfavorable judgment after Channel 4 investigative documentary series - Dispatches - sent an undercover reporter to work as a content moderator in a Dublin-based Facebook contractor.

It showed that moderators at Facebook were preventing Pages from far-right activists from being deleted even after they violate the rules.

Silver said the company is training its team of content reviewers in three areas - pre-training which includes what to expect on the job; hands-on learning that includes a minimum of 80 hours with a live instructor followed by hands-on practice and ongoing coaching.

"We want to keep personal perspectives and biases out of the equation entirely - so, in theory, two people reviewing the same posts would always make the same decision. Of course, judgments can vary if policies aren't sufficiently prescriptive.

Facebook said it audits a sample of reviewer decisions each week to find out if a wrong call was made.

"Our auditors are even audited on a regular basis. In addition, we have leadership at each office to provide guidance, as well as weekly check-ins with policy experts to answer any questions," said the social media giant.

Facebook said it has a team of four clinical psychologists across three regions who are tasked with designing, delivering and evaluating resiliency programs for everyone who works with graphics and objectionable content.

"This group also works with our vendor partners and their dedicated resiliency teams to help build industry standards," said Silver.

By Sowmya Sangam

If you enjoyed this Post, Sign up for Newsletter

(And get daily dose of political, entertainment news straight to your inbox)

Rate This Article
(0 votes)