Facebook content moderators make 3L blunders a day
At present, about 15,000 workers police Facebook's main platform and its Instagram subsidiary.
Tasked with reviewing about three million posts a day, Facebook moderators make about three lakh mistakes in 24 hours in deciding what should stay online and what should be taken down, said a new report.
At present, about 15,000 workers police Facebook's main platform and its Instagram subsidiary, said the report from New York University's Stern Center for Business and Human Rights.
The number of blunders was derived on the basis of a statement made by Facebook CEO Mark Zuckerberg in a white paper in November 2018.
The Facebook CEO admitted that moderators "make the wrong call in more than one out of every 10 cases."
Most of these workers are employed by third-party vendors, said the report, adding that the frequently chaotic outsourced environments in which moderators work impinge on their decision making.
The research found that to efficiently sanitise the platform, Facebook needs to end outsourcing of content moderation and double the number of people who moderate the content on a daily basis.
"The peripheral status of moderators undercuts their receiving adequate counseling and medical care for the psychological side effects of repeated exposure to toxic online content," said the report titled "Who Moderates the Social Media Giants?" A Call to End Outsourcing".
"Watching the worst social media has to offer leaves many moderators emotionally debilitated. Too often, they don't get the support or benefits they need and deserve," said the report authored by Paul M. Barrett, Deputy Director of the NYU Stern Center for Business and Human Rights.
The author recommended that facebook needs to significantly expand fact-checking to debunk misinformation.