Facebook to add 3,000 workers to filter out violent content, says Zuckerberg | HT Tech

Facebook to add 3,000 workers to filter out violent content, says Zuckerberg

The hiring spree is an acknowledgement by Facebook that, at least for now, it needs more than automated software to improve monitoring of posts.

By:REUTERS, SAN FRANCISCO
| Updated on: May 03 2017, 21:11 IST
Facebook Live, a service that allows any user to broadcast live, has been marred since its launch last year by instances of people streaming violence.
Facebook Live, a service that allows any user to broadcast live, has been marred since its launch last year by instances of people streaming violence. (Reuters Photo)
Facebook Live, a service that allows any user to broadcast live, has been marred since its launch last year by instances of people streaming violence.
Facebook Live, a service that allows any user to broadcast live, has been marred since its launch last year by instances of people streaming violence. (Reuters Photo)

Facebook will hire 3,000 more people over the next year to respond to reports of inappropriate material on the social media network and speed up the removal of videos showing murder, suicide and other violent acts, chief executive Mark Zuckerberg said on Wednesday.

The hiring spree is an acknowledgement by Facebook that, at least for now, it needs more than automated software to improve monitoring of posts. Facebook Live, a service that allows any user to broadcast live, has been marred since its launch last year by instances of people streaming violence.

Zuckerberg, the company's co-founder, said in a Facebook post the workers will be in addition to the 4,500 people who already review posts that may violate its terms of service.

Last week, a father in Thailand broadcast himself killing his daughter on Facebook Live, police said. After more than a day, and 370,000 views, Facebook removed the video. Other videos from places such as Chicago and Cleveland have also shocked viewers with their violence.

Zuckerberg said: "We're working to make these videos easier to report so we can take the right action sooner - whether that's responding quickly when someone needs help or taking a post down."

The 3,000 workers will get new positions and will monitor all Facebook content, not just live videos, the company said. The company did not say where the jobs would be located.

Facebook is due to report quarterly revenue and earnings later on Wednesday after markets close in New York.

The world's largest social network, with 1.9 billion monthly users, has been turning to artificial intelligence to try to automate the process of finding pornography, violence and other potentially offensive material. In March, the company said it planned to use such technology to help spot users with suicidal tendencies and get them assistance.

However, Facebook still relies largely on its users to report problematic material. It receives millions of reports from users each week, and like other large Silicon Valley companies, it relies on thousands of human monitors to review the reports.

"Despite industry claims to the contrary, I don't know of any computational mechanism that can adequately, accurately, 100 percent do this work in lieu of humans. We're just not there yet technologically," said Sarah Roberts, a professor of information studies at UCLA who looks at content monitoring.

The workers who monitor material generally work on contract in places such as India and the Philippines, and they face difficult working conditions because of the hours they spend making quick decisions while sifting through traumatic material, Roberts said in an interview.

Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 03 May, 21:10 IST
NEXT ARTICLE BEGINS