Facebook to add 3000 moderators after video killings

Facebook to add 3000 moderators after video killings

by Joseph Anthony
105 views
A man poses with a magnifier in front of a Facebook logo on display in this illustration taken in Sarajevo

Facebook will hire 3,000 more people over the next year to respond to reports of inappropriate material on the social media network and speed up the removal of videos showing murder, suicide and other violent acts, Chief Executive Mark Zuckerberg said on Wednesday.

The hiring spree is an acknowledgement by Facebook that, at least for now, it needs more than automated software to improve monitoring of posts. Facebook Live, a service that allows any user to broadcast live, has been marred since its launch last year by instances of people streaming violence.

Zuckerberg, the companyโ€™s co-founder, said in a Facebook post the workers will be in addition to the 4,500 people who already review posts that may violate its terms of service.

Last week, a father in Thailand broadcast himself killing his daughter on Facebook Live, police said. After more than a day, and 370,000 views, Facebook removed the video. Other videos from places such as Chicago and Cleveland have also shocked viewers with their violence.

Zuckerberg said: โ€œWeโ€™re working to make these videos easier to report so we can take the right action sooner โ€“ whether thatโ€™s responding quickly when someone needs help or taking a post down.โ€

The 3,000 workers will be new positions and will monitor all Facebook content, not just live videos, the company said. The company did not say where the jobs would be located.

Facebook is due to report quarterly revenue and earnings later on Wednesday after markets close in New York.

The worldโ€™s largest social network, with 1.9 billion monthly users, has been turning to artificial intelligence to try to automate the process of finding pornography, violence and other potentially offensive material. In March, the company said it planned to use such technology to help spot users with suicidal tendencies and get them assistance.

However, Facebook still relies largely on its users to report problematic material. It receives millions of reports from users each week, and like other large Silicon Valley companies, it relies on thousands of human monitors to review the reports.

โ€œDespite industry claims to the contrary, I donโ€™t know of any computational mechanism that can adequately, accurately, 100 percent do this work in lieu of humans. Weโ€™re just not there yet technologically,โ€ said Sarah Roberts, a professor of information studies at UCLA who looks at content monitoring.

The workers who monitor material generally work on contract in places such as India and the Philippines, and they face difficult working conditions because of the hours they spend making quick decisions while sifting through traumatic material, Roberts said in an interview.

You may also like

Leave a Comment

Chijos News is an independent online publication that provides readers with the latest breaking Nigerian news, world news, entertainment, sports, business, and many more.

@2024 – Chijosnews.com. All Rights Reserved.

-
00:00
00:00
Update Required Flash plugin
-
00:00
00:00