Miscellaneous Social Media

Facebook Hiring 3,000 Additional People to Screen Violent Content

Facebook is hiring 3,000 new people in a bid to more quickly review and remove violent and offensive content.

CEO Mark Zuckerberg said the additional 3,000 bodies will be added to its community operations team across the globe, taking the size of the department from 4,500 to 7,500.

The announcement comes at a time when Facebook has come under fire for live video of murders and suicides being aired on its platform such as last week’s tragedy of a man in Thailand killing his baby girl before taking his own life.

“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community,” Zuckerberg said in a post to his wall.

 Photo by Brian Solis — Facebook CEO Mark Zuckerberg giving his F8 Keynote.
Photo by Brian Solis —
Facebook CEO Mark Zuckerberg giving his F8 Keynote.

“If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”

Zuckerberg, who has had many critics suggest Facebook Live be deactivated completely, said it is, largely, a force for good and, in some cases, can even aid people in need.

“Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate,” he said. “No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need.”

While Zuckerberg’s plan to hire more moderators is a step in the right direction, many are pointing out the toll such a job can take on an individual’s mental health.

UCLA assistant professor Sarah Roberts told The Daily Beast moderators of violent and offensive content have psychologically “tough” jobs.

People who remain in such jobs long-term either suffer from “burnout or becoming desensitized, neither of which is a good outcome,” Roberts told the publication. “It can manifest in a number of ways, going all the way to the case of the two Microsoft employees who are suing Microsoft for what they say are debilitating cases of PTSD that have rendered them completely disabled.”

Microsoft has disputed the lawsuit’s claims, saying it provides the “resources and support” its moderators need, “including an individual wellness plan.”

“We view it as a process, always learning and applying the newest research about what we can do to help support our employees even more,” a Microsoft spokesperson said in an e-mailed statement.

Facebook told The Daily Beast it too has a psychological support program for its content moderators.

The social media site has also promised to make it easier for:

  • Users to report offensive content.
  • Reviewers to determine which posts violate Facebook’s standards.
  •  Reviewers to contact law enforcement if someone needs help.

About the author

avatar

Jennifer Cowan

Jennifer Cowan is the Managing Editor for SiteProNews.

1 Comment

Click here to post a comment
  • sounds less like creating a safe community and more like burying the ugly truth.
    The more folks see of how messed up things really are the less likely progressive policies will be acceptable.