Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

The Verge spotlights the hidden cost of being a Facebook content moderator, a role Facebook outsources to 3rd parties to make the platform safe for users

Save for later
  • 4 min read
  • 26 Feb 2019

article-image
Facebook has been in news in recent years for its data leaks and data privacy concerns. This time the company is on the radar because of the deplorable working conditions of content moderators. The reviewers are so much affected by the content on the platform that they are trying to overcome their PTSD by having sex and getting into drugs at work, reports The Verge in a compelling and horrifying insight into the lives of content moderators who work as contract workers at Facebook’s Arizona office.

Last year there was a similar report against Facebook. An ex-employee had filed a lawsuit against Facebook, in September for not providing enough protection to the content moderators who are responsible for reviewing disturbing content on the platform.

The platform has millions of videos, images of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder. The platform relies on machine learning augmented by human content moderators to keep the platform safe for the users. This means any image that violates the corporation’s terms of use is removed from the platform.

In a statement to CNBC, a Facebook spokesperson said, "We value the hard work of content reviewers and have certain standards around their well-being and support. We work with only highly reputable global partners that have standards for their workforce, and we jointly enforce these standards with regular touch points to ensure the work environment is safe and supportive, and that the most appropriate resources are in place."

The company has also posted a blog post about its work with its partners like Cognizant and its steps towards ensuring a healthy working environment for content reviewers.

As reported by The Verge, the contracted moderators get one 30-minute lunch, two 15-minute breaks, and nine minutes of "wellness time" per day. But much of this time is spent waiting in queues for the bathroom where three stalls per restroom serve hundreds of employees.

Facebook’s environment is such that workers cope with stress by telling dark jokes about committing suicide, then smoke weed during breaks to numb their emotions. According to the report, it’s a place where employees can be fired for making just a few errors a week. Even the team leaders give a hard time to the content moderators by micromanaging their bathroom and prayer break.

The moderators are paid $15 per hour for moderating content that could range from offensive jokes to potential threats to videos depicting murder.

A Cognizant spokesperson said, “The company has investigated the issues raised by The Verge and previously taken action where necessary and have steps in place to continue to address these concerns and any others raised by our employees. In addition to offering a comprehensive wellness program at Cognizant, including a safe and supportive work culture, 24x7 phone support and onsite counselor support to employees, Cognizant has partnered with leading HR and Wellness consultants to develop the next generation of wellness practices."

Public reaction to this news is mostly negative with users complaining and condemning how the company is being run.

https://twitter.com/waltmossberg/status/1100245569451237376

https://twitter.com/HawksNest/status/1100068105336774656

https://twitter.com/likalaruku/status/1100194103902523393

https://twitter.com/blakereid/status/1100094391241170944

People are angry with the fact that the content moderators at Facebook endure such trauma in their role. Some believe some compensation should be given to those suffering from PTSD as a result of working in certain high-stress roles in companies across industries.

https://twitter.com/hypatiadotca/status/1100206605356851200

According to Kevin Collier, a Cyber reporter, Facebook is underpaying and making content moderators overwork in a desperate attempt to reign in abuse of the platform it created.

https://twitter.com/kevincollier/status/1100077425357176834

One of the users tweeted, “And I've concluded that FB is run by sociopaths.” Youtube has rolled out a feature in the US that displays notices below videos uploaded by news broadcasters which receive government or public money. Alex Stamos, former Chief Security Officer at Facebook, highlighted something similar but with reference to Facebook. According to him, Facebook needs a state-sponsored label and people should know the human cost of policing online humanity.

https://twitter.com/alexstamos/status/1100157296527589376

To know more about this news, check out the report by The Verge.

Ex-employee on contract sues Facebook for not protecting content moderators from mental trauma

NIPS 2017 Special: Decoding the Human Brain for Artificial Intelligence to make smarter decisions

Facebook and Google pressurized to work against ‘Anti-Vaccine’ trends after Pinterest blocks anti-vaccination content from its pinboards