A content moderator at Facebook filed a class action lawsuit against the company on Friday, claiming it does not protect employees from the mental trauma caused by the graphic images they see online every day.
“Every day, Facebook users post millions of videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder,” according to the complaint.
Facebook, which currently employs at least 7,500 content moderators, has put workplace safety standards into place to protect content moderators, including counseling and mental health support, changing the way traumatic images appear and training moderators to be able to recognize the symptoms of PTSD. But the lawsuit claims Facebook ignores its own workplace safety standards and violates California harm by requiring its moderators to work in “dangerous conditions that cause debilitating physical and psychological harm.”
“Facebook does not provide its content moderators with sufficient training or implement the safety standards it helped develop,” the complaint said. “Facebook content moderators review thousands of trauma-inducing images each day, with little training on how to handle the resulting distress.”
The lawsuit details that Scola’s PTSD symptoms can be triggered if she touches a computer mouse, enters a cold building, sees violence on TV or hears loud noises. Remembering or discussing the graphic imagery she saw on Facebook is also a trigger.
Facebook said in July that all content reviewers have access to mental health resources, including onsite counselors, and that all reviewers have full health care benefits. The company did not immediately respond to a request for comment.