Facebook, which has faced criticism from all corners for its content moderation mistakes and for the massive rulebook that guides for moderators, had more than 30,000 employees working on safety and security by the end of last year. In the face of a never-ending firehose of content, moderators are expected to maintain a 95 percent accuracy rate while reviewing more than 1,000 posts per week to see if they violate Facebook's community standards.
The Verge's report, which is based on interviews with a dozen former and current Cognizant employees, depicts a soul-crushing, morbid environment where workers joke about self-harm, do drugs on the job, develop severe anxiety or have panic attacks because of the horrifying content they're forced to view. Most of the moderators interviewed quit after one year.
In addition, moderators told the tech news site that some colleagues have even embraced the fringe, conspiracy-laden views of the memes and posts they're forced to view each day.
Both Cognizant, and Facebook, which is led by CEO Mark Zuckerberg, pushed back on some aspects of The Verge's reporting.
At a later stage of the reporting, Facebook allowed The Verge's reporter to visit the Phoenix site after telling her that the moderators' experiences don't reflect those of most contractors, either in Phoenix or worldwide. New positive-message posters were put up and several content moderators who spoke to The Verge expressed satisfaction with their jobs and how they're treated, claiming that the very awful, violent content is only a small fraction of what they view.
A former contract content moderator sued Facebook in September, claiming that her work for the tech giant left her with PTSD.