5 Takeaways From Facebook’s Leaked Moderation Documents

And with only a few seconds to spare, Facebook moderators have to make the call even if the text that accompanies the laughing yellow face is in an unfamiliar language.

To help with those decisions, Facebook has created a list of guidelines for what its two billion users should be allowed to say.

But for the thousands of moderators across the world, faced with navigating this byzantine maze of rules as they monitor billions of posts per day in over 100 languages, clarity is hard to come by.

But The New York Times acquired 1,400 pages from these guidelines, and found problems not just in how the rules are drafted but in the way the moderation itself is done.

Though the company does consult outside groups, the rules are set largely by young lawyers and engineers, most of whom have no experience in the regions of the world they are making decisions about.

Facebook employees say they have not yet figured out, definitively, what sorts of posts can lead to violence or political turmoil.

Another appears to contain errors about Indian law, advising moderators that almost any criticism of religion should be flagged as probably illegal.

In Myanmar, a paperwork error allowed a prominent extremist group, accused of fomenting genocide, to stay on the platform for months.

Facebook is growing more assertive about barring groups and people, as well as types of speech, that it believes could lead to violence.

And by relying on outsourced workers to do most of the moderation, Facebook can keep costs down even as it sets rules for over two billion users.

Original article