Facebook: Pandemic hurt enforcement on suicide, child nudity
The COVID-19 pandemic affected Facebooks ability to remove harmful and forbidden material from its platforms, the company said Tuesday.
Sending its content moderators to work from home in March amid the pandemic led the company to remove less harmful material from Facebook and Instagram around suicide, self-injury, child nudity and sexual exploitation.
Sending its human reviewers home meant that Facebook relied more on technology, rather than people, to find posts, photos and other content that violates its rules.
The company said Tuesday that it has since brought many reviewers back to working online from home and, where it is safe, a smaller number into offices.
But Facebook also said its systems have gotten better at proactively detecting hate speech, meaning it is found and removed before anyone sees it. The social network said that's because it has expanded its automation technology into Spanish, Arabic and Indonesian and made improvements to its English detection technology.
In the Netherlands and Belgium, images of Black Pete, or Zwarte Piet, that use blackface features and stereotyping characteristics will also be removed, the company said.
White people often don blackface makeup, red lipstick and curly black wigs to play Black Pete during street parties honoring Sinterklaas.
11 is a happy day: From today, Black Pete is officially no longer welcome worldwide on Facebook and Instagram, the group said.
Populist lawmaker Geert Wilders tweeted a photo of a Black Pete shortly after the Facebook announcement accompanied by the text: Facebook and Instagram ban images of Zwarte Piet.
We use cookies and analyse traffic to this site. By continuing to use this site, closing this banner, or clicking "I Agree", you agree to the use of cookies. Read our privacy poplicy for more information.