Facebook blames COVID-19 for reduced action on suicide, self-injury, and child exploitation content
Facebook says thatCOVID-19has hindered itsability to remove posts aboutsuicide, self-injury, andchild nudity and sexual exploitation.
The social media giant said the decision to send content reviewers home in March had forced it to rely more heavily on tech to remove violating content.
Meanwhile, action on Instagram content that sexually exploits or endangers children decreased from 1 million to 479,400.
In addition, the firm claimed that its focus on removing of harmful content meant it couldnt calculate the prevalence of violent and graphic contentin its latest community standards report.
Instagrams hate speech detection rate climbed even further, from45% to 84%, while actioned content rose from808,900 to 3.3 million.
In other Facebook news, the company today announced new measures to stop publishers backed by political organizations from running ads disguised as news.
We use cookies and analyse traffic to this site. By continuing to use this site, closing this banner, or clicking "I Agree", you agree to the use of cookies. Read our privacy poplicy for more information.