Facebook says it removed over 7M pieces of wrong COVID-19 content in quarter
Sending its contentmoderators to work from home in March amid the pandemic led the company to remove less harmful material from Facebook and Instagram around suicide, self-injury, child nudity and sexual exploitation.
The COVID-19 pandemic affected Facebooks ability to remove harmful and forbidden material from its platforms, the company said because people at home were uncomfortable looking at disturbing images with their families around.
The company said Tuesday that it has since brought many reviewers back to working online from home and, where it is safe, a smaller number into offices.
Rosen added that improvements to its technology enabled it to take action on more content in some areas, and increase proactive detection rate in others.
Just last week, Facebook took down a post from President Trump's personal page of a Fox News interview in which he said that children are almost immune from COVID-19.
It was the first time Facebook has removed a post by the president for violating its policies on COVID-19 misinformation.
Late June, Facebook removed a network of accounts, groups and pages on Facebook and Instagram connected to the boogaloo anti-government movement that encourages violence in the United States.
The social media giantalso designated boogaloo as a dangerous organization, giving it the same classification as terrorist and hate groups.
We use cookies and analyse traffic to this site. By continuing to use this site, closing this banner, or clicking "I Agree", you agree to the use of cookies. Read our privacy poplicy for more information.