Facebook moves to target misinformation before election

The company said Thursday it will restrict new political ads in the week before the election and remove posts that convey misinformation about COVID-19 and voting.

That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.

Facebook and other social media companies are being scrutinized over how they handle misinformation, given issues with President Donald Trump and other candidates posting false information and Russias ongoing attempts to interfere in U.S.

Facebook has long been criticized for not fact-checking political ads or limiting how they can be targeted at small groups of people.

With the nation divided, and election results potentially taking days or weeks to be finalized, there could be an increased risk of civil unrest across the country, Zuckerberg said.

In July, Trump refused to publicly commit to accepting the results of the upcoming election, as he scoffed at polls that showed him lagging behind Democratic rival Joe Biden.

Trump also has made false claims that the increased use of mail-in voting because of the coronavirus pandemic allows for voter fraud.

Under the new measures, Facebook says it will prohibit politicians and campaigns from running new election ads in the week before the election.

The company also will work with Reuters to provide official election results and make the information available both on its platform and with push notifications.

Facebook had previously drawn criticism for its ads policy, which cited freedom of expression as the reason for letting politicians like Trump post false information about voting.

Original article