Facebook makes changes in its ongoing attempt to limit misinformation

For years, Facebook has grappled with the spread of controversial content on its platform, such as misinformation about elections, anti-vaccination stories, violence and hate speech.

When users in a group frequently share content that has been deemed false by Facebook's third-party fact checkers, that group's content will be pushed lower in News Feed so fewer people see it.

With this feature, Facebook hopes to reduce the spread of websites that are disproportionately popular on Facebook compared to other parts of the web.

She said some of the posts that appeared to be anti-vaccination involved people asking questions, seeking information and having conversations around the topic.

Renee Murphy, principal analyst at research firm Forrester who covers security and risk, said that while Facebook's steps are positive, they don't do nearly enough to address some of its larger problems.

Meanwhile, Facebook-owned Instagram is trying to squash the spread of inappropriate posts that don't violate its policies.

For example, a sexually suggestive photo would still pop up in a feed if a user follows that account, but it may no longer be recommended for the Explore Page or in pages for hashtags.

WhatsApp, another Facebook-owned app, has a similar function, which is part of an effort to stop the spread of misinformation. WhatsApp has had major issues with viral hoax messages spreading on the platform, which have resulted in more than a dozen lynchings in India.

Forrester's Murphy believes the company should do more to address major issues such as violence being livestreamed and going viral on the platform.

Original article