Facebook says it will no longer show health groups in recommendations

- Facebook Inc will no longer show health groups in its recommendations, the social media giant announced on Thursday, saying it was crucial that people get health information from authoritative sources.

Misleading health content has racked up an estimated 3.8 billion views on Facebook over the past year, peaking during the coronavirus pandemic, advocacy group Avaaz said in a report here last month.

Facebook, under pressure to curb such misinformation on its platform, has made amplifying credible health information a key element of its response.

The worlds largest social network also said it would bar administrators and moderators of groups that have been taken down for policy violations from creating any new groups for a period of time.

Last month, it removed nearly 800 QAnon conspiracy groups for posts celebrating violence, showing intent to use weapons, or attracting followers with patterns of violent behavior.

Twitter also said in a tweet on Thursday that the platform had reduced impressions on QAnon-related tweets by more than 50% through its work to deamplify content and accounts associated with the conspiracy theory. In July, the social media company said it would stop recommending QAnon content and accounts in a crackdown it expected would affect about 150,000 accounts.

The company said this coordination could be technical - for example, an individual operating multiple accounts to tweet the same message - or social, such as using a messaging app to organize many people to tweet at the same time.

Twitter said it prohibits all forms of technical coordination, but for social coordination to break its rules, there must be evidence of physical or psychological harm, or informational harm caused by false or misleading content.

Original article