Facebook partially documents its content recommendation system TechCrunch

Algorithmic recommendation systems on social media sites like YouTube, Facebook and Twitter have shouldered much of the blame for the spread of misinformation, propaganda, hate speech, conspiracy theories and other harmful content.

Facebook, in particular, has come under fire in recent days for allowing QAnon conspiracy groups to thrive on its platform and for helping militia groups to scale membership.
Today, Facebook is attempting to combat claims that its recommendation systems are at any way at fault for how people are exposed to troubling, objectionable, dangerous, misleading and untruthful content.

In new documentation available in Facebooks Help Center and Instagrams Help Center, the company details how Facebook and Instagrams algorithms work to filter out content, accounts, Pages, Groups and Events from its recommendations.

The Recommendation Guidelines typically fall under Facebooks efforts in the reduce area, and are designed to maintain a higher standard than Facebooks Community Standards, because they push users to follow new accounts, groups, Pages and the like.

One obvious category of content that many not be eligible for recommendation includes those that would impede Facebooks ability to foster a safe community, such as content focused on self-harm, suicide, eating disorders, violence, sexually explicit content, regulated content like tobacco or drugs or content shared by non-recommendable accounts or entities.

Facebook also claims to not recommend sensitive or low-quality content, content users frequently say they dislike and content associated with low-quality publishings.

In addition, Facebook claims it wont recommend fake or misleading content, like those making claims found false by independent fact checkers, vaccine-related misinformation and content promoting the use of fraudulent documents.

It says it will also try not to recommend accounts or entities that recently violated Community Standards, shared content Facebook tries to not recommend, posted vaccine-related misinformation, engaged in purchasing Likes, has been banned from running ads, posted false information or are associated with movements tied to violence.

Facebooks search engine favors engagement and activity like how many members a group has or how often users post not how close its content aligns with accepted truths or medical guidelines.

Original article