These changes are new, small inconveniences piled atop frustrating user-experience decisions that Facebook has been making for more than a decade. But they are the latest example of how Facebook tries to shape every user's experience through black box algorithmsand how this approach harms not only individuals but the world at large.At this point, Facebook is working so hard to ignore expert advice on how to reduce toxicity that it looks like Facebook doesn't want to improve in any meaningful way.
When kept to a reasonable size and managed properly, they can be incredibly beneficial, especially when their members might not have the time, resources, and knowledge to put together independently hosted forum solutions.
Having that kind of content also appear in your personal newsfeed is apparently even worse.
Facebook at the same time has introduced a slew of tweaks to the user interface on both Web and mobile that make it significantly harder to promote high-quality engagement on the platform, particularly in groups.
Meaningful, thoughtful conversationeven in small, serious, well-moderated groupshas become almost impossible to maintain.
Even worse, the WSJ found that Facebook was totally and completely aware that the algorithms used for groups recommendations were a huge problem.
Violent, far-right extremists in the United States rely on Facebook groups as a way to communicate, and Facebook seems to be doing very little to stop them.
The consequences of Facebook's failures to take content seriously just keep piling up, and yet the change to promote groups will create even more fertile ground for the spread of extremism and misinformation.
Original article