It’s definitely been a very tricky week for Facebook. The Wall Street Journal is indeed making revelations about Mark Zuckerberg’s company after having put its hands on a series of internal reports that are quite damning.

The latest investigation to date concerns a change in the News Feed recommendation algorithm that occurred in 2018. The latter chooses the content that the user will be confronted with. At the time, these changes were clearly announced by the leaders of the Tech giant.

The idea was to limit interactions with content produced by “professionals” to show more publications from our loved ones. All of this was justified on the grounds of mental health protection. According to an internal memo that our colleagues were able to consult, it was in fact a strategy to respond to a decline in use of the platform.

“Violent content is abnormally prevalent”

The consequences were very problematic since according to the observations of the researchers, the effect was the opposite. This has resulted in a greater emphasis on inflammatory publications:

Misinformation, toxicity, and violent content were abnormally prevalent in the reposted content. (…) Our approach has had unhealthy collateral effects on important parts of the content, especially in politics and news. Our responsibility is growing. Many of the people we spoke to told us that they feared, in the long term, the negative effects that this algorithm could have on democracy.

The consequences have been significant for certain political organisations and media outlets, which have revised their strategies by turning to sensationalist and outrageous communication to boost commitment. Unfortunately, we know the rest of the story, including the multiplication of online disinformation campaigns.

See also  This is it, Netflix is taking its first steps into video games

Facebook has not failed to react to the American media’s criticism. Quoted by Business Insidera spokesperson defends himself:

Is a change in rankings the source of the world’s divisions? No. The research shows that some partisan divisions in our society have been growing for decades, long before platforms like Facebook existed. It also shows that meaningful engagement with friends and family on our platform is better for people’s well-being than the alternative.

Recall that this week, the Wall Street Journal released two more investigations into Mark Zuckerberg’s firm. The first concerns the existence of a modThe first is adifferentiated ration that would be applied to 5.8 million VIP users. The second is Instagram, which has a very negative impact on the mental health of teenage girls.

FacebookFacebook