Facebook knew its algorithms promoted extremist groups, but did nothing: Report | Tech News

Facebook knew its algorithms promoted extremist groups, but did nothing: Report

According to a WSJ report, Facebook and Mark Zuckerberg knew about the polarising behaviour on the site and did nothing about it. 

By: HT TECH
| Updated on: Aug 20 2022, 21:12 IST
Facebook has been struggling for a while dealing with extremist content on its platform and that’s not new.
Facebook has been struggling for a while dealing with extremist content on its platform and that’s not new. (REUTERS)
Facebook has been struggling for a while dealing with extremist content on its platform and that’s not new.
Facebook has been struggling for a while dealing with extremist content on its platform and that’s not new. (REUTERS)

Facebook has been struggling for a while dealing with extremist content on its platform and that's not new. Right from the 2016 US elections where Russians used polarising ads to manipulate American voters to the violence in Myanmar - the social networking platform has neither cracked down on these instances hard nor have they taken a stand.

A report in the Wall Street Journal by Jeff Horowitz and Deepa Seetharaman suggests that Facebook “knew that its algorithm was dividing people, but did very little to address the problem”. The report notes that internal presentations from 2018 illustrated how Facebook's algorithm “aggravated polarising behaviour in some cases”. A slide from that presentation pointed out that if these algorithms are “left unchecked then they would feed users more divisive content”.

“Our algorithms exploit the human brain's attraction to divisiveness. If left unchecked, Facebook would feed users more and more divisive content in an effort to gain user attention & increase time on the platform,” the slide read.

The WSJ report says that Zuckerberg and his team shelved the presentation and “decided not to apply its observations” on any of their products. Facebook's Chief of Policy Joe Kaplan was of the opinion that “these changes might have affected conservative users and publications”.

Facebook has said that they have “learned a lot” since 2016 and have built a “robust” integrity team to deal with issues like these -

“We've learned a lot since 2016 and are not the same company today. We've built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform's impact on society so we continue to improve.”

The WSJ report notes that even before Facebook formed this ‘integrity team', a researcher from the social media platform, Monica Lee, found in 2016 that “64% of all extremist group joins are due to our recommendation tools”.

Facebook tried to tackle the issue by tweaking the algorithm and creating temporary sub-groups to host discussions, however, these tweaks were considered ‘antigrowth' and were shot down. At the end of it all, Facebook didn't do much, almost nothing in fact, and chose to upload ‘free speech'- something Zuckerberg has strongly rallied for lately.

However, Facebook has put its Oversight Board in place, which is a team of people who can overrule the social network's decisions with rest to content moderation. And with that in place, maybe the company will be able to root out errors before they are pointed out by others.

Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 27 May, 17:08 IST
NEXT ARTICLE BEGINS