Twitter draws flak over bias in photo preview algorithm
Several Twitter users posted images featuring photos of a Black person and white person to show a bias in algorithm.
Twitter is drawing flak over an algorithm that decides how photos are cropped and displayed on users' timelines. The algorithm is said to be biased and prefers the faces of white people over people with darker skin in the preview. Twitter acknowledged the issue with the algorithm and promised to rework it.
The apparent bias was discovered over the weekend when several Twitter users posted photos featuring a Black person and a white person. Twitter's preview of the photo in users' timeline mostly preferred the white person.
Twitter user Kim Sherrell in his experiment showed different permutations and combinations to conclude that the algorithm was biased. In one of the tweets, he compared the photos featuring Barack Obama and Mitch McConnell. Twitter consistently preferred McConnel over Obama. The algorithm did not work the same when he inverted colours in the image.
Trying a horrible experiment...— Tony “Abolish (Pol)ICE” Arcieri 🦀 (@bascule) September 19, 2020
Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama? pic.twitter.com/bR1GRyCkia
Researcher Matt Blaze pointed out that the bias of algorithm depends on the official Twitter app used by a user. For instance, Tweetdeck offered more neutral results.
thanks to everyone who raised this. we tested for bias before shipping the model and didn't find evidence of racial or gender bias in our testing, but it's clear that we've got more analysis to do. we'll open source our work so others can review and replicate. https://t.co/E6sZV3xboH— liz kelley (@lizkelley) September 20, 2020
Twitter's Liz Kelly was quick to respond to the bias of the algorithm. She promised that the company will open source the revaluated work so that everyone can review and replicate.
“thanks to everyone who raised this. we tested for bias before shipping the model and didn't find evidence of racial or gender bias in our testing, but it's clear that we've got more analysis to do. we'll open source our work so others can review and replicate,” she said in a tweet.
Twitter engineer Zehan Wang said that the company had conducted bias studies before release in 2017.
“We'll look into this. The algorithm does not do face detection at all (it actually replaced a previous algorithm which did). We conducted some bias studies before release back in 2017. At the time we found that there was no significant bias between ethnicities (or genders),” he said.