Twitter draws flak over bias in photo preview algorithm

    Several Twitter users posted images featuring photos of a Black person and white person to show a bias in algorithm.
    By: HT TECH
    | Updated on: Sep 21 2020, 10:10 IST
    Racial bias found in Twitter photo algorithm
    Racial bias found in Twitter photo algorithm (REUTERS)
    Racial bias found in Twitter photo algorithm
    Racial bias found in Twitter photo algorithm (REUTERS)

    Twitter is drawing flak over an algorithm that decides how photos are cropped and displayed on users' timelines. The algorithm is said to be biased and prefers the faces of white people over people with darker skin in the preview. Twitter acknowledged the issue with the algorithm and promised to rework it.

    The apparent bias was discovered over the weekend when several Twitter users posted photos featuring a Black person and a white person. Twitter's preview of the photo in users' timeline mostly preferred the white person.

    Twitter user Kim Sherrell in his experiment showed different permutations and combinations to conclude that the algorithm was biased. In one of the tweets, he compared the photos featuring Barack Obama and Mitch McConnell. Twitter consistently preferred McConnel over Obama. The algorithm did not work the same when he inverted colours in the image.

    Researcher Matt Blaze pointed out that the bias of algorithm depends on the official Twitter app used by a user. For instance, Tweetdeck offered more neutral results.

    Twitter's Liz Kelly was quick to respond to the bias of the algorithm. She promised that the company will open source the revaluated work so that everyone can review and replicate.

    “thanks to everyone who raised this. we tested for bias before shipping the model and didn't find evidence of racial or gender bias in our testing, but it's clear that we've got more analysis to do. we'll open source our work so others can review and replicate,” she said in a tweet.

    ALSO READ: Twitter will make key US political accounts adopt tighter account security

    Twitter engineer Zehan Wang said that the company had conducted bias studies before release in 2017.

    “We'll look into this. The algorithm does not do face detection at all (it actually replaced a previous algorithm which did). We conducted some bias studies before release back in 2017. At the time we found that there was no significant bias between ethnicities (or genders),” he said.

    Follow HT Tech for the latest tech news and reviews , also keep up with us on Twitter, Facebook, and Instagram. For our latest videos, subscribe to our YouTube channel.

    First Published Date: 21 Sep, 10:10 IST
    NEXT ARTICLE BEGINS
    keep up with tech