US study has found racial bias in facial recognition tools

Many facial recognition algorithms wrongly identified African-American and Asian faces 10 to 100 times more than Caucasian faces

By: HT CORRESPONDENT
| Updated on: Dec 20 2019, 20:21 IST
According to a recent study conducted by the US, many facial recognition systems misidentify people of colour more often than white people. REUTERS/Damir Sagolj/Files
According to a recent study conducted by the US, many facial recognition systems misidentify people of colour more often than white people. REUTERS/Damir Sagolj/Files (Reuters)

According to a recent study conducted by the US, many facial recognition systems misidentify people of colour more often than white people. And if the results of this study are anything to go by, it will increase skepticism of the technology that is being widely used by law enforcement agencies across the world.

The study by the National Institute of Standards and Technology (NIST) found while conducting a particular type of database search known as the 'one-to-one' matching that "many facial recognition algorithms falsely identified African-American and Asian faces 10 to 100 times more than Caucasian faces".

You may be interested in

MobilesTablets Laptops
7% OFF
Apple iPhone 15 Pro Max
  • Black Titanium
  • 8 GB RAM
  • 256 GB Storage
28% OFF
Samsung Galaxy S23 Ultra 5G
  • Green
  • 12 GB RAM
  • 256 GB Storage
Google Pixel 8 Pro
  • Obsidian
  • 12 GB RAM
  • 128 GB Storage
Apple iPhone 15 Plus
  • Black
  • 6 GB RAM
  • 128 GB Storage

The study also found that "African-American females are more likely to be misidentified in 'one-to-many' matching, which can be used for identification of a person of interest in a criminal investigation".

Also read
Looking for a smartphone? To check mobile finder click here.

The NIST study is clear evidence to the fact that face matching struggles across demographics unlike other companies that have downplayed bias in technology that can guess an individual's gender.

Joy Buolamwini, the founder of Algorithmic Justice League, called the report "a comprehensive rebuttal" to those who say that artificial intelligence (AI) bias is no longer an issue. The study also comes at a time when there is growing discontent over technology in the United States, with "critics warning it can lead to unjust harassment or arrests".

For the report, NIST tested 189 algorithms from 99 developers, excluding companies such as Amazon.com that did not submit one for review. NIST studied algorithms detached from the cloud and proprietary training data and found what it tested to differ from what companies sell.

China's SenseTime, an AI startup valued at more than $7.5 billion, had "high false match rates for all comparisons" in one of the tests, the NIST report said.

SenseTime's algorithm "produced a false positive more than 10% of the time when looking at photos of Somali men, which, if deployed at an airport, would mean a Somali man could pass a customs check one in every 10 times he used passports of other Somali men".

Yitu, another AI startup from China, was more accurate in comparison to SenseTime and had little racial skew.

According to the study, Microsoft Corp had "almost 10 times more false positives for women of color than men of color in some instances during a one-to-many test. Its algorithm showed little discrepancy in a one-to-many test with photos just of black and white males".

Congressman Bennie Thompson, chairman of the US House Committee on Homeland Security, said that these findings of bias were "worse than feared, at a time when customs officials are adding facial recognition to travel checkpoints".

"The administration must reassess its plans for facial recognition technology in light of these shocking results," Thompson said.

Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 20 Dec, 18:17 IST
NEXT ARTICLE BEGINS