Amazon to pause use of facial recognition software by cops
Amazon said it will implement a one-year moratorium on police use of its facial recognition software, a major course change for a company that has been one of the most strident defenders of the controversial technology.
The company will pause law enforcement use of the software to give lawmakers time to regulate a technology that has stirred debate for years and shined an uncomfortable spotlight on Amazon’s cloud-computing division. The move comes in the midst of protests about police brutality and bias after an officer killed an unarmed black man, George Floyd. Facial recognition technology has been shown in experiments to sometimes have difficulty identifying people with darker skin, recalling for activist groups past government overreach that infringed on civil liberties.
Amazon Web Services, the company’s cloud-computing group, in 2016 released Rekognition, a software service designed to identify objects in still images and video, including the ability to match a face with images in a database without taking the time to manually compare images.
Rekognition isn’t the only such software. Amazon rivals such as Microsoft Corp. and Google have similar capabilities. But Amazon’s software became the focus of an intense debate about the potential for powerful, new software to undermine human rights after the American Civil Liberties Union called out the risks that such software would misidentify people. The group highlighted Amazon’s relationships with a sheriff’s office in Oregon and the city of Orlando, two engagements Amazon had touted in marketing materials.
Nina Lindsey, an Amazon spokeswoman, declined to comment beyond a two-paragraph blog post announcing the move.
“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” the company said. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”
In a sweeping police-reform bill introduced Monday, House and Senate Democrats included a provision that would block real-time facial recognition analysis of federal police body camera footage.
Amazon said other organizations, including those using facial recognition to combat human trafficking, will be able to keep using the software. Rekognition runs on Amazon servers and is delivered to customers as an internet service, making it, in theory, relatively simple for Amazon to suspend access for police users. It’s unclear how many law enforcement agencies were using Rekognition. In an interview for a PBS Frontline investigation that aired earlier this year, AWS chief Andy Jassy said he didn’t know the total number of police departments using Rekognition.
“It’s sort of the first, real, meaningful concession we’ve seen from Amazon allowing that use of facial recognition by police might not be good for communities” harmed by biased policing, said Shankar Narayan, who expressed concerns about Rekognition to Amazon officials while at the ACLU of Washington, which he left earlier this year. “The move shows that Amazon is vulnerable to public pressure and optics,” said Narayan, a co-founder MIRA, an organization working to give civil society groups a greater say in how new technologies are used.
The pressure on Amazon intensified after a January 2019 study by two AI researchers showed the software made more mistakes when used on people with darker skin, particularly women. Amazon argued with the conclusions and methodology of the paper, authored by Inioluwa Deborah Raji and Joy Buolamwini, leading some of top AI scientists, including Turing Award winner Yoshua Bengio to criticize both Amazon’s sale of the product to police and its treatment of Raji and Buolamwini. Separately, the ACLU tested the software on members of Congress and found it falsely matched 28 of them with mugshots, disproportionately selecting minority lawmakers.
Amazon, which has long been reluctant to bow to outside pressure on public policy issues, claimed that those studies didn’t accurately reflect the capabilities of its software. The company has also said there have been no reported cases of law enforcement abuse of Rekognition, though Amazon’s ability to audit the software’s use is limited by AWS’s encryption and policies against examining customer data.
“We believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future,” Matt Wood, an executive in Amazon’s machine learning group, said in a 2018 blog post. “The world would be a very different place if we had restricted people from buying computers because it was possible to use that computer to do harm.”
Even as Amazon continued to defend Rekognition, it began trying to fix its deficiencies, conceding at its own trade shows that there was room for improvement. Amazon eventually offered suggestions on potential facial recognition regulation, but resisted calls for a moratorium on its use as cities like San Francisco began to implement restrictions or wholesale bans on the technology.
Other technology builders have proceeded more cautiously. Google Cloud Platform stopped short of selling facial recognition as an off-the-shelf service, saying in late 2018 that the company wanted to allow more time to work through “important technology and policy questions.” International Business Machines Corp. earlier this week said it would also no longer sell general-purpose facial recognition and analysis software. Chief Executive Officer Arvind Krishna said in a letter to Congress that the company opposed uses of technology, including facial recognition, for mass surveillance, racial profiling or violations of basic human rights.
Amazon’s blog post didn’t mention Ring, the video doorbell subsidiary that in recent years has established ties to police departments around the U.S. and allows investigators to request footage from users. Activists have held up those arrangements, which cover more than 1,300 law enforcement agencies, as evidence that the company’s public statements expressing support for the Black Lives Matter movement are less than authentic. The Electronic Frontier Foundation earlier Wednesday introduced a petition demanding Ring end its police partnerships.