Bringing facial recognition to Ukraine war is a bad idea | Tech News

Bringing facial recognition to Ukraine war is a bad idea

A startup’s donation of controversial tech to Ukraine could also help distract from its wider problems.

By:BLOOMBERG
| Updated on: Aug 22 2022, 10:55 IST
Ukraine
A startup's donation of controversial tech to Ukraine could also help distract from its wider problems. (Carlos Barria/REUTERS)
Ukraine
A startup's donation of controversial tech to Ukraine could also help distract from its wider problems. (Carlos Barria/REUTERS)

Perhaps the saying, “There is no such thing as bad publicity,” holds true for controversial technology companies. New York-based Clearview AI has been criticized by privacy advocates for years because of the way it has scraped billions of images from social-media networks to build a search engine for faces used by police departments. It was the subject of a New York Times investigation, and several countries including France and Canada have banned the company.

Still, at least 600 law-enforcement agencies have used its technology, and this week Clearview revealed it had offered the government of Ukraine free access to its “facial network” to help stave off the Russian invasion. 

Ukraine's Ministry of Defense has not said how it will use the technology, according to Reuters, which first reported on the news citing Clearview Chief Executive Officer Hoan Ton-That as its main source. Ukraine's government has also not confirmed that it was using Clearview, but Reuters reported that its soldiers could potentially use the technology to weed out Russian operatives at checkpoints. Out of Clearview's database of 10 billion faces, more than 2 billion come from Russia's most popular social-media network, Vkontakte, allowing the company to theoretically match many Russian faces to their social profile.   

Ukraine has received several offers of help from the tech world, including from Elon Musk and satellite operator MDA Ltd. But Clearview's offer to Ukraine has, rightly, caused outrage among privacy campaigners. Chief among the concerns is that facial recognition makes mistakes. It is bad enough when that leads police to make a wrongful arrest. In a war zone, there are even greater life and death consequences.

There is evidence that frontline users of facial recognition often don't operate it properly. A British study of how London police used the tech to spot suspected criminals on the streets found that officers suffered from “deference to the algorithm.” They tended to agree, in other words, with whatever the software suggested. Even if police weren't sure if a face caught on camera matched a mugshot, they would assume the match must be accurate if the software said so. And while other officers sometimes challenged their colleagues if they disagreed with the face-matching software, they never challenged those who agreed with it, according to the 2019 study.

It is hard to imagine soldiers taking a more nuanced approach in the midst of pressure to protect cities under siege, and with little or no training on how to use such software.

Clearview's offer ultimately has the feel of a publicity stunt. The company denies this, saying, “Hoan Ton-That saw the suffering in Ukraine,  and like people and companies from across the U.S. and the world, wanted to do what he could to help.”

In something of a positive Streisand effect, Clearview's controversial reputation inexplicably seems to have kept it in business. Even as its data-collection tactics have led to take-down notices from Meta Platforms Inc.'s Facebook, Alphabet Inc.'s Google and Twitter Inc., the company has forged ahead as if nothing was the matter. It recently told investors that its facial database would swell tenfold to 100 billion images, and that it was expanding into the private sector, for instance, by helping to verify gig-economy workers, according to a report in the Washington Post.  

But Clearview's legal issues aren't going away. The company has been plagued by lawsuits in federal courts and in several states, including in New York, California, Illinois and Virginia, and it faces a wave of regulatory probes in the U.K. France, Italy and several other European countries.

Clearview said legal problems were normal in the startup world. “Just like Airbnb, Uber, PayPal and other iconic innovative startups, there is a major legal component to [our] operations early on,” a Clearview spokeswoman said. She added that privacy laws across the world tended to support exemptions for law enforcement and national security.    

War has a tendency to bring out both generosity and opportunism. Clearview's offer to Ukraine may well fall into the latter camp. More broadly, though, it is dangerous to bring facial-recognition technology into a war zone, all the more if doing so becomes the norm.   

Clearview similarly got its police business off the ground by offering free trials to law enforcement employees.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Parmy Olson is a Bloomberg Opinion columnist covering technology. She previously reported for the Wall Street Journal and Forbes and is the author of 'We Are Anonymous.'

Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 17 Mar, 23:50 IST
Tags:
NEXT ARTICLE BEGINS