Amazon facial AI matched politicians with criminals in test | Fin24
  • Trump and Tariffs

    Face-to-face trade meetings with China on the horizon.

  • Peter Moyo

    What is next for Old Mutual and its on-and off-again CEO Peter Moyo as legal manoeuverings continue?

  • Fin24’s newsletter

    Sign up to receive Fin24's top news in your inbox every morning.


Amazon facial AI matched politicians with criminals in test

Jul 27 2018 07:40
Spencer Soper, Bloomberg's facial recognition software falsely matched 28 members of the US Congress to criminal mugshots of different people, according to a test conducted by the American Civil Liberties Union, which argues the technology’s use by law enforcement may violate civil rights.

The ACLU cited the results of an experiment in which it used Amazon’s Rekognition tool to compare pictures of all members of the House and Senate against a database of 25 000 arrest photos. The subsequent false matches disproportionately affected members of the Congressional Black Caucus, it said, reinforcing concerns that the technology is less effective on people with darker skin. Some artificial intelligence software that’s used in facial recognition has been shown to be racially biased because it was trained using images that included relatively few minorities.

“The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats, men and women, and legislators of all ages, from all across the country,” the ACLU said in a blog post. “These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance.”

The test results, released Thursday, provide further ammunition to critics of the nascent technology. Government use of facial-recognition software has raised concerns among civil rights groups that maintain it can be used to quiet dissent and target groups such as undocumented immigrants and black rights activists.

In an emailed statement, a spokesperson for Amazon Web Services questioned the ACLU’s methodology. The ACLU’s test registered a match as anything where its system had 80% confidence, which the organisation said was the default setting for the facial recognition system Amazon offers to the public.

But Amazon says it guides law enforcement customers to register matches only at levels of 95% confidence or above. The lower standard is intended for applications with lower stakes, and "wouldn’t be appropriate for identifying individuals with a reasonable level of certainty," according to the company. It said it’s still confident that image and video analysis will prove useful to law enforcement.

Mug shots

Law enforcement in the US has made wide use of facial recognition for a range of tasks, from running mug shots against databases of drivers’ license photos to scanning people walking by surveillance cameras. In 2016, researchers at Georgetown University found that at least five major police departments claimed to run real-time face recognition on footage from street cameras, or expressed interest in doing so. Almost without exception, the agencies buying facial-recognition technology from private companies didn’t require the vendors to provide evidence that it was accurate.

Amazon itself has defended the technology, saying law enforcement is just one application and that the software can also be used to find abducted people and track down lost children. In a June blog post, Amazon’s AI general manager, Matt Wood, wrote: “There have always been and will always be risks with new technology capabilities. Each organisation choosing to employ technology must act responsibly or risk legal penalties and public condemnation.”

Amazon, Microsoft and Alphabet have all faced blow-back regarding AI tools they sell to the government. Microsoft in July urged lawmakers to regulate such tools to prevent abuse.

At Alphabet’s Google, employees revolted over the company’s work on Project Maven, a Defense Department initiative to apply AI to drone footage. Chief Executive Officer Sundar Pichai in June released a set of principles that pledged not to use its powerful AI for weapons, illegal surveillance and technologies that cause harm.

The ACLU however has made Amazon a primary target. In May, it released a report based on public records requests that revealed Amazon marketing materials promoting the technology as a law enforcement tool. Law enforcement agencies in Florida and Oregon have used the service for surveillance.

A presentation by Amazon’s cloud computing business indicated its Rekognition system can slash the time it takes to identify individuals in photos and video. The company’s technology does it in minutes rather than the days it takes when images are sent to different law enforcement agencies for manual review, according to marketing documents obtained by the ACLU.

* Sign up to Fin24's top news in your inbox: SUBSCRIBE TO FIN24 NEWSLETTER

amazon  |  artificial intelligence  |  ai


Company Snapshot

Money Clinic

Money Clinic
Do you have a question about your finances? We'll get an expert opinion.
Click here...

Voting Booth

What's your view on deep sea mining?

Previous results · Suggest a vote