A facial recognition tool that Amazon.com sells to web developers wrongly identified 28 members of Congress as police suspects, in a test conducted by the American Civil Liberties Union (ACLU), the organization said on Thursday.
Amazon, in a response, said it took issue with the settings of its face ID tool during the test. The findings nonetheless highlight the risks that individuals could face if police use the technology in certain ways to catch criminals.
Since May, the ACLU and other civil rights groups have pressured Amazon to stop selling governments access to Rekognition, a powerful image ID software unveiled in 2016 by the company's cloud-computing division.
The groups cited use of Rekognition by law enforcement in Oregon and Florida and warned that the tool could be used to target immigrants and people of color unfairly.
Their activism has kicked off a public debate. The president of Microsoft Corp, Amazon's rival which also uses facial recognition technology, called on Congress earlier this month to study possible regulations.
The ACLU said it wants Congress to enact a moratorium on use of the technology by law enforcement.
Facial recognition is already widely used in China for police purposes, and a number of start-up companies there - some valued at billions of dollars - are aggressively pursuing the technology.
Amazon has touted a range of uses for Rekognition, from detecting offensive content online to identifying celebrities.
"We remain excited about how image and video analysis can be a driver for good in the world," a spokeswoman for Amazon Web Services said in a statement, citing its help finding lost children and preventing crimes. She said Rekognition was normally used to narrow the field for human review, not to make final decisions.
The ACLU said it paid just $12.33 (roughly Rs. 850) to have Amazon Rekognition compare official photos of every member of the U S House and Senate against a database of 25,000 public arrest photos.
The technology identified "matches" for 28 members of Congress with at least 80 percent accuracy, Amazon's default setting, the ACLU said.
These matches were disproportionately people of colour, according to the ACLU. Some 39 percent were African-American and Latino lawmakers, versus 20 percent who identify as a person of colour in Congress, the ACLU said. It added that Rekognition could exacerbate harm because people of colour are already targeted at an above-average rate by the police.
"Face surveillance is flawed, and it's biased, and it's dangerous," Jacob Snow, an attorney at the ACLU of Northern California, told Reuters.
Amazon's spokeswoman said the 80 percent confidence setting was appropriate for identifying objects, not individuals. The company guides customers to set a threshold of 95 percent or higher for law enforcement activities, she said.
© Thomson Reuters 2018