Amazon's facial recognition software, called Rekognition, is being tested by the Orlando Police Department in preparation for potential purchase by police departments around the country. A new test by the California ACLU, however, revealed the technology is far from ready for a public roll-out.
The software, which is meant to match real people's faces against pictures in its database to determine their identity, was given pictures of California's 535 members of Congress and asked to match them against 25,000 publicly available mugshots. Ideally, the program would find zero matches. Unfortunately, it misidentified 28 members of Congress as criminals.
The American Civil Liberties Union tested Amazon’s new facial recognition system Rekognition API by scanning pics of all 538 mems of Congress against mugshots of criminals. 28 matched. Hmmm.— John Simpson (@JohnSimpsonNews) July 27, 2018
The software also disproportionately misidentified black and female members of Congress. Of the 28 people misidentified, 11 were people of color. Studies have shown that facial recognition software is "routinely" less accurate on women and people with dark skin. Though the overall error rate for ACLU's test of California Congress was 5%, it was 39% when dealing with non-white members. This disparity could become incredibly dangerous if law enforcement begins using the technology to identify potential suspects, especially since minorities are already disproportionately arrested in the U.S. for committing the same crimes as white people.
Amazon’s facial recognition misidentified 28 lawmakers, including me, as people arrested for a crime. It’s clearly inaccurate tech, but law enforcement may want to use it for surveillance. We’re demanding answers on the privacy and civil rights implications of “Rekognition”. pic.twitter.com/PVI9nUwfcu— Ed Markey (@SenMarkey) July 26, 2018
The California ACLU commented:
If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a ‘match’ indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins.
Amazon shareholders are also urging Amazon CEO Jeff Bezos to discontinue selling Rekognition to law enforcement agencies.
Many wrote in an open letter:
We are concerned the technology would be used to unfairly and disproportionately target and surveil people of color, immigrants, and civil society organizations. We are concerned sales may be expanded to foreign governments, including authoritarian regimes.
BREAKING: Amazon Rekognition falsely identified 28 members of Congress as arrestees in a test conducted by @ACLU_NorCal. The @ACLU is calling on Congress to pass a moratorium on law-enforcement use of face recognition. https://t.co/vPJ171Rkn9 pic.twitter.com/L2Ag0xRXl1— Jake Snow (@snowjake) July 26, 2018
The Congressional Black Caucus also wrote to Bezos in an open letter expressing their concern, pointing out that "communities of color are more heavily and aggressively policed than white communities," and were scared "wrong decisions will be made due to the skewed data set produced by [...] unconstitutional policing practices."
Amazon released a lengthy statement in response:
We have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families, and building educational apps for children), and organizations (enhancing security through multi-factor authentication, finding images more easily, or preventing package theft). We remain excited about how image and video analysis can be a driver for good in the world, including in the public sector and law enforcement.
With regard to this recent test of Amazon Rekognition by the ACLU, we think that the results could probably be improved by following best practices around setting the confidence thresholds (this is the percentage likelihood that Rekognition found a match) used in the test. While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn’t be appropriate for identifying individuals with a reasonable level of certainty. When using facial recognition for law enforcement activities, we guide customers to set a higher threshold of at least 95% or higher.
I think your barking up the wrong tree with the “deeply flawed” tech angle, eventually they will figure out how to make these systems work better than our wildest imaginations. The “flaw” is that the system itself is immoral.— A.Fraser (@dAnconiaMining) July 26, 2018
It seems Amazon has no plans to discontinue selling Rekognition to law enforcement:
Finally, it is worth noting that in real world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgement (and not to make fully autonomous decisions), where it can help find lost children, restrict human trafficking, or prevent crimes.
Though technology assistance in identifying suspects could one day be a huge benefit to society, it seems the software is far from ready.