Amazon's Facial Recognition Software Shows Its Flaws By Misidentifying 28 Members Of Congress As Criminals 😬

Amazon's Facial Recognition Software Shows Its Flaws By Misidentifying 28 Members Of Congress As Criminals 😬
User Avatar
Updated 2 weeks ago

Amazon's facial recognition software, called Rekognition, is being tested by the Orlando Police Department in preparation for potential purchase by police departments around the country. A new test by the California ACLU, however, revealed the technology is far from ready for a public roll-out. 

The software, which is meant to match real people's faces against pictures in its database to determine their identity, was given pictures of California's 535 members of Congress and asked to match them against 25,000 publicly available mugshots. Ideally, the program would find zero matches. Unfortunately, it misidentified 28 members of Congress as criminals.

The software also disproportionately misidentified black and female members of Congress. Of the 28 people misidentified, 11 were people of color. Studies have shown that facial recognition software is "routinely" less accurate on women and people with dark skin. Though the overall error rate for ACLU's test of California Congress was 5%, it was 39% when dealing with non-white members. This disparity could become incredibly dangerous if law enforcement begins using the technology to identify potential suspects, especially since minorities are already disproportionately arrested in the U.S. for committing the same crimes as white people.

The California ACLU commented:

If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a ‘match’ indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins.

Amazon shareholders are also urging Amazon CEO Jeff Bezos to discontinue selling Rekognition to law enforcement agencies. 

Many wrote in an open letter:

We are concerned the technology would be used to unfairly and disproportionately target and surveil people of color, immigrants, and civil society organizations. We are concerned sales may be expanded to foreign governments, including authoritarian regimes.

The Congressional Black Caucus also wrote to Bezos in an open letter expressing their concern, pointing out that "communities of color are more heavily and aggressively policed than white communities," and were scared "wrong decisions will be made due to the skewed data set produced by [...] unconstitutional policing practices." 

Amazon released a lengthy statement in response:

We have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families, and building educational apps for children), and organizations (enhancing security through multi-factor authentication, finding images more easily, or preventing package theft). We remain excited about how image and video analysis can be a driver for good in the world, including in the public sector and law enforcement.

They continued:

With regard to this recent test of Amazon Rekognition by the ACLU, we think that the results could probably be improved by following best practices around setting the confidence thresholds (this is the percentage likelihood that Rekognition found a match) used in the test. While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn’t be appropriate for identifying individuals with a reasonable level of certainty. When using facial recognition for law enforcement activities, we guide customers to set a higher threshold of at least 95% or higher.

It seems Amazon has no plans to discontinue selling Rekognition to law enforcement:

Finally, it is worth noting that in real world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgement (and not to make fully autonomous decisions), where it can help find lost children, restrict human trafficking, or prevent crimes.

Though technology assistance in identifying suspects could one day be a huge benefit to society, it seems the software is far from ready.