Amazon's facial recognition software, called Rekognition, is being tested by the Orlando Police Department in preparation for potential purchase by police departments around the country. A new test by the California ACLU, however, revealed the technology is far from ready for a public roll-out.
The software, which is meant to match real people's faces against pictures in its database to determine their identity, was given pictures of California's 535 members of Congress and asked to match them against 25,000 publicly available mugshots. Ideally, the program would find zero matches. Unfortunately, it misidentified 28 members of Congress as criminals.
The software also disproportionately misidentified black and female members of Congress. Of the 28 people misidentified, 11 were people of color. Studies have shown that facial recognition software is "routinely" less accurate on women and people with dark skin. Though the overall error rate for ACLU's test of California Congress was 5%, it was 39% when dealing with non-white members. This disparity could become incredibly dangerous if law enforcement begins using the technology to identify potential suspects, especially since minorities are already disproportionately arrested in the U.S. for committing the same crimes as white people.