Quantcast
Channel: Artificial Intelligence
Viewing all articles
Browse latest Browse all 1160

Facial-recognition technology has a racial-bias problem, according to a new landmark federal study

$
0
0

facial recognition pay station

  • Facial-recognition algorithms are more likely to misidentify people of color than white people, according to a federal study published on Thursday.
  • The study found that black people and Asian people were up to 100 times as likely to produce a false positive than white men, and women were more likely to be misidentified than men across the board.
  • Law enforcement across the US is embracing facial-recognition technology as a tool for identifying suspects, a trend that the study calls into question.
  • Some prominent facial-recognition software, like Amazon's Rekognition, was not made available for the study.
  • Visit Business Insider's homepage for more stories.

A sweeping federal study of facial-recognition technology found that the systems were worse at identifying women and people of color than men and white people, the National Institute of Standards and Technology announced on Thursday.

Researchers found that facial-recognition software produced higher rates of false positives for black people and Asian people than whites. The software had a higher rate of false positives for those groups by a factor of 10 to 100 times, depending on which algorithms were used.

Women were also misidentified more frequently than men across the board, the study found. The Native American demographic had the highest rate of false positives.

"While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied," the NIST researcher Patrick Grother, the report's primary author, said in a statement. "This data will be valuable to policymakers, developers and end users in thinking about the limitations and appropriate use of these algorithms."

Law-enforcement agencies across the US have begun to embrace facial-recognition technology as a tool for tracking people's movement and identifying suspects in the past year. Civil-liberties advocates have pushed back against that trend, and some cities have banned the use of facial recognition by law enforcement.

ACLU senior policy analyst Jay Stanley said the NIST study is evidence that facial recognition is a "dystopian technology" and called on government agencies to stop using it. The ACLU has consistently opposed facial recognition, and is suing the federal government to release information about their use of facial-recognition software made by Amazon and Microsoft.

"Even government scientists are now confirming that this surveillance technology is flawed and biased," he said. "One false match can lead to missed flights, lengthy interrogations, watchlist placements, tense police encounters, false arrests, or worse."

The NIST study confirms existing research that has shown racial and gender bias of facial-recognition technology. Researchers at MIT found that Amazon's facial-recognition software, Rekognition, misidentified people of color at a higher rate than white people (Amazon criticized the study, arguing that researchers were using the software incorrectly).

Amazon was not one of the 99 facial recognition vendors tested by NIST, however — the company did not make its software available for the study, according to The Washington Post. Vendors who participated in the study included Intel, Microsoft, Panasonic, SenseTime, and Vigilant Solutions.

An Amazon spokesperson declined to comment. In the past, Amazon has said its software is a cloud-based service and would need to be altered in order to undergo NIST's test.

The NIST study also found that facial-recognition software made by Asian companies was less likely to misidentify Asian faces.

"These results are an encouraging sign that more diverse training data may produce more equitable outcomes, should it be possible for developers to use such data," Grother said in a statement.

SEE ALSO: How police are using technology like drones and facial recognition to track people across the US

Join the conversation about this story »

NOW WATCH: Apple just released iOS 13.2 with 60 new emoji and emoji variations. Here's how everyday people submit their own emoji.


Viewing all articles
Browse latest Browse all 1160

Trending Articles