Federal study confirms racial bias of many facial-recognition systems

Asian and African American people were up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. Native Americans had the highest false-positive rate of all ethnicities, according to the study, which found that systems varied widely in their accuracy.

The National Institute of Standards and Technology, the federal laboratory is known as NIST that develops standards for new technology, found “empirical evidence” that most of the facial-recognition algorithms exhibit “demographic differentials” that can worsen their accuracy based on a person’s age, gender or race.