Late last year, the federal government released its own damning report on bias issues in face recognition algorithms, finding that the systems generally work best on middle-aged white men’s faces, and not so well for people of color, women, children, or the elderly. The federal government study concluded the rates of error tended to be highest for Black women, just as Buolamwini, Gebru, and Raji found. These error-prone, racially biased algorithms can have devastating impacts for people of color. For example, many police departments use face recognition technology to identify suspects and make arrests. One false match can lead to a wrongful arrest, a lengthy detention, and even deadly police violence.

https://www.aclu.org/news/privacy-te...-6195970821995