Cargando…

Biased Face Recognition Technology Used by Government: A Problem for Liberal Democracy

This paper presents a novel philosophical analysis of the problem of law enforcement’s use of biased face recognition technology (FRT) in liberal democracies. FRT programs used by law enforcement in identifying crime suspects are substantially more error-prone on facial images depicting darker skin...

Descripción completa

Detalles Bibliográficos
Autor principal: Gentzel, Michael
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer Netherlands 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8475322/
https://www.ncbi.nlm.nih.gov/pubmed/34603941
http://dx.doi.org/10.1007/s13347-021-00478-z
Descripción
Sumario:This paper presents a novel philosophical analysis of the problem of law enforcement’s use of biased face recognition technology (FRT) in liberal democracies. FRT programs used by law enforcement in identifying crime suspects are substantially more error-prone on facial images depicting darker skin tones and females as compared to facial images depicting Caucasian males. This bias can lead to citizens being wrongfully investigated by police along racial and gender lines. The author develops and defends “A Liberal Argument Against Biased FRT,” which concludes that law enforcement use of biased FRT is inconsistent with the classical liberal requirement that government treat all citizens equally before the law. Two objections to this argument are considered and shown to be unsound. The author concludes by suggesting that equality before the law should be preserved while the problem of machine bias ought to be resolved before FRT and other types of artificial intelligence (AI) are deployed by governments in liberal democracies.