publication . Preprint . Article . Report . 2020

Understanding bias in facial recognition technologies

Leslie, David;
Open Access English
  • Published: 05 Oct 2020
Abstract
Over the past couple of years, the growing debate around automated facial recognition has reached a boiling point. As developers have continued to swiftly expand the scope of these kinds of technologies into an almost unbounded range of applications, an increasingly strident chorus of critical voices has sounded concerns about the injurious effects of the proliferation of such systems. Opponents argue that the irresponsible design and use of facial detection and recognition technologies (FDRTs) threatens to violate civil liberties, infringe on basic human rights and further entrench structural racism and systemic marginalisation. They also caution that the gradu...
Subjects
free text keywords: Computer Science - Computers and Society, Computer Science - Computer Vision and Pattern Recognition, Computer Science - Databases, facial recognition technologies, algorithmic bias, digital ethics, responsible innovation, biometric technologies

Alvi, M., Zisserman, A., & Nellåker, C. (2018). Turning a blind eye: Explicit removal of biases and variation from deep neural network embeddings. In Proceedings of the European Conference on Computer Vision (ECCV) (pp. 0-0). [OpenAIRE]

Amini, A., Soleimany, A. P., Schwarting, W., Bhatia, S. N., & Rus, D. (2019, January). Uncovering and mitigating algorithmic bias through learned latent structure. In Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society (pp. 289-295). [OpenAIRE]

Big Brother Watch. (2018, May). Face Off: The lawless growth of facial recognition in UK policing. Retrieved from https://bigbrotherwatch.org.uk/wp-content/uploads/2018/05/Face-Off-final-digital-1.pdf

Biometrics and Forensics Ethics Group (BFEG) Facial Recogniton Working Group. (2019, February). Ethical issues arising from the police use of live facial recognition technology. Retrieved from https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/ file/781745/Facial_Recognition_Briefing_BFEG_February_2019.pdf

Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in ... Retrieved September 22, 2020, from http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf

Buolamwini, J., Ordóñez, V., Morgenstern, J., & Learned-Miller, E. (2020, May 29). Facial Recognition Technologies: A Primer. Algorithmic Justice League. Retrieved from https://globaluploads.webflow.com/5e027ca188c99e3515b404b7/5ed1002058516c11edc66a14_FRTsPrimer May2020.pdf

Davies, B., Innes, M., & Dawson, A. (2018, September) An Evaluation of South Wales Police's Use of Automated Facial Recognition. Retrieved from https://static1.squarespace.com/static/51b06364e4b02de2f57fd72e/t/5bfd4fbc21c67c2cdd692f a8/1543327693640/AFR+Report+%5BDigital%5D.pdf

Abstract
Over the past couple of years, the growing debate around automated facial recognition has reached a boiling point. As developers have continued to swiftly expand the scope of these kinds of technologies into an almost unbounded range of applications, an increasingly strident chorus of critical voices has sounded concerns about the injurious effects of the proliferation of such systems. Opponents argue that the irresponsible design and use of facial detection and recognition technologies (FDRTs) threatens to violate civil liberties, infringe on basic human rights and further entrench structural racism and systemic marginalisation. They also caution that the gradu...
Subjects
free text keywords: Computer Science - Computers and Society, Computer Science - Computer Vision and Pattern Recognition, Computer Science - Databases, facial recognition technologies, algorithmic bias, digital ethics, responsible innovation, biometric technologies

Alvi, M., Zisserman, A., & Nellåker, C. (2018). Turning a blind eye: Explicit removal of biases and variation from deep neural network embeddings. In Proceedings of the European Conference on Computer Vision (ECCV) (pp. 0-0). [OpenAIRE]

Amini, A., Soleimany, A. P., Schwarting, W., Bhatia, S. N., & Rus, D. (2019, January). Uncovering and mitigating algorithmic bias through learned latent structure. In Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society (pp. 289-295). [OpenAIRE]

Big Brother Watch. (2018, May). Face Off: The lawless growth of facial recognition in UK policing. Retrieved from https://bigbrotherwatch.org.uk/wp-content/uploads/2018/05/Face-Off-final-digital-1.pdf

Biometrics and Forensics Ethics Group (BFEG) Facial Recogniton Working Group. (2019, February). Ethical issues arising from the police use of live facial recognition technology. Retrieved from https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/ file/781745/Facial_Recognition_Briefing_BFEG_February_2019.pdf

Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in ... Retrieved September 22, 2020, from http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf

Buolamwini, J., Ordóñez, V., Morgenstern, J., & Learned-Miller, E. (2020, May 29). Facial Recognition Technologies: A Primer. Algorithmic Justice League. Retrieved from https://globaluploads.webflow.com/5e027ca188c99e3515b404b7/5ed1002058516c11edc66a14_FRTsPrimer May2020.pdf

Davies, B., Innes, M., & Dawson, A. (2018, September) An Evaluation of South Wales Police's Use of Automated Facial Recognition. Retrieved from https://static1.squarespace.com/static/51b06364e4b02de2f57fd72e/t/5bfd4fbc21c67c2cdd692f a8/1543327693640/AFR+Report+%5BDigital%5D.pdf

Any information missing or wrong?Report an Issue