Cargando…

“I don’t Think These Devices are Very Culturally Sensitive.”—Impact of Automated Speech Recognition Errors on African Americans

Automated speech recognition (ASR) converts language into text and is used across a variety of applications to assist us in everyday life, from powering virtual assistants, natural language conversations, to enabling dictation services. While recent work suggests that there are racial disparities in...

Descripción completa

Detalles Bibliográficos
Autores principales: Mengesha, Zion, Heldreth, Courtney, Lahav, Michal, Sublewski, Juliana, Tuennerman, Elyse
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8664002/
https://www.ncbi.nlm.nih.gov/pubmed/34901836
http://dx.doi.org/10.3389/frai.2021.725911
_version_ 1784613763842834432
author Mengesha, Zion
Heldreth, Courtney
Lahav, Michal
Sublewski, Juliana
Tuennerman, Elyse
author_facet Mengesha, Zion
Heldreth, Courtney
Lahav, Michal
Sublewski, Juliana
Tuennerman, Elyse
author_sort Mengesha, Zion
collection PubMed
description Automated speech recognition (ASR) converts language into text and is used across a variety of applications to assist us in everyday life, from powering virtual assistants, natural language conversations, to enabling dictation services. While recent work suggests that there are racial disparities in the performance of ASR systems for speakers of African American Vernacular English, little is known about the psychological and experiential effects of these failures paper provides a detailed examination of the behavioral and psychological consequences of ASR voice errors and the difficulty African American users have with getting their intents recognized. The results demonstrate that ASR failures have a negative, detrimental impact on African American users. Specifically, African Americans feel othered when using technology powered by ASR—errors surface thoughts about identity, namely about race and geographic location—leaving them feeling that the technology was not made for them. As a result, African Americans accommodate their speech to have better success with the technology. We incorporate the insights and lessons learned from sociolinguistics in our suggestions for linguistically responsive ways to build more inclusive voice systems that consider African American users’ needs, attitudes, and speech patterns. Our findings suggest that the use of a diary study can enable researchers to best understand the experiences and needs of communities who are often misunderstood by ASR. We argue this methodological framework could enable researchers who are concerned with fairness in AI to better capture the needs of all speakers who are traditionally misheard by voice-activated, artificially intelligent (voice-AI) digital systems.
format Online
Article
Text
id pubmed-8664002
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-86640022021-12-11 “I don’t Think These Devices are Very Culturally Sensitive.”—Impact of Automated Speech Recognition Errors on African Americans Mengesha, Zion Heldreth, Courtney Lahav, Michal Sublewski, Juliana Tuennerman, Elyse Front Artif Intell Artificial Intelligence Automated speech recognition (ASR) converts language into text and is used across a variety of applications to assist us in everyday life, from powering virtual assistants, natural language conversations, to enabling dictation services. While recent work suggests that there are racial disparities in the performance of ASR systems for speakers of African American Vernacular English, little is known about the psychological and experiential effects of these failures paper provides a detailed examination of the behavioral and psychological consequences of ASR voice errors and the difficulty African American users have with getting their intents recognized. The results demonstrate that ASR failures have a negative, detrimental impact on African American users. Specifically, African Americans feel othered when using technology powered by ASR—errors surface thoughts about identity, namely about race and geographic location—leaving them feeling that the technology was not made for them. As a result, African Americans accommodate their speech to have better success with the technology. We incorporate the insights and lessons learned from sociolinguistics in our suggestions for linguistically responsive ways to build more inclusive voice systems that consider African American users’ needs, attitudes, and speech patterns. Our findings suggest that the use of a diary study can enable researchers to best understand the experiences and needs of communities who are often misunderstood by ASR. We argue this methodological framework could enable researchers who are concerned with fairness in AI to better capture the needs of all speakers who are traditionally misheard by voice-activated, artificially intelligent (voice-AI) digital systems. Frontiers Media S.A. 2021-11-26 /pmc/articles/PMC8664002/ /pubmed/34901836 http://dx.doi.org/10.3389/frai.2021.725911 Text en Copyright © 2021 Mengesha, Heldreth, Lahav, Sublewski and Tuennerman. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Artificial Intelligence
Mengesha, Zion
Heldreth, Courtney
Lahav, Michal
Sublewski, Juliana
Tuennerman, Elyse
“I don’t Think These Devices are Very Culturally Sensitive.”—Impact of Automated Speech Recognition Errors on African Americans
title “I don’t Think These Devices are Very Culturally Sensitive.”—Impact of Automated Speech Recognition Errors on African Americans
title_full “I don’t Think These Devices are Very Culturally Sensitive.”—Impact of Automated Speech Recognition Errors on African Americans
title_fullStr “I don’t Think These Devices are Very Culturally Sensitive.”—Impact of Automated Speech Recognition Errors on African Americans
title_full_unstemmed “I don’t Think These Devices are Very Culturally Sensitive.”—Impact of Automated Speech Recognition Errors on African Americans
title_short “I don’t Think These Devices are Very Culturally Sensitive.”—Impact of Automated Speech Recognition Errors on African Americans
title_sort “i don’t think these devices are very culturally sensitive.”—impact of automated speech recognition errors on african americans
topic Artificial Intelligence
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8664002/
https://www.ncbi.nlm.nih.gov/pubmed/34901836
http://dx.doi.org/10.3389/frai.2021.725911
work_keys_str_mv AT mengeshazion idontthinkthesedevicesareveryculturallysensitiveimpactofautomatedspeechrecognitionerrorsonafricanamericans
AT heldrethcourtney idontthinkthesedevicesareveryculturallysensitiveimpactofautomatedspeechrecognitionerrorsonafricanamericans
AT lahavmichal idontthinkthesedevicesareveryculturallysensitiveimpactofautomatedspeechrecognitionerrorsonafricanamericans
AT sublewskijuliana idontthinkthesedevicesareveryculturallysensitiveimpactofautomatedspeechrecognitionerrorsonafricanamericans
AT tuennermanelyse idontthinkthesedevicesareveryculturallysensitiveimpactofautomatedspeechrecognitionerrorsonafricanamericans