Cargando…
Opportunities and Pitfalls in Applying Emotion Recognition Software for Persons With a Visual Impairment: Simulated Real Life Conversations
BACKGROUND: A large part of the communication cues exchanged between persons is nonverbal. Persons with a visual impairment are often unable to perceive these cues, such as gestures or facial expression of emotions. In a previous study, we have determined that visually impaired persons can increase...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
JMIR Publications
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6895890/ https://www.ncbi.nlm.nih.gov/pubmed/31750838 http://dx.doi.org/10.2196/13722 |
_version_ | 1783476655912648704 |
---|---|
author | Buimer, Hendrik Schellens, Renske Kostelijk, Tjerk Nemri, Abdellatif Zhao, Yan Van der Geest, Thea Van Wezel, Richard |
author_facet | Buimer, Hendrik Schellens, Renske Kostelijk, Tjerk Nemri, Abdellatif Zhao, Yan Van der Geest, Thea Van Wezel, Richard |
author_sort | Buimer, Hendrik |
collection | PubMed |
description | BACKGROUND: A large part of the communication cues exchanged between persons is nonverbal. Persons with a visual impairment are often unable to perceive these cues, such as gestures or facial expression of emotions. In a previous study, we have determined that visually impaired persons can increase their ability to recognize facial expressions of emotions from validated pictures and videos by using an emotion recognition system that signals vibrotactile cues associated with one of the six basic emotions. OBJECTIVE: The aim of this study was to determine whether the previously tested emotion recognition system worked equally well in realistic situations and under controlled laboratory conditions. METHODS: The emotion recognition system consists of a camera mounted on spectacles, a tablet running facial emotion recognition software, and a waist belt with vibrotactile stimulators to provide haptic feedback representing Ekman’s six universal emotions. A total of 8 visually impaired persons (4 females and 4 males; mean age 46.75 years, age range 28-66 years) participated in two training sessions followed by one experimental session. During the experiment, participants engaged in two 15 minute conversations, in one of which they wore the emotion recognition system. To conclude the study, exit interviews were conducted to assess the experiences of the participants. Due to technical issues with the registration of the emotion recognition software, only 6 participants were included in the video analysis. RESULTS: We found that participants were quickly able to learn, distinguish, and remember vibrotactile signals associated with the six emotions. A total of 4 participants felt that they were able to use the vibrotactile signals in the conversation. Moreover, 5 out of the 6 participants had no difficulties in keeping the camera focused on the conversation partner. The emotion recognition was very accurate in detecting happiness but performed unsatisfactorily in recognizing the other five universal emotions. CONCLUSIONS: The system requires some essential improvements in performance and wearability before it is ready to support visually impaired persons in their daily life interactions. Nevertheless, the participants saw potential in the system as an assistive technology, assuming their user requirements can be met. |
format | Online Article Text |
id | pubmed-6895890 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | JMIR Publications |
record_format | MEDLINE/PubMed |
spelling | pubmed-68958902019-12-23 Opportunities and Pitfalls in Applying Emotion Recognition Software for Persons With a Visual Impairment: Simulated Real Life Conversations Buimer, Hendrik Schellens, Renske Kostelijk, Tjerk Nemri, Abdellatif Zhao, Yan Van der Geest, Thea Van Wezel, Richard JMIR Mhealth Uhealth Original Paper BACKGROUND: A large part of the communication cues exchanged between persons is nonverbal. Persons with a visual impairment are often unable to perceive these cues, such as gestures or facial expression of emotions. In a previous study, we have determined that visually impaired persons can increase their ability to recognize facial expressions of emotions from validated pictures and videos by using an emotion recognition system that signals vibrotactile cues associated with one of the six basic emotions. OBJECTIVE: The aim of this study was to determine whether the previously tested emotion recognition system worked equally well in realistic situations and under controlled laboratory conditions. METHODS: The emotion recognition system consists of a camera mounted on spectacles, a tablet running facial emotion recognition software, and a waist belt with vibrotactile stimulators to provide haptic feedback representing Ekman’s six universal emotions. A total of 8 visually impaired persons (4 females and 4 males; mean age 46.75 years, age range 28-66 years) participated in two training sessions followed by one experimental session. During the experiment, participants engaged in two 15 minute conversations, in one of which they wore the emotion recognition system. To conclude the study, exit interviews were conducted to assess the experiences of the participants. Due to technical issues with the registration of the emotion recognition software, only 6 participants were included in the video analysis. RESULTS: We found that participants were quickly able to learn, distinguish, and remember vibrotactile signals associated with the six emotions. A total of 4 participants felt that they were able to use the vibrotactile signals in the conversation. Moreover, 5 out of the 6 participants had no difficulties in keeping the camera focused on the conversation partner. The emotion recognition was very accurate in detecting happiness but performed unsatisfactorily in recognizing the other five universal emotions. CONCLUSIONS: The system requires some essential improvements in performance and wearability before it is ready to support visually impaired persons in their daily life interactions. Nevertheless, the participants saw potential in the system as an assistive technology, assuming their user requirements can be met. JMIR Publications 2019-11-21 /pmc/articles/PMC6895890/ /pubmed/31750838 http://dx.doi.org/10.2196/13722 Text en ©Hendrik Buimer, Renske Schellens, Tjerk Kostelijk, Abdellatif Nemri, Yan Zhao, Thea Van der Geest, Richard Van Wezel. Originally published in JMIR mHealth and uHealth (http://mhealth.jmir.org), 21.11.2019. https://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included. |
spellingShingle | Original Paper Buimer, Hendrik Schellens, Renske Kostelijk, Tjerk Nemri, Abdellatif Zhao, Yan Van der Geest, Thea Van Wezel, Richard Opportunities and Pitfalls in Applying Emotion Recognition Software for Persons With a Visual Impairment: Simulated Real Life Conversations |
title | Opportunities and Pitfalls in Applying Emotion Recognition Software for Persons With a Visual Impairment: Simulated Real Life Conversations |
title_full | Opportunities and Pitfalls in Applying Emotion Recognition Software for Persons With a Visual Impairment: Simulated Real Life Conversations |
title_fullStr | Opportunities and Pitfalls in Applying Emotion Recognition Software for Persons With a Visual Impairment: Simulated Real Life Conversations |
title_full_unstemmed | Opportunities and Pitfalls in Applying Emotion Recognition Software for Persons With a Visual Impairment: Simulated Real Life Conversations |
title_short | Opportunities and Pitfalls in Applying Emotion Recognition Software for Persons With a Visual Impairment: Simulated Real Life Conversations |
title_sort | opportunities and pitfalls in applying emotion recognition software for persons with a visual impairment: simulated real life conversations |
topic | Original Paper |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6895890/ https://www.ncbi.nlm.nih.gov/pubmed/31750838 http://dx.doi.org/10.2196/13722 |
work_keys_str_mv | AT buimerhendrik opportunitiesandpitfallsinapplyingemotionrecognitionsoftwareforpersonswithavisualimpairmentsimulatedreallifeconversations AT schellensrenske opportunitiesandpitfallsinapplyingemotionrecognitionsoftwareforpersonswithavisualimpairmentsimulatedreallifeconversations AT kostelijktjerk opportunitiesandpitfallsinapplyingemotionrecognitionsoftwareforpersonswithavisualimpairmentsimulatedreallifeconversations AT nemriabdellatif opportunitiesandpitfallsinapplyingemotionrecognitionsoftwareforpersonswithavisualimpairmentsimulatedreallifeconversations AT zhaoyan opportunitiesandpitfallsinapplyingemotionrecognitionsoftwareforpersonswithavisualimpairmentsimulatedreallifeconversations AT vandergeestthea opportunitiesandpitfallsinapplyingemotionrecognitionsoftwareforpersonswithavisualimpairmentsimulatedreallifeconversations AT vanwezelrichard opportunitiesandpitfallsinapplyingemotionrecognitionsoftwareforpersonswithavisualimpairmentsimulatedreallifeconversations |