Cargando…

Augmented-Reality Presentation of Household Sounds for Deaf and Hard-of-Hearing People

Normal-hearing people use sound as a cue to recognize various events that occur in their surrounding environment; however, this is not possible for deaf and hearing of hard (DHH) people, and in such a context they may not be able to freely detect their surrounding environment. Therefore, there is an...

Descripción completa

Detalles Bibliográficos
Autor principal: Asakura, Takumi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10490607/
https://www.ncbi.nlm.nih.gov/pubmed/37688079
http://dx.doi.org/10.3390/s23177616
_version_ 1785103878968049664
author Asakura, Takumi
author_facet Asakura, Takumi
author_sort Asakura, Takumi
collection PubMed
description Normal-hearing people use sound as a cue to recognize various events that occur in their surrounding environment; however, this is not possible for deaf and hearing of hard (DHH) people, and in such a context they may not be able to freely detect their surrounding environment. Therefore, there is an opportunity to create a convenient device that can detect sounds occurring in daily life and present them visually instead of auditorily. Additionally, it is of great importance to appropriately evaluate how such a supporting device would change the lives of DHH people. The current study proposes an augmented-reality-based system for presenting household sounds to DHH people as visual information. We examined the effect of displaying both the icons indicating sounds classified by machine learning and a dynamic spectrogram indicating the real-time time–frequency characteristics of the environmental sounds. First, the issues that DHH people perceive as problems in their daily lives were investigated through a survey, suggesting that DHH people need to visualize their surrounding sound environment. Then, after the accuracy of the machine-learning-based classifier installed in the proposed system was validated, the subjective impression of how the proposed system increased the comfort of daily life was obtained through a field experiment in a real residence. The results confirmed that the comfort of daily life in household spaces can be improved by combining not only the classification results of machine learning but also the real-time display of spectrograms.
format Online
Article
Text
id pubmed-10490607
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-104906072023-09-09 Augmented-Reality Presentation of Household Sounds for Deaf and Hard-of-Hearing People Asakura, Takumi Sensors (Basel) Article Normal-hearing people use sound as a cue to recognize various events that occur in their surrounding environment; however, this is not possible for deaf and hearing of hard (DHH) people, and in such a context they may not be able to freely detect their surrounding environment. Therefore, there is an opportunity to create a convenient device that can detect sounds occurring in daily life and present them visually instead of auditorily. Additionally, it is of great importance to appropriately evaluate how such a supporting device would change the lives of DHH people. The current study proposes an augmented-reality-based system for presenting household sounds to DHH people as visual information. We examined the effect of displaying both the icons indicating sounds classified by machine learning and a dynamic spectrogram indicating the real-time time–frequency characteristics of the environmental sounds. First, the issues that DHH people perceive as problems in their daily lives were investigated through a survey, suggesting that DHH people need to visualize their surrounding sound environment. Then, after the accuracy of the machine-learning-based classifier installed in the proposed system was validated, the subjective impression of how the proposed system increased the comfort of daily life was obtained through a field experiment in a real residence. The results confirmed that the comfort of daily life in household spaces can be improved by combining not only the classification results of machine learning but also the real-time display of spectrograms. MDPI 2023-09-02 /pmc/articles/PMC10490607/ /pubmed/37688079 http://dx.doi.org/10.3390/s23177616 Text en © 2023 by the author. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Asakura, Takumi
Augmented-Reality Presentation of Household Sounds for Deaf and Hard-of-Hearing People
title Augmented-Reality Presentation of Household Sounds for Deaf and Hard-of-Hearing People
title_full Augmented-Reality Presentation of Household Sounds for Deaf and Hard-of-Hearing People
title_fullStr Augmented-Reality Presentation of Household Sounds for Deaf and Hard-of-Hearing People
title_full_unstemmed Augmented-Reality Presentation of Household Sounds for Deaf and Hard-of-Hearing People
title_short Augmented-Reality Presentation of Household Sounds for Deaf and Hard-of-Hearing People
title_sort augmented-reality presentation of household sounds for deaf and hard-of-hearing people
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10490607/
https://www.ncbi.nlm.nih.gov/pubmed/37688079
http://dx.doi.org/10.3390/s23177616
work_keys_str_mv AT asakuratakumi augmentedrealitypresentationofhouseholdsoundsfordeafandhardofhearingpeople