Cargando…

Absolute Eye Gaze Estimation With Biosensors in Hearing Aids

People with hearing impairment typically have difficulties following conversations in multi-talker situations. Previous studies have shown that utilizing eye gaze to steer audio through beamformers could be a solution for those situations. Recent studies have shown that in-ear electrodes that captur...

Descripción completa

Detalles Bibliográficos
Autores principales: Favre-Félix, Antoine, Graversen, Carina, Bhuiyan, Tanveer A., Skoglund, Martin A., Rotger-Griful, Sergi, Rank, Mike Lind, Dau, Torsten, Lunner, Thomas
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6915090/
https://www.ncbi.nlm.nih.gov/pubmed/31920477
http://dx.doi.org/10.3389/fnins.2019.01294
_version_ 1783479950568849408
author Favre-Félix, Antoine
Graversen, Carina
Bhuiyan, Tanveer A.
Skoglund, Martin A.
Rotger-Griful, Sergi
Rank, Mike Lind
Dau, Torsten
Lunner, Thomas
author_facet Favre-Félix, Antoine
Graversen, Carina
Bhuiyan, Tanveer A.
Skoglund, Martin A.
Rotger-Griful, Sergi
Rank, Mike Lind
Dau, Torsten
Lunner, Thomas
author_sort Favre-Félix, Antoine
collection PubMed
description People with hearing impairment typically have difficulties following conversations in multi-talker situations. Previous studies have shown that utilizing eye gaze to steer audio through beamformers could be a solution for those situations. Recent studies have shown that in-ear electrodes that capture electrooculography in the ear (EarEOG) can estimate the eye-gaze relative to the head, when the head was fixed. The head movement can be estimated using motion sensors around the ear to create an estimate of the absolute eye-gaze in the room. In this study, an experiment was designed to mimic a multi-talker situation in order to study and model the EarEOG signal when participants attempted to follow a conversation. Eleven hearing impaired participants were presented speech from the DAT speech corpus (Bo Nielsen et al., 2014), with three targets positioned at −30°, 0° and +30° azimuth. The experiment was run in two setups: one where the participants had their head fixed in a chinrest, and the other where they were free to move their head. The participants’ task was to focus their visual attention on an LED-indicated target that changed regularly. A model was developed for the relative eye-gaze estimation, taking saccades, fixations, head movement and drift from the electrode-skin half-cell into account. This model explained 90.5% of the variance of the EarEOG when the head was fixed, and 82.6% when the head was free. The absolute eye-gaze was also estimated utilizing that model. When the head was fixed, the estimation of the absolute eye-gaze was reliable. However, due to hardware issues, the estimation of the absolute eye-gaze when the head was free had a variance that was too large to reliably estimate the attended target. Overall, this study demonstrated the potential of estimating absolute eye-gaze using EarEOG and motion sensors around the ear.
format Online
Article
Text
id pubmed-6915090
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-69150902020-01-09 Absolute Eye Gaze Estimation With Biosensors in Hearing Aids Favre-Félix, Antoine Graversen, Carina Bhuiyan, Tanveer A. Skoglund, Martin A. Rotger-Griful, Sergi Rank, Mike Lind Dau, Torsten Lunner, Thomas Front Neurosci Neuroscience People with hearing impairment typically have difficulties following conversations in multi-talker situations. Previous studies have shown that utilizing eye gaze to steer audio through beamformers could be a solution for those situations. Recent studies have shown that in-ear electrodes that capture electrooculography in the ear (EarEOG) can estimate the eye-gaze relative to the head, when the head was fixed. The head movement can be estimated using motion sensors around the ear to create an estimate of the absolute eye-gaze in the room. In this study, an experiment was designed to mimic a multi-talker situation in order to study and model the EarEOG signal when participants attempted to follow a conversation. Eleven hearing impaired participants were presented speech from the DAT speech corpus (Bo Nielsen et al., 2014), with three targets positioned at −30°, 0° and +30° azimuth. The experiment was run in two setups: one where the participants had their head fixed in a chinrest, and the other where they were free to move their head. The participants’ task was to focus their visual attention on an LED-indicated target that changed regularly. A model was developed for the relative eye-gaze estimation, taking saccades, fixations, head movement and drift from the electrode-skin half-cell into account. This model explained 90.5% of the variance of the EarEOG when the head was fixed, and 82.6% when the head was free. The absolute eye-gaze was also estimated utilizing that model. When the head was fixed, the estimation of the absolute eye-gaze was reliable. However, due to hardware issues, the estimation of the absolute eye-gaze when the head was free had a variance that was too large to reliably estimate the attended target. Overall, this study demonstrated the potential of estimating absolute eye-gaze using EarEOG and motion sensors around the ear. Frontiers Media S.A. 2019-12-05 /pmc/articles/PMC6915090/ /pubmed/31920477 http://dx.doi.org/10.3389/fnins.2019.01294 Text en Copyright © 2019 Favre-Félix, Graversen, Bhuiyan, Skoglund, Rotger-Griful, Rank, Dau and Lunner. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Favre-Félix, Antoine
Graversen, Carina
Bhuiyan, Tanveer A.
Skoglund, Martin A.
Rotger-Griful, Sergi
Rank, Mike Lind
Dau, Torsten
Lunner, Thomas
Absolute Eye Gaze Estimation With Biosensors in Hearing Aids
title Absolute Eye Gaze Estimation With Biosensors in Hearing Aids
title_full Absolute Eye Gaze Estimation With Biosensors in Hearing Aids
title_fullStr Absolute Eye Gaze Estimation With Biosensors in Hearing Aids
title_full_unstemmed Absolute Eye Gaze Estimation With Biosensors in Hearing Aids
title_short Absolute Eye Gaze Estimation With Biosensors in Hearing Aids
title_sort absolute eye gaze estimation with biosensors in hearing aids
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6915090/
https://www.ncbi.nlm.nih.gov/pubmed/31920477
http://dx.doi.org/10.3389/fnins.2019.01294
work_keys_str_mv AT favrefelixantoine absoluteeyegazeestimationwithbiosensorsinhearingaids
AT graversencarina absoluteeyegazeestimationwithbiosensorsinhearingaids
AT bhuiyantanveera absoluteeyegazeestimationwithbiosensorsinhearingaids
AT skoglundmartina absoluteeyegazeestimationwithbiosensorsinhearingaids
AT rotgergrifulsergi absoluteeyegazeestimationwithbiosensorsinhearingaids
AT rankmikelind absoluteeyegazeestimationwithbiosensorsinhearingaids
AT dautorsten absoluteeyegazeestimationwithbiosensorsinhearingaids
AT lunnerthomas absoluteeyegazeestimationwithbiosensorsinhearingaids