Cargando…
Comparing In-ear EOG for Eye-Movement Estimation With Eye-Tracking: Accuracy, Calibration, and Speech Comprehension
This presentation details and evaluates a method for estimating the attended speaker during a two-person conversation by means of in-ear electro-oculography (EOG). Twenty-five hearing-impaired participants were fitted with molds equipped with EOG electrodes (in-ear EOG) and wore eye-tracking glasses...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9279575/ https://www.ncbi.nlm.nih.gov/pubmed/35844213 http://dx.doi.org/10.3389/fnins.2022.873201 |
_version_ | 1784746427756314624 |
---|---|
author | Skoglund, Martin A. Andersen, Martin Shiell, Martha M. Keidser, Gitte Rank, Mike Lind Rotger-Griful, Sergi |
author_facet | Skoglund, Martin A. Andersen, Martin Shiell, Martha M. Keidser, Gitte Rank, Mike Lind Rotger-Griful, Sergi |
author_sort | Skoglund, Martin A. |
collection | PubMed |
description | This presentation details and evaluates a method for estimating the attended speaker during a two-person conversation by means of in-ear electro-oculography (EOG). Twenty-five hearing-impaired participants were fitted with molds equipped with EOG electrodes (in-ear EOG) and wore eye-tracking glasses while watching a video of two life-size people in a dialog solving a Diapix task. The dialogue was directionally presented and together with background noise in the frontal hemisphere at 60 dB SPL. During three conditions of steering (none, in-ear EOG, conventional eye-tracking), participants' comprehension was periodically measured using multiple-choice questions. Based on eye movement detection by in-ear EOG or conventional eye-tracking, the estimated attended speaker was amplified by 6 dB. In the in-ear EOG condition, the estimate was based on one selected channel pair of electrodes out of 36 possible electrodes. A novel calibration procedure introducing three different metrics was used to select the measurement channel. The in-ear EOG attended speaker estimates were compared to those of the eye-tracker. Across participants, the mean accuracy of in-ear EOG estimation of the attended speaker was 68%, ranging from 50 to 89%. Based on offline simulation, it was established that higher scoring metrics obtained for a channel with the calibration procedure were significantly associated with better data quality. Results showed a statistically significant improvement in comprehension of about 10% in both steering conditions relative to the no-steering condition. Comprehension in the two steering conditions was not significantly different. Further, better comprehension obtained under the in-ear EOG condition was significantly correlated with more accurate estimation of the attended speaker. In conclusion, this study shows promising results in the use of in-ear EOG for visual attention estimation with potential for applicability in hearing assistive devices. |
format | Online Article Text |
id | pubmed-9279575 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-92795752022-07-15 Comparing In-ear EOG for Eye-Movement Estimation With Eye-Tracking: Accuracy, Calibration, and Speech Comprehension Skoglund, Martin A. Andersen, Martin Shiell, Martha M. Keidser, Gitte Rank, Mike Lind Rotger-Griful, Sergi Front Neurosci Neuroscience This presentation details and evaluates a method for estimating the attended speaker during a two-person conversation by means of in-ear electro-oculography (EOG). Twenty-five hearing-impaired participants were fitted with molds equipped with EOG electrodes (in-ear EOG) and wore eye-tracking glasses while watching a video of two life-size people in a dialog solving a Diapix task. The dialogue was directionally presented and together with background noise in the frontal hemisphere at 60 dB SPL. During three conditions of steering (none, in-ear EOG, conventional eye-tracking), participants' comprehension was periodically measured using multiple-choice questions. Based on eye movement detection by in-ear EOG or conventional eye-tracking, the estimated attended speaker was amplified by 6 dB. In the in-ear EOG condition, the estimate was based on one selected channel pair of electrodes out of 36 possible electrodes. A novel calibration procedure introducing three different metrics was used to select the measurement channel. The in-ear EOG attended speaker estimates were compared to those of the eye-tracker. Across participants, the mean accuracy of in-ear EOG estimation of the attended speaker was 68%, ranging from 50 to 89%. Based on offline simulation, it was established that higher scoring metrics obtained for a channel with the calibration procedure were significantly associated with better data quality. Results showed a statistically significant improvement in comprehension of about 10% in both steering conditions relative to the no-steering condition. Comprehension in the two steering conditions was not significantly different. Further, better comprehension obtained under the in-ear EOG condition was significantly correlated with more accurate estimation of the attended speaker. In conclusion, this study shows promising results in the use of in-ear EOG for visual attention estimation with potential for applicability in hearing assistive devices. Frontiers Media S.A. 2022-06-30 /pmc/articles/PMC9279575/ /pubmed/35844213 http://dx.doi.org/10.3389/fnins.2022.873201 Text en Copyright © 2022 Skoglund, Andersen, Shiell, Keidser, Rank and Rotger-Griful. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Skoglund, Martin A. Andersen, Martin Shiell, Martha M. Keidser, Gitte Rank, Mike Lind Rotger-Griful, Sergi Comparing In-ear EOG for Eye-Movement Estimation With Eye-Tracking: Accuracy, Calibration, and Speech Comprehension |
title | Comparing In-ear EOG for Eye-Movement Estimation With Eye-Tracking: Accuracy, Calibration, and Speech Comprehension |
title_full | Comparing In-ear EOG for Eye-Movement Estimation With Eye-Tracking: Accuracy, Calibration, and Speech Comprehension |
title_fullStr | Comparing In-ear EOG for Eye-Movement Estimation With Eye-Tracking: Accuracy, Calibration, and Speech Comprehension |
title_full_unstemmed | Comparing In-ear EOG for Eye-Movement Estimation With Eye-Tracking: Accuracy, Calibration, and Speech Comprehension |
title_short | Comparing In-ear EOG for Eye-Movement Estimation With Eye-Tracking: Accuracy, Calibration, and Speech Comprehension |
title_sort | comparing in-ear eog for eye-movement estimation with eye-tracking: accuracy, calibration, and speech comprehension |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9279575/ https://www.ncbi.nlm.nih.gov/pubmed/35844213 http://dx.doi.org/10.3389/fnins.2022.873201 |
work_keys_str_mv | AT skoglundmartina comparingineareogforeyemovementestimationwitheyetrackingaccuracycalibrationandspeechcomprehension AT andersenmartin comparingineareogforeyemovementestimationwitheyetrackingaccuracycalibrationandspeechcomprehension AT shiellmartham comparingineareogforeyemovementestimationwitheyetrackingaccuracycalibrationandspeechcomprehension AT keidsergitte comparingineareogforeyemovementestimationwitheyetrackingaccuracycalibrationandspeechcomprehension AT rankmikelind comparingineareogforeyemovementestimationwitheyetrackingaccuracycalibrationandspeechcomprehension AT rotgergrifulsergi comparingineareogforeyemovementestimationwitheyetrackingaccuracycalibrationandspeechcomprehension |