Cargando…
Measuring Driver Perception: Combining Eye-Tracking and Automated Road Scene Perception
OBJECTIVE: To investigate how well gaze behavior can indicate driver awareness of individual road users when related to the vehicle’s road scene perception. BACKGROUND: An appropriate method is required to identify how driver gaze reveals awareness of other road users. METHOD: We developed a recogni...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
SAGE Publications
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9136390/ https://www.ncbi.nlm.nih.gov/pubmed/32993382 http://dx.doi.org/10.1177/0018720820959958 |
_version_ | 1784714169673580544 |
---|---|
author | Stapel, Jork El Hassnaoui, Mounir Happee, Riender |
author_facet | Stapel, Jork El Hassnaoui, Mounir Happee, Riender |
author_sort | Stapel, Jork |
collection | PubMed |
description | OBJECTIVE: To investigate how well gaze behavior can indicate driver awareness of individual road users when related to the vehicle’s road scene perception. BACKGROUND: An appropriate method is required to identify how driver gaze reveals awareness of other road users. METHOD: We developed a recognition-based method for labeling of driver situation awareness (SA) in a vehicle with road-scene perception and eye tracking. Thirteen drivers performed 91 left turns on complex urban intersections and identified images of encountered road users among distractor images. RESULTS: Drivers fixated within 2° for 72.8% of relevant and 27.8% of irrelevant road users and were able to recognize 36.1% of the relevant and 19.4% of irrelevant road users one min after leaving the intersection. Gaze behavior could predict road user relevance but not the outcome of the recognition task. Unexpectedly, 18% of road users observed beyond 10° were recognized. CONCLUSIONS: Despite suboptimal psychometric properties leading to low recognition rates, our recognition task could identify awareness of individual road users during left turn maneuvers. Perception occurred at gaze angles well beyond 2°, which means that fixation locations are insufficient for awareness monitoring. APPLICATION: Findings can be used in driver attention and awareness modelling, and design of gaze-based driver support systems. |
format | Online Article Text |
id | pubmed-9136390 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | SAGE Publications |
record_format | MEDLINE/PubMed |
spelling | pubmed-91363902022-05-28 Measuring Driver Perception: Combining Eye-Tracking and Automated Road Scene Perception Stapel, Jork El Hassnaoui, Mounir Happee, Riender Hum Factors Sensory and Perceptual Processes OBJECTIVE: To investigate how well gaze behavior can indicate driver awareness of individual road users when related to the vehicle’s road scene perception. BACKGROUND: An appropriate method is required to identify how driver gaze reveals awareness of other road users. METHOD: We developed a recognition-based method for labeling of driver situation awareness (SA) in a vehicle with road-scene perception and eye tracking. Thirteen drivers performed 91 left turns on complex urban intersections and identified images of encountered road users among distractor images. RESULTS: Drivers fixated within 2° for 72.8% of relevant and 27.8% of irrelevant road users and were able to recognize 36.1% of the relevant and 19.4% of irrelevant road users one min after leaving the intersection. Gaze behavior could predict road user relevance but not the outcome of the recognition task. Unexpectedly, 18% of road users observed beyond 10° were recognized. CONCLUSIONS: Despite suboptimal psychometric properties leading to low recognition rates, our recognition task could identify awareness of individual road users during left turn maneuvers. Perception occurred at gaze angles well beyond 2°, which means that fixation locations are insufficient for awareness monitoring. APPLICATION: Findings can be used in driver attention and awareness modelling, and design of gaze-based driver support systems. SAGE Publications 2020-09-29 2022-06 /pmc/articles/PMC9136390/ /pubmed/32993382 http://dx.doi.org/10.1177/0018720820959958 Text en Copyright © 2020, The Author(s) https://creativecommons.org/licenses/by/4.0/This article is distributed under the terms of the Creative Commons Attribution 4.0 License (https://creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/open-access-at-sage). |
spellingShingle | Sensory and Perceptual Processes Stapel, Jork El Hassnaoui, Mounir Happee, Riender Measuring Driver Perception: Combining Eye-Tracking and Automated Road Scene Perception |
title | Measuring Driver Perception: Combining Eye-Tracking and Automated Road Scene Perception |
title_full | Measuring Driver Perception: Combining Eye-Tracking and Automated Road Scene Perception |
title_fullStr | Measuring Driver Perception: Combining Eye-Tracking and Automated Road Scene Perception |
title_full_unstemmed | Measuring Driver Perception: Combining Eye-Tracking and Automated Road Scene Perception |
title_short | Measuring Driver Perception: Combining Eye-Tracking and Automated Road Scene Perception |
title_sort | measuring driver perception: combining eye-tracking and automated road scene perception |
topic | Sensory and Perceptual Processes |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9136390/ https://www.ncbi.nlm.nih.gov/pubmed/32993382 http://dx.doi.org/10.1177/0018720820959958 |
work_keys_str_mv | AT stapeljork measuringdriverperceptioncombiningeyetrackingandautomatedroadsceneperception AT elhassnaouimounir measuringdriverperceptioncombiningeyetrackingandautomatedroadsceneperception AT happeeriender measuringdriverperceptioncombiningeyetrackingandautomatedroadsceneperception |