Cargando…

An augmented reality sign-reading assistant for users with reduced vision

People typically rely heavily on visual information when finding their way to unfamiliar locations. For individuals with reduced vision, there are a variety of navigational tools available to assist with this task if needed. However, for wayfinding in unfamiliar indoor environments the applicability...

Descripción completa

Detalles Bibliográficos
Autores principales: Huang, Jonathan, Kinateder, Max, Dunn, Matt J., Jarosz, Wojciech, Yang, Xing-Dong, Cooper, Emily A.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6334915/
https://www.ncbi.nlm.nih.gov/pubmed/30650159
http://dx.doi.org/10.1371/journal.pone.0210630
_version_ 1783387808729137152
author Huang, Jonathan
Kinateder, Max
Dunn, Matt J.
Jarosz, Wojciech
Yang, Xing-Dong
Cooper, Emily A.
author_facet Huang, Jonathan
Kinateder, Max
Dunn, Matt J.
Jarosz, Wojciech
Yang, Xing-Dong
Cooper, Emily A.
author_sort Huang, Jonathan
collection PubMed
description People typically rely heavily on visual information when finding their way to unfamiliar locations. For individuals with reduced vision, there are a variety of navigational tools available to assist with this task if needed. However, for wayfinding in unfamiliar indoor environments the applicability of existing tools is limited. One potential approach to assist with this task is to enhance visual information about the location and content of existing signage in the environment. With this aim, we developed a prototype software application, which runs on a consumer head-mounted augmented reality (AR) device, to assist visually impaired users with sign-reading. The sign-reading assistant identifies real-world text (e.g., signs and room numbers) on command, highlights the text location, converts it to high contrast AR lettering, and optionally reads the content aloud via text-to-speech. We assessed the usability of this application in a behavioral experiment. Participants with simulated visual impairment were asked to locate a particular office within a hallway, either with or without AR assistance (referred to as the AR group and control group, respectively). Subjective assessments indicated that participants in the AR group found the application helpful for this task, and an analysis of walking paths indicated that these participants took more direct routes compared to the control group. However, participants in the AR group also walked more slowly and took more time to complete the task than the control group. The results point to several specific future goals for usability and system performance in AR-based assistive tools.
format Online
Article
Text
id pubmed-6334915
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-63349152019-01-31 An augmented reality sign-reading assistant for users with reduced vision Huang, Jonathan Kinateder, Max Dunn, Matt J. Jarosz, Wojciech Yang, Xing-Dong Cooper, Emily A. PLoS One Research Article People typically rely heavily on visual information when finding their way to unfamiliar locations. For individuals with reduced vision, there are a variety of navigational tools available to assist with this task if needed. However, for wayfinding in unfamiliar indoor environments the applicability of existing tools is limited. One potential approach to assist with this task is to enhance visual information about the location and content of existing signage in the environment. With this aim, we developed a prototype software application, which runs on a consumer head-mounted augmented reality (AR) device, to assist visually impaired users with sign-reading. The sign-reading assistant identifies real-world text (e.g., signs and room numbers) on command, highlights the text location, converts it to high contrast AR lettering, and optionally reads the content aloud via text-to-speech. We assessed the usability of this application in a behavioral experiment. Participants with simulated visual impairment were asked to locate a particular office within a hallway, either with or without AR assistance (referred to as the AR group and control group, respectively). Subjective assessments indicated that participants in the AR group found the application helpful for this task, and an analysis of walking paths indicated that these participants took more direct routes compared to the control group. However, participants in the AR group also walked more slowly and took more time to complete the task than the control group. The results point to several specific future goals for usability and system performance in AR-based assistive tools. Public Library of Science 2019-01-16 /pmc/articles/PMC6334915/ /pubmed/30650159 http://dx.doi.org/10.1371/journal.pone.0210630 Text en © 2019 Huang et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Huang, Jonathan
Kinateder, Max
Dunn, Matt J.
Jarosz, Wojciech
Yang, Xing-Dong
Cooper, Emily A.
An augmented reality sign-reading assistant for users with reduced vision
title An augmented reality sign-reading assistant for users with reduced vision
title_full An augmented reality sign-reading assistant for users with reduced vision
title_fullStr An augmented reality sign-reading assistant for users with reduced vision
title_full_unstemmed An augmented reality sign-reading assistant for users with reduced vision
title_short An augmented reality sign-reading assistant for users with reduced vision
title_sort augmented reality sign-reading assistant for users with reduced vision
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6334915/
https://www.ncbi.nlm.nih.gov/pubmed/30650159
http://dx.doi.org/10.1371/journal.pone.0210630
work_keys_str_mv AT huangjonathan anaugmentedrealitysignreadingassistantforuserswithreducedvision
AT kinatedermax anaugmentedrealitysignreadingassistantforuserswithreducedvision
AT dunnmattj anaugmentedrealitysignreadingassistantforuserswithreducedvision
AT jaroszwojciech anaugmentedrealitysignreadingassistantforuserswithreducedvision
AT yangxingdong anaugmentedrealitysignreadingassistantforuserswithreducedvision
AT cooperemilya anaugmentedrealitysignreadingassistantforuserswithreducedvision
AT huangjonathan augmentedrealitysignreadingassistantforuserswithreducedvision
AT kinatedermax augmentedrealitysignreadingassistantforuserswithreducedvision
AT dunnmattj augmentedrealitysignreadingassistantforuserswithreducedvision
AT jaroszwojciech augmentedrealitysignreadingassistantforuserswithreducedvision
AT yangxingdong augmentedrealitysignreadingassistantforuserswithreducedvision
AT cooperemilya augmentedrealitysignreadingassistantforuserswithreducedvision