Cargando…

Rendering visual events as sounds: Spatial attention capture by auditory augmented reality

Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial...

Descripción completa

Detalles Bibliográficos
Autores principales: Stone, Scott A., Tata, Matthew S.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5549738/
https://www.ncbi.nlm.nih.gov/pubmed/28792518
http://dx.doi.org/10.1371/journal.pone.0182635
_version_ 1783256022564995072
author Stone, Scott A.
Tata, Matthew S.
author_facet Stone, Scott A.
Tata, Matthew S.
author_sort Stone, Scott A.
collection PubMed
description Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B) to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasible.
format Online
Article
Text
id pubmed-5549738
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-55497382017-08-12 Rendering visual events as sounds: Spatial attention capture by auditory augmented reality Stone, Scott A. Tata, Matthew S. PLoS One Research Article Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B) to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasible. Public Library of Science 2017-08-08 /pmc/articles/PMC5549738/ /pubmed/28792518 http://dx.doi.org/10.1371/journal.pone.0182635 Text en © 2017 Stone, Tata http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Stone, Scott A.
Tata, Matthew S.
Rendering visual events as sounds: Spatial attention capture by auditory augmented reality
title Rendering visual events as sounds: Spatial attention capture by auditory augmented reality
title_full Rendering visual events as sounds: Spatial attention capture by auditory augmented reality
title_fullStr Rendering visual events as sounds: Spatial attention capture by auditory augmented reality
title_full_unstemmed Rendering visual events as sounds: Spatial attention capture by auditory augmented reality
title_short Rendering visual events as sounds: Spatial attention capture by auditory augmented reality
title_sort rendering visual events as sounds: spatial attention capture by auditory augmented reality
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5549738/
https://www.ncbi.nlm.nih.gov/pubmed/28792518
http://dx.doi.org/10.1371/journal.pone.0182635
work_keys_str_mv AT stonescotta renderingvisualeventsassoundsspatialattentioncapturebyauditoryaugmentedreality
AT tatamatthews renderingvisualeventsassoundsspatialattentioncapturebyauditoryaugmentedreality