Cargando…
Touch engages visual spatial contextual processing
The spatial context in which we view a visual stimulus strongly determines how we perceive the stimulus. In the visual tilt illusion, the perceived orientation of a visual grating is affected by the orientation signals in its surrounding context. Conceivably, the spatial context in which a visual gr...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6226493/ https://www.ncbi.nlm.nih.gov/pubmed/30413736 http://dx.doi.org/10.1038/s41598-018-34810-z |
_version_ | 1783369954004828160 |
---|---|
author | Pérez-Bellido, Alexis Pappal, Ryan D. Yau, Jeffrey M. |
author_facet | Pérez-Bellido, Alexis Pappal, Ryan D. Yau, Jeffrey M. |
author_sort | Pérez-Bellido, Alexis |
collection | PubMed |
description | The spatial context in which we view a visual stimulus strongly determines how we perceive the stimulus. In the visual tilt illusion, the perceived orientation of a visual grating is affected by the orientation signals in its surrounding context. Conceivably, the spatial context in which a visual grating is perceived can be defined by interactive multisensory information rather than visual signals alone. Here, we tested the hypothesis that tactile signals engage the neural mechanisms supporting visual contextual modulation. Because tactile signals also convey orientation information and touch can selectively interact with visual orientation perception, we predicted that tactile signals would modulate the visual tilt illusion. We applied a bias-free method to measure the tilt illusion while testing visual-only, tactile-only or visuo-tactile contextual surrounds. We found that a tactile context can influence visual tilt perception. Moreover, combining visual and tactile orientation information in the surround results in a larger tilt illusion relative to the illusion achieved with the visual-only surround. These results demonstrate that the visual tilt illusion is subject to multisensory influences and imply that non-visual signals access the neural circuits whose computations underlie the contextual modulation of vision. |
format | Online Article Text |
id | pubmed-6226493 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2018 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-62264932018-11-13 Touch engages visual spatial contextual processing Pérez-Bellido, Alexis Pappal, Ryan D. Yau, Jeffrey M. Sci Rep Article The spatial context in which we view a visual stimulus strongly determines how we perceive the stimulus. In the visual tilt illusion, the perceived orientation of a visual grating is affected by the orientation signals in its surrounding context. Conceivably, the spatial context in which a visual grating is perceived can be defined by interactive multisensory information rather than visual signals alone. Here, we tested the hypothesis that tactile signals engage the neural mechanisms supporting visual contextual modulation. Because tactile signals also convey orientation information and touch can selectively interact with visual orientation perception, we predicted that tactile signals would modulate the visual tilt illusion. We applied a bias-free method to measure the tilt illusion while testing visual-only, tactile-only or visuo-tactile contextual surrounds. We found that a tactile context can influence visual tilt perception. Moreover, combining visual and tactile orientation information in the surround results in a larger tilt illusion relative to the illusion achieved with the visual-only surround. These results demonstrate that the visual tilt illusion is subject to multisensory influences and imply that non-visual signals access the neural circuits whose computations underlie the contextual modulation of vision. Nature Publishing Group UK 2018-11-09 /pmc/articles/PMC6226493/ /pubmed/30413736 http://dx.doi.org/10.1038/s41598-018-34810-z Text en © The Author(s) 2018 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. |
spellingShingle | Article Pérez-Bellido, Alexis Pappal, Ryan D. Yau, Jeffrey M. Touch engages visual spatial contextual processing |
title | Touch engages visual spatial contextual processing |
title_full | Touch engages visual spatial contextual processing |
title_fullStr | Touch engages visual spatial contextual processing |
title_full_unstemmed | Touch engages visual spatial contextual processing |
title_short | Touch engages visual spatial contextual processing |
title_sort | touch engages visual spatial contextual processing |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6226493/ https://www.ncbi.nlm.nih.gov/pubmed/30413736 http://dx.doi.org/10.1038/s41598-018-34810-z |
work_keys_str_mv | AT perezbellidoalexis touchengagesvisualspatialcontextualprocessing AT pappalryand touchengagesvisualspatialcontextualprocessing AT yaujeffreym touchengagesvisualspatialcontextualprocessing |