Cargando…
Crossmodal learning of target-context associations: When would tactile context predict visual search?
It is well established that statistical learning of visual target locations in relation to constantly positioned visual distractors facilitates visual search. In the present study, we investigated whether such a contextual-cueing effect would also work crossmodally, from touch onto vision. Participa...
Autores principales: | Chen, Siyi, Shi, Zhuanghua, Zang, Xuelian, Zhu, Xiuna, Assumpção, Leonardo, Müller, Hermann J., Geyer, Thomas |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer US
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7297845/ https://www.ncbi.nlm.nih.gov/pubmed/31845105 http://dx.doi.org/10.3758/s13414-019-01907-0 |
Ejemplares similares
-
Contextual cueing: implicit memory of tactile context facilitates tactile search
por: Assumpção, Leonardo, et al.
Publicado: (2015) -
Multisensory visuo-tactile context learning enhances the guidance of unisensory visual search
por: Chen, Siyi, et al.
Publicado: (2021) -
From Foreground to Background: How Task-Neutral Context Influences Contextual Cueing of Visual Search
por: Zang, Xuelian, et al.
Publicado: (2016) -
Influences of luminance contrast and ambient lighting on visual context learning and retrieval
por: Zang, Xuelian, et al.
Publicado: (2020) -
Interaction of Perceptual Grouping and Crossmodal Temporal Capture in Tactile Apparent-Motion
por: Chen, Lihan, et al.
Publicado: (2011)