Cargando…

Attention to colors induces surround suppression at category boundaries

We investigated how attention to a visual feature modulates representations of other features. The feature-similarity gain model predicts a graded modulation, whereas an alternative model asserts an inhibitory surround in feature space. Although evidence for both types of modulations can be found, a...

Descripción completa

Detalles Bibliográficos
Autores principales: Fang, Ming W. H., Becker, Mark W., Liu, Taosheng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6363742/
https://www.ncbi.nlm.nih.gov/pubmed/30723272
http://dx.doi.org/10.1038/s41598-018-37610-7
Descripción
Sumario:We investigated how attention to a visual feature modulates representations of other features. The feature-similarity gain model predicts a graded modulation, whereas an alternative model asserts an inhibitory surround in feature space. Although evidence for both types of modulations can be found, a consensus has not emerged in the literature. Here, we aimed to reconcile these different views by systematically measuring how attention modulates color perception. Based on previous literature, we also predicted that color categories would impact attentional modulation. Our results showed that both surround suppression and feature-similarity gain modulate perception of colors but they operate on different similarity scales. Furthermore, the region of the suppressive surround coincided with the color category boundary, suggesting a categorical sharpening effect. We implemented a neural population coding model to explain the observed behavioral effects, which revealed a hitherto unknown connection between neural tuning shift and surround suppression.