Cargando…
Resolving the time course of visual and auditory object categorization
Humans can effortlessly categorize objects, both when they are conveyed through visual images and spoken words. To resolve the neural correlates of object categorization, studies have so far primarily focused on the visual modality. It is therefore still unclear how the brain extracts categorical in...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
American Physiological Society
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9190735/ https://www.ncbi.nlm.nih.gov/pubmed/35583972 http://dx.doi.org/10.1152/jn.00515.2021 |
_version_ | 1784725856712654848 |
---|---|
author | Iamshchinina, Polina Karapetian, Agnessa Kaiser, Daniel Cichy, Radoslaw M. |
author_facet | Iamshchinina, Polina Karapetian, Agnessa Kaiser, Daniel Cichy, Radoslaw M. |
author_sort | Iamshchinina, Polina |
collection | PubMed |
description | Humans can effortlessly categorize objects, both when they are conveyed through visual images and spoken words. To resolve the neural correlates of object categorization, studies have so far primarily focused on the visual modality. It is therefore still unclear how the brain extracts categorical information from auditory signals. In the current study, we used EEG (n = 48) and time-resolved multivariate pattern analysis to investigate 1) the time course with which object category information emerges in the auditory modality and 2) how the representational transition from individual object identification to category representation compares between the auditory modality and the visual modality. Our results show that 1) auditory object category representations can be reliably extracted from EEG signals and 2) a similar representational transition occurs in the visual and auditory modalities, where an initial representation at the individual-object level is followed by a subsequent representation of the objects’ category membership. Altogether, our results suggest an analogous hierarchy of information processing across sensory channels. However, there was no convergence toward conceptual modality-independent representations, thus providing no evidence for a shared supramodal code. NEW & NOTEWORTHY Object categorization operates on inputs from different sensory modalities, such as vision and audition. This process was mainly studied in vision. Here, we explore auditory object categorization. We show that auditory object category representations can be reliably extracted from EEG signals and, similar to vision, auditory representations initially carry information about individual objects, which is followed by a subsequent representation of the objects’ category membership. |
format | Online Article Text |
id | pubmed-9190735 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | American Physiological Society |
record_format | MEDLINE/PubMed |
spelling | pubmed-91907352022-07-06 Resolving the time course of visual and auditory object categorization Iamshchinina, Polina Karapetian, Agnessa Kaiser, Daniel Cichy, Radoslaw M. J Neurophysiol Rapid Report Humans can effortlessly categorize objects, both when they are conveyed through visual images and spoken words. To resolve the neural correlates of object categorization, studies have so far primarily focused on the visual modality. It is therefore still unclear how the brain extracts categorical information from auditory signals. In the current study, we used EEG (n = 48) and time-resolved multivariate pattern analysis to investigate 1) the time course with which object category information emerges in the auditory modality and 2) how the representational transition from individual object identification to category representation compares between the auditory modality and the visual modality. Our results show that 1) auditory object category representations can be reliably extracted from EEG signals and 2) a similar representational transition occurs in the visual and auditory modalities, where an initial representation at the individual-object level is followed by a subsequent representation of the objects’ category membership. Altogether, our results suggest an analogous hierarchy of information processing across sensory channels. However, there was no convergence toward conceptual modality-independent representations, thus providing no evidence for a shared supramodal code. NEW & NOTEWORTHY Object categorization operates on inputs from different sensory modalities, such as vision and audition. This process was mainly studied in vision. Here, we explore auditory object categorization. We show that auditory object category representations can be reliably extracted from EEG signals and, similar to vision, auditory representations initially carry information about individual objects, which is followed by a subsequent representation of the objects’ category membership. American Physiological Society 2022-06-01 2022-05-18 /pmc/articles/PMC9190735/ /pubmed/35583972 http://dx.doi.org/10.1152/jn.00515.2021 Text en Copyright © 2022 The Authors https://creativecommons.org/licenses/by/4.0/Licensed under Creative Commons Attribution CC-BY 4.0 (https://creativecommons.org/licenses/by/4.0/) . Published by the American Physiological Society. |
spellingShingle | Rapid Report Iamshchinina, Polina Karapetian, Agnessa Kaiser, Daniel Cichy, Radoslaw M. Resolving the time course of visual and auditory object categorization |
title | Resolving the time course of visual and auditory object categorization |
title_full | Resolving the time course of visual and auditory object categorization |
title_fullStr | Resolving the time course of visual and auditory object categorization |
title_full_unstemmed | Resolving the time course of visual and auditory object categorization |
title_short | Resolving the time course of visual and auditory object categorization |
title_sort | resolving the time course of visual and auditory object categorization |
topic | Rapid Report |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9190735/ https://www.ncbi.nlm.nih.gov/pubmed/35583972 http://dx.doi.org/10.1152/jn.00515.2021 |
work_keys_str_mv | AT iamshchininapolina resolvingthetimecourseofvisualandauditoryobjectcategorization AT karapetianagnessa resolvingthetimecourseofvisualandauditoryobjectcategorization AT kaiserdaniel resolvingthetimecourseofvisualandauditoryobjectcategorization AT cichyradoslawm resolvingthetimecourseofvisualandauditoryobjectcategorization |