Cargando…
Meta-Analyses Support a Taxonomic Model for Representations of Different Categories of Audio-Visual Interaction Events in the Human Brain
Our ability to perceive meaningful action events involving objects, people, and other animate agents is characterized in part by an interplay of visual and auditory sensory processing and their cross-modal interactions. However, this multisensory ability can be altered or dysfunctional in some heari...
Autores principales: | Csonka, Matt, Mardmomen, Nadia, Webster, Paula J, Brefczynski-Lewis, Julie A, Frum, Chris, Lewis, James W |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Oxford University Press
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7941256/ https://www.ncbi.nlm.nih.gov/pubmed/33718874 http://dx.doi.org/10.1093/texcom/tgab002 |
Ejemplares similares
-
Cortical Plasticity of Audio–Visual Object Representations
por: Naumer, Marcus J., et al.
Publicado: (2009) -
Divergent Human Cortical Regions for Processing Distinct Acoustic-Semantic Categories of Natural Sounds: Animal Action Sounds vs. Vocalizations
por: Webster, Paula J., et al.
Publicado: (2017) -
Breastfeeding Duration Is Associated with Regional, but Not Global, Differences in White Matter Tracts
por: Bauer, Christopher E., et al.
Publicado: (2019) -
Visual category representations in the infant brain
por: Xie, Siying, et al.
Publicado: (2022) -
Long-term memory representations for audio-visual scenes
por: Meyerhoff, Hauke S., et al.
Publicado: (2022)