Cargando…
Quantifying the speech-gesture relation with massive multimodal datasets: Informativity in time expressions
The development of large-scale corpora has led to a quantum leap in our understanding of speech in recent years. By contrast, the analysis of massive datasets has so far had a limited impact on the study of gesture and other visual communicative behaviors. We utilized the UCLA-Red Hen Lab multi-bill...
Autores principales: | Pagán Cánovas, Cristóbal, Valenzuela, Javier, Alcaraz Carrión, Daniel, Olza, Inés, Ramscar, Michael |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7266323/ https://www.ncbi.nlm.nih.gov/pubmed/32484842 http://dx.doi.org/10.1371/journal.pone.0233892 |
Ejemplares similares
-
Editorial: Gesture-Speech Integration: Combining Gesture and Speech to Create Understanding
por: Sweller, Naomi, et al.
Publicado: (2021) -
When Gesture “Takes Over”: Speech-Embedded Nonverbal Depictions in Multimodal Interaction
por: Hsu, Hui-Chieh, et al.
Publicado: (2021) -
Attention to Speech-Accompanying Gestures: Eye Movements and Information Uptake
por: Gullberg, Marianne, et al.
Publicado: (2009) -
Supramodal neural processing of abstract information conveyed by speech and gesture
por: Straube, Benjamin, et al.
Publicado: (2013) -
UMONS-TAICHI: A multimodal motion capture dataset of expertise in Taijiquan gestures
por: Tits, Mickaël, et al.
Publicado: (2018)