Cargando…

An Information Theoretic Characterisation of Auditory Encoding

The entropy metric derived from information theory provides a means to quantify the amount of information transmitted in acoustic streams like speech or music. By systematically varying the entropy of pitch sequences, we sought brain areas where neural activity and energetic demands increase as a fu...

Descripción completa

Detalles Bibliográficos
Autores principales: Overath, Tobias, Cusack, Rhodri, Kumar, Sukhbinder, von Kriegstein, Katharina, Warren, Jason D, Grube, Manon, Carlyon, Robert P, Griffiths, Timothy D
Formato: Texto
Lenguaje:English
Publicado: Public Library of Science 2007
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2039771/
https://www.ncbi.nlm.nih.gov/pubmed/17958472
http://dx.doi.org/10.1371/journal.pbio.0050288
Descripción
Sumario:The entropy metric derived from information theory provides a means to quantify the amount of information transmitted in acoustic streams like speech or music. By systematically varying the entropy of pitch sequences, we sought brain areas where neural activity and energetic demands increase as a function of entropy. Such a relationship is predicted to occur in an efficient encoding mechanism that uses less computational resource when less information is present in the signal: we specifically tested the hypothesis that such a relationship is present in the planum temporale (PT). In two convergent functional MRI studies, we demonstrated this relationship in PT for encoding, while furthermore showing that a distributed fronto-parietal network for retrieval of acoustic information is independent of entropy. The results establish PT as an efficient neural engine that demands less computational resource to encode redundant signals than those with high information content.