Cargando…
Electrical brain imaging evidences left auditory cortex involvement in speech and non-speech discrimination based on temporal features
BACKGROUND: Speech perception is based on a variety of spectral and temporal acoustic features available in the acoustic signal. Voice-onset time (VOT) is considered an important cue that is cardinal for phonetic perception. METHODS: In the present study, we recorded and compared scalp auditory evok...
Autores principales: | , , |
---|---|
Formato: | Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2007
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2231369/ https://www.ncbi.nlm.nih.gov/pubmed/18070338 http://dx.doi.org/10.1186/1744-9081-3-63 |
_version_ | 1782150247817936896 |
---|---|
author | Zaehle, Tino Jancke, Lutz Meyer, Martin |
author_facet | Zaehle, Tino Jancke, Lutz Meyer, Martin |
author_sort | Zaehle, Tino |
collection | PubMed |
description | BACKGROUND: Speech perception is based on a variety of spectral and temporal acoustic features available in the acoustic signal. Voice-onset time (VOT) is considered an important cue that is cardinal for phonetic perception. METHODS: In the present study, we recorded and compared scalp auditory evoked potentials (AEP) in response to consonant-vowel-syllables (CV) with varying voice-onset-times (VOT) and non-speech analogues with varying noise-onset-time (NOT). In particular, we aimed to investigate the spatio-temporal pattern of acoustic feature processing underlying elemental speech perception and relate this temporal processing mechanism to specific activations of the auditory cortex. RESULTS: Results show that the characteristic AEP waveform in response to consonant-vowel-syllables is on a par with those of non-speech sounds with analogue temporal characteristics. The amplitude of the N1a and N1b component of the auditory evoked potentials significantly correlated with the duration of the VOT in CV and likewise, with the duration of the NOT in non-speech sounds. Furthermore, current density maps indicate overlapping supratemporal networks involved in the perception of both speech and non-speech sounds with a bilateral activation pattern during the N1a time window and leftward asymmetry during the N1b time window. Elaborate regional statistical analysis of the activation over the middle and posterior portion of the supratemporal plane (STP) revealed strong left lateralized responses over the middle STP for both the N1a and N1b component, and a functional leftward asymmetry over the posterior STP for the N1b component. CONCLUSION: The present data demonstrate overlapping spatio-temporal brain responses during the perception of temporal acoustic cues in both speech and non-speech sounds. Source estimation evidences a preponderant role of the left middle and posterior auditory cortex in speech and non-speech discrimination based on temporal features. Therefore, in congruency with recent fMRI studies, we suggest that similar mechanisms underlie the perception of linguistically different but acoustically equivalent auditory events on the level of basic auditory analysis. |
format | Text |
id | pubmed-2231369 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2007 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-22313692008-02-06 Electrical brain imaging evidences left auditory cortex involvement in speech and non-speech discrimination based on temporal features Zaehle, Tino Jancke, Lutz Meyer, Martin Behav Brain Funct Research BACKGROUND: Speech perception is based on a variety of spectral and temporal acoustic features available in the acoustic signal. Voice-onset time (VOT) is considered an important cue that is cardinal for phonetic perception. METHODS: In the present study, we recorded and compared scalp auditory evoked potentials (AEP) in response to consonant-vowel-syllables (CV) with varying voice-onset-times (VOT) and non-speech analogues with varying noise-onset-time (NOT). In particular, we aimed to investigate the spatio-temporal pattern of acoustic feature processing underlying elemental speech perception and relate this temporal processing mechanism to specific activations of the auditory cortex. RESULTS: Results show that the characteristic AEP waveform in response to consonant-vowel-syllables is on a par with those of non-speech sounds with analogue temporal characteristics. The amplitude of the N1a and N1b component of the auditory evoked potentials significantly correlated with the duration of the VOT in CV and likewise, with the duration of the NOT in non-speech sounds. Furthermore, current density maps indicate overlapping supratemporal networks involved in the perception of both speech and non-speech sounds with a bilateral activation pattern during the N1a time window and leftward asymmetry during the N1b time window. Elaborate regional statistical analysis of the activation over the middle and posterior portion of the supratemporal plane (STP) revealed strong left lateralized responses over the middle STP for both the N1a and N1b component, and a functional leftward asymmetry over the posterior STP for the N1b component. CONCLUSION: The present data demonstrate overlapping spatio-temporal brain responses during the perception of temporal acoustic cues in both speech and non-speech sounds. Source estimation evidences a preponderant role of the left middle and posterior auditory cortex in speech and non-speech discrimination based on temporal features. Therefore, in congruency with recent fMRI studies, we suggest that similar mechanisms underlie the perception of linguistically different but acoustically equivalent auditory events on the level of basic auditory analysis. BioMed Central 2007-12-10 /pmc/articles/PMC2231369/ /pubmed/18070338 http://dx.doi.org/10.1186/1744-9081-3-63 Text en Copyright © 2007 Zaehle et al; licensee BioMed Central Ltd. http://creativecommons.org/licenses/by/2.0 This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( (http://creativecommons.org/licenses/by/2.0) ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Research Zaehle, Tino Jancke, Lutz Meyer, Martin Electrical brain imaging evidences left auditory cortex involvement in speech and non-speech discrimination based on temporal features |
title | Electrical brain imaging evidences left auditory cortex involvement in speech and non-speech discrimination based on temporal features |
title_full | Electrical brain imaging evidences left auditory cortex involvement in speech and non-speech discrimination based on temporal features |
title_fullStr | Electrical brain imaging evidences left auditory cortex involvement in speech and non-speech discrimination based on temporal features |
title_full_unstemmed | Electrical brain imaging evidences left auditory cortex involvement in speech and non-speech discrimination based on temporal features |
title_short | Electrical brain imaging evidences left auditory cortex involvement in speech and non-speech discrimination based on temporal features |
title_sort | electrical brain imaging evidences left auditory cortex involvement in speech and non-speech discrimination based on temporal features |
topic | Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2231369/ https://www.ncbi.nlm.nih.gov/pubmed/18070338 http://dx.doi.org/10.1186/1744-9081-3-63 |
work_keys_str_mv | AT zaehletino electricalbrainimagingevidencesleftauditorycortexinvolvementinspeechandnonspeechdiscriminationbasedontemporalfeatures AT janckelutz electricalbrainimagingevidencesleftauditorycortexinvolvementinspeechandnonspeechdiscriminationbasedontemporalfeatures AT meyermartin electricalbrainimagingevidencesleftauditorycortexinvolvementinspeechandnonspeechdiscriminationbasedontemporalfeatures |