Cargando…
A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images
Although there is a large consensus regarding the involvement of specific acoustic cues in speech perception, the precise mechanisms underlying the transformation from continuous acoustical properties into discrete perceptual units remains undetermined. This gap in knowledge is partially due to the...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2015
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4364617/ https://www.ncbi.nlm.nih.gov/pubmed/25781470 http://dx.doi.org/10.1371/journal.pone.0118009 |
_version_ | 1782362092920111104 |
---|---|
author | Varnet, Léo Knoblauch, Kenneth Serniclaes, Willy Meunier, Fanny Hoen, Michel |
author_facet | Varnet, Léo Knoblauch, Kenneth Serniclaes, Willy Meunier, Fanny Hoen, Michel |
author_sort | Varnet, Léo |
collection | PubMed |
description | Although there is a large consensus regarding the involvement of specific acoustic cues in speech perception, the precise mechanisms underlying the transformation from continuous acoustical properties into discrete perceptual units remains undetermined. This gap in knowledge is partially due to the lack of a turnkey solution for isolating critical speech cues from natural stimuli. In this paper, we describe a psychoacoustic imaging method known as the Auditory Classification Image technique that allows experimenters to estimate the relative importance of time-frequency regions in categorizing natural speech utterances in noise. Importantly, this technique enables the testing of hypotheses on the listening strategies of participants at the group level. We exemplify this approach by identifying the acoustic cues involved in da/ga categorization with two phonetic contexts, Al- or Ar-. The application of Auditory Classification Images to our group of 16 participants revealed significant critical regions on the second and third formant onsets, as predicted by the literature, as well as an unexpected temporal cue on the first formant. Finally, through a cluster-based nonparametric test, we demonstrate that this method is sufficiently sensitive to detect fine modifications of the classification strategies between different utterances of the same phoneme. |
format | Online Article Text |
id | pubmed-4364617 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2015 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-43646172015-03-23 A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images Varnet, Léo Knoblauch, Kenneth Serniclaes, Willy Meunier, Fanny Hoen, Michel PLoS One Research Article Although there is a large consensus regarding the involvement of specific acoustic cues in speech perception, the precise mechanisms underlying the transformation from continuous acoustical properties into discrete perceptual units remains undetermined. This gap in knowledge is partially due to the lack of a turnkey solution for isolating critical speech cues from natural stimuli. In this paper, we describe a psychoacoustic imaging method known as the Auditory Classification Image technique that allows experimenters to estimate the relative importance of time-frequency regions in categorizing natural speech utterances in noise. Importantly, this technique enables the testing of hypotheses on the listening strategies of participants at the group level. We exemplify this approach by identifying the acoustic cues involved in da/ga categorization with two phonetic contexts, Al- or Ar-. The application of Auditory Classification Images to our group of 16 participants revealed significant critical regions on the second and third formant onsets, as predicted by the literature, as well as an unexpected temporal cue on the first formant. Finally, through a cluster-based nonparametric test, we demonstrate that this method is sufficiently sensitive to detect fine modifications of the classification strategies between different utterances of the same phoneme. Public Library of Science 2015-03-17 /pmc/articles/PMC4364617/ /pubmed/25781470 http://dx.doi.org/10.1371/journal.pone.0118009 Text en © 2015 Varnet et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited. |
spellingShingle | Research Article Varnet, Léo Knoblauch, Kenneth Serniclaes, Willy Meunier, Fanny Hoen, Michel A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images |
title | A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images |
title_full | A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images |
title_fullStr | A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images |
title_full_unstemmed | A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images |
title_short | A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images |
title_sort | psychophysical imaging method evidencing auditory cue extraction during speech perception: a group analysis of auditory classification images |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4364617/ https://www.ncbi.nlm.nih.gov/pubmed/25781470 http://dx.doi.org/10.1371/journal.pone.0118009 |
work_keys_str_mv | AT varnetleo apsychophysicalimagingmethodevidencingauditorycueextractionduringspeechperceptionagroupanalysisofauditoryclassificationimages AT knoblauchkenneth apsychophysicalimagingmethodevidencingauditorycueextractionduringspeechperceptionagroupanalysisofauditoryclassificationimages AT serniclaeswilly apsychophysicalimagingmethodevidencingauditorycueextractionduringspeechperceptionagroupanalysisofauditoryclassificationimages AT meunierfanny apsychophysicalimagingmethodevidencingauditorycueextractionduringspeechperceptionagroupanalysisofauditoryclassificationimages AT hoenmichel apsychophysicalimagingmethodevidencingauditorycueextractionduringspeechperceptionagroupanalysisofauditoryclassificationimages AT varnetleo psychophysicalimagingmethodevidencingauditorycueextractionduringspeechperceptionagroupanalysisofauditoryclassificationimages AT knoblauchkenneth psychophysicalimagingmethodevidencingauditorycueextractionduringspeechperceptionagroupanalysisofauditoryclassificationimages AT serniclaeswilly psychophysicalimagingmethodevidencingauditorycueextractionduringspeechperceptionagroupanalysisofauditoryclassificationimages AT meunierfanny psychophysicalimagingmethodevidencingauditorycueextractionduringspeechperceptionagroupanalysisofauditoryclassificationimages AT hoenmichel psychophysicalimagingmethodevidencingauditorycueextractionduringspeechperceptionagroupanalysisofauditoryclassificationimages |