Cargando…
Visual-Tactile Speech Perception and the Autism Quotient
Multisensory information is integrated asymmetrically in speech perception: An audio signal can follow video by 240ms, but can precede video by only 60ms, without disrupting the sense of synchronicity (Munhall et al., 1996). Similarly, air flow can follow either audio (Gick et al., 2010) or video (B...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8802876/ https://www.ncbi.nlm.nih.gov/pubmed/35106291 http://dx.doi.org/10.3389/fcomm.2018.00061 |
_version_ | 1784642763665440768 |
---|---|
author | Derrick, Donald Bicevskis, Katie Gick, Bryan |
author_facet | Derrick, Donald Bicevskis, Katie Gick, Bryan |
author_sort | Derrick, Donald |
collection | PubMed |
description | Multisensory information is integrated asymmetrically in speech perception: An audio signal can follow video by 240ms, but can precede video by only 60ms, without disrupting the sense of synchronicity (Munhall et al., 1996). Similarly, air flow can follow either audio (Gick et al., 2010) or video (Bicevskis et al., 2016) by a much larger margin than it can precede either while remaining perceptually synchronous. These asymmetric windows of integration have been attributed to the physical properties of the signals; light travels faster than sound (Munhall et al., 1996), and sound travels faster than air flow (Gick et al., 2010). Perceptual windows of integration narrow during development (Hillock-Dunn and Wallace, 2012), but remain wider among people with autism (Wallace and Stevenson, 2014). Here we show that, even among neurotypical adult perceivers, visual-tactile windows of integration are wider and flatter the higher the participant’s Autism Quotient (AQ) (Baron-Cohen et al., 2001), a self-report measure of autistic traits. As “pa” is produced with a tiny burst of aspiration (Derrick et al., 2009), we applied light and inaudible air puffs to participants’ necks while they watched silent videos of a person saying “ba” or “pa,” with puffs presented both synchronously and at varying degrees of asynchrony relative to the recorded plosive release burst, which itself is time-aligned to visible lip opening. All syllables seen along with cutaneous air puffs were more likely to be perceived as “pa.” Syllables were perceived as “pa” most often when the air puff occurred 50–100ms after lip opening, with decaying probability as asynchrony increased. Integration was less dependent on time-alignment the higher the participant’s AQ. Perceivers integrate event-relevant tactile information in visual speech perception with greater reliance upon event-related accuracy the more they self-describe as neurotypical, supporting the Happé and Frith (2006) weak coherence account of autism spectrum disorder (ASD). |
format | Online Article Text |
id | pubmed-8802876 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
record_format | MEDLINE/PubMed |
spelling | pubmed-88028762022-01-31 Visual-Tactile Speech Perception and the Autism Quotient Derrick, Donald Bicevskis, Katie Gick, Bryan Front Commun (Lausanne) Article Multisensory information is integrated asymmetrically in speech perception: An audio signal can follow video by 240ms, but can precede video by only 60ms, without disrupting the sense of synchronicity (Munhall et al., 1996). Similarly, air flow can follow either audio (Gick et al., 2010) or video (Bicevskis et al., 2016) by a much larger margin than it can precede either while remaining perceptually synchronous. These asymmetric windows of integration have been attributed to the physical properties of the signals; light travels faster than sound (Munhall et al., 1996), and sound travels faster than air flow (Gick et al., 2010). Perceptual windows of integration narrow during development (Hillock-Dunn and Wallace, 2012), but remain wider among people with autism (Wallace and Stevenson, 2014). Here we show that, even among neurotypical adult perceivers, visual-tactile windows of integration are wider and flatter the higher the participant’s Autism Quotient (AQ) (Baron-Cohen et al., 2001), a self-report measure of autistic traits. As “pa” is produced with a tiny burst of aspiration (Derrick et al., 2009), we applied light and inaudible air puffs to participants’ necks while they watched silent videos of a person saying “ba” or “pa,” with puffs presented both synchronously and at varying degrees of asynchrony relative to the recorded plosive release burst, which itself is time-aligned to visible lip opening. All syllables seen along with cutaneous air puffs were more likely to be perceived as “pa.” Syllables were perceived as “pa” most often when the air puff occurred 50–100ms after lip opening, with decaying probability as asynchrony increased. Integration was less dependent on time-alignment the higher the participant’s AQ. Perceivers integrate event-relevant tactile information in visual speech perception with greater reliance upon event-related accuracy the more they self-describe as neurotypical, supporting the Happé and Frith (2006) weak coherence account of autism spectrum disorder (ASD). 2019-01 2019-01-07 /pmc/articles/PMC8802876/ /pubmed/35106291 http://dx.doi.org/10.3389/fcomm.2018.00061 Text en https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Article Derrick, Donald Bicevskis, Katie Gick, Bryan Visual-Tactile Speech Perception and the Autism Quotient |
title | Visual-Tactile Speech Perception and the Autism Quotient |
title_full | Visual-Tactile Speech Perception and the Autism Quotient |
title_fullStr | Visual-Tactile Speech Perception and the Autism Quotient |
title_full_unstemmed | Visual-Tactile Speech Perception and the Autism Quotient |
title_short | Visual-Tactile Speech Perception and the Autism Quotient |
title_sort | visual-tactile speech perception and the autism quotient |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8802876/ https://www.ncbi.nlm.nih.gov/pubmed/35106291 http://dx.doi.org/10.3389/fcomm.2018.00061 |
work_keys_str_mv | AT derrickdonald visualtactilespeechperceptionandtheautismquotient AT bicevskiskatie visualtactilespeechperceptionandtheautismquotient AT gickbryan visualtactilespeechperceptionandtheautismquotient |