Cargando…
A supramodal and conceptual representation of subsecond time revealed with perceptual learning of temporal interval discrimination
Subsecond time perception has been frequently attributed to modality-specific timing mechanisms that would predict no cross-modal transfer of temporal perceptual learning. In fact, perceptual learning of temporal interval discrimination (TID) reportedly shows either no cross-modal transfer, or asymm...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9226181/ https://www.ncbi.nlm.nih.gov/pubmed/35739220 http://dx.doi.org/10.1038/s41598-022-14698-6 |
_version_ | 1784733801443753984 |
---|---|
author | Xiong, Ying-Zi Guan, Shu-Chen Yu, Cong |
author_facet | Xiong, Ying-Zi Guan, Shu-Chen Yu, Cong |
author_sort | Xiong, Ying-Zi |
collection | PubMed |
description | Subsecond time perception has been frequently attributed to modality-specific timing mechanisms that would predict no cross-modal transfer of temporal perceptual learning. In fact, perceptual learning of temporal interval discrimination (TID) reportedly shows either no cross-modal transfer, or asymmetric transfer from audition to vision, but not vice versa. However, here we demonstrate complete cross-modal transfer of auditory and visual TID learning using a double training paradigm. Specifically, visual TID learning transfers to and optimizes auditory TID when the participants also receive exposure to the auditory temporal interval by practicing a functionally orthogonal near-threshold tone frequency discrimination task at the same trained interval. Auditory TID learning also transfers to and optimizes visual TID with additional practice of an orthogonal near-threshold visual contrast discrimination task at the same trained interval. Practicing these functionally orthogonal tasks per se has no impact on TID thresholds. We interpret the transfer results as indications of a supramodal representation of subsecond time. Moreover, because TID learning shows complete transfer between modalities with vastly different temporal precisions, the sub-second time presentation must be conceptual. Double training may refine this supramodal and conceptual subsecond time representation and connect it to a new sense to improve time perception. |
format | Online Article Text |
id | pubmed-9226181 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-92261812022-06-25 A supramodal and conceptual representation of subsecond time revealed with perceptual learning of temporal interval discrimination Xiong, Ying-Zi Guan, Shu-Chen Yu, Cong Sci Rep Article Subsecond time perception has been frequently attributed to modality-specific timing mechanisms that would predict no cross-modal transfer of temporal perceptual learning. In fact, perceptual learning of temporal interval discrimination (TID) reportedly shows either no cross-modal transfer, or asymmetric transfer from audition to vision, but not vice versa. However, here we demonstrate complete cross-modal transfer of auditory and visual TID learning using a double training paradigm. Specifically, visual TID learning transfers to and optimizes auditory TID when the participants also receive exposure to the auditory temporal interval by practicing a functionally orthogonal near-threshold tone frequency discrimination task at the same trained interval. Auditory TID learning also transfers to and optimizes visual TID with additional practice of an orthogonal near-threshold visual contrast discrimination task at the same trained interval. Practicing these functionally orthogonal tasks per se has no impact on TID thresholds. We interpret the transfer results as indications of a supramodal representation of subsecond time. Moreover, because TID learning shows complete transfer between modalities with vastly different temporal precisions, the sub-second time presentation must be conceptual. Double training may refine this supramodal and conceptual subsecond time representation and connect it to a new sense to improve time perception. Nature Publishing Group UK 2022-06-23 /pmc/articles/PMC9226181/ /pubmed/35739220 http://dx.doi.org/10.1038/s41598-022-14698-6 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Xiong, Ying-Zi Guan, Shu-Chen Yu, Cong A supramodal and conceptual representation of subsecond time revealed with perceptual learning of temporal interval discrimination |
title | A supramodal and conceptual representation of subsecond time revealed with perceptual learning of temporal interval discrimination |
title_full | A supramodal and conceptual representation of subsecond time revealed with perceptual learning of temporal interval discrimination |
title_fullStr | A supramodal and conceptual representation of subsecond time revealed with perceptual learning of temporal interval discrimination |
title_full_unstemmed | A supramodal and conceptual representation of subsecond time revealed with perceptual learning of temporal interval discrimination |
title_short | A supramodal and conceptual representation of subsecond time revealed with perceptual learning of temporal interval discrimination |
title_sort | supramodal and conceptual representation of subsecond time revealed with perceptual learning of temporal interval discrimination |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9226181/ https://www.ncbi.nlm.nih.gov/pubmed/35739220 http://dx.doi.org/10.1038/s41598-022-14698-6 |
work_keys_str_mv | AT xiongyingzi asupramodalandconceptualrepresentationofsubsecondtimerevealedwithperceptuallearningoftemporalintervaldiscrimination AT guanshuchen asupramodalandconceptualrepresentationofsubsecondtimerevealedwithperceptuallearningoftemporalintervaldiscrimination AT yucong asupramodalandconceptualrepresentationofsubsecondtimerevealedwithperceptuallearningoftemporalintervaldiscrimination AT xiongyingzi supramodalandconceptualrepresentationofsubsecondtimerevealedwithperceptuallearningoftemporalintervaldiscrimination AT guanshuchen supramodalandconceptualrepresentationofsubsecondtimerevealedwithperceptuallearningoftemporalintervaldiscrimination AT yucong supramodalandconceptualrepresentationofsubsecondtimerevealedwithperceptuallearningoftemporalintervaldiscrimination |