Cargando…

Cross-modal metacognition: Visual and tactile confidence share a common scale

Humans can judge the quality of their perceptual decisions—an ability known as perceptual confidence. Previous work suggested that confidence can be evaluated on an abstract scale that can be sensory modality-independent or even domain-general. However, evidence is still scarce on whether confidence...

Descripción completa

Detalles Bibliográficos
Autores principales: Klever, Lena, Beyvers, Marie Christin, Fiehler, Katja, Mamassian, Pascal, Billino, Jutta
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Association for Research in Vision and Ophthalmology 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10166118/
https://www.ncbi.nlm.nih.gov/pubmed/37140913
http://dx.doi.org/10.1167/jov.23.5.3
_version_ 1785038379269750784
author Klever, Lena
Beyvers, Marie Christin
Fiehler, Katja
Mamassian, Pascal
Billino, Jutta
author_facet Klever, Lena
Beyvers, Marie Christin
Fiehler, Katja
Mamassian, Pascal
Billino, Jutta
author_sort Klever, Lena
collection PubMed
description Humans can judge the quality of their perceptual decisions—an ability known as perceptual confidence. Previous work suggested that confidence can be evaluated on an abstract scale that can be sensory modality-independent or even domain-general. However, evidence is still scarce on whether confidence judgments can be directly made across visual and tactile decisions. Here, we investigated in a sample of 56 adults whether visual and tactile confidence share a common scale by measuring visual contrast and vibrotactile discrimination thresholds in a confidence-forced choice paradigm. Confidence judgments were made about the correctness of the perceptual decision between two trials involving either the same or different modalities. To estimate confidence efficiency, we compared discrimination thresholds obtained from all trials to those from trials judged to be relatively more confident. We found evidence for metaperception because higher confidence was associated with better perceptual performance in both modalities. Importantly, participants were able to judge their confidence across modalities without any costs in metaperceptual sensitivity and only minor changes in response times compared to unimodal confidence judgments. In addition, we were able to predict cross-modal confidence well from unimodal judgments. In conclusion, our findings show that perceptual confidence is computed on an abstract scale and that it can assess the quality of our decisions across sensory modalities.
format Online
Article
Text
id pubmed-10166118
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher The Association for Research in Vision and Ophthalmology
record_format MEDLINE/PubMed
spelling pubmed-101661182023-05-09 Cross-modal metacognition: Visual and tactile confidence share a common scale Klever, Lena Beyvers, Marie Christin Fiehler, Katja Mamassian, Pascal Billino, Jutta J Vis Article Humans can judge the quality of their perceptual decisions—an ability known as perceptual confidence. Previous work suggested that confidence can be evaluated on an abstract scale that can be sensory modality-independent or even domain-general. However, evidence is still scarce on whether confidence judgments can be directly made across visual and tactile decisions. Here, we investigated in a sample of 56 adults whether visual and tactile confidence share a common scale by measuring visual contrast and vibrotactile discrimination thresholds in a confidence-forced choice paradigm. Confidence judgments were made about the correctness of the perceptual decision between two trials involving either the same or different modalities. To estimate confidence efficiency, we compared discrimination thresholds obtained from all trials to those from trials judged to be relatively more confident. We found evidence for metaperception because higher confidence was associated with better perceptual performance in both modalities. Importantly, participants were able to judge their confidence across modalities without any costs in metaperceptual sensitivity and only minor changes in response times compared to unimodal confidence judgments. In addition, we were able to predict cross-modal confidence well from unimodal judgments. In conclusion, our findings show that perceptual confidence is computed on an abstract scale and that it can assess the quality of our decisions across sensory modalities. The Association for Research in Vision and Ophthalmology 2023-05-04 /pmc/articles/PMC10166118/ /pubmed/37140913 http://dx.doi.org/10.1167/jov.23.5.3 Text en Copyright 2023 The Authors https://creativecommons.org/licenses/by-nc-nd/4.0/This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
spellingShingle Article
Klever, Lena
Beyvers, Marie Christin
Fiehler, Katja
Mamassian, Pascal
Billino, Jutta
Cross-modal metacognition: Visual and tactile confidence share a common scale
title Cross-modal metacognition: Visual and tactile confidence share a common scale
title_full Cross-modal metacognition: Visual and tactile confidence share a common scale
title_fullStr Cross-modal metacognition: Visual and tactile confidence share a common scale
title_full_unstemmed Cross-modal metacognition: Visual and tactile confidence share a common scale
title_short Cross-modal metacognition: Visual and tactile confidence share a common scale
title_sort cross-modal metacognition: visual and tactile confidence share a common scale
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10166118/
https://www.ncbi.nlm.nih.gov/pubmed/37140913
http://dx.doi.org/10.1167/jov.23.5.3
work_keys_str_mv AT kleverlena crossmodalmetacognitionvisualandtactileconfidenceshareacommonscale
AT beyversmariechristin crossmodalmetacognitionvisualandtactileconfidenceshareacommonscale
AT fiehlerkatja crossmodalmetacognitionvisualandtactileconfidenceshareacommonscale
AT mamassianpascal crossmodalmetacognitionvisualandtactileconfidenceshareacommonscale
AT billinojutta crossmodalmetacognitionvisualandtactileconfidenceshareacommonscale