Cargando…
Crossmodal Pattern Discrimination in Humans and Robots: A Visuo-Tactile Case Study
The quality of crossmodal perception hinges on two factors: The accuracy of the independent unimodal perception and the ability to integrate information from different sensory systems. In humans, the ability for cognitively demanding crossmodal perception diminishes from young to old age. Here, we p...
Autores principales: | , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7805622/ https://www.ncbi.nlm.nih.gov/pubmed/33501309 http://dx.doi.org/10.3389/frobt.2020.540565 |
_version_ | 1783636341388476416 |
---|---|
author | Higgen, Focko L. Ruppel, Philipp Görner, Michael Kerzel, Matthias Hendrich, Norman Feldheim, Jan Wermter, Stefan Zhang, Jianwei Gerloff, Christian |
author_facet | Higgen, Focko L. Ruppel, Philipp Görner, Michael Kerzel, Matthias Hendrich, Norman Feldheim, Jan Wermter, Stefan Zhang, Jianwei Gerloff, Christian |
author_sort | Higgen, Focko L. |
collection | PubMed |
description | The quality of crossmodal perception hinges on two factors: The accuracy of the independent unimodal perception and the ability to integrate information from different sensory systems. In humans, the ability for cognitively demanding crossmodal perception diminishes from young to old age. Here, we propose a new approach to research to which degree the different factors contribute to crossmodal processing and the age-related decline by replicating a medical study on visuo-tactile crossmodal pattern discrimination utilizing state-of-the-art tactile sensing technology and artificial neural networks (ANN). We implemented two ANN models to specifically focus on the relevance of early integration of sensory information during the crossmodal processing stream as a mechanism proposed for efficient processing in the human brain. Applying an adaptive staircase procedure, we approached comparable unimodal classification performance for both modalities in the human participants as well as the ANN. This allowed us to compare crossmodal performance between and within the systems, independent of the underlying unimodal processes. Our data show that unimodal classification accuracies of the tactile sensing technology are comparable to humans. For crossmodal discrimination of the ANN the integration of high-level unimodal features on earlier stages of the crossmodal processing stream shows higher accuracies compared to the late integration of independent unimodal classifications. In comparison to humans, the ANN show higher accuracies than older participants in the unimodal as well as the crossmodal condition, but lower accuracies than younger participants in the crossmodal task. Taken together, we can show that state-of-the-art tactile sensing technology is able to perform a complex tactile recognition task at levels comparable to humans. For crossmodal processing, human inspired early sensory integration seems to improve the performance of artificial neural networks. Still, younger participants seem to employ more efficient crossmodal integration mechanisms than modeled in the proposed ANN. Our work demonstrates how collaborative research in neuroscience and embodied artificial neurocognitive models can help to derive models to inform the design of future neurocomputational architectures. |
format | Online Article Text |
id | pubmed-7805622 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-78056222021-01-25 Crossmodal Pattern Discrimination in Humans and Robots: A Visuo-Tactile Case Study Higgen, Focko L. Ruppel, Philipp Görner, Michael Kerzel, Matthias Hendrich, Norman Feldheim, Jan Wermter, Stefan Zhang, Jianwei Gerloff, Christian Front Robot AI Robotics and AI The quality of crossmodal perception hinges on two factors: The accuracy of the independent unimodal perception and the ability to integrate information from different sensory systems. In humans, the ability for cognitively demanding crossmodal perception diminishes from young to old age. Here, we propose a new approach to research to which degree the different factors contribute to crossmodal processing and the age-related decline by replicating a medical study on visuo-tactile crossmodal pattern discrimination utilizing state-of-the-art tactile sensing technology and artificial neural networks (ANN). We implemented two ANN models to specifically focus on the relevance of early integration of sensory information during the crossmodal processing stream as a mechanism proposed for efficient processing in the human brain. Applying an adaptive staircase procedure, we approached comparable unimodal classification performance for both modalities in the human participants as well as the ANN. This allowed us to compare crossmodal performance between and within the systems, independent of the underlying unimodal processes. Our data show that unimodal classification accuracies of the tactile sensing technology are comparable to humans. For crossmodal discrimination of the ANN the integration of high-level unimodal features on earlier stages of the crossmodal processing stream shows higher accuracies compared to the late integration of independent unimodal classifications. In comparison to humans, the ANN show higher accuracies than older participants in the unimodal as well as the crossmodal condition, but lower accuracies than younger participants in the crossmodal task. Taken together, we can show that state-of-the-art tactile sensing technology is able to perform a complex tactile recognition task at levels comparable to humans. For crossmodal processing, human inspired early sensory integration seems to improve the performance of artificial neural networks. Still, younger participants seem to employ more efficient crossmodal integration mechanisms than modeled in the proposed ANN. Our work demonstrates how collaborative research in neuroscience and embodied artificial neurocognitive models can help to derive models to inform the design of future neurocomputational architectures. Frontiers Media S.A. 2020-12-23 /pmc/articles/PMC7805622/ /pubmed/33501309 http://dx.doi.org/10.3389/frobt.2020.540565 Text en Copyright © 2020 Higgen, Ruppel, Görner, Kerzel, Hendrich, Feldheim, Wermter, Zhang and Gerloff. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Robotics and AI Higgen, Focko L. Ruppel, Philipp Görner, Michael Kerzel, Matthias Hendrich, Norman Feldheim, Jan Wermter, Stefan Zhang, Jianwei Gerloff, Christian Crossmodal Pattern Discrimination in Humans and Robots: A Visuo-Tactile Case Study |
title | Crossmodal Pattern Discrimination in Humans and Robots: A Visuo-Tactile Case Study |
title_full | Crossmodal Pattern Discrimination in Humans and Robots: A Visuo-Tactile Case Study |
title_fullStr | Crossmodal Pattern Discrimination in Humans and Robots: A Visuo-Tactile Case Study |
title_full_unstemmed | Crossmodal Pattern Discrimination in Humans and Robots: A Visuo-Tactile Case Study |
title_short | Crossmodal Pattern Discrimination in Humans and Robots: A Visuo-Tactile Case Study |
title_sort | crossmodal pattern discrimination in humans and robots: a visuo-tactile case study |
topic | Robotics and AI |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7805622/ https://www.ncbi.nlm.nih.gov/pubmed/33501309 http://dx.doi.org/10.3389/frobt.2020.540565 |
work_keys_str_mv | AT higgenfockol crossmodalpatterndiscriminationinhumansandrobotsavisuotactilecasestudy AT ruppelphilipp crossmodalpatterndiscriminationinhumansandrobotsavisuotactilecasestudy AT gornermichael crossmodalpatterndiscriminationinhumansandrobotsavisuotactilecasestudy AT kerzelmatthias crossmodalpatterndiscriminationinhumansandrobotsavisuotactilecasestudy AT hendrichnorman crossmodalpatterndiscriminationinhumansandrobotsavisuotactilecasestudy AT feldheimjan crossmodalpatterndiscriminationinhumansandrobotsavisuotactilecasestudy AT wermterstefan crossmodalpatterndiscriminationinhumansandrobotsavisuotactilecasestudy AT zhangjianwei crossmodalpatterndiscriminationinhumansandrobotsavisuotactilecasestudy AT gerloffchristian crossmodalpatterndiscriminationinhumansandrobotsavisuotactilecasestudy |