Cargando…
Optimal visual–haptic integration with articulated tools
When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when usi...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer Berlin Heidelberg
2017
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5380699/ https://www.ncbi.nlm.nih.gov/pubmed/28214998 http://dx.doi.org/10.1007/s00221-017-4896-5 |
_version_ | 1782519797309767680 |
---|---|
author | Takahashi, Chie Watt, Simon J. |
author_facet | Takahashi, Chie Watt, Simon J. |
author_sort | Takahashi, Chie |
collection | PubMed |
description | When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when using articulated tools. Such tools (tongs, pliers, etc) are a defining characteristic of human hand function, but complicate the classical sensory ‘correspondence problem’ underlying multisensory integration. Optimal integration requires establishing the relationship between signals acquired by different sensors (hand and eye) and, therefore, in fundamentally unrelated units. The system must also determine when signals refer to the same property of the world—seeing and feeling the same thing—and only integrate those that do. This could be achieved by comparing the pattern of current visual and haptic input to known statistics of their normal relationship. Articulated tools disrupt this relationship, however, by altering the geometrical relationship between object properties and hand posture (the haptic signal). We examined whether different tool configurations are taken into account in visual–haptic integration. We indexed integration by measuring the precision of size estimates, and compared our results to optimal predictions from a maximum-likelihood integrator. Integration was near optimal, independent of tool configuration/hand posture, provided that visual and haptic signals referred to the same object in the world. Thus, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account. This reveals highly flexible multisensory integration underlying tool use, consistent with the brain constructing internal models of tools’ properties. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s00221-017-4896-5) contains supplementary material, which is available to authorized users. |
format | Online Article Text |
id | pubmed-5380699 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2017 |
publisher | Springer Berlin Heidelberg |
record_format | MEDLINE/PubMed |
spelling | pubmed-53806992017-04-17 Optimal visual–haptic integration with articulated tools Takahashi, Chie Watt, Simon J. Exp Brain Res Research Article When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when using articulated tools. Such tools (tongs, pliers, etc) are a defining characteristic of human hand function, but complicate the classical sensory ‘correspondence problem’ underlying multisensory integration. Optimal integration requires establishing the relationship between signals acquired by different sensors (hand and eye) and, therefore, in fundamentally unrelated units. The system must also determine when signals refer to the same property of the world—seeing and feeling the same thing—and only integrate those that do. This could be achieved by comparing the pattern of current visual and haptic input to known statistics of their normal relationship. Articulated tools disrupt this relationship, however, by altering the geometrical relationship between object properties and hand posture (the haptic signal). We examined whether different tool configurations are taken into account in visual–haptic integration. We indexed integration by measuring the precision of size estimates, and compared our results to optimal predictions from a maximum-likelihood integrator. Integration was near optimal, independent of tool configuration/hand posture, provided that visual and haptic signals referred to the same object in the world. Thus, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account. This reveals highly flexible multisensory integration underlying tool use, consistent with the brain constructing internal models of tools’ properties. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s00221-017-4896-5) contains supplementary material, which is available to authorized users. Springer Berlin Heidelberg 2017-02-18 2017 /pmc/articles/PMC5380699/ /pubmed/28214998 http://dx.doi.org/10.1007/s00221-017-4896-5 Text en © The Author(s) 2017 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. |
spellingShingle | Research Article Takahashi, Chie Watt, Simon J. Optimal visual–haptic integration with articulated tools |
title | Optimal visual–haptic integration with articulated tools |
title_full | Optimal visual–haptic integration with articulated tools |
title_fullStr | Optimal visual–haptic integration with articulated tools |
title_full_unstemmed | Optimal visual–haptic integration with articulated tools |
title_short | Optimal visual–haptic integration with articulated tools |
title_sort | optimal visual–haptic integration with articulated tools |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5380699/ https://www.ncbi.nlm.nih.gov/pubmed/28214998 http://dx.doi.org/10.1007/s00221-017-4896-5 |
work_keys_str_mv | AT takahashichie optimalvisualhapticintegrationwitharticulatedtools AT wattsimonj optimalvisualhapticintegrationwitharticulatedtools |