Cargando…

Deep neural network model of haptic saliency

Haptic exploration usually involves stereotypical systematic movements that are adapted to the task. Here we tested whether exploration movements are also driven by physical stimulus features. We designed haptic stimuli, whose surface relief varied locally in spatial frequency, height, orientation,...

Descripción completa

Detalles Bibliográficos
Autores principales: Metzger, Anna, Toscani, Matteo, Akbarinia, Arash, Valsecchi, Matteo, Drewing, Knut
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7809404/
https://www.ncbi.nlm.nih.gov/pubmed/33446756
http://dx.doi.org/10.1038/s41598-020-80675-6
_version_ 1783637114516144128
author Metzger, Anna
Toscani, Matteo
Akbarinia, Arash
Valsecchi, Matteo
Drewing, Knut
author_facet Metzger, Anna
Toscani, Matteo
Akbarinia, Arash
Valsecchi, Matteo
Drewing, Knut
author_sort Metzger, Anna
collection PubMed
description Haptic exploration usually involves stereotypical systematic movements that are adapted to the task. Here we tested whether exploration movements are also driven by physical stimulus features. We designed haptic stimuli, whose surface relief varied locally in spatial frequency, height, orientation, and anisotropy. In Experiment 1, participants subsequently explored two stimuli in order to decide whether they were same or different. We trained a variational autoencoder to predict the spatial distribution of touch duration from the surface relief of the haptic stimuli. The model successfully predicted where participants touched the stimuli. It could also predict participants’ touch distribution from the stimulus’ surface relief when tested with two new groups of participants, who performed a different task (Exp. 2) or explored different stimuli (Exp. 3). We further generated a large number of virtual surface reliefs (uniformly expressing a certain combination of features) and correlated the model’s responses with stimulus properties to understand the model’s preferences in order to infer which stimulus features were preferentially touched by participants. Our results indicate that haptic exploratory behavior is to some extent driven by the physical features of the stimuli, with e.g. edge-like structures, vertical and horizontal patterns, and rough regions being explored in more detail.
format Online
Article
Text
id pubmed-7809404
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-78094042021-01-15 Deep neural network model of haptic saliency Metzger, Anna Toscani, Matteo Akbarinia, Arash Valsecchi, Matteo Drewing, Knut Sci Rep Article Haptic exploration usually involves stereotypical systematic movements that are adapted to the task. Here we tested whether exploration movements are also driven by physical stimulus features. We designed haptic stimuli, whose surface relief varied locally in spatial frequency, height, orientation, and anisotropy. In Experiment 1, participants subsequently explored two stimuli in order to decide whether they were same or different. We trained a variational autoencoder to predict the spatial distribution of touch duration from the surface relief of the haptic stimuli. The model successfully predicted where participants touched the stimuli. It could also predict participants’ touch distribution from the stimulus’ surface relief when tested with two new groups of participants, who performed a different task (Exp. 2) or explored different stimuli (Exp. 3). We further generated a large number of virtual surface reliefs (uniformly expressing a certain combination of features) and correlated the model’s responses with stimulus properties to understand the model’s preferences in order to infer which stimulus features were preferentially touched by participants. Our results indicate that haptic exploratory behavior is to some extent driven by the physical features of the stimuli, with e.g. edge-like structures, vertical and horizontal patterns, and rough regions being explored in more detail. Nature Publishing Group UK 2021-01-14 /pmc/articles/PMC7809404/ /pubmed/33446756 http://dx.doi.org/10.1038/s41598-020-80675-6 Text en © The Author(s) 2021 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Article
Metzger, Anna
Toscani, Matteo
Akbarinia, Arash
Valsecchi, Matteo
Drewing, Knut
Deep neural network model of haptic saliency
title Deep neural network model of haptic saliency
title_full Deep neural network model of haptic saliency
title_fullStr Deep neural network model of haptic saliency
title_full_unstemmed Deep neural network model of haptic saliency
title_short Deep neural network model of haptic saliency
title_sort deep neural network model of haptic saliency
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7809404/
https://www.ncbi.nlm.nih.gov/pubmed/33446756
http://dx.doi.org/10.1038/s41598-020-80675-6
work_keys_str_mv AT metzgeranna deepneuralnetworkmodelofhapticsaliency
AT toscanimatteo deepneuralnetworkmodelofhapticsaliency
AT akbariniaarash deepneuralnetworkmodelofhapticsaliency
AT valsecchimatteo deepneuralnetworkmodelofhapticsaliency
AT drewingknut deepneuralnetworkmodelofhapticsaliency