Cargando…

3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch

Across a plethora of social situations, we touch others in natural and intuitive ways to share thoughts and emotions, such as tapping to get one’s attention or caressing to soothe one’s anxiety. A deeper understanding of these human-to-human interactions will require, in part, the precise measuremen...

Descripción completa

Detalles Bibliográficos
Autores principales: Xu, Shan, Xu, Chang, McIntyre, Sarah, Olausson, Håkan, Gerling, Gregory J.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9219726/
https://www.ncbi.nlm.nih.gov/pubmed/35755449
http://dx.doi.org/10.3389/fphys.2022.841938
_version_ 1784732188137226240
author Xu, Shan
Xu, Chang
McIntyre, Sarah
Olausson, Håkan
Gerling, Gregory J.
author_facet Xu, Shan
Xu, Chang
McIntyre, Sarah
Olausson, Håkan
Gerling, Gregory J.
author_sort Xu, Shan
collection PubMed
description Across a plethora of social situations, we touch others in natural and intuitive ways to share thoughts and emotions, such as tapping to get one’s attention or caressing to soothe one’s anxiety. A deeper understanding of these human-to-human interactions will require, in part, the precise measurement of skin-to-skin physical contact. Among prior efforts, each measurement approach exhibits certain constraints, e.g., motion trackers do not capture the precise shape of skin surfaces, while pressure sensors impede skin-to-skin contact. In contrast, this work develops an interference-free 3D visual tracking system using a depth camera to measure the contact attributes between the bare hand of a toucher and the forearm of a receiver. The toucher’s hand is tracked as a posed and positioned mesh by fitting a hand model to detected 3D hand joints, whereas a receiver’s forearm is extracted as a 3D surface updated upon repeated skin contact. Based on a contact model involving point clouds, the spatiotemporal changes of hand-to-forearm contact are decomposed as six, high-resolution, time-series contact attributes, i.e., contact area, indentation depth, absolute velocity, and three orthogonal velocity components, together with contact duration. To examine the system’s capabilities and limitations, two types of experiments were performed. First, to evaluate its ability to discern human touches, one person delivered cued social messages, e.g., happiness, anger, sympathy, to another person using their preferred gestures. The results indicated that messages and gestures, as well as the identities of the touchers, were readily discerned from their contact attributes. Second, the system’s spatiotemporal accuracy was validated against measurements from independent devices, including an electromagnetic motion tracker, sensorized pressure mat, and laser displacement sensor. While validated here in the context of social communication, this system is extendable to human touch interactions such as maternal care of infants and massage therapy.
format Online
Article
Text
id pubmed-9219726
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-92197262022-06-24 3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch Xu, Shan Xu, Chang McIntyre, Sarah Olausson, Håkan Gerling, Gregory J. Front Physiol Physiology Across a plethora of social situations, we touch others in natural and intuitive ways to share thoughts and emotions, such as tapping to get one’s attention or caressing to soothe one’s anxiety. A deeper understanding of these human-to-human interactions will require, in part, the precise measurement of skin-to-skin physical contact. Among prior efforts, each measurement approach exhibits certain constraints, e.g., motion trackers do not capture the precise shape of skin surfaces, while pressure sensors impede skin-to-skin contact. In contrast, this work develops an interference-free 3D visual tracking system using a depth camera to measure the contact attributes between the bare hand of a toucher and the forearm of a receiver. The toucher’s hand is tracked as a posed and positioned mesh by fitting a hand model to detected 3D hand joints, whereas a receiver’s forearm is extracted as a 3D surface updated upon repeated skin contact. Based on a contact model involving point clouds, the spatiotemporal changes of hand-to-forearm contact are decomposed as six, high-resolution, time-series contact attributes, i.e., contact area, indentation depth, absolute velocity, and three orthogonal velocity components, together with contact duration. To examine the system’s capabilities and limitations, two types of experiments were performed. First, to evaluate its ability to discern human touches, one person delivered cued social messages, e.g., happiness, anger, sympathy, to another person using their preferred gestures. The results indicated that messages and gestures, as well as the identities of the touchers, were readily discerned from their contact attributes. Second, the system’s spatiotemporal accuracy was validated against measurements from independent devices, including an electromagnetic motion tracker, sensorized pressure mat, and laser displacement sensor. While validated here in the context of social communication, this system is extendable to human touch interactions such as maternal care of infants and massage therapy. Frontiers Media S.A. 2022-06-09 /pmc/articles/PMC9219726/ /pubmed/35755449 http://dx.doi.org/10.3389/fphys.2022.841938 Text en Copyright © 2022 Xu, Xu, McIntyre, Olausson and Gerling. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Physiology
Xu, Shan
Xu, Chang
McIntyre, Sarah
Olausson, Håkan
Gerling, Gregory J.
3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
title 3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
title_full 3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
title_fullStr 3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
title_full_unstemmed 3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
title_short 3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
title_sort 3d visual tracking to quantify physical contact interactions in human-to-human touch
topic Physiology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9219726/
https://www.ncbi.nlm.nih.gov/pubmed/35755449
http://dx.doi.org/10.3389/fphys.2022.841938
work_keys_str_mv AT xushan 3dvisualtrackingtoquantifyphysicalcontactinteractionsinhumantohumantouch
AT xuchang 3dvisualtrackingtoquantifyphysicalcontactinteractionsinhumantohumantouch
AT mcintyresarah 3dvisualtrackingtoquantifyphysicalcontactinteractionsinhumantohumantouch
AT olaussonhakan 3dvisualtrackingtoquantifyphysicalcontactinteractionsinhumantohumantouch
AT gerlinggregoryj 3dvisualtrackingtoquantifyphysicalcontactinteractionsinhumantohumantouch