Cargando…
Integrating optical finger motion tracking with surface touch events
This paper presents a method of integrating two contrasting sensor systems for studying human interaction with a mechanical system, using piano performance as the case study. Piano technique requires both precise small-scale motion of fingers on the key surfaces and planned large-scale movement of t...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2015
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4451251/ https://www.ncbi.nlm.nih.gov/pubmed/26082732 http://dx.doi.org/10.3389/fpsyg.2015.00702 |
_version_ | 1782374113608728576 |
---|---|
author | MacRitchie, Jennifer McPherson, Andrew P. |
author_facet | MacRitchie, Jennifer McPherson, Andrew P. |
author_sort | MacRitchie, Jennifer |
collection | PubMed |
description | This paper presents a method of integrating two contrasting sensor systems for studying human interaction with a mechanical system, using piano performance as the case study. Piano technique requires both precise small-scale motion of fingers on the key surfaces and planned large-scale movement of the hands and arms. Where studies of performance often focus on one of these scales in isolation, this paper investigates the relationship between them. Two sensor systems were installed on an acoustic grand piano: a monocular high-speed camera tracking the position of painted markers on the hands, and capacitive touch sensors attach to the key surfaces which measure the location of finger-key contacts. This paper highlights a method of fusing the data from these systems, including temporal and spatial alignment, segmentation into notes and automatic fingering annotation. Three case studies demonstrate the utility of the multi-sensor data: analysis of finger flexion or extension based on touch and camera marker location, timing analysis of finger-key contact preceding and following key presses, and characterization of individual finger movements in the transitions between successive key presses. Piano performance is the focus of this paper, but the sensor method could equally apply to other fine motor control scenarios, with applications to human-computer interaction. |
format | Online Article Text |
id | pubmed-4451251 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2015 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-44512512015-06-16 Integrating optical finger motion tracking with surface touch events MacRitchie, Jennifer McPherson, Andrew P. Front Psychol Psychology This paper presents a method of integrating two contrasting sensor systems for studying human interaction with a mechanical system, using piano performance as the case study. Piano technique requires both precise small-scale motion of fingers on the key surfaces and planned large-scale movement of the hands and arms. Where studies of performance often focus on one of these scales in isolation, this paper investigates the relationship between them. Two sensor systems were installed on an acoustic grand piano: a monocular high-speed camera tracking the position of painted markers on the hands, and capacitive touch sensors attach to the key surfaces which measure the location of finger-key contacts. This paper highlights a method of fusing the data from these systems, including temporal and spatial alignment, segmentation into notes and automatic fingering annotation. Three case studies demonstrate the utility of the multi-sensor data: analysis of finger flexion or extension based on touch and camera marker location, timing analysis of finger-key contact preceding and following key presses, and characterization of individual finger movements in the transitions between successive key presses. Piano performance is the focus of this paper, but the sensor method could equally apply to other fine motor control scenarios, with applications to human-computer interaction. Frontiers Media S.A. 2015-06-02 /pmc/articles/PMC4451251/ /pubmed/26082732 http://dx.doi.org/10.3389/fpsyg.2015.00702 Text en Copyright © 2015 MacRitchie and McPherson. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Psychology MacRitchie, Jennifer McPherson, Andrew P. Integrating optical finger motion tracking with surface touch events |
title | Integrating optical finger motion tracking with surface touch events |
title_full | Integrating optical finger motion tracking with surface touch events |
title_fullStr | Integrating optical finger motion tracking with surface touch events |
title_full_unstemmed | Integrating optical finger motion tracking with surface touch events |
title_short | Integrating optical finger motion tracking with surface touch events |
title_sort | integrating optical finger motion tracking with surface touch events |
topic | Psychology |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4451251/ https://www.ncbi.nlm.nih.gov/pubmed/26082732 http://dx.doi.org/10.3389/fpsyg.2015.00702 |
work_keys_str_mv | AT macritchiejennifer integratingopticalfingermotiontrackingwithsurfacetouchevents AT mcphersonandrewp integratingopticalfingermotiontrackingwithsurfacetouchevents |