Cargando…

Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer

When learning to play a musical instrument, it is important to improve the quality of self-practice. Many systems have been developed to assist practice. Some practice assistance systems use special sensors (pressure, flow, and motion sensors) to acquire the control parameters of the musical instrum...

Descripción completa

Detalles Bibliográficos
Autores principales: Kuroda, Jin, Koutaki, Gou
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8914778/
https://www.ncbi.nlm.nih.gov/pubmed/35271221
http://dx.doi.org/10.3390/s22052074
_version_ 1784667826391351296
author Kuroda, Jin
Koutaki, Gou
author_facet Kuroda, Jin
Koutaki, Gou
author_sort Kuroda, Jin
collection PubMed
description When learning to play a musical instrument, it is important to improve the quality of self-practice. Many systems have been developed to assist practice. Some practice assistance systems use special sensors (pressure, flow, and motion sensors) to acquire the control parameters of the musical instrument, and provide specific guidance. However, it is difficult to acquire the control parameters of wind instruments (e.g., saxophone or flute) such as flow and angle between the player and the musical instrument, since it is not possible to place sensors into the mouth. In this paper, we propose a sensorless control parameter estimation system based on the recorded sound of a wind instrument using only machine learning. In the machine learning framework, many training samples that have both sound and correct labels are required. Therefore, we generated training samples using a robotic performer. This has two advantages: (1) it is easy to obtain many training samples with exhaustive control parameters, and (2) we can use the correct labels as the given control parameters of the robot. In addition to the samples generated by the robot, some human performance data were also used for training to construct an estimation model that enhanced the feature differences between robot and human performance. Finally, a flute control parameter estimation system was developed, and its estimation accuracy for eight novice flute players was evaluated using the Spearman’s rank correlation coefficient. The experimental results showed that the proposed system was able to estimate human control parameters with high accuracy.
format Online
Article
Text
id pubmed-8914778
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-89147782022-03-12 Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer Kuroda, Jin Koutaki, Gou Sensors (Basel) Article When learning to play a musical instrument, it is important to improve the quality of self-practice. Many systems have been developed to assist practice. Some practice assistance systems use special sensors (pressure, flow, and motion sensors) to acquire the control parameters of the musical instrument, and provide specific guidance. However, it is difficult to acquire the control parameters of wind instruments (e.g., saxophone or flute) such as flow and angle between the player and the musical instrument, since it is not possible to place sensors into the mouth. In this paper, we propose a sensorless control parameter estimation system based on the recorded sound of a wind instrument using only machine learning. In the machine learning framework, many training samples that have both sound and correct labels are required. Therefore, we generated training samples using a robotic performer. This has two advantages: (1) it is easy to obtain many training samples with exhaustive control parameters, and (2) we can use the correct labels as the given control parameters of the robot. In addition to the samples generated by the robot, some human performance data were also used for training to construct an estimation model that enhanced the feature differences between robot and human performance. Finally, a flute control parameter estimation system was developed, and its estimation accuracy for eight novice flute players was evaluated using the Spearman’s rank correlation coefficient. The experimental results showed that the proposed system was able to estimate human control parameters with high accuracy. MDPI 2022-03-07 /pmc/articles/PMC8914778/ /pubmed/35271221 http://dx.doi.org/10.3390/s22052074 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Kuroda, Jin
Koutaki, Gou
Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer
title Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer
title_full Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer
title_fullStr Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer
title_full_unstemmed Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer
title_short Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer
title_sort sensing control parameters of flute from microphone sound based on machine learning from robotic performer
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8914778/
https://www.ncbi.nlm.nih.gov/pubmed/35271221
http://dx.doi.org/10.3390/s22052074
work_keys_str_mv AT kurodajin sensingcontrolparametersofflutefrommicrophonesoundbasedonmachinelearningfromroboticperformer
AT koutakigou sensingcontrolparametersofflutefrommicrophonesoundbasedonmachinelearningfromroboticperformer