Cargando…
Quantifying Auditory Temporal Stability in a Large Database of Recorded Music
“Moving to the beat” is both one of the most basic and one of the most profound means by which humans (and a few other species) interact with music. Computer algorithms that detect the precise temporal location of beats (i.e., pulses of musical “energy”) in recorded music have important practical ap...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2014
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4254286/ https://www.ncbi.nlm.nih.gov/pubmed/25469636 http://dx.doi.org/10.1371/journal.pone.0110452 |
_version_ | 1782347330759950336 |
---|---|
author | Ellis, Robert J. Duan, Zhiyan Wang, Ye |
author_facet | Ellis, Robert J. Duan, Zhiyan Wang, Ye |
author_sort | Ellis, Robert J. |
collection | PubMed |
description | “Moving to the beat” is both one of the most basic and one of the most profound means by which humans (and a few other species) interact with music. Computer algorithms that detect the precise temporal location of beats (i.e., pulses of musical “energy”) in recorded music have important practical applications, such as the creation of playlists with a particular tempo for rehabilitation (e.g., rhythmic gait training), exercise (e.g., jogging), or entertainment (e.g., continuous dance mixes). Although several such algorithms return simple point estimates of an audio file’s temporal structure (e.g., “average tempo”, “time signature”), none has sought to quantify the temporal stability of a series of detected beats. Such a method-a “Balanced Evaluation of Auditory Temporal Stability” (BEATS)–is proposed here, and is illustrated using the Million Song Dataset (a collection of audio features and music metadata for nearly one million audio files). A publically accessible web interface is also presented, which combines the thresholdable statistics of BEATS with queryable metadata terms, fostering potential avenues of research and facilitating the creation of highly personalized music playlists for clinical or recreational applications. |
format | Online Article Text |
id | pubmed-4254286 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2014 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-42542862014-12-11 Quantifying Auditory Temporal Stability in a Large Database of Recorded Music Ellis, Robert J. Duan, Zhiyan Wang, Ye PLoS One Research Article “Moving to the beat” is both one of the most basic and one of the most profound means by which humans (and a few other species) interact with music. Computer algorithms that detect the precise temporal location of beats (i.e., pulses of musical “energy”) in recorded music have important practical applications, such as the creation of playlists with a particular tempo for rehabilitation (e.g., rhythmic gait training), exercise (e.g., jogging), or entertainment (e.g., continuous dance mixes). Although several such algorithms return simple point estimates of an audio file’s temporal structure (e.g., “average tempo”, “time signature”), none has sought to quantify the temporal stability of a series of detected beats. Such a method-a “Balanced Evaluation of Auditory Temporal Stability” (BEATS)–is proposed here, and is illustrated using the Million Song Dataset (a collection of audio features and music metadata for nearly one million audio files). A publically accessible web interface is also presented, which combines the thresholdable statistics of BEATS with queryable metadata terms, fostering potential avenues of research and facilitating the creation of highly personalized music playlists for clinical or recreational applications. Public Library of Science 2014-12-03 /pmc/articles/PMC4254286/ /pubmed/25469636 http://dx.doi.org/10.1371/journal.pone.0110452 Text en © 2014 Ellis et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited. |
spellingShingle | Research Article Ellis, Robert J. Duan, Zhiyan Wang, Ye Quantifying Auditory Temporal Stability in a Large Database of Recorded Music |
title | Quantifying Auditory Temporal Stability in a Large Database of Recorded Music |
title_full | Quantifying Auditory Temporal Stability in a Large Database of Recorded Music |
title_fullStr | Quantifying Auditory Temporal Stability in a Large Database of Recorded Music |
title_full_unstemmed | Quantifying Auditory Temporal Stability in a Large Database of Recorded Music |
title_short | Quantifying Auditory Temporal Stability in a Large Database of Recorded Music |
title_sort | quantifying auditory temporal stability in a large database of recorded music |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4254286/ https://www.ncbi.nlm.nih.gov/pubmed/25469636 http://dx.doi.org/10.1371/journal.pone.0110452 |
work_keys_str_mv | AT ellisrobertj quantifyingauditorytemporalstabilityinalargedatabaseofrecordedmusic AT duanzhiyan quantifyingauditorytemporalstabilityinalargedatabaseofrecordedmusic AT wangye quantifyingauditorytemporalstabilityinalargedatabaseofrecordedmusic |