Cargando…

MERP: A Music Dataset with Emotion Ratings and Raters’ Profile Information

Music is capable of conveying many emotions. The level and type of emotion of the music perceived by a listener, however, is highly subjective. In this study, we present the Music Emotion Recognition with Profile information dataset (MERP). This database was collected through Amazon Mechanical Turk...

Descripción completa

Detalles Bibliográficos
Autores principales: Koh, En Yan, Cheuk, Kin Wai, Heung, Kwan Yee, Agres, Kat R., Herremans, Dorien
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9824842/
https://www.ncbi.nlm.nih.gov/pubmed/36616980
http://dx.doi.org/10.3390/s23010382
_version_ 1784866509521158144
author Koh, En Yan
Cheuk, Kin Wai
Heung, Kwan Yee
Agres, Kat R.
Herremans, Dorien
author_facet Koh, En Yan
Cheuk, Kin Wai
Heung, Kwan Yee
Agres, Kat R.
Herremans, Dorien
author_sort Koh, En Yan
collection PubMed
description Music is capable of conveying many emotions. The level and type of emotion of the music perceived by a listener, however, is highly subjective. In this study, we present the Music Emotion Recognition with Profile information dataset (MERP). This database was collected through Amazon Mechanical Turk (MTurk) and features dynamical valence and arousal ratings of 54 selected full-length songs. The dataset contains music features, as well as user profile information of the annotators. The songs were selected from the Free Music Archive using an innovative method (a Triple Neural Network with the OpenSmile toolkit) to identify 50 songs with the most distinctive emotions. Specifically, the songs were chosen to fully cover the four quadrants of the valence-arousal space. Four additional songs were selected from the DEAM dataset to act as a benchmark in this study and filter out low quality ratings. A total of 452 participants participated in annotating the dataset, with 277 participants remaining after thoroughly cleaning the dataset. Their demographic information, listening preferences, and musical background were recorded. We offer an extensive analysis of the resulting dataset, together with a baseline emotion prediction model based on a fully connected model and an LSTM model, for our newly proposed MERP dataset.
format Online
Article
Text
id pubmed-9824842
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-98248422023-01-08 MERP: A Music Dataset with Emotion Ratings and Raters’ Profile Information Koh, En Yan Cheuk, Kin Wai Heung, Kwan Yee Agres, Kat R. Herremans, Dorien Sensors (Basel) Article Music is capable of conveying many emotions. The level and type of emotion of the music perceived by a listener, however, is highly subjective. In this study, we present the Music Emotion Recognition with Profile information dataset (MERP). This database was collected through Amazon Mechanical Turk (MTurk) and features dynamical valence and arousal ratings of 54 selected full-length songs. The dataset contains music features, as well as user profile information of the annotators. The songs were selected from the Free Music Archive using an innovative method (a Triple Neural Network with the OpenSmile toolkit) to identify 50 songs with the most distinctive emotions. Specifically, the songs were chosen to fully cover the four quadrants of the valence-arousal space. Four additional songs were selected from the DEAM dataset to act as a benchmark in this study and filter out low quality ratings. A total of 452 participants participated in annotating the dataset, with 277 participants remaining after thoroughly cleaning the dataset. Their demographic information, listening preferences, and musical background were recorded. We offer an extensive analysis of the resulting dataset, together with a baseline emotion prediction model based on a fully connected model and an LSTM model, for our newly proposed MERP dataset. MDPI 2022-12-29 /pmc/articles/PMC9824842/ /pubmed/36616980 http://dx.doi.org/10.3390/s23010382 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Koh, En Yan
Cheuk, Kin Wai
Heung, Kwan Yee
Agres, Kat R.
Herremans, Dorien
MERP: A Music Dataset with Emotion Ratings and Raters’ Profile Information
title MERP: A Music Dataset with Emotion Ratings and Raters’ Profile Information
title_full MERP: A Music Dataset with Emotion Ratings and Raters’ Profile Information
title_fullStr MERP: A Music Dataset with Emotion Ratings and Raters’ Profile Information
title_full_unstemmed MERP: A Music Dataset with Emotion Ratings and Raters’ Profile Information
title_short MERP: A Music Dataset with Emotion Ratings and Raters’ Profile Information
title_sort merp: a music dataset with emotion ratings and raters’ profile information
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9824842/
https://www.ncbi.nlm.nih.gov/pubmed/36616980
http://dx.doi.org/10.3390/s23010382
work_keys_str_mv AT kohenyan merpamusicdatasetwithemotionratingsandratersprofileinformation
AT cheukkinwai merpamusicdatasetwithemotionratingsandratersprofileinformation
AT heungkwanyee merpamusicdatasetwithemotionratingsandratersprofileinformation
AT agreskatr merpamusicdatasetwithemotionratingsandratersprofileinformation
AT herremansdorien merpamusicdatasetwithemotionratingsandratersprofileinformation