Cargando…
(Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians
Assessment criteria for sight-singing abilities are similar to those used to judge music performances across music school programs. However, little evidence of agreement among judges has been provided in the literature. Fifty out of 152 participants were randomly selected and blindly assessed by thr...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5987045/ https://www.ncbi.nlm.nih.gov/pubmed/29896144 http://dx.doi.org/10.3389/fpsyg.2018.00837 |
_version_ | 1783329041832476672 |
---|---|
author | Bortz, Graziela Germano, Nayana G. Cogo-Moreira, Hugo |
author_facet | Bortz, Graziela Germano, Nayana G. Cogo-Moreira, Hugo |
author_sort | Bortz, Graziela |
collection | PubMed |
description | Assessment criteria for sight-singing abilities are similar to those used to judge music performances across music school programs. However, little evidence of agreement among judges has been provided in the literature. Fifty out of 152 participants were randomly selected and blindly assessed by three judges, who evaluated students based on given criteria. Participants were recorded while sight-singing 19 intervals and 10 tonal melodies. Interjudge agreement on melodic sight-singing was tested considering four items in a five-point Likert scale format as follows: (1) Intonation and pitch accuracy; (2) Tonal sense and memory; (3) Rhythmic precision, regularity of pulse and subdivisions; (4) Fluency and music direction. Intervals were scored considering a 3-point Likert scale. Agreement was conducted using weighted kappa. For melodic sight-singing considering the ten tonal melodies, on average, the weighted kappa (κ(w)) were: κ1(w) = 0.296, κ2(w) = 0.487, κ3(w) = 0.224, and κ4(w) = 0.244, ranging from fair to moderate.. For intervals, the lowest agreement was kappa = 0.406 and the highest was kappa = 0.792 (on average, kappa = 0.637). These findings light up the discussion on the validity and reliability of models that have been taken for granted in assessing music performance in auditions and contests, and illustrate the need to better discuss evaluation criteria. |
format | Online Article Text |
id | pubmed-5987045 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2018 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-59870452018-06-12 (Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians Bortz, Graziela Germano, Nayana G. Cogo-Moreira, Hugo Front Psychol Psychology Assessment criteria for sight-singing abilities are similar to those used to judge music performances across music school programs. However, little evidence of agreement among judges has been provided in the literature. Fifty out of 152 participants were randomly selected and blindly assessed by three judges, who evaluated students based on given criteria. Participants were recorded while sight-singing 19 intervals and 10 tonal melodies. Interjudge agreement on melodic sight-singing was tested considering four items in a five-point Likert scale format as follows: (1) Intonation and pitch accuracy; (2) Tonal sense and memory; (3) Rhythmic precision, regularity of pulse and subdivisions; (4) Fluency and music direction. Intervals were scored considering a 3-point Likert scale. Agreement was conducted using weighted kappa. For melodic sight-singing considering the ten tonal melodies, on average, the weighted kappa (κ(w)) were: κ1(w) = 0.296, κ2(w) = 0.487, κ3(w) = 0.224, and κ4(w) = 0.244, ranging from fair to moderate.. For intervals, the lowest agreement was kappa = 0.406 and the highest was kappa = 0.792 (on average, kappa = 0.637). These findings light up the discussion on the validity and reliability of models that have been taken for granted in assessing music performance in auditions and contests, and illustrate the need to better discuss evaluation criteria. Frontiers Media S.A. 2018-05-29 /pmc/articles/PMC5987045/ /pubmed/29896144 http://dx.doi.org/10.3389/fpsyg.2018.00837 Text en Copyright © 2018 Bortz, Germano and Cogo-Moreira. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Psychology Bortz, Graziela Germano, Nayana G. Cogo-Moreira, Hugo (Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
title | (Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
title_full | (Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
title_fullStr | (Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
title_full_unstemmed | (Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
title_short | (Dis)agreement on Sight-Singing Assessment of Undergraduate Musicians |
title_sort | (dis)agreement on sight-singing assessment of undergraduate musicians |
topic | Psychology |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5987045/ https://www.ncbi.nlm.nih.gov/pubmed/29896144 http://dx.doi.org/10.3389/fpsyg.2018.00837 |
work_keys_str_mv | AT bortzgraziela disagreementonsightsingingassessmentofundergraduatemusicians AT germanonayanag disagreementonsightsingingassessmentofundergraduatemusicians AT cogomoreirahugo disagreementonsightsingingassessmentofundergraduatemusicians |