Cargando…

Better to be in agreement than in bad company: A critical analysis of many kappa-like tests

We assessed several agreement coefficients applied in 2x2 contingency tables, which are commonly applied in research due to dichotomization. Here, we not only studied some specific estimators but also developed a general method for the study of any estimator candidate to be an agreement measurement....

Descripción completa

Detalles Bibliográficos
Autores principales: Silveira, Paulo Sergio Panse, Siqueira, Jose Oliveira
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer US 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10615996/
https://www.ncbi.nlm.nih.gov/pubmed/36114386
http://dx.doi.org/10.3758/s13428-022-01950-0
_version_ 1785129302080094208
author Silveira, Paulo Sergio Panse
Siqueira, Jose Oliveira
author_facet Silveira, Paulo Sergio Panse
Siqueira, Jose Oliveira
author_sort Silveira, Paulo Sergio Panse
collection PubMed
description We assessed several agreement coefficients applied in 2x2 contingency tables, which are commonly applied in research due to dichotomization. Here, we not only studied some specific estimators but also developed a general method for the study of any estimator candidate to be an agreement measurement. This method was developed in open-source R codes and it is available to the researchers. We tested this method by verifying the performance of several traditional estimators over all possible configurations with sizes ranging from 1 to 68 (total of 1,028,789 tables). Cohen’s kappa showed handicapped behavior similar to Pearson’s r, Yule’s Q, and Yule’s Y. Scott’s pi, and Shankar and Bangdiwala’s B seem to better assess situations of disagreement than agreement between raters. Krippendorff’s alpha emulates, without any advantage, Scott’s pi in cases with nominal variables and two raters. Dice’s F1 and McNemar’s chi-squared incompletely assess the information of the contingency table, showing the poorest performance among all. We concluded that Cohen’s kappa is a measurement of association and McNemar’s chi-squared assess neither association nor agreement; the only two authentic agreement estimators are Holley and Guilford’s G and Gwet’s AC1. The latter two estimators also showed the best performance over the range of table sizes and should be considered as the first choices for agreement measurement in contingency 2x2 tables. All procedures and data were implemented in R and are available to download from Harvard Dataverse https://doi.org/10.7910/DVN/HMYTCK. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.3758/s13428-022-01950-0.
format Online
Article
Text
id pubmed-10615996
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Springer US
record_format MEDLINE/PubMed
spelling pubmed-106159962023-11-01 Better to be in agreement than in bad company: A critical analysis of many kappa-like tests Silveira, Paulo Sergio Panse Siqueira, Jose Oliveira Behav Res Methods Article We assessed several agreement coefficients applied in 2x2 contingency tables, which are commonly applied in research due to dichotomization. Here, we not only studied some specific estimators but also developed a general method for the study of any estimator candidate to be an agreement measurement. This method was developed in open-source R codes and it is available to the researchers. We tested this method by verifying the performance of several traditional estimators over all possible configurations with sizes ranging from 1 to 68 (total of 1,028,789 tables). Cohen’s kappa showed handicapped behavior similar to Pearson’s r, Yule’s Q, and Yule’s Y. Scott’s pi, and Shankar and Bangdiwala’s B seem to better assess situations of disagreement than agreement between raters. Krippendorff’s alpha emulates, without any advantage, Scott’s pi in cases with nominal variables and two raters. Dice’s F1 and McNemar’s chi-squared incompletely assess the information of the contingency table, showing the poorest performance among all. We concluded that Cohen’s kappa is a measurement of association and McNemar’s chi-squared assess neither association nor agreement; the only two authentic agreement estimators are Holley and Guilford’s G and Gwet’s AC1. The latter two estimators also showed the best performance over the range of table sizes and should be considered as the first choices for agreement measurement in contingency 2x2 tables. All procedures and data were implemented in R and are available to download from Harvard Dataverse https://doi.org/10.7910/DVN/HMYTCK. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.3758/s13428-022-01950-0. Springer US 2022-09-16 2023 /pmc/articles/PMC10615996/ /pubmed/36114386 http://dx.doi.org/10.3758/s13428-022-01950-0 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Silveira, Paulo Sergio Panse
Siqueira, Jose Oliveira
Better to be in agreement than in bad company: A critical analysis of many kappa-like tests
title Better to be in agreement than in bad company: A critical analysis of many kappa-like tests
title_full Better to be in agreement than in bad company: A critical analysis of many kappa-like tests
title_fullStr Better to be in agreement than in bad company: A critical analysis of many kappa-like tests
title_full_unstemmed Better to be in agreement than in bad company: A critical analysis of many kappa-like tests
title_short Better to be in agreement than in bad company: A critical analysis of many kappa-like tests
title_sort better to be in agreement than in bad company: a critical analysis of many kappa-like tests
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10615996/
https://www.ncbi.nlm.nih.gov/pubmed/36114386
http://dx.doi.org/10.3758/s13428-022-01950-0
work_keys_str_mv AT silveirapaulosergiopanse bettertobeinagreementthaninbadcompanyacriticalanalysisofmanykappaliketests
AT siqueirajoseoliveira bettertobeinagreementthaninbadcompanyacriticalanalysisofmanykappaliketests