Cargando…

Measuring ability to assess claims about treatment effects: a latent trait analysis of items from the ‘Claim Evaluation Tools’ database using Rasch modelling

BACKGROUND: The Claim Evaluation Tools database contains multiple-choice items for measuring people’s ability to apply the key concepts they need to know to be able to assess treatment claims. We assessed items from the database using Rasch analysis to develop an outcome measure to be used in two ra...

Descripción completa

Detalles Bibliográficos
Autores principales: Austvoll-Dahlgren, Astrid, Guttersrud, Øystein, Nsangi, Allen, Semakula, Daniel, Oxman, Andrew D
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BMJ Open 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5777469/
https://www.ncbi.nlm.nih.gov/pubmed/28550019
http://dx.doi.org/10.1136/bmjopen-2016-013185
_version_ 1783294201981566976
author Austvoll-Dahlgren, Astrid
Guttersrud, Øystein
Nsangi, Allen
Semakula, Daniel
Oxman, Andrew D
author_facet Austvoll-Dahlgren, Astrid
Guttersrud, Øystein
Nsangi, Allen
Semakula, Daniel
Oxman, Andrew D
author_sort Austvoll-Dahlgren, Astrid
collection PubMed
description BACKGROUND: The Claim Evaluation Tools database contains multiple-choice items for measuring people’s ability to apply the key concepts they need to know to be able to assess treatment claims. We assessed items from the database using Rasch analysis to develop an outcome measure to be used in two randomised trials in Uganda. Rasch analysis is a form of psychometric testing relying on Item Response Theory. It is a dynamic way of developing outcome measures that are valid and reliable. OBJECTIVES: To assess the validity, reliability and responsiveness of 88 items addressing 22 key concepts using Rasch analysis. PARTICIPANTS: We administrated four sets of multiple-choice items in English to 1114 people in Uganda and Norway, of which 685 were children and 429 were adults (including 171 health professionals). We scored all items dichotomously. We explored summary and individual fit statistics using the RUMM2030 analysis package. We used SPSS to perform distractor analysis. RESULTS: Most items conformed well to the Rasch model, but some items needed revision. Overall, the four item sets had satisfactory reliability. We did not identify significant response dependence between any pairs of items and, overall, the magnitude of multidimensionality in the data was acceptable. The items had a high level of difficulty. CONCLUSION: Most of the items conformed well to the Rasch model’s expectations. Following revision of some items, we concluded that most of the items were suitable for use in an outcome measure for evaluating the ability of children or adults to assess treatment claims.
format Online
Article
Text
id pubmed-5777469
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher BMJ Open
record_format MEDLINE/PubMed
spelling pubmed-57774692018-01-29 Measuring ability to assess claims about treatment effects: a latent trait analysis of items from the ‘Claim Evaluation Tools’ database using Rasch modelling Austvoll-Dahlgren, Astrid Guttersrud, Øystein Nsangi, Allen Semakula, Daniel Oxman, Andrew D BMJ Open Patient-Centred Medicine BACKGROUND: The Claim Evaluation Tools database contains multiple-choice items for measuring people’s ability to apply the key concepts they need to know to be able to assess treatment claims. We assessed items from the database using Rasch analysis to develop an outcome measure to be used in two randomised trials in Uganda. Rasch analysis is a form of psychometric testing relying on Item Response Theory. It is a dynamic way of developing outcome measures that are valid and reliable. OBJECTIVES: To assess the validity, reliability and responsiveness of 88 items addressing 22 key concepts using Rasch analysis. PARTICIPANTS: We administrated four sets of multiple-choice items in English to 1114 people in Uganda and Norway, of which 685 were children and 429 were adults (including 171 health professionals). We scored all items dichotomously. We explored summary and individual fit statistics using the RUMM2030 analysis package. We used SPSS to perform distractor analysis. RESULTS: Most items conformed well to the Rasch model, but some items needed revision. Overall, the four item sets had satisfactory reliability. We did not identify significant response dependence between any pairs of items and, overall, the magnitude of multidimensionality in the data was acceptable. The items had a high level of difficulty. CONCLUSION: Most of the items conformed well to the Rasch model’s expectations. Following revision of some items, we concluded that most of the items were suitable for use in an outcome measure for evaluating the ability of children or adults to assess treatment claims. BMJ Open 2017-05-25 /pmc/articles/PMC5777469/ /pubmed/28550019 http://dx.doi.org/10.1136/bmjopen-2016-013185 Text en © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted. This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/
spellingShingle Patient-Centred Medicine
Austvoll-Dahlgren, Astrid
Guttersrud, Øystein
Nsangi, Allen
Semakula, Daniel
Oxman, Andrew D
Measuring ability to assess claims about treatment effects: a latent trait analysis of items from the ‘Claim Evaluation Tools’ database using Rasch modelling
title Measuring ability to assess claims about treatment effects: a latent trait analysis of items from the ‘Claim Evaluation Tools’ database using Rasch modelling
title_full Measuring ability to assess claims about treatment effects: a latent trait analysis of items from the ‘Claim Evaluation Tools’ database using Rasch modelling
title_fullStr Measuring ability to assess claims about treatment effects: a latent trait analysis of items from the ‘Claim Evaluation Tools’ database using Rasch modelling
title_full_unstemmed Measuring ability to assess claims about treatment effects: a latent trait analysis of items from the ‘Claim Evaluation Tools’ database using Rasch modelling
title_short Measuring ability to assess claims about treatment effects: a latent trait analysis of items from the ‘Claim Evaluation Tools’ database using Rasch modelling
title_sort measuring ability to assess claims about treatment effects: a latent trait analysis of items from the ‘claim evaluation tools’ database using rasch modelling
topic Patient-Centred Medicine
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5777469/
https://www.ncbi.nlm.nih.gov/pubmed/28550019
http://dx.doi.org/10.1136/bmjopen-2016-013185
work_keys_str_mv AT austvolldahlgrenastrid measuringabilitytoassessclaimsabouttreatmenteffectsalatenttraitanalysisofitemsfromtheclaimevaluationtoolsdatabaseusingraschmodelling
AT guttersrudøystein measuringabilitytoassessclaimsabouttreatmenteffectsalatenttraitanalysisofitemsfromtheclaimevaluationtoolsdatabaseusingraschmodelling
AT nsangiallen measuringabilitytoassessclaimsabouttreatmenteffectsalatenttraitanalysisofitemsfromtheclaimevaluationtoolsdatabaseusingraschmodelling
AT semakuladaniel measuringabilitytoassessclaimsabouttreatmenteffectsalatenttraitanalysisofitemsfromtheclaimevaluationtoolsdatabaseusingraschmodelling
AT oxmanandrewd measuringabilitytoassessclaimsabouttreatmenteffectsalatenttraitanalysisofitemsfromtheclaimevaluationtoolsdatabaseusingraschmodelling