Cargando…
What do measures of agreement (κ) tell us about quality of exposure assessment? Theoretical analysis and numerical simulation
BACKGROUND: The reliability of binary exposure classification methods is routinely reported in occupational health literature because it is viewed as an important component of evaluating the trustworthiness of the exposure assessment by experts. The Kappa statistics (κ) are typically employed to ass...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BMJ Publishing Group
2013
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3855494/ https://www.ncbi.nlm.nih.gov/pubmed/24302507 http://dx.doi.org/10.1136/bmjopen-2013-003952 |
_version_ | 1782294926218756096 |
---|---|
author | Burstyn, Igor de Vocht, Frank Gustafson, Paul |
author_facet | Burstyn, Igor de Vocht, Frank Gustafson, Paul |
author_sort | Burstyn, Igor |
collection | PubMed |
description | BACKGROUND: The reliability of binary exposure classification methods is routinely reported in occupational health literature because it is viewed as an important component of evaluating the trustworthiness of the exposure assessment by experts. The Kappa statistics (κ) are typically employed to assess how well raters or classification systems agree in a variety of contexts, such as identifying exposed participants in a population-based epidemiological study of risks due to occupational exposures. However, the question we are really interested in is not so much the reliability of an exposure assessment method, although this holds value in itself, but the validity of the exposure estimates. The validity of binary classifiers can be expressed as a method's sensitivity (SN) and specificity (SP), estimated from its agreement with the error-free classifier. METHODS AND RESULTS: We describe a simulation-based method for deriving information on SN and SP that can be derived from κ and the prevalence of exposure, since an analytic solution is not possible without restrictive assumptions. This work is illustrated in the context of comparison of job-exposure matrices assessing occupational exposures to polycyclic aromatic hydrocarbons. DISCUSSION: Our approach allows the investigators to evaluate how good their exposure-assessment methods truly are, not just how well they agree with each other, and should lead to incorporation of information of validity of expert assessment methods into formal uncertainty analyses in epidemiology. |
format | Online Article Text |
id | pubmed-3855494 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2013 |
publisher | BMJ Publishing Group |
record_format | MEDLINE/PubMed |
spelling | pubmed-38554942013-12-09 What do measures of agreement (κ) tell us about quality of exposure assessment? Theoretical analysis and numerical simulation Burstyn, Igor de Vocht, Frank Gustafson, Paul BMJ Open Research Methods BACKGROUND: The reliability of binary exposure classification methods is routinely reported in occupational health literature because it is viewed as an important component of evaluating the trustworthiness of the exposure assessment by experts. The Kappa statistics (κ) are typically employed to assess how well raters or classification systems agree in a variety of contexts, such as identifying exposed participants in a population-based epidemiological study of risks due to occupational exposures. However, the question we are really interested in is not so much the reliability of an exposure assessment method, although this holds value in itself, but the validity of the exposure estimates. The validity of binary classifiers can be expressed as a method's sensitivity (SN) and specificity (SP), estimated from its agreement with the error-free classifier. METHODS AND RESULTS: We describe a simulation-based method for deriving information on SN and SP that can be derived from κ and the prevalence of exposure, since an analytic solution is not possible without restrictive assumptions. This work is illustrated in the context of comparison of job-exposure matrices assessing occupational exposures to polycyclic aromatic hydrocarbons. DISCUSSION: Our approach allows the investigators to evaluate how good their exposure-assessment methods truly are, not just how well they agree with each other, and should lead to incorporation of information of validity of expert assessment methods into formal uncertainty analyses in epidemiology. BMJ Publishing Group 2013-12-03 /pmc/articles/PMC3855494/ /pubmed/24302507 http://dx.doi.org/10.1136/bmjopen-2013-003952 Text en Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 3.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/3.0/ |
spellingShingle | Research Methods Burstyn, Igor de Vocht, Frank Gustafson, Paul What do measures of agreement (κ) tell us about quality of exposure assessment? Theoretical analysis and numerical simulation |
title | What do measures of agreement (κ) tell us about quality of exposure assessment? Theoretical analysis and numerical simulation |
title_full | What do measures of agreement (κ) tell us about quality of exposure assessment? Theoretical analysis and numerical simulation |
title_fullStr | What do measures of agreement (κ) tell us about quality of exposure assessment? Theoretical analysis and numerical simulation |
title_full_unstemmed | What do measures of agreement (κ) tell us about quality of exposure assessment? Theoretical analysis and numerical simulation |
title_short | What do measures of agreement (κ) tell us about quality of exposure assessment? Theoretical analysis and numerical simulation |
title_sort | what do measures of agreement (κ) tell us about quality of exposure assessment? theoretical analysis and numerical simulation |
topic | Research Methods |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3855494/ https://www.ncbi.nlm.nih.gov/pubmed/24302507 http://dx.doi.org/10.1136/bmjopen-2013-003952 |
work_keys_str_mv | AT burstynigor whatdomeasuresofagreementktellusaboutqualityofexposureassessmenttheoreticalanalysisandnumericalsimulation AT devochtfrank whatdomeasuresofagreementktellusaboutqualityofexposureassessmenttheoreticalanalysisandnumericalsimulation AT gustafsonpaul whatdomeasuresofagreementktellusaboutqualityofexposureassessmenttheoreticalanalysisandnumericalsimulation |