Cargando…

Examining Agreement between Clinicians when Assessing Sick Children

BACKGROUND: Case management guidelines use a limited set of clinical features to guide assessment and treatment for common childhood diseases in poor countries. Using video records of clinical signs we assessed agreement among experts and assessed whether Kenyan health workers could identify signs d...

Descripción completa

Detalles Bibliográficos
Autores principales: Wagai, John, Senga, John, Fegan, Greg, English, Mike
Formato: Texto
Lenguaje:English
Publicado: Public Library of Science 2009
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2644760/
https://www.ncbi.nlm.nih.gov/pubmed/19247448
http://dx.doi.org/10.1371/journal.pone.0004626
_version_ 1782164757068906496
author Wagai, John
Senga, John
Fegan, Greg
English, Mike
author_facet Wagai, John
Senga, John
Fegan, Greg
English, Mike
author_sort Wagai, John
collection PubMed
description BACKGROUND: Case management guidelines use a limited set of clinical features to guide assessment and treatment for common childhood diseases in poor countries. Using video records of clinical signs we assessed agreement among experts and assessed whether Kenyan health workers could identify signs defined by expert consensus. METHODOLOGY: 104 videos representing 11 clinical sign categories were presented to experts using a web questionnaire. Proportionate agreement and agreement beyond chance were calculated using kappa and the AC1 statistic. 31 videos were selected and presented to local health workers, 20 for which experts had demonstrated clear agreement and 11 for which experts could not demonstrate agreement. PRINCIPAL FINDINGS: Experts reached very high level of chance adjusted agreement for some videos while for a few videos no agreement beyond chance was found. Where experts agreed Kenyan hospital staff of all cadres recognised signs with high mean sensitivity and specificity (sensitivity: 0.897–0.975, specificity: 0.813–0.894); years of experience, gender and hospital had no influence on mean sensitivity or specificity. Local health workers did not agree on videos where experts had low or no agreement. Results of different agreement statistics for multiple observers, the AC1 and Fleiss' kappa, differ across the range of proportionate agreement. CONCLUSION: Videos provide a useful means to test agreement amongst geographically diverse groups of health workers. Kenyan health workers are in agreement with experts where clinical signs are clear-cut supporting the potential value of assessment and management guidelines. However, clinical signs are not always clear-cut. Video recordings offer one means to help standardise interpretation of clinical signs.
format Text
id pubmed-2644760
institution National Center for Biotechnology Information
language English
publishDate 2009
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-26447602009-02-27 Examining Agreement between Clinicians when Assessing Sick Children Wagai, John Senga, John Fegan, Greg English, Mike PLoS One Research Article BACKGROUND: Case management guidelines use a limited set of clinical features to guide assessment and treatment for common childhood diseases in poor countries. Using video records of clinical signs we assessed agreement among experts and assessed whether Kenyan health workers could identify signs defined by expert consensus. METHODOLOGY: 104 videos representing 11 clinical sign categories were presented to experts using a web questionnaire. Proportionate agreement and agreement beyond chance were calculated using kappa and the AC1 statistic. 31 videos were selected and presented to local health workers, 20 for which experts had demonstrated clear agreement and 11 for which experts could not demonstrate agreement. PRINCIPAL FINDINGS: Experts reached very high level of chance adjusted agreement for some videos while for a few videos no agreement beyond chance was found. Where experts agreed Kenyan hospital staff of all cadres recognised signs with high mean sensitivity and specificity (sensitivity: 0.897–0.975, specificity: 0.813–0.894); years of experience, gender and hospital had no influence on mean sensitivity or specificity. Local health workers did not agree on videos where experts had low or no agreement. Results of different agreement statistics for multiple observers, the AC1 and Fleiss' kappa, differ across the range of proportionate agreement. CONCLUSION: Videos provide a useful means to test agreement amongst geographically diverse groups of health workers. Kenyan health workers are in agreement with experts where clinical signs are clear-cut supporting the potential value of assessment and management guidelines. However, clinical signs are not always clear-cut. Video recordings offer one means to help standardise interpretation of clinical signs. Public Library of Science 2009-02-27 /pmc/articles/PMC2644760/ /pubmed/19247448 http://dx.doi.org/10.1371/journal.pone.0004626 Text en Wagai et al. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Wagai, John
Senga, John
Fegan, Greg
English, Mike
Examining Agreement between Clinicians when Assessing Sick Children
title Examining Agreement between Clinicians when Assessing Sick Children
title_full Examining Agreement between Clinicians when Assessing Sick Children
title_fullStr Examining Agreement between Clinicians when Assessing Sick Children
title_full_unstemmed Examining Agreement between Clinicians when Assessing Sick Children
title_short Examining Agreement between Clinicians when Assessing Sick Children
title_sort examining agreement between clinicians when assessing sick children
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2644760/
https://www.ncbi.nlm.nih.gov/pubmed/19247448
http://dx.doi.org/10.1371/journal.pone.0004626
work_keys_str_mv AT wagaijohn examiningagreementbetweenclinicianswhenassessingsickchildren
AT sengajohn examiningagreementbetweenclinicianswhenassessingsickchildren
AT fegangreg examiningagreementbetweenclinicianswhenassessingsickchildren
AT englishmike examiningagreementbetweenclinicianswhenassessingsickchildren