Cargando…

Validity, Reliability and Acceptability of the Team Standardized Assessment of Clinical Encounter Report

BACKGROUND: The Team Standardized Assessment of a Clinical Encounter Report (StACER) was designed for use in Geriatric Medicine residency programs to evaluate Communicator and Collaborator competencies. METHODS: The Team StACER was completed by two geriatricians and interdisciplinary team members ba...

Descripción completa

Detalles Bibliográficos
Autores principales: Wong, Camilla L., Norris, Mireille, Sinha, Samir S., Zorzitto, Maria L., Madala, Sushma, Hamid, Jemila S.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Canadian Geriatrics Society 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5178860/
https://www.ncbi.nlm.nih.gov/pubmed/28050222
http://dx.doi.org/10.5770/cgj.19.234
_version_ 1782485268774780928
author Wong, Camilla L.
Norris, Mireille
Sinha, Samir S.
Zorzitto, Maria L.
Madala, Sushma
Hamid, Jemila S.
author_facet Wong, Camilla L.
Norris, Mireille
Sinha, Samir S.
Zorzitto, Maria L.
Madala, Sushma
Hamid, Jemila S.
author_sort Wong, Camilla L.
collection PubMed
description BACKGROUND: The Team Standardized Assessment of a Clinical Encounter Report (StACER) was designed for use in Geriatric Medicine residency programs to evaluate Communicator and Collaborator competencies. METHODS: The Team StACER was completed by two geriatricians and interdisciplinary team members based on observations during a geriatric medicine team meeting. Postgraduate trainees were recruited from July 2010–November 2013. Inter-rater reliability between two geriatricians and between all team members was determined. Internal consistency of items for the constructs Communicator and Collaborator competencies was calculated. Raters completed a survey previously administered to Canadian geriatricians to assess face validity. Trainees completed a survey to determine the usefulness of this instrument as a feedback tool. RESULTS: Thirty postgraduate trainees participated. The prevalence-adjusted bias-adjusted kappa range inter-rater reliability for Communicator and Collaborator items were 0.87–1.00 and 0.86–1.00, respectively. The Cronbach’s alpha coefficient for Communicator and Collaborator items was 0.997 (95% CI: 0.993–1.00) and 0.997 (95% CI: 0.997–1.00), respectively. The instrument lacked discriminatory power, as all trainees scored “meets requirements” in the overall assessment. Niney-three per cent and 86% of trainees found feedback useful for developing Communicator and Collaborator competencies, respectively. CONCLUSIONS: The Team StACER has adequate inter-rater reliability and internal consistency. Poor discriminatory power and face validity challenge the merit of using this evaluation tool. Trainees felt the tool provided useful feedback on Collaborator and Communicator competencies.
format Online
Article
Text
id pubmed-5178860
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher Canadian Geriatrics Society
record_format MEDLINE/PubMed
spelling pubmed-51788602017-01-03 Validity, Reliability and Acceptability of the Team Standardized Assessment of Clinical Encounter Report Wong, Camilla L. Norris, Mireille Sinha, Samir S. Zorzitto, Maria L. Madala, Sushma Hamid, Jemila S. Can Geriatr J Original Research BACKGROUND: The Team Standardized Assessment of a Clinical Encounter Report (StACER) was designed for use in Geriatric Medicine residency programs to evaluate Communicator and Collaborator competencies. METHODS: The Team StACER was completed by two geriatricians and interdisciplinary team members based on observations during a geriatric medicine team meeting. Postgraduate trainees were recruited from July 2010–November 2013. Inter-rater reliability between two geriatricians and between all team members was determined. Internal consistency of items for the constructs Communicator and Collaborator competencies was calculated. Raters completed a survey previously administered to Canadian geriatricians to assess face validity. Trainees completed a survey to determine the usefulness of this instrument as a feedback tool. RESULTS: Thirty postgraduate trainees participated. The prevalence-adjusted bias-adjusted kappa range inter-rater reliability for Communicator and Collaborator items were 0.87–1.00 and 0.86–1.00, respectively. The Cronbach’s alpha coefficient for Communicator and Collaborator items was 0.997 (95% CI: 0.993–1.00) and 0.997 (95% CI: 0.997–1.00), respectively. The instrument lacked discriminatory power, as all trainees scored “meets requirements” in the overall assessment. Niney-three per cent and 86% of trainees found feedback useful for developing Communicator and Collaborator competencies, respectively. CONCLUSIONS: The Team StACER has adequate inter-rater reliability and internal consistency. Poor discriminatory power and face validity challenge the merit of using this evaluation tool. Trainees felt the tool provided useful feedback on Collaborator and Communicator competencies. Canadian Geriatrics Society 2016-12-23 /pmc/articles/PMC5178860/ /pubmed/28050222 http://dx.doi.org/10.5770/cgj.19.234 Text en © 2016 Author(s). Published by the Canadian Geriatrics Society. This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial No-Derivative license (http://creativecommons.org/licenses/by-nc-nd/2.5/ca/), which permits unrestricted non-commercial use and distribution, provided the original work is properly cited.
spellingShingle Original Research
Wong, Camilla L.
Norris, Mireille
Sinha, Samir S.
Zorzitto, Maria L.
Madala, Sushma
Hamid, Jemila S.
Validity, Reliability and Acceptability of the Team Standardized Assessment of Clinical Encounter Report
title Validity, Reliability and Acceptability of the Team Standardized Assessment of Clinical Encounter Report
title_full Validity, Reliability and Acceptability of the Team Standardized Assessment of Clinical Encounter Report
title_fullStr Validity, Reliability and Acceptability of the Team Standardized Assessment of Clinical Encounter Report
title_full_unstemmed Validity, Reliability and Acceptability of the Team Standardized Assessment of Clinical Encounter Report
title_short Validity, Reliability and Acceptability of the Team Standardized Assessment of Clinical Encounter Report
title_sort validity, reliability and acceptability of the team standardized assessment of clinical encounter report
topic Original Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5178860/
https://www.ncbi.nlm.nih.gov/pubmed/28050222
http://dx.doi.org/10.5770/cgj.19.234
work_keys_str_mv AT wongcamillal validityreliabilityandacceptabilityoftheteamstandardizedassessmentofclinicalencounterreport
AT norrismireille validityreliabilityandacceptabilityoftheteamstandardizedassessmentofclinicalencounterreport
AT sinhasamirs validityreliabilityandacceptabilityoftheteamstandardizedassessmentofclinicalencounterreport
AT zorzittomarial validityreliabilityandacceptabilityoftheteamstandardizedassessmentofclinicalencounterreport
AT madalasushma validityreliabilityandacceptabilityoftheteamstandardizedassessmentofclinicalencounterreport
AT hamidjemilas validityreliabilityandacceptabilityoftheteamstandardizedassessmentofclinicalencounterreport