Cargando…

Clinical excellence: evidence on the assessment of senior doctors' applications to the UK Advisory Committee on Clinical Excellence Awards. Analysis of complete national data set

OBJECTIVES: To inform the rational deployment of assessor resource in the evaluation of applications to the UK Advisory Committee on Clinical Excellence Awards (ACCEA). SETTING: ACCEA are responsible for a scheme to financially reward senior doctors in England and Wales who are assessed to be workin...

Descripción completa

Detalles Bibliográficos
Autores principales: Campbell, John L, Abel, Gary
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BMJ Publishing Group 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4893866/
https://www.ncbi.nlm.nih.gov/pubmed/27256095
http://dx.doi.org/10.1136/bmjopen-2016-011958
_version_ 1782435632187965440
author Campbell, John L
Abel, Gary
author_facet Campbell, John L
Abel, Gary
author_sort Campbell, John L
collection PubMed
description OBJECTIVES: To inform the rational deployment of assessor resource in the evaluation of applications to the UK Advisory Committee on Clinical Excellence Awards (ACCEA). SETTING: ACCEA are responsible for a scheme to financially reward senior doctors in England and Wales who are assessed to be working over and above the standard expected of their role. PARTICIPANTS: Anonymised applications of consultants and senior academic GPs for awards were considered by members of 14 regional subcommittees and 2 national assessing committees during the 2014–2015 round of applications. DESIGN: It involved secondary analysis of complete anonymised national data set. PRIMARY AND SECONDARY OUTCOME MEASURES: We analysed scores for each of 1916 applications for a clinical excellence award across 4 levels of award. Scores were provided by members of 16 subcommittees. We assessed the reliability of assessments and described the variance in the assessment of scores. RESULTS: Members of regional subcommittees assessed 1529 new applications and 387 renewal applications. Average scores increased with the level of application being made. On average, applications were assessed by 9.5 assessors. The highest contributions to the variance in individual assessors' assessments of applications were attributable to assessors or to residual variance. The applicant accounted for around a quarter of the variance in scores for new bronze applications, with this proportion decreasing for higher award levels. Reliability in excess of 0.7 can be attained where 4 assessors score bronze applications, with twice as many assessors being required for higher levels of application. CONCLUSIONS: Assessment processes pertaining in the competitive allocation of public funds need to be credible and efficient. The present arrangements for assessing and scoring applications are defensible, depending on the level of reliability judged to be required in the assessment process. Some relatively minor reconfiguration in approaches to scoring might usefully be considered in future rounds of assessment.
format Online
Article
Text
id pubmed-4893866
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher BMJ Publishing Group
record_format MEDLINE/PubMed
spelling pubmed-48938662016-06-09 Clinical excellence: evidence on the assessment of senior doctors' applications to the UK Advisory Committee on Clinical Excellence Awards. Analysis of complete national data set Campbell, John L Abel, Gary BMJ Open Research Methods OBJECTIVES: To inform the rational deployment of assessor resource in the evaluation of applications to the UK Advisory Committee on Clinical Excellence Awards (ACCEA). SETTING: ACCEA are responsible for a scheme to financially reward senior doctors in England and Wales who are assessed to be working over and above the standard expected of their role. PARTICIPANTS: Anonymised applications of consultants and senior academic GPs for awards were considered by members of 14 regional subcommittees and 2 national assessing committees during the 2014–2015 round of applications. DESIGN: It involved secondary analysis of complete anonymised national data set. PRIMARY AND SECONDARY OUTCOME MEASURES: We analysed scores for each of 1916 applications for a clinical excellence award across 4 levels of award. Scores were provided by members of 16 subcommittees. We assessed the reliability of assessments and described the variance in the assessment of scores. RESULTS: Members of regional subcommittees assessed 1529 new applications and 387 renewal applications. Average scores increased with the level of application being made. On average, applications were assessed by 9.5 assessors. The highest contributions to the variance in individual assessors' assessments of applications were attributable to assessors or to residual variance. The applicant accounted for around a quarter of the variance in scores for new bronze applications, with this proportion decreasing for higher award levels. Reliability in excess of 0.7 can be attained where 4 assessors score bronze applications, with twice as many assessors being required for higher levels of application. CONCLUSIONS: Assessment processes pertaining in the competitive allocation of public funds need to be credible and efficient. The present arrangements for assessing and scoring applications are defensible, depending on the level of reliability judged to be required in the assessment process. Some relatively minor reconfiguration in approaches to scoring might usefully be considered in future rounds of assessment. BMJ Publishing Group 2016-06-02 /pmc/articles/PMC4893866/ /pubmed/27256095 http://dx.doi.org/10.1136/bmjopen-2016-011958 Text en Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/ This is an Open Access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) license, which permits others to distribute, remix, adapt and build upon this work, for commercial use, provided the original work is properly cited. See: http://creativecommons.org/licenses/by/4.0/
spellingShingle Research Methods
Campbell, John L
Abel, Gary
Clinical excellence: evidence on the assessment of senior doctors' applications to the UK Advisory Committee on Clinical Excellence Awards. Analysis of complete national data set
title Clinical excellence: evidence on the assessment of senior doctors' applications to the UK Advisory Committee on Clinical Excellence Awards. Analysis of complete national data set
title_full Clinical excellence: evidence on the assessment of senior doctors' applications to the UK Advisory Committee on Clinical Excellence Awards. Analysis of complete national data set
title_fullStr Clinical excellence: evidence on the assessment of senior doctors' applications to the UK Advisory Committee on Clinical Excellence Awards. Analysis of complete national data set
title_full_unstemmed Clinical excellence: evidence on the assessment of senior doctors' applications to the UK Advisory Committee on Clinical Excellence Awards. Analysis of complete national data set
title_short Clinical excellence: evidence on the assessment of senior doctors' applications to the UK Advisory Committee on Clinical Excellence Awards. Analysis of complete national data set
title_sort clinical excellence: evidence on the assessment of senior doctors' applications to the uk advisory committee on clinical excellence awards. analysis of complete national data set
topic Research Methods
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4893866/
https://www.ncbi.nlm.nih.gov/pubmed/27256095
http://dx.doi.org/10.1136/bmjopen-2016-011958
work_keys_str_mv AT campbelljohnl clinicalexcellenceevidenceontheassessmentofseniordoctorsapplicationstotheukadvisorycommitteeonclinicalexcellenceawardsanalysisofcompletenationaldataset
AT abelgary clinicalexcellenceevidenceontheassessmentofseniordoctorsapplicationstotheukadvisorycommitteeonclinicalexcellenceawardsanalysisofcompletenationaldataset