Cargando…

Bringing Feedback in From the Outback via a Generic and Preference-Sensitive Instrument for Course Quality Assessment

BACKGROUND: Much effort and many resources have been put into developing ways of eliciting valid and informative student feedback on courses in medical, nursing, and other health professional schools. Whatever their motivation, items, and setting, the response rates have usually been disappointingly...

Descripción completa

Detalles Bibliográficos
Autores principales: Kaltoft, Mette K, Nielsen, Jesper B, Salkeld, Glenn, Lander, Jo, Dowie, Jack
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications Inc. 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4376236/
https://www.ncbi.nlm.nih.gov/pubmed/25720558
http://dx.doi.org/10.2196/resprot.4012
_version_ 1782363711159140352
author Kaltoft, Mette K
Nielsen, Jesper B
Salkeld, Glenn
Lander, Jo
Dowie, Jack
author_facet Kaltoft, Mette K
Nielsen, Jesper B
Salkeld, Glenn
Lander, Jo
Dowie, Jack
author_sort Kaltoft, Mette K
collection PubMed
description BACKGROUND: Much effort and many resources have been put into developing ways of eliciting valid and informative student feedback on courses in medical, nursing, and other health professional schools. Whatever their motivation, items, and setting, the response rates have usually been disappointingly low, and there seems to be an acceptance that the results are potentially biased. OBJECTIVE: The objective of the study was to look at an innovative approach to course assessment by students in the health professions. This approach was designed to make it an integral part of their educational experience, rather than a marginal, terminal, and optional add-on as “feedback”. It becomes a weighted, but ungraded, part of the course assignment requirements. METHODS: A ten-item, two-part Internet instrument, MyCourseQuality (MCQ-10D), was developed following a purposive review of previous instruments. Shorthand labels for the criteria are: Content, Organization, Perspective, Presentations, Materials, Relevance, Workload, Support, Interactivity, and Assessment. The assessment is unique in being dually personalized. In part 1, at the beginning of the course, the student enters their importance weights for the ten criteria. In part 2, at its completion, they rate the course on the same criteria. Their ratings and weightings are combined in a simple expected-value calculation to produce their dually personalized and decomposable MCQ score. Satisfactory (technical) completion of both parts contributes 10% of the marks available in the course. Providers are required to make the relevant characteristics of the course fully transparent at enrollment, and the course is to be rated as offered. A separate item appended to the survey allows students to suggest changes to what is offered. Students also complete (anonymously) the standard feedback form in the setting concerned. RESULTS: Piloting in a medical school and health professional school will establish the organizational feasibility and acceptability of the approach (a version of which has been employed in one medical school previously), as well as its impact on provider behavior and intentions, and on student engagement and responsiveness. The priorities for future improvements in terms of the specified criteria are identified at both individual and group level. The group results from MCQ will be compared with those from the standard feedback questionnaire, which will also be completed anonymously by the same students (or some percentage of them). CONCLUSIONS: We present a protocol for the piloting of a student-centered, dually personalized course quality instrument that forms part of the assignment requirements and is therefore an integral part of the course. If, and how, such an essentially formative Student-Reported Outcome or Experience Measure can be used summatively, at unit or program level, remains to be determined, and is not our concern here.
format Online
Article
Text
id pubmed-4376236
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher JMIR Publications Inc.
record_format MEDLINE/PubMed
spelling pubmed-43762362015-04-02 Bringing Feedback in From the Outback via a Generic and Preference-Sensitive Instrument for Course Quality Assessment Kaltoft, Mette K Nielsen, Jesper B Salkeld, Glenn Lander, Jo Dowie, Jack JMIR Res Protoc Original Paper BACKGROUND: Much effort and many resources have been put into developing ways of eliciting valid and informative student feedback on courses in medical, nursing, and other health professional schools. Whatever their motivation, items, and setting, the response rates have usually been disappointingly low, and there seems to be an acceptance that the results are potentially biased. OBJECTIVE: The objective of the study was to look at an innovative approach to course assessment by students in the health professions. This approach was designed to make it an integral part of their educational experience, rather than a marginal, terminal, and optional add-on as “feedback”. It becomes a weighted, but ungraded, part of the course assignment requirements. METHODS: A ten-item, two-part Internet instrument, MyCourseQuality (MCQ-10D), was developed following a purposive review of previous instruments. Shorthand labels for the criteria are: Content, Organization, Perspective, Presentations, Materials, Relevance, Workload, Support, Interactivity, and Assessment. The assessment is unique in being dually personalized. In part 1, at the beginning of the course, the student enters their importance weights for the ten criteria. In part 2, at its completion, they rate the course on the same criteria. Their ratings and weightings are combined in a simple expected-value calculation to produce their dually personalized and decomposable MCQ score. Satisfactory (technical) completion of both parts contributes 10% of the marks available in the course. Providers are required to make the relevant characteristics of the course fully transparent at enrollment, and the course is to be rated as offered. A separate item appended to the survey allows students to suggest changes to what is offered. Students also complete (anonymously) the standard feedback form in the setting concerned. RESULTS: Piloting in a medical school and health professional school will establish the organizational feasibility and acceptability of the approach (a version of which has been employed in one medical school previously), as well as its impact on provider behavior and intentions, and on student engagement and responsiveness. The priorities for future improvements in terms of the specified criteria are identified at both individual and group level. The group results from MCQ will be compared with those from the standard feedback questionnaire, which will also be completed anonymously by the same students (or some percentage of them). CONCLUSIONS: We present a protocol for the piloting of a student-centered, dually personalized course quality instrument that forms part of the assignment requirements and is therefore an integral part of the course. If, and how, such an essentially formative Student-Reported Outcome or Experience Measure can be used summatively, at unit or program level, remains to be determined, and is not our concern here. JMIR Publications Inc. 2015-02-13 /pmc/articles/PMC4376236/ /pubmed/25720558 http://dx.doi.org/10.2196/resprot.4012 Text en ©Mette K Kaltoft, Jesper B Nielsen, Glenn Salkeld, Jo Lander, Jack Dowie. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 13.02.2015. http://creativecommons.org/licenses/by/2.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Research Protocols, is properly cited. The complete bibliographic information, a link to the original publication on http://www.researchprotocols.org, as well as this copyright and license information must be included.
spellingShingle Original Paper
Kaltoft, Mette K
Nielsen, Jesper B
Salkeld, Glenn
Lander, Jo
Dowie, Jack
Bringing Feedback in From the Outback via a Generic and Preference-Sensitive Instrument for Course Quality Assessment
title Bringing Feedback in From the Outback via a Generic and Preference-Sensitive Instrument for Course Quality Assessment
title_full Bringing Feedback in From the Outback via a Generic and Preference-Sensitive Instrument for Course Quality Assessment
title_fullStr Bringing Feedback in From the Outback via a Generic and Preference-Sensitive Instrument for Course Quality Assessment
title_full_unstemmed Bringing Feedback in From the Outback via a Generic and Preference-Sensitive Instrument for Course Quality Assessment
title_short Bringing Feedback in From the Outback via a Generic and Preference-Sensitive Instrument for Course Quality Assessment
title_sort bringing feedback in from the outback via a generic and preference-sensitive instrument for course quality assessment
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4376236/
https://www.ncbi.nlm.nih.gov/pubmed/25720558
http://dx.doi.org/10.2196/resprot.4012
work_keys_str_mv AT kaltoftmettek bringingfeedbackinfromtheoutbackviaagenericandpreferencesensitiveinstrumentforcoursequalityassessment
AT nielsenjesperb bringingfeedbackinfromtheoutbackviaagenericandpreferencesensitiveinstrumentforcoursequalityassessment
AT salkeldglenn bringingfeedbackinfromtheoutbackviaagenericandpreferencesensitiveinstrumentforcoursequalityassessment
AT landerjo bringingfeedbackinfromtheoutbackviaagenericandpreferencesensitiveinstrumentforcoursequalityassessment
AT dowiejack bringingfeedbackinfromtheoutbackviaagenericandpreferencesensitiveinstrumentforcoursequalityassessment