Cargando…
Writing to Learn: An Evaluation of the Calibrated Peer Review™ Program in Two Neuroscience Courses
Although the majority of scientific information is communicated in written form, and peer review is the primary process by which it is validated, undergraduate students may receive little direct training in science writing or peer review. Here, I describe the use of Calibrated Peer Review™ (CPR), a...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Faculty for Undergraduate Neuroscience
2005
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3592621/ https://www.ncbi.nlm.nih.gov/pubmed/23493247 |
_version_ | 1782262144665911296 |
---|---|
author | Prichard, J. Roxanne |
author_facet | Prichard, J. Roxanne |
author_sort | Prichard, J. Roxanne |
collection | PubMed |
description | Although the majority of scientific information is communicated in written form, and peer review is the primary process by which it is validated, undergraduate students may receive little direct training in science writing or peer review. Here, I describe the use of Calibrated Peer Review™ (CPR), a free, web-based writing and peer review program designed to alleviate instructor workload, in two undergraduate neuroscience courses: an upper- level sensation and perception course (41 students, three assignments) and an introductory neuroscience course (50 students; two assignments). Using CPR online, students reviewed primary research articles on assigned ‘hot’ topics, wrote short essays in response to specific guiding questions, reviewed standard ‘calibration’ essays, and provided anonymous quantitative and qualitative peer reviews. An automated grading system calculated the final scores based on a student’s essay quality (as determined by the average of three peer reviews) and his or her accuracy in evaluating 1) three standard calibration essays, 2) three anonymous peer reviews, and 3) his or her self review. Thus, students were assessed not only on their skill at constructing logical, evidence-based arguments, but also on their ability to accurately evaluate their peers’ writing. According to both student self-reports and instructor observation, students’ writing and peer review skills improved over the course of the semester. Student evaluation of the CPR program was mixed; while some students felt like the peer review process enhanced their understanding of the material and improved their writing, others felt as though the process was biased and required too much time. Despite student critiques of the program, I still recommend the CPR program as an excellent and free resource for incorporating more writing, peer review, and critical thinking into an undergraduate neuroscience curriculum. |
format | Online Article Text |
id | pubmed-3592621 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2005 |
publisher | Faculty for Undergraduate Neuroscience |
record_format | MEDLINE/PubMed |
spelling | pubmed-35926212013-03-14 Writing to Learn: An Evaluation of the Calibrated Peer Review™ Program in Two Neuroscience Courses Prichard, J. Roxanne J Undergrad Neurosci Educ Article Although the majority of scientific information is communicated in written form, and peer review is the primary process by which it is validated, undergraduate students may receive little direct training in science writing or peer review. Here, I describe the use of Calibrated Peer Review™ (CPR), a free, web-based writing and peer review program designed to alleviate instructor workload, in two undergraduate neuroscience courses: an upper- level sensation and perception course (41 students, three assignments) and an introductory neuroscience course (50 students; two assignments). Using CPR online, students reviewed primary research articles on assigned ‘hot’ topics, wrote short essays in response to specific guiding questions, reviewed standard ‘calibration’ essays, and provided anonymous quantitative and qualitative peer reviews. An automated grading system calculated the final scores based on a student’s essay quality (as determined by the average of three peer reviews) and his or her accuracy in evaluating 1) three standard calibration essays, 2) three anonymous peer reviews, and 3) his or her self review. Thus, students were assessed not only on their skill at constructing logical, evidence-based arguments, but also on their ability to accurately evaluate their peers’ writing. According to both student self-reports and instructor observation, students’ writing and peer review skills improved over the course of the semester. Student evaluation of the CPR program was mixed; while some students felt like the peer review process enhanced their understanding of the material and improved their writing, others felt as though the process was biased and required too much time. Despite student critiques of the program, I still recommend the CPR program as an excellent and free resource for incorporating more writing, peer review, and critical thinking into an undergraduate neuroscience curriculum. Faculty for Undergraduate Neuroscience 2005-10-15 /pmc/articles/PMC3592621/ /pubmed/23493247 Text en Copyright © 2005 Faculty for Undergraduate Neuroscience |
spellingShingle | Article Prichard, J. Roxanne Writing to Learn: An Evaluation of the Calibrated Peer Review™ Program in Two Neuroscience Courses |
title | Writing to Learn: An Evaluation of the Calibrated Peer Review™ Program in Two Neuroscience Courses |
title_full | Writing to Learn: An Evaluation of the Calibrated Peer Review™ Program in Two Neuroscience Courses |
title_fullStr | Writing to Learn: An Evaluation of the Calibrated Peer Review™ Program in Two Neuroscience Courses |
title_full_unstemmed | Writing to Learn: An Evaluation of the Calibrated Peer Review™ Program in Two Neuroscience Courses |
title_short | Writing to Learn: An Evaluation of the Calibrated Peer Review™ Program in Two Neuroscience Courses |
title_sort | writing to learn: an evaluation of the calibrated peer review™ program in two neuroscience courses |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3592621/ https://www.ncbi.nlm.nih.gov/pubmed/23493247 |
work_keys_str_mv | AT prichardjroxanne writingtolearnanevaluationofthecalibratedpeerreviewprogramintwoneurosciencecourses |