Cargando…

We Know What You Agreed To, Don't We?—Evaluating the Quality of Paper-Based Consents Forms and Their Digitalized Equivalent Using the Example of the Baltic Fracture Competence Centre Project

Introduction  The informed consent is the legal basis for research with human subjects. Therefore, the consent form (CF) as legally binding document must be valid, that is, be completely filled-in stating the person's decision clearly and signed by the respective person. However, especially pap...

Descripción completa

Detalles Bibliográficos
Autores principales: Rau, Henriette, Stahl, Dana, Reichel, Anna-Juliana, Bialke, Martin, Bahls, Thomas, Hoffmann, Wolfgang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Georg Thieme Verlag KG 2023
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10306442/
https://www.ncbi.nlm.nih.gov/pubmed/36623832
http://dx.doi.org/10.1055/s-0042-1760249
_version_ 1785065935742173184
author Rau, Henriette
Stahl, Dana
Reichel, Anna-Juliana
Bialke, Martin
Bahls, Thomas
Hoffmann, Wolfgang
author_facet Rau, Henriette
Stahl, Dana
Reichel, Anna-Juliana
Bialke, Martin
Bahls, Thomas
Hoffmann, Wolfgang
author_sort Rau, Henriette
collection PubMed
description Introduction  The informed consent is the legal basis for research with human subjects. Therefore, the consent form (CF) as legally binding document must be valid, that is, be completely filled-in stating the person's decision clearly and signed by the respective person. However, especially paper-based CFs might have quality issues and the transformation into machine-readable information could add to low quality. This paper evaluates the quality and arising quality issues of paper-based CFs using the example of the Baltic Fracture Competence Centre (BFCC) fracture registry. It also evaluates the impact of quality assurance (QA) measures including giving site-specific feedback. Finally, it answers the question whether manual data entry of patients' decisions by clinical staff leads to a significant error rate in digitalized paper-based CFs. Methods  Based on defined quality criteria, monthly QA including source data verification was conducted by two individual reviewers since the start of recruitment in December 2017. Basis for the analyses are the CFs collected from December 2017 until February 2019 (first recruitment period). Results  After conducting QA internally, the sudden increase of quality issues in May 2018 led to site-specific feedback reports and follow-up training regarding the CFs' quality starting in June 2018. Specific criteria and descriptions on how to correct the CFs helped in increasing the quality in a timely matter. Most common issues were missing pages, decisions regarding optional modules, and signature(s). Since patients' datasets without valid CFs must be deleted, QA helped in retaining 65 datasets for research so that the final datapool consisted of 840 (99.29%) patients. Conclusion  All quality issues could be assigned to one predefined criterion. Using the example of the BFCC fracture registry, CF-QA proved to significantly increase CF quality and help retain the number of available datasets for research. Consequently, the described quality indicators, criteria, and QA processes can be seen as the best practice approach.
format Online
Article
Text
id pubmed-10306442
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Georg Thieme Verlag KG
record_format MEDLINE/PubMed
spelling pubmed-103064422023-06-29 We Know What You Agreed To, Don't We?—Evaluating the Quality of Paper-Based Consents Forms and Their Digitalized Equivalent Using the Example of the Baltic Fracture Competence Centre Project Rau, Henriette Stahl, Dana Reichel, Anna-Juliana Bialke, Martin Bahls, Thomas Hoffmann, Wolfgang Methods Inf Med Introduction  The informed consent is the legal basis for research with human subjects. Therefore, the consent form (CF) as legally binding document must be valid, that is, be completely filled-in stating the person's decision clearly and signed by the respective person. However, especially paper-based CFs might have quality issues and the transformation into machine-readable information could add to low quality. This paper evaluates the quality and arising quality issues of paper-based CFs using the example of the Baltic Fracture Competence Centre (BFCC) fracture registry. It also evaluates the impact of quality assurance (QA) measures including giving site-specific feedback. Finally, it answers the question whether manual data entry of patients' decisions by clinical staff leads to a significant error rate in digitalized paper-based CFs. Methods  Based on defined quality criteria, monthly QA including source data verification was conducted by two individual reviewers since the start of recruitment in December 2017. Basis for the analyses are the CFs collected from December 2017 until February 2019 (first recruitment period). Results  After conducting QA internally, the sudden increase of quality issues in May 2018 led to site-specific feedback reports and follow-up training regarding the CFs' quality starting in June 2018. Specific criteria and descriptions on how to correct the CFs helped in increasing the quality in a timely matter. Most common issues were missing pages, decisions regarding optional modules, and signature(s). Since patients' datasets without valid CFs must be deleted, QA helped in retaining 65 datasets for research so that the final datapool consisted of 840 (99.29%) patients. Conclusion  All quality issues could be assigned to one predefined criterion. Using the example of the BFCC fracture registry, CF-QA proved to significantly increase CF quality and help retain the number of available datasets for research. Consequently, the described quality indicators, criteria, and QA processes can be seen as the best practice approach. Georg Thieme Verlag KG 2023-01-09 /pmc/articles/PMC10306442/ /pubmed/36623832 http://dx.doi.org/10.1055/s-0042-1760249 Text en The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. ( https://creativecommons.org/licenses/by-nc-nd/4.0/ ) https://creativecommons.org/licenses/by-nc-nd/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License, which permits unrestricted reproduction and distribution, for non-commercial purposes only; and use and reproduction, but not distribution, of adapted material for non-commercial purposes only, provided the original work is properly cited.
spellingShingle Rau, Henriette
Stahl, Dana
Reichel, Anna-Juliana
Bialke, Martin
Bahls, Thomas
Hoffmann, Wolfgang
We Know What You Agreed To, Don't We?—Evaluating the Quality of Paper-Based Consents Forms and Their Digitalized Equivalent Using the Example of the Baltic Fracture Competence Centre Project
title We Know What You Agreed To, Don't We?—Evaluating the Quality of Paper-Based Consents Forms and Their Digitalized Equivalent Using the Example of the Baltic Fracture Competence Centre Project
title_full We Know What You Agreed To, Don't We?—Evaluating the Quality of Paper-Based Consents Forms and Their Digitalized Equivalent Using the Example of the Baltic Fracture Competence Centre Project
title_fullStr We Know What You Agreed To, Don't We?—Evaluating the Quality of Paper-Based Consents Forms and Their Digitalized Equivalent Using the Example of the Baltic Fracture Competence Centre Project
title_full_unstemmed We Know What You Agreed To, Don't We?—Evaluating the Quality of Paper-Based Consents Forms and Their Digitalized Equivalent Using the Example of the Baltic Fracture Competence Centre Project
title_short We Know What You Agreed To, Don't We?—Evaluating the Quality of Paper-Based Consents Forms and Their Digitalized Equivalent Using the Example of the Baltic Fracture Competence Centre Project
title_sort we know what you agreed to, don't we?—evaluating the quality of paper-based consents forms and their digitalized equivalent using the example of the baltic fracture competence centre project
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10306442/
https://www.ncbi.nlm.nih.gov/pubmed/36623832
http://dx.doi.org/10.1055/s-0042-1760249
work_keys_str_mv AT rauhenriette weknowwhatyouagreedtodontweevaluatingthequalityofpaperbasedconsentsformsandtheirdigitalizedequivalentusingtheexampleofthebalticfracturecompetencecentreproject
AT stahldana weknowwhatyouagreedtodontweevaluatingthequalityofpaperbasedconsentsformsandtheirdigitalizedequivalentusingtheexampleofthebalticfracturecompetencecentreproject
AT reichelannajuliana weknowwhatyouagreedtodontweevaluatingthequalityofpaperbasedconsentsformsandtheirdigitalizedequivalentusingtheexampleofthebalticfracturecompetencecentreproject
AT bialkemartin weknowwhatyouagreedtodontweevaluatingthequalityofpaperbasedconsentsformsandtheirdigitalizedequivalentusingtheexampleofthebalticfracturecompetencecentreproject
AT bahlsthomas weknowwhatyouagreedtodontweevaluatingthequalityofpaperbasedconsentsformsandtheirdigitalizedequivalentusingtheexampleofthebalticfracturecompetencecentreproject
AT hoffmannwolfgang weknowwhatyouagreedtodontweevaluatingthequalityofpaperbasedconsentsformsandtheirdigitalizedequivalentusingtheexampleofthebalticfracturecompetencecentreproject