Cargando…

Electronic Source Data Transcription for Electronic Case Report Forms in China: Validation of the Electronic Source Record Tool in a Real-world Ophthalmology Study

BACKGROUND: As researchers are increasingly interested in real-world studies (RWSs), improving data collection efficiency and data quality has become an important challenge. An electronic source (eSource) generally includes direct capture, collection, and storage of electronic data to simplify clini...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Bin, Lai, Junkai, Liu, Mimi, Jin, Feifei, Peng, Yifei, Yao, Chen
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9804087/
https://www.ncbi.nlm.nih.gov/pubmed/36525285
http://dx.doi.org/10.2196/43229
_version_ 1784862025787113472
author Wang, Bin
Lai, Junkai
Liu, Mimi
Jin, Feifei
Peng, Yifei
Yao, Chen
author_facet Wang, Bin
Lai, Junkai
Liu, Mimi
Jin, Feifei
Peng, Yifei
Yao, Chen
author_sort Wang, Bin
collection PubMed
description BACKGROUND: As researchers are increasingly interested in real-world studies (RWSs), improving data collection efficiency and data quality has become an important challenge. An electronic source (eSource) generally includes direct capture, collection, and storage of electronic data to simplify clinical research. It can improve data quality and patient safety and reduce clinical trial costs. Although there are already large projects on eSource technology, there is a lack of experience in using eSource technology to implement RWSs. Our team designed and developed an eSource record (ESR) system in China. In a preliminary prospective study, we selected a cosmetic medical device project to evaluate ESR software’s effect on data collection and transcription. As the previous case verification was simple, we plan to choose more complicated ophthalmology projects to further evaluate the ESR. OBJECTIVE: We aimed to evaluate the data transcription efficiency and quality of ESR software in retrospective studies to verify the feasibility of using eSource as an alternative to traditional manual transcription of data in RWS projects. METHODS: The approved ophthalmic femtosecond laser project was used for ESR case validation. This study compared the efficiency and quality of data transcription between the eSource method using ESR software and the traditional clinical research model of manually transcribing the data. Usability refers to the quality of a user’s experience when interacting with products or systems including websites, software, devices, or applications. To evaluate the system availability of ESR, we used the System Usability Scale (SUS). The questionnaire consisted of the following 2 parts: participant information and SUS evaluation of the electronic medical record (EMR), electronic data capture (EDC), and ESR systems. By accessing log data from the EDC system previously used by the research project, all the time spent from the beginning to the end of the study could be counted. RESULTS: In terms of transcription time cost per field, the eSource method can reduce the time cost by 81.8% (11.2/13.7). Compared with traditional manual data transcription, the eSource method has higher data transcription quality (correct entry rate of 2356/2400, 98.17% vs 47,991/51,424, 93.32%). A total of 15 questionnaires were received with a response rate of 100%. In terms of usability, the average overall SUS scores of the EMR, EDC, and ESR systems were 50.3 (SD 21.9), 51.5 (SD 14.2), and 63.0 (SD 11.3; contract research organization experts: 69.5, SD 11.5; clinicians: 59.8, SD 10.2), respectively. The Cronbach α for the SUS items of the EMR, EDC, and ESR systems were 0.591 (95% CI −0.012 to 0.903), 0.588 (95% CI −0.288 to 0.951), and 0.785 (95% CI 0.576-0.916), respectively. CONCLUSIONS: In real-world ophthalmology studies, the eSource approach based on the ESR system can replace the traditional clinical research model that relies on the manual transcription of data.
format Online
Article
Text
id pubmed-9804087
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-98040872023-01-01 Electronic Source Data Transcription for Electronic Case Report Forms in China: Validation of the Electronic Source Record Tool in a Real-world Ophthalmology Study Wang, Bin Lai, Junkai Liu, Mimi Jin, Feifei Peng, Yifei Yao, Chen JMIR Form Res Original Paper BACKGROUND: As researchers are increasingly interested in real-world studies (RWSs), improving data collection efficiency and data quality has become an important challenge. An electronic source (eSource) generally includes direct capture, collection, and storage of electronic data to simplify clinical research. It can improve data quality and patient safety and reduce clinical trial costs. Although there are already large projects on eSource technology, there is a lack of experience in using eSource technology to implement RWSs. Our team designed and developed an eSource record (ESR) system in China. In a preliminary prospective study, we selected a cosmetic medical device project to evaluate ESR software’s effect on data collection and transcription. As the previous case verification was simple, we plan to choose more complicated ophthalmology projects to further evaluate the ESR. OBJECTIVE: We aimed to evaluate the data transcription efficiency and quality of ESR software in retrospective studies to verify the feasibility of using eSource as an alternative to traditional manual transcription of data in RWS projects. METHODS: The approved ophthalmic femtosecond laser project was used for ESR case validation. This study compared the efficiency and quality of data transcription between the eSource method using ESR software and the traditional clinical research model of manually transcribing the data. Usability refers to the quality of a user’s experience when interacting with products or systems including websites, software, devices, or applications. To evaluate the system availability of ESR, we used the System Usability Scale (SUS). The questionnaire consisted of the following 2 parts: participant information and SUS evaluation of the electronic medical record (EMR), electronic data capture (EDC), and ESR systems. By accessing log data from the EDC system previously used by the research project, all the time spent from the beginning to the end of the study could be counted. RESULTS: In terms of transcription time cost per field, the eSource method can reduce the time cost by 81.8% (11.2/13.7). Compared with traditional manual data transcription, the eSource method has higher data transcription quality (correct entry rate of 2356/2400, 98.17% vs 47,991/51,424, 93.32%). A total of 15 questionnaires were received with a response rate of 100%. In terms of usability, the average overall SUS scores of the EMR, EDC, and ESR systems were 50.3 (SD 21.9), 51.5 (SD 14.2), and 63.0 (SD 11.3; contract research organization experts: 69.5, SD 11.5; clinicians: 59.8, SD 10.2), respectively. The Cronbach α for the SUS items of the EMR, EDC, and ESR systems were 0.591 (95% CI −0.012 to 0.903), 0.588 (95% CI −0.288 to 0.951), and 0.785 (95% CI 0.576-0.916), respectively. CONCLUSIONS: In real-world ophthalmology studies, the eSource approach based on the ESR system can replace the traditional clinical research model that relies on the manual transcription of data. JMIR Publications 2022-12-16 /pmc/articles/PMC9804087/ /pubmed/36525285 http://dx.doi.org/10.2196/43229 Text en ©Bin Wang, Junkai Lai, Mimi Liu, Feifei Jin, Yifei Peng, Chen Yao. Originally published in JMIR Formative Research (https://formative.jmir.org), 16.12.2022. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on https://formative.jmir.org, as well as this copyright and license information must be included.
spellingShingle Original Paper
Wang, Bin
Lai, Junkai
Liu, Mimi
Jin, Feifei
Peng, Yifei
Yao, Chen
Electronic Source Data Transcription for Electronic Case Report Forms in China: Validation of the Electronic Source Record Tool in a Real-world Ophthalmology Study
title Electronic Source Data Transcription for Electronic Case Report Forms in China: Validation of the Electronic Source Record Tool in a Real-world Ophthalmology Study
title_full Electronic Source Data Transcription for Electronic Case Report Forms in China: Validation of the Electronic Source Record Tool in a Real-world Ophthalmology Study
title_fullStr Electronic Source Data Transcription for Electronic Case Report Forms in China: Validation of the Electronic Source Record Tool in a Real-world Ophthalmology Study
title_full_unstemmed Electronic Source Data Transcription for Electronic Case Report Forms in China: Validation of the Electronic Source Record Tool in a Real-world Ophthalmology Study
title_short Electronic Source Data Transcription for Electronic Case Report Forms in China: Validation of the Electronic Source Record Tool in a Real-world Ophthalmology Study
title_sort electronic source data transcription for electronic case report forms in china: validation of the electronic source record tool in a real-world ophthalmology study
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9804087/
https://www.ncbi.nlm.nih.gov/pubmed/36525285
http://dx.doi.org/10.2196/43229
work_keys_str_mv AT wangbin electronicsourcedatatranscriptionforelectroniccasereportformsinchinavalidationoftheelectronicsourcerecordtoolinarealworldophthalmologystudy
AT laijunkai electronicsourcedatatranscriptionforelectroniccasereportformsinchinavalidationoftheelectronicsourcerecordtoolinarealworldophthalmologystudy
AT liumimi electronicsourcedatatranscriptionforelectroniccasereportformsinchinavalidationoftheelectronicsourcerecordtoolinarealworldophthalmologystudy
AT jinfeifei electronicsourcedatatranscriptionforelectroniccasereportformsinchinavalidationoftheelectronicsourcerecordtoolinarealworldophthalmologystudy
AT pengyifei electronicsourcedatatranscriptionforelectroniccasereportformsinchinavalidationoftheelectronicsourcerecordtoolinarealworldophthalmologystudy
AT yaochen electronicsourcedatatranscriptionforelectroniccasereportformsinchinavalidationoftheelectronicsourcerecordtoolinarealworldophthalmologystudy