Cargando…

Electronic Source Data Transcription for Electronic Case Report Forms in China: Validation of the Electronic Source Record Tool in a Real-world Ophthalmology Study

BACKGROUND: As researchers are increasingly interested in real-world studies (RWSs), improving data collection efficiency and data quality has become an important challenge. An electronic source (eSource) generally includes direct capture, collection, and storage of electronic data to simplify clini...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Bin, Lai, Junkai, Liu, Mimi, Jin, Feifei, Peng, Yifei, Yao, Chen
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9804087/
https://www.ncbi.nlm.nih.gov/pubmed/36525285
http://dx.doi.org/10.2196/43229
Descripción
Sumario:BACKGROUND: As researchers are increasingly interested in real-world studies (RWSs), improving data collection efficiency and data quality has become an important challenge. An electronic source (eSource) generally includes direct capture, collection, and storage of electronic data to simplify clinical research. It can improve data quality and patient safety and reduce clinical trial costs. Although there are already large projects on eSource technology, there is a lack of experience in using eSource technology to implement RWSs. Our team designed and developed an eSource record (ESR) system in China. In a preliminary prospective study, we selected a cosmetic medical device project to evaluate ESR software’s effect on data collection and transcription. As the previous case verification was simple, we plan to choose more complicated ophthalmology projects to further evaluate the ESR. OBJECTIVE: We aimed to evaluate the data transcription efficiency and quality of ESR software in retrospective studies to verify the feasibility of using eSource as an alternative to traditional manual transcription of data in RWS projects. METHODS: The approved ophthalmic femtosecond laser project was used for ESR case validation. This study compared the efficiency and quality of data transcription between the eSource method using ESR software and the traditional clinical research model of manually transcribing the data. Usability refers to the quality of a user’s experience when interacting with products or systems including websites, software, devices, or applications. To evaluate the system availability of ESR, we used the System Usability Scale (SUS). The questionnaire consisted of the following 2 parts: participant information and SUS evaluation of the electronic medical record (EMR), electronic data capture (EDC), and ESR systems. By accessing log data from the EDC system previously used by the research project, all the time spent from the beginning to the end of the study could be counted. RESULTS: In terms of transcription time cost per field, the eSource method can reduce the time cost by 81.8% (11.2/13.7). Compared with traditional manual data transcription, the eSource method has higher data transcription quality (correct entry rate of 2356/2400, 98.17% vs 47,991/51,424, 93.32%). A total of 15 questionnaires were received with a response rate of 100%. In terms of usability, the average overall SUS scores of the EMR, EDC, and ESR systems were 50.3 (SD 21.9), 51.5 (SD 14.2), and 63.0 (SD 11.3; contract research organization experts: 69.5, SD 11.5; clinicians: 59.8, SD 10.2), respectively. The Cronbach α for the SUS items of the EMR, EDC, and ESR systems were 0.591 (95% CI −0.012 to 0.903), 0.588 (95% CI −0.288 to 0.951), and 0.785 (95% CI 0.576-0.916), respectively. CONCLUSIONS: In real-world ophthalmology studies, the eSource approach based on the ESR system can replace the traditional clinical research model that relies on the manual transcription of data.