Cargando…
Reporting of abstracts in studies that used routinely collected data for exploring drug treatment effects: a cross-sectional survey
BACKGROUND: In recent years, studies that used routinely collected data (RCD), such as electronic medical records and administrative claims, for exploring drug treatment effects, including effectiveness and safety, have been increasingly published. Abstracts of such studies represent a highly attend...
Autores principales: | , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8742367/ https://www.ncbi.nlm.nih.gov/pubmed/34996370 http://dx.doi.org/10.1186/s12874-021-01482-9 |
Sumario: | BACKGROUND: In recent years, studies that used routinely collected data (RCD), such as electronic medical records and administrative claims, for exploring drug treatment effects, including effectiveness and safety, have been increasingly published. Abstracts of such studies represent a highly attended source for busy clinicians or policy-makers, and are important for indexing by literature database. If less clearly presented, they may mislead decisions or indexing. We thus conducted a cross-sectional survey to systematically examine how the abstracts of such studies were reported. METHODS: We searched PubMed to identify all observational studies published in 2018 that used RCD for assessing drug treatment effects. Teams of methods-trained collected data from eligible studies using pilot-tested, standardized forms that were developed and expanded from “The reporting of studies conducted using observational routinely collected health data statement for pharmacoepidemiology” (RECORD-PE) statement. We used descriptive analyses to examine how authors reported data source, study design, data analysis, and interpretation of findings. RESULTS: A total of 222 studies were included, of which 118 (53.2%) reported type of database used, 17 (7.7%) clearly reported database linkage, and 140 (63.1%) reported coverage of data source. Only 44 (19.8%) studies stated a predefined hypothesis, 127 (57.2%) reported study design, 140 (63.1%) reported statistical models used, 142 (77.6%) reported adjusted estimates, 33 (14.9%) mentioned sensitivity analyses, and 39 (17.6%) made a strong claim about treatment effect. Studies published in top 5 general medicine journals were more likely to report the name of data source (94.7% vs. 67.0%) and study design (100% vs. 53.2%) than those in other journals. CONCLUSIONS: The under-reporting of key methodological features in abstracts of RCD studies was common, which would substantially compromise the indexing of this type of literature and prevent the effective use of study findings. Substantial efforts to improve the reporting of abstracts in these studies are highly warranted. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12874-021-01482-9. |
---|