Cargando…
Stress testing journals: a quasi-experimental study of rejection rates of a previously published paper
BACKGROUND: When a journal receives a duplicate publication, the ability to identify the submitted work as previously published, and reject it, is an assay to publication ethics best practices. The aim of this study was to evaluate how three different types of journals, namely open access (OA) journ...
Autores principales: | , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7171725/ https://www.ncbi.nlm.nih.gov/pubmed/32312313 http://dx.doi.org/10.1186/s12916-020-01550-9 |
_version_ | 1783524122530152448 |
---|---|
author | Cobey, Kelly D. Rice, Danielle B. Lalu, Manoj M. Abramowitz, Daniel Ahmadzai, Nadera Cunningham, Heather Ayala, Ana Patricia Raffoul, Hana Khan, Faizan Shamseer, Larissa Moher, David |
author_facet | Cobey, Kelly D. Rice, Danielle B. Lalu, Manoj M. Abramowitz, Daniel Ahmadzai, Nadera Cunningham, Heather Ayala, Ana Patricia Raffoul, Hana Khan, Faizan Shamseer, Larissa Moher, David |
author_sort | Cobey, Kelly D. |
collection | PubMed |
description | BACKGROUND: When a journal receives a duplicate publication, the ability to identify the submitted work as previously published, and reject it, is an assay to publication ethics best practices. The aim of this study was to evaluate how three different types of journals, namely open access (OA) journals, subscription-based journals, and presumed predatory journals, responded to receiving a previously published manuscript for review. METHODS: We performed a quasi-experimental study in which we submitted a previously published article to a random sample of 602 biomedical journals, roughly 200 journals from each journal type sampled: OA journals, subscription-based journals, and presumed predatory journals. Three hundred and three journals received a Word version in manuscript format, while 299 journals received the formatted publisher’s PDF version of the published article. We then recorded responses to the submission received after approximately 1 month. Responses were reviewed, extracted, and coded in duplicate. Our primary outcome was the rate of rejection of the two types of submitted articles (PDF vs Word) within our three journal types. RESULTS: We received correspondence back from 308 (51.1%) journals within our study timeline (32 days); (N = 46 predatory journals, N = 127 OA journals, N = 135 subscription-based journals). Of the journals that responded, 153 received the Word version of the paper, while 155 received the PDF version. Four journals (1.3%) accepted our paper, 291 (94.5%) journals rejected the paper, and 13 (4.2%) requested a revision. A chi-square test looking at journal type, and submission type, was significant (χ(2) (4) = 23.50, p < 0.001). All four responses to accept our article came from presumed predatory journals, 3 of which received the Word format and 1 that received the PDF format. Less than half of journals that rejected our submissions did so because they identified ethical issues such as plagiarism with the manuscript (133 (45.7%)). CONCLUSION: Few journals accepted our submitted paper. However, our findings suggest that all three types of journals may not have adequate safeguards in place to recognize and act on plagiarism or duplicate submissions. ELECTRONIC SUPPLEMENTARY MATERIAL: Supplementary information accompanies this paper at 10.1186/s12916-020-01550-9. |
format | Online Article Text |
id | pubmed-7171725 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-71717252020-04-24 Stress testing journals: a quasi-experimental study of rejection rates of a previously published paper Cobey, Kelly D. Rice, Danielle B. Lalu, Manoj M. Abramowitz, Daniel Ahmadzai, Nadera Cunningham, Heather Ayala, Ana Patricia Raffoul, Hana Khan, Faizan Shamseer, Larissa Moher, David BMC Med Research Article BACKGROUND: When a journal receives a duplicate publication, the ability to identify the submitted work as previously published, and reject it, is an assay to publication ethics best practices. The aim of this study was to evaluate how three different types of journals, namely open access (OA) journals, subscription-based journals, and presumed predatory journals, responded to receiving a previously published manuscript for review. METHODS: We performed a quasi-experimental study in which we submitted a previously published article to a random sample of 602 biomedical journals, roughly 200 journals from each journal type sampled: OA journals, subscription-based journals, and presumed predatory journals. Three hundred and three journals received a Word version in manuscript format, while 299 journals received the formatted publisher’s PDF version of the published article. We then recorded responses to the submission received after approximately 1 month. Responses were reviewed, extracted, and coded in duplicate. Our primary outcome was the rate of rejection of the two types of submitted articles (PDF vs Word) within our three journal types. RESULTS: We received correspondence back from 308 (51.1%) journals within our study timeline (32 days); (N = 46 predatory journals, N = 127 OA journals, N = 135 subscription-based journals). Of the journals that responded, 153 received the Word version of the paper, while 155 received the PDF version. Four journals (1.3%) accepted our paper, 291 (94.5%) journals rejected the paper, and 13 (4.2%) requested a revision. A chi-square test looking at journal type, and submission type, was significant (χ(2) (4) = 23.50, p < 0.001). All four responses to accept our article came from presumed predatory journals, 3 of which received the Word format and 1 that received the PDF format. Less than half of journals that rejected our submissions did so because they identified ethical issues such as plagiarism with the manuscript (133 (45.7%)). CONCLUSION: Few journals accepted our submitted paper. However, our findings suggest that all three types of journals may not have adequate safeguards in place to recognize and act on plagiarism or duplicate submissions. ELECTRONIC SUPPLEMENTARY MATERIAL: Supplementary information accompanies this paper at 10.1186/s12916-020-01550-9. BioMed Central 2020-04-21 /pmc/articles/PMC7171725/ /pubmed/32312313 http://dx.doi.org/10.1186/s12916-020-01550-9 Text en © The Author(s) 2020 Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data. |
spellingShingle | Research Article Cobey, Kelly D. Rice, Danielle B. Lalu, Manoj M. Abramowitz, Daniel Ahmadzai, Nadera Cunningham, Heather Ayala, Ana Patricia Raffoul, Hana Khan, Faizan Shamseer, Larissa Moher, David Stress testing journals: a quasi-experimental study of rejection rates of a previously published paper |
title | Stress testing journals: a quasi-experimental study of rejection rates of a previously published paper |
title_full | Stress testing journals: a quasi-experimental study of rejection rates of a previously published paper |
title_fullStr | Stress testing journals: a quasi-experimental study of rejection rates of a previously published paper |
title_full_unstemmed | Stress testing journals: a quasi-experimental study of rejection rates of a previously published paper |
title_short | Stress testing journals: a quasi-experimental study of rejection rates of a previously published paper |
title_sort | stress testing journals: a quasi-experimental study of rejection rates of a previously published paper |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7171725/ https://www.ncbi.nlm.nih.gov/pubmed/32312313 http://dx.doi.org/10.1186/s12916-020-01550-9 |
work_keys_str_mv | AT cobeykellyd stresstestingjournalsaquasiexperimentalstudyofrejectionratesofapreviouslypublishedpaper AT ricedanielleb stresstestingjournalsaquasiexperimentalstudyofrejectionratesofapreviouslypublishedpaper AT lalumanojm stresstestingjournalsaquasiexperimentalstudyofrejectionratesofapreviouslypublishedpaper AT abramowitzdaniel stresstestingjournalsaquasiexperimentalstudyofrejectionratesofapreviouslypublishedpaper AT ahmadzainadera stresstestingjournalsaquasiexperimentalstudyofrejectionratesofapreviouslypublishedpaper AT cunninghamheather stresstestingjournalsaquasiexperimentalstudyofrejectionratesofapreviouslypublishedpaper AT ayalaanapatricia stresstestingjournalsaquasiexperimentalstudyofrejectionratesofapreviouslypublishedpaper AT raffoulhana stresstestingjournalsaquasiexperimentalstudyofrejectionratesofapreviouslypublishedpaper AT khanfaizan stresstestingjournalsaquasiexperimentalstudyofrejectionratesofapreviouslypublishedpaper AT shamseerlarissa stresstestingjournalsaquasiexperimentalstudyofrejectionratesofapreviouslypublishedpaper AT moherdavid stresstestingjournalsaquasiexperimentalstudyofrejectionratesofapreviouslypublishedpaper |