Cargando…
Publishing computational research - a review of infrastructures for reproducible and transparent scholarly communication
BACKGROUND: The trend toward open science increases the pressure on authors to provide access to the source code and data they used to compute the results reported in their scientific papers. Since sharing materials reproducibly is challenging, several projects have developed solutions to support th...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7359270/ https://www.ncbi.nlm.nih.gov/pubmed/32685199 http://dx.doi.org/10.1186/s41073-020-00095-y |
_version_ | 1783559013464539136 |
---|---|
author | Konkol, Markus Nüst, Daniel Goulier, Laura |
author_facet | Konkol, Markus Nüst, Daniel Goulier, Laura |
author_sort | Konkol, Markus |
collection | PubMed |
description | BACKGROUND: The trend toward open science increases the pressure on authors to provide access to the source code and data they used to compute the results reported in their scientific papers. Since sharing materials reproducibly is challenging, several projects have developed solutions to support the release of executable analyses alongside articles. METHODS: We reviewed 11 applications that can assist researchers in adhering to reproducibility principles. The applications were found through a literature search and interactions with the reproducible research community. An application was included in our analysis if it (i) was actively maintained at the time the data for this paper was collected, (ii) supports the publication of executable code and data, (iii) is connected to the scholarly publication process. By investigating the software documentation and published articles, we compared the applications across 19 criteria, such as deployment options and features that support authors in creating and readers in studying executable papers. RESULTS: From the 11 applications, eight allow publishers to self-host the system for free, whereas three provide paid services. Authors can submit an executable analysis using Jupyter Notebooks or R Markdown documents (10 applications support these formats). All approaches provide features to assist readers in studying the materials, e.g., one-click reproducible results or tools for manipulating the analysis parameters. Six applications allow for modifying materials after publication. CONCLUSIONS: The applications support authors to publish reproducible research predominantly with literate programming. Concerning readers, most applications provide user interfaces to inspect and manipulate the computational analysis. The next step is to investigate the gaps identified in this review, such as the costs publishers have to expect when hosting an application, the consideration of sensitive data, and impacts on the review process. |
format | Online Article Text |
id | pubmed-7359270 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-73592702020-07-17 Publishing computational research - a review of infrastructures for reproducible and transparent scholarly communication Konkol, Markus Nüst, Daniel Goulier, Laura Res Integr Peer Rev Review BACKGROUND: The trend toward open science increases the pressure on authors to provide access to the source code and data they used to compute the results reported in their scientific papers. Since sharing materials reproducibly is challenging, several projects have developed solutions to support the release of executable analyses alongside articles. METHODS: We reviewed 11 applications that can assist researchers in adhering to reproducibility principles. The applications were found through a literature search and interactions with the reproducible research community. An application was included in our analysis if it (i) was actively maintained at the time the data for this paper was collected, (ii) supports the publication of executable code and data, (iii) is connected to the scholarly publication process. By investigating the software documentation and published articles, we compared the applications across 19 criteria, such as deployment options and features that support authors in creating and readers in studying executable papers. RESULTS: From the 11 applications, eight allow publishers to self-host the system for free, whereas three provide paid services. Authors can submit an executable analysis using Jupyter Notebooks or R Markdown documents (10 applications support these formats). All approaches provide features to assist readers in studying the materials, e.g., one-click reproducible results or tools for manipulating the analysis parameters. Six applications allow for modifying materials after publication. CONCLUSIONS: The applications support authors to publish reproducible research predominantly with literate programming. Concerning readers, most applications provide user interfaces to inspect and manipulate the computational analysis. The next step is to investigate the gaps identified in this review, such as the costs publishers have to expect when hosting an application, the consideration of sensitive data, and impacts on the review process. BioMed Central 2020-07-14 /pmc/articles/PMC7359270/ /pubmed/32685199 http://dx.doi.org/10.1186/s41073-020-00095-y Text en © The Author(s) 2020 Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data. |
spellingShingle | Review Konkol, Markus Nüst, Daniel Goulier, Laura Publishing computational research - a review of infrastructures for reproducible and transparent scholarly communication |
title | Publishing computational research - a review of infrastructures for reproducible and transparent scholarly communication |
title_full | Publishing computational research - a review of infrastructures for reproducible and transparent scholarly communication |
title_fullStr | Publishing computational research - a review of infrastructures for reproducible and transparent scholarly communication |
title_full_unstemmed | Publishing computational research - a review of infrastructures for reproducible and transparent scholarly communication |
title_short | Publishing computational research - a review of infrastructures for reproducible and transparent scholarly communication |
title_sort | publishing computational research - a review of infrastructures for reproducible and transparent scholarly communication |
topic | Review |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7359270/ https://www.ncbi.nlm.nih.gov/pubmed/32685199 http://dx.doi.org/10.1186/s41073-020-00095-y |
work_keys_str_mv | AT konkolmarkus publishingcomputationalresearchareviewofinfrastructuresforreproducibleandtransparentscholarlycommunication AT nustdaniel publishingcomputationalresearchareviewofinfrastructuresforreproducibleandtransparentscholarlycommunication AT goulierlaura publishingcomputationalresearchareviewofinfrastructuresforreproducibleandtransparentscholarlycommunication |