Cargando…
Comparing perturbation models for evaluating stability of neuroimaging pipelines
With an increase in awareness regarding a troubling lack of reproducibility in analytical software tools, the degree of validity in scientific derivatives and their downstream results has become unclear. The nature of reproducibility issues may vary across domains, tools, data sets, and computationa...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
SAGE Publications
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7418878/ https://www.ncbi.nlm.nih.gov/pubmed/32831546 http://dx.doi.org/10.1177/1094342020926237 |
_version_ | 1783569773124124672 |
---|---|
author | Kiar, Gregory de Oliveira Castro, Pablo Rioux, Pierre Petit, Eric Brown, Shawn T Evans, Alan C Glatard, Tristan |
author_facet | Kiar, Gregory de Oliveira Castro, Pablo Rioux, Pierre Petit, Eric Brown, Shawn T Evans, Alan C Glatard, Tristan |
author_sort | Kiar, Gregory |
collection | PubMed |
description | With an increase in awareness regarding a troubling lack of reproducibility in analytical software tools, the degree of validity in scientific derivatives and their downstream results has become unclear. The nature of reproducibility issues may vary across domains, tools, data sets, and computational infrastructures, but numerical instabilities are thought to be a core contributor. In neuroimaging, unexpected deviations have been observed when varying operating systems, software implementations, or adding negligible quantities of noise. In the field of numerical analysis, these issues have recently been explored through Monte Carlo Arithmetic, a method involving the instrumentation of floating-point operations with probabilistic noise injections at a target precision. Exploring multiple simulations in this context allows the characterization of the result space for a given tool or operation. In this article, we compare various perturbation models to introduce instabilities within a typical neuroimaging pipeline, including (i) targeted noise, (ii) Monte Carlo Arithmetic, and (iii) operating system variation, to identify the significance and quality of their impact on the resulting derivatives. We demonstrate that even low-order models in neuroimaging such as the structural connectome estimation pipeline evaluated here are sensitive to numerical instabilities, suggesting that stability is a relevant axis upon which tools are compared, alongside more traditional criteria such as biological feasibility, computational efficiency, or, when possible, accuracy. Heterogeneity was observed across participants which clearly illustrates a strong interaction between the tool and data set being processed, requiring that the stability of a given tool be evaluated with respect to a given cohort. We identify use cases for each perturbation method tested, including quality assurance, pipeline error detection, and local sensitivity analysis, and make recommendations for the evaluation of stability in a practical and analytically focused setting. Identifying how these relationships and recommendations scale to higher order computational tools, distinct data sets, and their implication on biological feasibility remain exciting avenues for future work. |
format | Online Article Text |
id | pubmed-7418878 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | SAGE Publications |
record_format | MEDLINE/PubMed |
spelling | pubmed-74188782020-08-19 Comparing perturbation models for evaluating stability of neuroimaging pipelines Kiar, Gregory de Oliveira Castro, Pablo Rioux, Pierre Petit, Eric Brown, Shawn T Evans, Alan C Glatard, Tristan Int J High Perform Comput Appl Special Issue Articles With an increase in awareness regarding a troubling lack of reproducibility in analytical software tools, the degree of validity in scientific derivatives and their downstream results has become unclear. The nature of reproducibility issues may vary across domains, tools, data sets, and computational infrastructures, but numerical instabilities are thought to be a core contributor. In neuroimaging, unexpected deviations have been observed when varying operating systems, software implementations, or adding negligible quantities of noise. In the field of numerical analysis, these issues have recently been explored through Monte Carlo Arithmetic, a method involving the instrumentation of floating-point operations with probabilistic noise injections at a target precision. Exploring multiple simulations in this context allows the characterization of the result space for a given tool or operation. In this article, we compare various perturbation models to introduce instabilities within a typical neuroimaging pipeline, including (i) targeted noise, (ii) Monte Carlo Arithmetic, and (iii) operating system variation, to identify the significance and quality of their impact on the resulting derivatives. We demonstrate that even low-order models in neuroimaging such as the structural connectome estimation pipeline evaluated here are sensitive to numerical instabilities, suggesting that stability is a relevant axis upon which tools are compared, alongside more traditional criteria such as biological feasibility, computational efficiency, or, when possible, accuracy. Heterogeneity was observed across participants which clearly illustrates a strong interaction between the tool and data set being processed, requiring that the stability of a given tool be evaluated with respect to a given cohort. We identify use cases for each perturbation method tested, including quality assurance, pipeline error detection, and local sensitivity analysis, and make recommendations for the evaluation of stability in a practical and analytically focused setting. Identifying how these relationships and recommendations scale to higher order computational tools, distinct data sets, and their implication on biological feasibility remain exciting avenues for future work. SAGE Publications 2020-05-21 2020-09 /pmc/articles/PMC7418878/ /pubmed/32831546 http://dx.doi.org/10.1177/1094342020926237 Text en © The Author(s) 2020 https://creativecommons.org/licenses/by-nc/4.0/ This article is distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 License (https://creativecommons.org/licenses/by-nc/4.0/) which permits non-commercial use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/open-access-at-sage). |
spellingShingle | Special Issue Articles Kiar, Gregory de Oliveira Castro, Pablo Rioux, Pierre Petit, Eric Brown, Shawn T Evans, Alan C Glatard, Tristan Comparing perturbation models for evaluating stability of neuroimaging pipelines |
title | Comparing perturbation models for evaluating stability of
neuroimaging pipelines |
title_full | Comparing perturbation models for evaluating stability of
neuroimaging pipelines |
title_fullStr | Comparing perturbation models for evaluating stability of
neuroimaging pipelines |
title_full_unstemmed | Comparing perturbation models for evaluating stability of
neuroimaging pipelines |
title_short | Comparing perturbation models for evaluating stability of
neuroimaging pipelines |
title_sort | comparing perturbation models for evaluating stability of
neuroimaging pipelines |
topic | Special Issue Articles |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7418878/ https://www.ncbi.nlm.nih.gov/pubmed/32831546 http://dx.doi.org/10.1177/1094342020926237 |
work_keys_str_mv | AT kiargregory comparingperturbationmodelsforevaluatingstabilityofneuroimagingpipelines AT deoliveiracastropablo comparingperturbationmodelsforevaluatingstabilityofneuroimagingpipelines AT riouxpierre comparingperturbationmodelsforevaluatingstabilityofneuroimagingpipelines AT petiteric comparingperturbationmodelsforevaluatingstabilityofneuroimagingpipelines AT brownshawnt comparingperturbationmodelsforevaluatingstabilityofneuroimagingpipelines AT evansalanc comparingperturbationmodelsforevaluatingstabilityofneuroimagingpipelines AT glatardtristan comparingperturbationmodelsforevaluatingstabilityofneuroimagingpipelines |