Cargando…
Big in Japan: Regularizing Networks for Solving Inverse Problems
Deep learning and (deep) neural networks are emerging tools to address inverse problems and image reconstruction tasks. Despite outstanding performance, the mathematical analysis for solving inverse problems by neural networks is mostly missing. In this paper, we introduce and rigorously analyze fam...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer US
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7144407/ https://www.ncbi.nlm.nih.gov/pubmed/32308256 http://dx.doi.org/10.1007/s10851-019-00911-1 |
_version_ | 1783519834070319104 |
---|---|
author | Schwab, Johannes Antholzer, Stephan Haltmeier, Markus |
author_facet | Schwab, Johannes Antholzer, Stephan Haltmeier, Markus |
author_sort | Schwab, Johannes |
collection | PubMed |
description | Deep learning and (deep) neural networks are emerging tools to address inverse problems and image reconstruction tasks. Despite outstanding performance, the mathematical analysis for solving inverse problems by neural networks is mostly missing. In this paper, we introduce and rigorously analyze families of deep regularizing neural networks (RegNets) of the form [Formula: see text] , where [Formula: see text] is a classical regularization and the network [Formula: see text] is trained to recover the missing part [Formula: see text] not found by the classical regularization. We show that these regularizing networks yield a convergent regularization method for solving inverse problems. Additionally, we derive convergence rates (quantitative error estimates) assuming a sufficient decay of the associated distance function. We demonstrate that our results recover existing convergence and convergence rates results for filter-based regularization methods as well as the recently introduced null space network as special cases. Numerical results are presented for a tomographic sparse data problem, which clearly demonstrate that the proposed RegNets improve classical regularization as well as the null space network. |
format | Online Article Text |
id | pubmed-7144407 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Springer US |
record_format | MEDLINE/PubMed |
spelling | pubmed-71444072020-04-15 Big in Japan: Regularizing Networks for Solving Inverse Problems Schwab, Johannes Antholzer, Stephan Haltmeier, Markus J Math Imaging Vis Article Deep learning and (deep) neural networks are emerging tools to address inverse problems and image reconstruction tasks. Despite outstanding performance, the mathematical analysis for solving inverse problems by neural networks is mostly missing. In this paper, we introduce and rigorously analyze families of deep regularizing neural networks (RegNets) of the form [Formula: see text] , where [Formula: see text] is a classical regularization and the network [Formula: see text] is trained to recover the missing part [Formula: see text] not found by the classical regularization. We show that these regularizing networks yield a convergent regularization method for solving inverse problems. Additionally, we derive convergence rates (quantitative error estimates) assuming a sufficient decay of the associated distance function. We demonstrate that our results recover existing convergence and convergence rates results for filter-based regularization methods as well as the recently introduced null space network as special cases. Numerical results are presented for a tomographic sparse data problem, which clearly demonstrate that the proposed RegNets improve classical regularization as well as the null space network. Springer US 2019-10-03 2020 /pmc/articles/PMC7144407/ /pubmed/32308256 http://dx.doi.org/10.1007/s10851-019-00911-1 Text en © The Author(s) 2019 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. |
spellingShingle | Article Schwab, Johannes Antholzer, Stephan Haltmeier, Markus Big in Japan: Regularizing Networks for Solving Inverse Problems |
title | Big in Japan: Regularizing Networks for Solving Inverse Problems |
title_full | Big in Japan: Regularizing Networks for Solving Inverse Problems |
title_fullStr | Big in Japan: Regularizing Networks for Solving Inverse Problems |
title_full_unstemmed | Big in Japan: Regularizing Networks for Solving Inverse Problems |
title_short | Big in Japan: Regularizing Networks for Solving Inverse Problems |
title_sort | big in japan: regularizing networks for solving inverse problems |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7144407/ https://www.ncbi.nlm.nih.gov/pubmed/32308256 http://dx.doi.org/10.1007/s10851-019-00911-1 |
work_keys_str_mv | AT schwabjohannes biginjapanregularizingnetworksforsolvinginverseproblems AT antholzerstephan biginjapanregularizingnetworksforsolvinginverseproblems AT haltmeiermarkus biginjapanregularizingnetworksforsolvinginverseproblems |