Cargando…
Operator compression with deep neural networks
This paper studies the compression of partial differential operators using neural networks. We consider a family of operators, parameterized by a potentially high-dimensional space of coefficients that may vary on a large range of scales. Based on the existing methods that compress such a multiscale...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer International Publishing
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9028012/ https://www.ncbi.nlm.nih.gov/pubmed/35531267 http://dx.doi.org/10.1186/s13662-022-03702-y |
_version_ | 1784691511232823296 |
---|---|
author | Kröpfl, Fabian Maier, Roland Peterseim, Daniel |
author_facet | Kröpfl, Fabian Maier, Roland Peterseim, Daniel |
author_sort | Kröpfl, Fabian |
collection | PubMed |
description | This paper studies the compression of partial differential operators using neural networks. We consider a family of operators, parameterized by a potentially high-dimensional space of coefficients that may vary on a large range of scales. Based on the existing methods that compress such a multiscale operator to a finite-dimensional sparse surrogate model on a given target scale, we propose to directly approximate the coefficient-to-surrogate map with a neural network. We emulate local assembly structures of the surrogates and thus only require a moderately sized network that can be trained efficiently in an offline phase. This enables large compression ratios and the online computation of a surrogate based on simple forward passes through the network is substantially accelerated compared to classical numerical upscaling approaches. We apply the abstract framework to a family of prototypical second-order elliptic heterogeneous diffusion operators as a demonstrating example. |
format | Online Article Text |
id | pubmed-9028012 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Springer International Publishing |
record_format | MEDLINE/PubMed |
spelling | pubmed-90280122022-05-06 Operator compression with deep neural networks Kröpfl, Fabian Maier, Roland Peterseim, Daniel Adv Contin Discret Model Research This paper studies the compression of partial differential operators using neural networks. We consider a family of operators, parameterized by a potentially high-dimensional space of coefficients that may vary on a large range of scales. Based on the existing methods that compress such a multiscale operator to a finite-dimensional sparse surrogate model on a given target scale, we propose to directly approximate the coefficient-to-surrogate map with a neural network. We emulate local assembly structures of the surrogates and thus only require a moderately sized network that can be trained efficiently in an offline phase. This enables large compression ratios and the online computation of a surrogate based on simple forward passes through the network is substantially accelerated compared to classical numerical upscaling approaches. We apply the abstract framework to a family of prototypical second-order elliptic heterogeneous diffusion operators as a demonstrating example. Springer International Publishing 2022-04-09 2022 /pmc/articles/PMC9028012/ /pubmed/35531267 http://dx.doi.org/10.1186/s13662-022-03702-y Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Research Kröpfl, Fabian Maier, Roland Peterseim, Daniel Operator compression with deep neural networks |
title | Operator compression with deep neural networks |
title_full | Operator compression with deep neural networks |
title_fullStr | Operator compression with deep neural networks |
title_full_unstemmed | Operator compression with deep neural networks |
title_short | Operator compression with deep neural networks |
title_sort | operator compression with deep neural networks |
topic | Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9028012/ https://www.ncbi.nlm.nih.gov/pubmed/35531267 http://dx.doi.org/10.1186/s13662-022-03702-y |
work_keys_str_mv | AT kropflfabian operatorcompressionwithdeepneuralnetworks AT maierroland operatorcompressionwithdeepneuralnetworks AT peterseimdaniel operatorcompressionwithdeepneuralnetworks |