Cargando…
Bayesian uncertainty quantification for data-driven equation learning
Equation learning aims to infer differential equation models from data. While a number of studies have shown that differential equation models can be successfully identified when the data are sufficiently detailed and corrupted with relatively small amounts of noise, the relationship between observa...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
The Royal Society
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8548080/ https://www.ncbi.nlm.nih.gov/pubmed/35153587 http://dx.doi.org/10.1098/rspa.2021.0426 |
Sumario: | Equation learning aims to infer differential equation models from data. While a number of studies have shown that differential equation models can be successfully identified when the data are sufficiently detailed and corrupted with relatively small amounts of noise, the relationship between observation noise and uncertainty in the learned differential equation models remains unexplored. We demonstrate that for noisy datasets there exists great variation in both the structure of the learned differential equation models and their parameter values. We explore how to exploit multiple datasets to quantify uncertainty in the learned models, and at the same time draw mechanistic conclusions about the target differential equations. We showcase our results using simulation data from a relatively straightforward agent-based model (ABM) which has a well-characterized partial differential equation description that provides highly accurate predictions of averaged ABM behaviours in relevant regions of parameter space. Our approach combines equation learning methods with Bayesian inference approaches so that a quantification of uncertainty can be given by the posterior parameter distribution of the learned model. |
---|