Cargando…
Monte Carlo samplers for efficient network inference
Accessing information on an underlying network driving a biological process often involves interrupting the process and collecting snapshot data. When snapshot data are stochastic, the data’s structure necessitates a probabilistic description to infer underlying reaction networks. As an example, we...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10353823/ https://www.ncbi.nlm.nih.gov/pubmed/37463156 http://dx.doi.org/10.1371/journal.pcbi.1011256 |
_version_ | 1785074787556524032 |
---|---|
author | Kilic, Zeliha Schweiger, Max Moyer, Camille Pressé, Steve |
author_facet | Kilic, Zeliha Schweiger, Max Moyer, Camille Pressé, Steve |
author_sort | Kilic, Zeliha |
collection | PubMed |
description | Accessing information on an underlying network driving a biological process often involves interrupting the process and collecting snapshot data. When snapshot data are stochastic, the data’s structure necessitates a probabilistic description to infer underlying reaction networks. As an example, we may imagine wanting to learn gene state networks from the type of data collected in single molecule RNA fluorescence in situ hybridization (RNA-FISH). In the networks we consider, nodes represent network states, and edges represent biochemical reaction rates linking states. Simultaneously estimating the number of nodes and constituent parameters from snapshot data remains a challenging task in part on account of data uncertainty and timescale separations between kinetic parameters mediating the network. While parametric Bayesian methods learn parameters given a network structure (with known node numbers) with rigorously propagated measurement uncertainty, learning the number of nodes and parameters with potentially large timescale separations remain open questions. Here, we propose a Bayesian nonparametric framework and describe a hybrid Bayesian Markov Chain Monte Carlo (MCMC) sampler directly addressing these challenges. In particular, in our hybrid method, Hamiltonian Monte Carlo (HMC) leverages local posterior geometries in inference to explore the parameter space; Adaptive Metropolis Hastings (AMH) learns correlations between plausible parameter sets to efficiently propose probable models; and Parallel Tempering takes into account multiple models simultaneously with tempered information content to augment sampling efficiency. We apply our method to synthetic data mimicking single molecule RNA-FISH, a popular snapshot method in probing transcriptional networks to illustrate the identified challenges inherent to learning dynamical models from these snapshots and how our method addresses them. |
format | Online Article Text |
id | pubmed-10353823 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-103538232023-07-19 Monte Carlo samplers for efficient network inference Kilic, Zeliha Schweiger, Max Moyer, Camille Pressé, Steve PLoS Comput Biol Research Article Accessing information on an underlying network driving a biological process often involves interrupting the process and collecting snapshot data. When snapshot data are stochastic, the data’s structure necessitates a probabilistic description to infer underlying reaction networks. As an example, we may imagine wanting to learn gene state networks from the type of data collected in single molecule RNA fluorescence in situ hybridization (RNA-FISH). In the networks we consider, nodes represent network states, and edges represent biochemical reaction rates linking states. Simultaneously estimating the number of nodes and constituent parameters from snapshot data remains a challenging task in part on account of data uncertainty and timescale separations between kinetic parameters mediating the network. While parametric Bayesian methods learn parameters given a network structure (with known node numbers) with rigorously propagated measurement uncertainty, learning the number of nodes and parameters with potentially large timescale separations remain open questions. Here, we propose a Bayesian nonparametric framework and describe a hybrid Bayesian Markov Chain Monte Carlo (MCMC) sampler directly addressing these challenges. In particular, in our hybrid method, Hamiltonian Monte Carlo (HMC) leverages local posterior geometries in inference to explore the parameter space; Adaptive Metropolis Hastings (AMH) learns correlations between plausible parameter sets to efficiently propose probable models; and Parallel Tempering takes into account multiple models simultaneously with tempered information content to augment sampling efficiency. We apply our method to synthetic data mimicking single molecule RNA-FISH, a popular snapshot method in probing transcriptional networks to illustrate the identified challenges inherent to learning dynamical models from these snapshots and how our method addresses them. Public Library of Science 2023-07-18 /pmc/articles/PMC10353823/ /pubmed/37463156 http://dx.doi.org/10.1371/journal.pcbi.1011256 Text en © 2023 Kilic et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Kilic, Zeliha Schweiger, Max Moyer, Camille Pressé, Steve Monte Carlo samplers for efficient network inference |
title | Monte Carlo samplers for efficient network inference |
title_full | Monte Carlo samplers for efficient network inference |
title_fullStr | Monte Carlo samplers for efficient network inference |
title_full_unstemmed | Monte Carlo samplers for efficient network inference |
title_short | Monte Carlo samplers for efficient network inference |
title_sort | monte carlo samplers for efficient network inference |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10353823/ https://www.ncbi.nlm.nih.gov/pubmed/37463156 http://dx.doi.org/10.1371/journal.pcbi.1011256 |
work_keys_str_mv | AT kiliczeliha montecarlosamplersforefficientnetworkinference AT schweigermax montecarlosamplersforefficientnetworkinference AT moyercamille montecarlosamplersforefficientnetworkinference AT pressesteve montecarlosamplersforefficientnetworkinference |