Cargando…
Epistatic Net allows the sparse spectral regularization of deep neural networks for inferring fitness functions
Despite recent advances in high-throughput combinatorial mutagenesis assays, the number of labeled sequences available to predict molecular functions has remained small for the vastness of the sequence space combined with the ruggedness of many fitness functions. While deep neural networks (DNNs) ca...
Autores principales: | , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8410946/ https://www.ncbi.nlm.nih.gov/pubmed/34471113 http://dx.doi.org/10.1038/s41467-021-25371-3 |
_version_ | 1783747200888602624 |
---|---|
author | Aghazadeh, Amirali Nisonoff, Hunter Ocal, Orhan Brookes, David H. Huang, Yijie Koyluoglu, O. Ozan Listgarten, Jennifer Ramchandran, Kannan |
author_facet | Aghazadeh, Amirali Nisonoff, Hunter Ocal, Orhan Brookes, David H. Huang, Yijie Koyluoglu, O. Ozan Listgarten, Jennifer Ramchandran, Kannan |
author_sort | Aghazadeh, Amirali |
collection | PubMed |
description | Despite recent advances in high-throughput combinatorial mutagenesis assays, the number of labeled sequences available to predict molecular functions has remained small for the vastness of the sequence space combined with the ruggedness of many fitness functions. While deep neural networks (DNNs) can capture high-order epistatic interactions among the mutational sites, they tend to overfit to the small number of labeled sequences available for training. Here, we developed Epistatic Net (EN), a method for spectral regularization of DNNs that exploits evidence that epistatic interactions in many fitness functions are sparse. We built a scalable extension of EN, usable for larger sequences, which enables spectral regularization using fast sparse recovery algorithms informed by coding theory. Results on several biological landscapes show that EN consistently improves the prediction accuracy of DNNs and enables them to outperform competing models which assume other priors. EN estimates the higher-order epistatic interactions of DNNs trained on massive sequence spaces-a computational problem that otherwise takes years to solve. |
format | Online Article Text |
id | pubmed-8410946 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-84109462021-09-22 Epistatic Net allows the sparse spectral regularization of deep neural networks for inferring fitness functions Aghazadeh, Amirali Nisonoff, Hunter Ocal, Orhan Brookes, David H. Huang, Yijie Koyluoglu, O. Ozan Listgarten, Jennifer Ramchandran, Kannan Nat Commun Article Despite recent advances in high-throughput combinatorial mutagenesis assays, the number of labeled sequences available to predict molecular functions has remained small for the vastness of the sequence space combined with the ruggedness of many fitness functions. While deep neural networks (DNNs) can capture high-order epistatic interactions among the mutational sites, they tend to overfit to the small number of labeled sequences available for training. Here, we developed Epistatic Net (EN), a method for spectral regularization of DNNs that exploits evidence that epistatic interactions in many fitness functions are sparse. We built a scalable extension of EN, usable for larger sequences, which enables spectral regularization using fast sparse recovery algorithms informed by coding theory. Results on several biological landscapes show that EN consistently improves the prediction accuracy of DNNs and enables them to outperform competing models which assume other priors. EN estimates the higher-order epistatic interactions of DNNs trained on massive sequence spaces-a computational problem that otherwise takes years to solve. Nature Publishing Group UK 2021-09-01 /pmc/articles/PMC8410946/ /pubmed/34471113 http://dx.doi.org/10.1038/s41467-021-25371-3 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Aghazadeh, Amirali Nisonoff, Hunter Ocal, Orhan Brookes, David H. Huang, Yijie Koyluoglu, O. Ozan Listgarten, Jennifer Ramchandran, Kannan Epistatic Net allows the sparse spectral regularization of deep neural networks for inferring fitness functions |
title | Epistatic Net allows the sparse spectral regularization of deep neural networks for inferring fitness functions |
title_full | Epistatic Net allows the sparse spectral regularization of deep neural networks for inferring fitness functions |
title_fullStr | Epistatic Net allows the sparse spectral regularization of deep neural networks for inferring fitness functions |
title_full_unstemmed | Epistatic Net allows the sparse spectral regularization of deep neural networks for inferring fitness functions |
title_short | Epistatic Net allows the sparse spectral regularization of deep neural networks for inferring fitness functions |
title_sort | epistatic net allows the sparse spectral regularization of deep neural networks for inferring fitness functions |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8410946/ https://www.ncbi.nlm.nih.gov/pubmed/34471113 http://dx.doi.org/10.1038/s41467-021-25371-3 |
work_keys_str_mv | AT aghazadehamirali epistaticnetallowsthesparsespectralregularizationofdeepneuralnetworksforinferringfitnessfunctions AT nisonoffhunter epistaticnetallowsthesparsespectralregularizationofdeepneuralnetworksforinferringfitnessfunctions AT ocalorhan epistaticnetallowsthesparsespectralregularizationofdeepneuralnetworksforinferringfitnessfunctions AT brookesdavidh epistaticnetallowsthesparsespectralregularizationofdeepneuralnetworksforinferringfitnessfunctions AT huangyijie epistaticnetallowsthesparsespectralregularizationofdeepneuralnetworksforinferringfitnessfunctions AT koyluogluoozan epistaticnetallowsthesparsespectralregularizationofdeepneuralnetworksforinferringfitnessfunctions AT listgartenjennifer epistaticnetallowsthesparsespectralregularizationofdeepneuralnetworksforinferringfitnessfunctions AT ramchandrankannan epistaticnetallowsthesparsespectralregularizationofdeepneuralnetworksforinferringfitnessfunctions |