Cargando…
ProtGPT2 is a deep unsupervised language model for protein design
Protein design aims to build novel proteins customized for specific purposes, thereby holding the potential to tackle many environmental and biomedical problems. Recent progress in Transformer-based architectures has enabled the implementation of language models capable of generating text with human...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9329459/ https://www.ncbi.nlm.nih.gov/pubmed/35896542 http://dx.doi.org/10.1038/s41467-022-32007-7 |
_version_ | 1784757925612355584 |
---|---|
author | Ferruz, Noelia Schmidt, Steffen Höcker, Birte |
author_facet | Ferruz, Noelia Schmidt, Steffen Höcker, Birte |
author_sort | Ferruz, Noelia |
collection | PubMed |
description | Protein design aims to build novel proteins customized for specific purposes, thereby holding the potential to tackle many environmental and biomedical problems. Recent progress in Transformer-based architectures has enabled the implementation of language models capable of generating text with human-like capabilities. Here, motivated by this success, we describe ProtGPT2, a language model trained on the protein space that generates de novo protein sequences following the principles of natural ones. The generated proteins display natural amino acid propensities, while disorder predictions indicate that 88% of ProtGPT2-generated proteins are globular, in line with natural sequences. Sensitive sequence searches in protein databases show that ProtGPT2 sequences are distantly related to natural ones, and similarity networks further demonstrate that ProtGPT2 is sampling unexplored regions of protein space. AlphaFold prediction of ProtGPT2-sequences yields well-folded non-idealized structures with embodiments and large loops and reveals topologies not captured in current structure databases. ProtGPT2 generates sequences in a matter of seconds and is freely available. |
format | Online Article Text |
id | pubmed-9329459 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-93294592022-07-29 ProtGPT2 is a deep unsupervised language model for protein design Ferruz, Noelia Schmidt, Steffen Höcker, Birte Nat Commun Article Protein design aims to build novel proteins customized for specific purposes, thereby holding the potential to tackle many environmental and biomedical problems. Recent progress in Transformer-based architectures has enabled the implementation of language models capable of generating text with human-like capabilities. Here, motivated by this success, we describe ProtGPT2, a language model trained on the protein space that generates de novo protein sequences following the principles of natural ones. The generated proteins display natural amino acid propensities, while disorder predictions indicate that 88% of ProtGPT2-generated proteins are globular, in line with natural sequences. Sensitive sequence searches in protein databases show that ProtGPT2 sequences are distantly related to natural ones, and similarity networks further demonstrate that ProtGPT2 is sampling unexplored regions of protein space. AlphaFold prediction of ProtGPT2-sequences yields well-folded non-idealized structures with embodiments and large loops and reveals topologies not captured in current structure databases. ProtGPT2 generates sequences in a matter of seconds and is freely available. Nature Publishing Group UK 2022-07-27 /pmc/articles/PMC9329459/ /pubmed/35896542 http://dx.doi.org/10.1038/s41467-022-32007-7 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Ferruz, Noelia Schmidt, Steffen Höcker, Birte ProtGPT2 is a deep unsupervised language model for protein design |
title | ProtGPT2 is a deep unsupervised language model for protein design |
title_full | ProtGPT2 is a deep unsupervised language model for protein design |
title_fullStr | ProtGPT2 is a deep unsupervised language model for protein design |
title_full_unstemmed | ProtGPT2 is a deep unsupervised language model for protein design |
title_short | ProtGPT2 is a deep unsupervised language model for protein design |
title_sort | protgpt2 is a deep unsupervised language model for protein design |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9329459/ https://www.ncbi.nlm.nih.gov/pubmed/35896542 http://dx.doi.org/10.1038/s41467-022-32007-7 |
work_keys_str_mv | AT ferruznoelia protgpt2isadeepunsupervisedlanguagemodelforproteindesign AT schmidtsteffen protgpt2isadeepunsupervisedlanguagemodelforproteindesign AT hockerbirte protgpt2isadeepunsupervisedlanguagemodelforproteindesign |