Cargando…
PeptideBERT: A Language Model Based on Transformers for Peptide Property Prediction
[Image: see text] Recent advances in language models have enabled the protein modeling community with a powerful tool that uses transformers to represent protein sequences as text. This breakthrough enables a sequence-to-property prediction for peptides without relying on explicit structural data. I...
Autores principales: | Guntuboina, Chakradhar, Das, Adrita, Mollaei, Parisa, Kim, Seongwon, Barati Farimani, Amir |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
American Chemical Society
2023
|
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10683064/ https://www.ncbi.nlm.nih.gov/pubmed/37956397 http://dx.doi.org/10.1021/acs.jpclett.3c02398 |
Ejemplares similares
-
Activity Map and
Transition Pathways of G Protein-Coupled
Receptor Revealed by Machine Learning
por: Mollaei, Parisa, et al.
Publicado: (2023) -
Unveiling Switching
Function of Amino Acids in
Proteins Using a Machine
Learning Approach
por: Mollaei, Parisa, et al.
Publicado: (2023) -
Prediction of GPCR activity using machine learning
por: Yadav, Prakarsh, et al.
Publicado: (2022) -
IUP-BERT: Identification of Umami Peptides Based on BERT Features
por: Jiang, Liangzhen, et al.
Publicado: (2022) -
MOFormer: Self-Supervised
Transformer Model for Metal–Organic
Framework Property Prediction
por: Cao, Zhonglin, et al.
Publicado: (2023)