Cargando…
cdsBERT - Extending Protein Language Models with Codon Awareness
Recent advancements in Protein Language Models (pLMs) have enabled high-throughput analysis of proteins through primary sequence alone. At the same time, newfound evidence illustrates that codon usage bias is remarkably predictive and can even change the final structure of a protein. Here, we explor...
Autores principales: | Hallee, Logan, Rafailidis, Nikolaos, Gleghorn, Jason P. |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Cold Spring Harbor Laboratory
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10516008/ https://www.ncbi.nlm.nih.gov/pubmed/37745387 http://dx.doi.org/10.1101/2023.09.15.558027 |
Ejemplares similares
-
Graph-BERT and language model-based framework for protein–protein interaction identification
por: Jha, Kanchan, et al.
Publicado: (2023) -
To BERT or Not to BERT Dealing with Possible BERT Failures in an Entailment Task
por: Fialho, Pedro, et al.
Publicado: (2020) -
BatteryBERT: A Pretrained Language Model for Battery
Database Enhancement
por: Huang, Shu, et al.
Publicado: (2022) -
PeptideBERT:
A Language Model Based on Transformers
for Peptide Property Prediction
por: Guntuboina, Chakradhar, et al.
Publicado: (2023) -
BatteryDataExtractor: battery-aware text-mining software embedded with BERT models
por: Huang, Shu, et al.
Publicado: (2022)