Cargando…
TSSNote-CyaPromBERT: Development of an integrated platform for highly accurate promoter prediction and visualization of Synechococcus sp. and Synechocystis sp. through a state-of-the-art natural language processing model BERT
Since the introduction of the first transformer model with a unique self-attention mechanism, natural language processing (NLP) models have attained state-of-the-art (SOTA) performance on various tasks. As DNA is the blueprint of life, it can be viewed as an unusual language, with its characteristic...
Autores principales: | Mai, Dung Hoang Anh, Nguyen, Linh Thanh, Lee, Eun Yeol |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9745317/ https://www.ncbi.nlm.nih.gov/pubmed/36523764 http://dx.doi.org/10.3389/fgene.2022.1067562 |
Ejemplares similares
-
To BERT or Not to BERT Dealing with Possible BERT Failures in an Entailment Task
por: Fialho, Pedro, et al.
Publicado: (2020) -
VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
por: Lu, Zhibin, et al.
Publicado: (2020) -
IUP-BERT: Identification of Umami Peptides Based on BERT Features
por: Jiang, Liangzhen, et al.
Publicado: (2022) -
Fine-tuning of BERT Model to Accurately Predict Drug–Target Interactions
por: Kang, Hyeunseok, et al.
Publicado: (2022) -
Fusion-ConvBERT: Parallel Convolution and BERT Fusion for Speech Emotion Recognition
por: Lee, Sanghyun, et al.
Publicado: (2020)