Cargando…

On the effectiveness of compact biomedical transformers

MOTIVATION: Language models pre-trained on biomedical corpora, such as BioBERT, have recently shown promising results on downstream biomedical tasks. Many existing pre-trained models, on the other hand, are resource-intensive and computationally heavy owing to factors such as embedding size, hidden...

Descripción completa

Detalles Bibliográficos
Autores principales: Rohanian, Omid, Nouriborji, Mohammadmahdi, Kouchaki, Samaneh, Clifton, David A
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Oxford University Press 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10027428/
https://www.ncbi.nlm.nih.gov/pubmed/36825820
http://dx.doi.org/10.1093/bioinformatics/btad103