Cargando…
On the effectiveness of compact biomedical transformers
MOTIVATION: Language models pre-trained on biomedical corpora, such as BioBERT, have recently shown promising results on downstream biomedical tasks. Many existing pre-trained models, on the other hand, are resource-intensive and computationally heavy owing to factors such as embedding size, hidden...
Autores principales: | Rohanian, Omid, Nouriborji, Mohammadmahdi, Kouchaki, Samaneh, Clifton, David A |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Oxford University Press
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10027428/ https://www.ncbi.nlm.nih.gov/pubmed/36825820 http://dx.doi.org/10.1093/bioinformatics/btad103 |
Ejemplares similares
-
An Unsupervised Data-Driven Anomaly Detection Approach for Adverse Health Conditions in People Living With Dementia: Cohort Study
por: Bijlani, Nivedita, et al.
Publicado: (2022) -
Application of machine learning techniques to tuberculosis drug resistance analysis
por: Kouchaki, Samaneh, et al.
Publicado: (2019) -
Compact and ultracompact spectral imagers: technology and applications in biomedical imaging
por: Tran, Minh H., et al.
Publicado: (2023) -
The Utility of Data Transformation for Alignment, De Novo Assembly and Classification of Short Read Virus Sequences
por: Tapinos, Avraam, et al.
Publicado: (2019) -
Uniform resolution of compact identifiers for biomedical data
por: Wimalaratne, Sarala M., et al.
Publicado: (2018)