Cargando…

Do syntactic trees enhance Bidirectional Encoder Representations from Transformers (BERT) models for chemical–drug relation extraction?

Collecting relations between chemicals and drugs is crucial in biomedical research. The pre-trained transformer model, e.g. Bidirectional Encoder Representations from Transformers (BERT), is shown to have limitations on biomedical texts; more specifically, the lack of annotated data makes relation e...

Descripción completa

Detalles Bibliográficos
Autores principales: Tang, Anfu, Deléger, Louise, Bossy, Robert, Zweigenbaum, Pierre, Nédellec, Claire
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Oxford University Press 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9408061/
https://www.ncbi.nlm.nih.gov/pubmed/36006843
http://dx.doi.org/10.1093/database/baac070