Cargando…
Do syntactic trees enhance Bidirectional Encoder Representations from Transformers (BERT) models for chemical–drug relation extraction?
Collecting relations between chemicals and drugs is crucial in biomedical research. The pre-trained transformer model, e.g. Bidirectional Encoder Representations from Transformers (BERT), is shown to have limitations on biomedical texts; more specifically, the lack of annotated data makes relation e...
Autores principales: | Tang, Anfu, Deléger, Louise, Bossy, Robert, Zweigenbaum, Pierre, Nédellec, Claire |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Oxford University Press
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9408061/ https://www.ncbi.nlm.nih.gov/pubmed/36006843 http://dx.doi.org/10.1093/database/baac070 |
Ejemplares similares
-
C-Norm: a neural approach to few-shot entity normalization
por: Ferré, Arnaud, et al.
Publicado: (2020) -
Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
por: Areshey, Ali, et al.
Publicado: (2023) -
BERT-Kgly: A Bidirectional Encoder Representations From Transformers (BERT)-Based Model for Predicting Lysine Glycation Site for Homo sapiens
por: Liu, Yinbo, et al.
Publicado: (2022) -
An Improved BERT and Syntactic Dependency Representation Model for Sentiment Analysis
por: Liu, Wenfeng, et al.
Publicado: (2022) -
Text mining tools for extracting information about microbial biodiversity in food
por: Chaix, Estelle, et al.
Publicado: (2019)