Cargando…
Relation classification via BERT with piecewise convolution and focal loss
Recent relation extraction models’ architecture are evolved from the shallow neural networks to natural language model, such as convolutional neural networks or recurrent neural networks to Bert. However, these methods did not consider the semantic information in the sequence or the distance depende...
Autores principales: | Liu, Jianyi, Duan, Xi, Zhang, Ru, Sun, Youqiang, Guan, Lei, Lin, Bingjie |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8432804/ https://www.ncbi.nlm.nih.gov/pubmed/34506554 http://dx.doi.org/10.1371/journal.pone.0257092 |
Ejemplares similares
-
Fusion-ConvBERT: Parallel Convolution and BERT Fusion for Speech Emotion Recognition
por: Lee, Sanghyun, et al.
Publicado: (2020) -
VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
por: Lu, Zhibin, et al.
Publicado: (2020) -
Chemical-induced disease extraction via recurrent piecewise convolutional neural networks
por: Li, Haodi, et al.
Publicado: (2018) -
BertSRC: transformer-based semantic relation classification
por: Lee, Yeawon, et al.
Publicado: (2022) -
srBERT: automatic article classification model for systematic review using BERT
por: Aum, Sungmin, et al.
Publicado: (2021)