Cargando…
A Relational Adaptive Neural Model for Joint Entity and Relation Extraction
Relation extraction is a popular subtask in natural language processing (NLP). In the task of entity relation joint extraction, overlapping entities and multi-type relation extraction in overlapping triplets remain a challenging problem. The classification of relations by sharing the same probabilit...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8008121/ https://www.ncbi.nlm.nih.gov/pubmed/33796016 http://dx.doi.org/10.3389/fnbot.2021.635492 |
_version_ | 1783672634886586368 |
---|---|
author | Duan, Guiduo Miao, Jiayu Huang, Tianxi Luo, Wenlong Hu, Dekun |
author_facet | Duan, Guiduo Miao, Jiayu Huang, Tianxi Luo, Wenlong Hu, Dekun |
author_sort | Duan, Guiduo |
collection | PubMed |
description | Relation extraction is a popular subtask in natural language processing (NLP). In the task of entity relation joint extraction, overlapping entities and multi-type relation extraction in overlapping triplets remain a challenging problem. The classification of relations by sharing the same probability space will ignore the correlation information among multiple relations. A relational-adaptive entity relation joint extraction model based on multi-head self-attention and densely connected graph convolution network (which is called MA-DCGCN) is proposed in the paper. In the model, the multi-head attention mechanism is specifically used to assign weights to multiple relation types among entities so as to ensure that the probability space of multiple relation is not mutually exclusive. This mechanism also predicts the strength of the relationship between various relationship types and entity pairs flexibly. The structure information of deeper level in the text graph is extracted by the densely connected graph convolution network, and the interaction information of entity relation is captured. To demonstrate the superior performance of our model, we conducted a variety of experiments on two widely used public datasets, NYT and WebNLG. Extensive results show that our model achieves state-of-the-art performance. Especially, the detection effect of overlapping triplets is significantly improved compared with the several existing mainstream methods. |
format | Online Article Text |
id | pubmed-8008121 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-80081212021-03-31 A Relational Adaptive Neural Model for Joint Entity and Relation Extraction Duan, Guiduo Miao, Jiayu Huang, Tianxi Luo, Wenlong Hu, Dekun Front Neurorobot Neuroscience Relation extraction is a popular subtask in natural language processing (NLP). In the task of entity relation joint extraction, overlapping entities and multi-type relation extraction in overlapping triplets remain a challenging problem. The classification of relations by sharing the same probability space will ignore the correlation information among multiple relations. A relational-adaptive entity relation joint extraction model based on multi-head self-attention and densely connected graph convolution network (which is called MA-DCGCN) is proposed in the paper. In the model, the multi-head attention mechanism is specifically used to assign weights to multiple relation types among entities so as to ensure that the probability space of multiple relation is not mutually exclusive. This mechanism also predicts the strength of the relationship between various relationship types and entity pairs flexibly. The structure information of deeper level in the text graph is extracted by the densely connected graph convolution network, and the interaction information of entity relation is captured. To demonstrate the superior performance of our model, we conducted a variety of experiments on two widely used public datasets, NYT and WebNLG. Extensive results show that our model achieves state-of-the-art performance. Especially, the detection effect of overlapping triplets is significantly improved compared with the several existing mainstream methods. Frontiers Media S.A. 2021-03-16 /pmc/articles/PMC8008121/ /pubmed/33796016 http://dx.doi.org/10.3389/fnbot.2021.635492 Text en Copyright © 2021 Duan, Miao, Huang, Luo and Hu. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Duan, Guiduo Miao, Jiayu Huang, Tianxi Luo, Wenlong Hu, Dekun A Relational Adaptive Neural Model for Joint Entity and Relation Extraction |
title | A Relational Adaptive Neural Model for Joint Entity and Relation Extraction |
title_full | A Relational Adaptive Neural Model for Joint Entity and Relation Extraction |
title_fullStr | A Relational Adaptive Neural Model for Joint Entity and Relation Extraction |
title_full_unstemmed | A Relational Adaptive Neural Model for Joint Entity and Relation Extraction |
title_short | A Relational Adaptive Neural Model for Joint Entity and Relation Extraction |
title_sort | relational adaptive neural model for joint entity and relation extraction |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8008121/ https://www.ncbi.nlm.nih.gov/pubmed/33796016 http://dx.doi.org/10.3389/fnbot.2021.635492 |
work_keys_str_mv | AT duanguiduo arelationaladaptiveneuralmodelforjointentityandrelationextraction AT miaojiayu arelationaladaptiveneuralmodelforjointentityandrelationextraction AT huangtianxi arelationaladaptiveneuralmodelforjointentityandrelationextraction AT luowenlong arelationaladaptiveneuralmodelforjointentityandrelationextraction AT hudekun arelationaladaptiveneuralmodelforjointentityandrelationextraction AT duanguiduo relationaladaptiveneuralmodelforjointentityandrelationextraction AT miaojiayu relationaladaptiveneuralmodelforjointentityandrelationextraction AT huangtianxi relationaladaptiveneuralmodelforjointentityandrelationextraction AT luowenlong relationaladaptiveneuralmodelforjointentityandrelationextraction AT hudekun relationaladaptiveneuralmodelforjointentityandrelationextraction |