Cargando…

Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network

Entity relationship extraction is one of the key areas of information extraction and is an important research content in the field of natural language processing. Based on past research, this paper proposes a combined extraction model based on a multi-headed attention neural network. Based on the BE...

Descripción completa

Detalles Bibliográficos
Autores principales: Xue, Yafei, Zhu, Jing, Lyu, Jing
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9155959/
https://www.ncbi.nlm.nih.gov/pubmed/35655501
http://dx.doi.org/10.1155/2022/1530295
_version_ 1784718350135328768
author Xue, Yafei
Zhu, Jing
Lyu, Jing
author_facet Xue, Yafei
Zhu, Jing
Lyu, Jing
author_sort Xue, Yafei
collection PubMed
description Entity relationship extraction is one of the key areas of information extraction and is an important research content in the field of natural language processing. Based on past research, this paper proposes a combined extraction model based on a multi-headed attention neural network. Based on the BERT training model architecture, this paper extracts textual entities and relations tasks. At the same time, it integrates the naming entity feature, the terminology labeling characteristics, and the training relationship. The multi-attention mechanism and improved neural structures are added to the model to enhance the characteristic extraction capacity of the model. By studying the parameters of the multi-head attention mechanism, it is shown that the optimal parameters of the multi-head attention are h = 8, dv = 16, and the classification effect of the model is the best at this time. After experimental analysis, comparing the traditional text entity relationship extraction model and the multi-head attention neural network joint extraction model, the model entity relationship extraction effect was evaluated from the aspects of comprehensive evaluation index F1, accuracy rate P, and system time consumed. Experiments show: First, in the accuracy indicator, Xception performance is best, reaching 87.7%, indicating that the model extraction feature effect is enhanced. Second, with the increase of the number of iterative times, the verification set curve and the training set curve have increased to 96% and 98%, respectively, and the model has a strong generalization ability. Third, the model completes the extraction of all data in the test set in 1005 ms, which is an acceptable speed. Therefore, the model test results in this article are good, with a strong practical value.
format Online
Article
Text
id pubmed-9155959
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-91559592022-06-01 Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network Xue, Yafei Zhu, Jing Lyu, Jing Comput Intell Neurosci Research Article Entity relationship extraction is one of the key areas of information extraction and is an important research content in the field of natural language processing. Based on past research, this paper proposes a combined extraction model based on a multi-headed attention neural network. Based on the BERT training model architecture, this paper extracts textual entities and relations tasks. At the same time, it integrates the naming entity feature, the terminology labeling characteristics, and the training relationship. The multi-attention mechanism and improved neural structures are added to the model to enhance the characteristic extraction capacity of the model. By studying the parameters of the multi-head attention mechanism, it is shown that the optimal parameters of the multi-head attention are h = 8, dv = 16, and the classification effect of the model is the best at this time. After experimental analysis, comparing the traditional text entity relationship extraction model and the multi-head attention neural network joint extraction model, the model entity relationship extraction effect was evaluated from the aspects of comprehensive evaluation index F1, accuracy rate P, and system time consumed. Experiments show: First, in the accuracy indicator, Xception performance is best, reaching 87.7%, indicating that the model extraction feature effect is enhanced. Second, with the increase of the number of iterative times, the verification set curve and the training set curve have increased to 96% and 98%, respectively, and the model has a strong generalization ability. Third, the model completes the extraction of all data in the test set in 1005 ms, which is an acceptable speed. Therefore, the model test results in this article are good, with a strong practical value. Hindawi 2022-05-24 /pmc/articles/PMC9155959/ /pubmed/35655501 http://dx.doi.org/10.1155/2022/1530295 Text en Copyright © 2022 Yafei Xue et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Xue, Yafei
Zhu, Jing
Lyu, Jing
Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
title Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
title_full Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
title_fullStr Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
title_full_unstemmed Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
title_short Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
title_sort construction and application of text entity relation joint extraction model based on multi-head attention neural network
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9155959/
https://www.ncbi.nlm.nih.gov/pubmed/35655501
http://dx.doi.org/10.1155/2022/1530295
work_keys_str_mv AT xueyafei constructionandapplicationoftextentityrelationjointextractionmodelbasedonmultiheadattentionneuralnetwork
AT zhujing constructionandapplicationoftextentityrelationjointextractionmodelbasedonmultiheadattentionneuralnetwork
AT lyujing constructionandapplicationoftextentityrelationjointextractionmodelbasedonmultiheadattentionneuralnetwork