Cargando…
Multipath Cross Graph Convolution for Knowledge Representation Learning
In the past, most of the entity prediction methods based on embedding lacked the training of local core relationships, resulting in a deficiency in the end-to-end training. Aiming at this problem, we propose an end-to-end knowledge graph embedding representation method. It involves local graph convo...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8727103/ https://www.ncbi.nlm.nih.gov/pubmed/34992642 http://dx.doi.org/10.1155/2021/2547905 |
_version_ | 1784626443523719168 |
---|---|
author | Tian, Luogeng Yang, Bailong Yin, Xinli Kang, Kai Wu, Jing |
author_facet | Tian, Luogeng Yang, Bailong Yin, Xinli Kang, Kai Wu, Jing |
author_sort | Tian, Luogeng |
collection | PubMed |
description | In the past, most of the entity prediction methods based on embedding lacked the training of local core relationships, resulting in a deficiency in the end-to-end training. Aiming at this problem, we propose an end-to-end knowledge graph embedding representation method. It involves local graph convolution and global cross learning in this paper, which is called the TransC graph convolutional network (TransC-GCN). Firstly, multiple local semantic spaces are divided according to the largest neighbor. Secondly, a translation model is used to map the local entities and relationships into a cross vector, which serves as the input of GCN. Thirdly, through training and learning of local semantic relations, the best entities and strongest relations are found. The optimal entity relation combination ranking is obtained by evaluating the posterior loss function based on the mutual information entropy. Experiments show that this paper can obtain local entity feature information more accurately through the convolution operation of the lightweight convolutional neural network. Also, the maximum pooling operation helps to grasp the strong signal on the local feature, thereby avoiding the globally redundant feature. Compared with the mainstream triad prediction baseline model, the proposed algorithm can effectively reduce the computational complexity while achieving strong robustness. It also increases the inference accuracy of entities and relations by 8.1% and 4.4%, respectively. In short, this new method can not only effectively extract the local nodes and relationship features of the knowledge graph but also satisfy the requirements of multilayer penetration and relationship derivation of a knowledge graph. |
format | Online Article Text |
id | pubmed-8727103 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Hindawi |
record_format | MEDLINE/PubMed |
spelling | pubmed-87271032022-01-05 Multipath Cross Graph Convolution for Knowledge Representation Learning Tian, Luogeng Yang, Bailong Yin, Xinli Kang, Kai Wu, Jing Comput Intell Neurosci Research Article In the past, most of the entity prediction methods based on embedding lacked the training of local core relationships, resulting in a deficiency in the end-to-end training. Aiming at this problem, we propose an end-to-end knowledge graph embedding representation method. It involves local graph convolution and global cross learning in this paper, which is called the TransC graph convolutional network (TransC-GCN). Firstly, multiple local semantic spaces are divided according to the largest neighbor. Secondly, a translation model is used to map the local entities and relationships into a cross vector, which serves as the input of GCN. Thirdly, through training and learning of local semantic relations, the best entities and strongest relations are found. The optimal entity relation combination ranking is obtained by evaluating the posterior loss function based on the mutual information entropy. Experiments show that this paper can obtain local entity feature information more accurately through the convolution operation of the lightweight convolutional neural network. Also, the maximum pooling operation helps to grasp the strong signal on the local feature, thereby avoiding the globally redundant feature. Compared with the mainstream triad prediction baseline model, the proposed algorithm can effectively reduce the computational complexity while achieving strong robustness. It also increases the inference accuracy of entities and relations by 8.1% and 4.4%, respectively. In short, this new method can not only effectively extract the local nodes and relationship features of the knowledge graph but also satisfy the requirements of multilayer penetration and relationship derivation of a knowledge graph. Hindawi 2021-12-28 /pmc/articles/PMC8727103/ /pubmed/34992642 http://dx.doi.org/10.1155/2021/2547905 Text en Copyright © 2021 Luogeng Tian et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Research Article Tian, Luogeng Yang, Bailong Yin, Xinli Kang, Kai Wu, Jing Multipath Cross Graph Convolution for Knowledge Representation Learning |
title | Multipath Cross Graph Convolution for Knowledge Representation Learning |
title_full | Multipath Cross Graph Convolution for Knowledge Representation Learning |
title_fullStr | Multipath Cross Graph Convolution for Knowledge Representation Learning |
title_full_unstemmed | Multipath Cross Graph Convolution for Knowledge Representation Learning |
title_short | Multipath Cross Graph Convolution for Knowledge Representation Learning |
title_sort | multipath cross graph convolution for knowledge representation learning |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8727103/ https://www.ncbi.nlm.nih.gov/pubmed/34992642 http://dx.doi.org/10.1155/2021/2547905 |
work_keys_str_mv | AT tianluogeng multipathcrossgraphconvolutionforknowledgerepresentationlearning AT yangbailong multipathcrossgraphconvolutionforknowledgerepresentationlearning AT yinxinli multipathcrossgraphconvolutionforknowledgerepresentationlearning AT kangkai multipathcrossgraphconvolutionforknowledgerepresentationlearning AT wujing multipathcrossgraphconvolutionforknowledgerepresentationlearning |