Cargando…

Text-Graph Enhanced Knowledge Graph Representation Learning

Knowledge Graphs (KGs) such as Freebase and YAGO have been widely adopted in a variety of NLP tasks. Representation learning of Knowledge Graphs (KGs) aims to map entities and relationships into a continuous low-dimensional vector space. Conventional KG embedding methods (such as TransE and ConvE) u...

Descripción completa

Detalles Bibliográficos
Autores principales: Hu, Linmei, Zhang, Mengmei, Li, Shaohua, Shi, Jinghan, Shi, Chuan, Yang, Cheng, Liu, Zhiyuan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8418144/
https://www.ncbi.nlm.nih.gov/pubmed/34490421
http://dx.doi.org/10.3389/frai.2021.697856
_version_ 1783748524700073984
author Hu, Linmei
Zhang, Mengmei
Li, Shaohua
Shi, Jinghan
Shi, Chuan
Yang, Cheng
Liu, Zhiyuan
author_facet Hu, Linmei
Zhang, Mengmei
Li, Shaohua
Shi, Jinghan
Shi, Chuan
Yang, Cheng
Liu, Zhiyuan
author_sort Hu, Linmei
collection PubMed
description Knowledge Graphs (KGs) such as Freebase and YAGO have been widely adopted in a variety of NLP tasks. Representation learning of Knowledge Graphs (KGs) aims to map entities and relationships into a continuous low-dimensional vector space. Conventional KG embedding methods (such as TransE and ConvE) utilize only KG triplets and thus suffer from structure sparsity. Some recent works address this issue by incorporating auxiliary texts of entities, typically entity descriptions. However, these methods usually focus only on local consecutive word sequences, but seldom explicitly use global word co-occurrence information in a corpus. In this paper, we propose to model the whole auxiliary text corpus with a graph and present an end-to-end text-graph enhanced KG embedding model, named Teger. Specifically, we model the auxiliary texts with a heterogeneous entity-word graph (called text-graph), which entails both local and global semantic relationships among entities and words. We then apply graph convolutional networks to learn informative entity embeddings that aggregate high-order neighborhood information. These embeddings are further integrated with the KG triplet embeddings via a gating mechanism, thus enriching the KG representations and alleviating the inherent structure sparsity. Experiments on benchmark datasets show that our method significantly outperforms several state-of-the-art methods.
format Online
Article
Text
id pubmed-8418144
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-84181442021-09-05 Text-Graph Enhanced Knowledge Graph Representation Learning Hu, Linmei Zhang, Mengmei Li, Shaohua Shi, Jinghan Shi, Chuan Yang, Cheng Liu, Zhiyuan Front Artif Intell Artificial Intelligence Knowledge Graphs (KGs) such as Freebase and YAGO have been widely adopted in a variety of NLP tasks. Representation learning of Knowledge Graphs (KGs) aims to map entities and relationships into a continuous low-dimensional vector space. Conventional KG embedding methods (such as TransE and ConvE) utilize only KG triplets and thus suffer from structure sparsity. Some recent works address this issue by incorporating auxiliary texts of entities, typically entity descriptions. However, these methods usually focus only on local consecutive word sequences, but seldom explicitly use global word co-occurrence information in a corpus. In this paper, we propose to model the whole auxiliary text corpus with a graph and present an end-to-end text-graph enhanced KG embedding model, named Teger. Specifically, we model the auxiliary texts with a heterogeneous entity-word graph (called text-graph), which entails both local and global semantic relationships among entities and words. We then apply graph convolutional networks to learn informative entity embeddings that aggregate high-order neighborhood information. These embeddings are further integrated with the KG triplet embeddings via a gating mechanism, thus enriching the KG representations and alleviating the inherent structure sparsity. Experiments on benchmark datasets show that our method significantly outperforms several state-of-the-art methods. Frontiers Media S.A. 2021-08-17 /pmc/articles/PMC8418144/ /pubmed/34490421 http://dx.doi.org/10.3389/frai.2021.697856 Text en Copyright © 2021 Hu, Zhang, Li, Shi, Shi, Yang and Liu. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Artificial Intelligence
Hu, Linmei
Zhang, Mengmei
Li, Shaohua
Shi, Jinghan
Shi, Chuan
Yang, Cheng
Liu, Zhiyuan
Text-Graph Enhanced Knowledge Graph Representation Learning
title Text-Graph Enhanced Knowledge Graph Representation Learning
title_full Text-Graph Enhanced Knowledge Graph Representation Learning
title_fullStr Text-Graph Enhanced Knowledge Graph Representation Learning
title_full_unstemmed Text-Graph Enhanced Knowledge Graph Representation Learning
title_short Text-Graph Enhanced Knowledge Graph Representation Learning
title_sort text-graph enhanced knowledge graph representation learning
topic Artificial Intelligence
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8418144/
https://www.ncbi.nlm.nih.gov/pubmed/34490421
http://dx.doi.org/10.3389/frai.2021.697856
work_keys_str_mv AT hulinmei textgraphenhancedknowledgegraphrepresentationlearning
AT zhangmengmei textgraphenhancedknowledgegraphrepresentationlearning
AT lishaohua textgraphenhancedknowledgegraphrepresentationlearning
AT shijinghan textgraphenhancedknowledgegraphrepresentationlearning
AT shichuan textgraphenhancedknowledgegraphrepresentationlearning
AT yangcheng textgraphenhancedknowledgegraphrepresentationlearning
AT liuzhiyuan textgraphenhancedknowledgegraphrepresentationlearning