Cargando…

Embedding Learning with Triple Trustiness on Noisy Knowledge Graph

Embedding learning on knowledge graphs (KGs) aims to encode all entities and relationships into a continuous vector space, which provides an effective and flexible method to implement downstream knowledge-driven artificial intelligence (AI) and natural language processing (NLP) tasks. Since KG const...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhao, Yu, Feng, Huali, Gallinari, Patrick
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514427/
http://dx.doi.org/10.3390/e21111083
_version_ 1783586585643581440
author Zhao, Yu
Feng, Huali
Gallinari, Patrick
author_facet Zhao, Yu
Feng, Huali
Gallinari, Patrick
author_sort Zhao, Yu
collection PubMed
description Embedding learning on knowledge graphs (KGs) aims to encode all entities and relationships into a continuous vector space, which provides an effective and flexible method to implement downstream knowledge-driven artificial intelligence (AI) and natural language processing (NLP) tasks. Since KG construction usually involves automatic mechanisms with less human supervision, it inevitably brings in plenty of noises to KGs. However, most conventional KG embedding approaches inappropriately assume that all facts in existing KGs are completely correct and ignore noise issues, which brings about potentially serious errors. To address this issue, in this paper we propose a novel approach to learn embeddings with triple trustiness on KGs, which takes possible noises into consideration. Specifically, we calculate the trustiness value of triples according to the rich and relatively reliable information from large amounts of entity type instances and entity descriptions in KGs. In addition, we present a cross-entropy based loss function for model optimization. In experiments, we evaluate our models on KG noise detection, KG completion and classification. Through extensive experiments on three datasets, we demonstrate that our proposed model can learn better embeddings than all baselines on noisy KGs.
format Online
Article
Text
id pubmed-7514427
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75144272020-11-09 Embedding Learning with Triple Trustiness on Noisy Knowledge Graph Zhao, Yu Feng, Huali Gallinari, Patrick Entropy (Basel) Article Embedding learning on knowledge graphs (KGs) aims to encode all entities and relationships into a continuous vector space, which provides an effective and flexible method to implement downstream knowledge-driven artificial intelligence (AI) and natural language processing (NLP) tasks. Since KG construction usually involves automatic mechanisms with less human supervision, it inevitably brings in plenty of noises to KGs. However, most conventional KG embedding approaches inappropriately assume that all facts in existing KGs are completely correct and ignore noise issues, which brings about potentially serious errors. To address this issue, in this paper we propose a novel approach to learn embeddings with triple trustiness on KGs, which takes possible noises into consideration. Specifically, we calculate the trustiness value of triples according to the rich and relatively reliable information from large amounts of entity type instances and entity descriptions in KGs. In addition, we present a cross-entropy based loss function for model optimization. In experiments, we evaluate our models on KG noise detection, KG completion and classification. Through extensive experiments on three datasets, we demonstrate that our proposed model can learn better embeddings than all baselines on noisy KGs. MDPI 2019-11-06 /pmc/articles/PMC7514427/ http://dx.doi.org/10.3390/e21111083 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Zhao, Yu
Feng, Huali
Gallinari, Patrick
Embedding Learning with Triple Trustiness on Noisy Knowledge Graph
title Embedding Learning with Triple Trustiness on Noisy Knowledge Graph
title_full Embedding Learning with Triple Trustiness on Noisy Knowledge Graph
title_fullStr Embedding Learning with Triple Trustiness on Noisy Knowledge Graph
title_full_unstemmed Embedding Learning with Triple Trustiness on Noisy Knowledge Graph
title_short Embedding Learning with Triple Trustiness on Noisy Knowledge Graph
title_sort embedding learning with triple trustiness on noisy knowledge graph
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514427/
http://dx.doi.org/10.3390/e21111083
work_keys_str_mv AT zhaoyu embeddinglearningwithtripletrustinessonnoisyknowledgegraph
AT fenghuali embeddinglearningwithtripletrustinessonnoisyknowledgegraph
AT gallinaripatrick embeddinglearningwithtripletrustinessonnoisyknowledgegraph