Cargando…

VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification

Much progress has been made recently on text classification with methods based on neural networks. In particular, models using attention mechanism such as BERT have shown to have the capability of capturing the contextual information within a sentence or document. However, their ability of capturing...

Descripción completa

Detalles Bibliográficos
Autores principales: Lu, Zhibin, Du, Pan, Nie, Jian-Yun
Formato: Online Artículo Texto
Lenguaje:English
Publicado: 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7148240/
http://dx.doi.org/10.1007/978-3-030-45439-5_25
_version_ 1783520551207174144
author Lu, Zhibin
Du, Pan
Nie, Jian-Yun
author_facet Lu, Zhibin
Du, Pan
Nie, Jian-Yun
author_sort Lu, Zhibin
collection PubMed
description Much progress has been made recently on text classification with methods based on neural networks. In particular, models using attention mechanism such as BERT have shown to have the capability of capturing the contextual information within a sentence or document. However, their ability of capturing the global information about the vocabulary of a language is more limited. This latter is the strength of Graph Convolutional Networks (GCN). In this paper, we propose VGCN-BERT model which combines the capability of BERT with a Vocabulary Graph Convolutional Network (VGCN). Local information and global information interact through different layers of BERT, allowing them to influence mutually and to build together a final representation for classification. In our experiments on several text classification datasets, our approach outperforms BERT and GCN alone, and achieve higher effectiveness than that reported in previous studies.
format Online
Article
Text
id pubmed-7148240
institution National Center for Biotechnology Information
language English
publishDate 2020
record_format MEDLINE/PubMed
spelling pubmed-71482402020-04-13 VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification Lu, Zhibin Du, Pan Nie, Jian-Yun Advances in Information Retrieval Article Much progress has been made recently on text classification with methods based on neural networks. In particular, models using attention mechanism such as BERT have shown to have the capability of capturing the contextual information within a sentence or document. However, their ability of capturing the global information about the vocabulary of a language is more limited. This latter is the strength of Graph Convolutional Networks (GCN). In this paper, we propose VGCN-BERT model which combines the capability of BERT with a Vocabulary Graph Convolutional Network (VGCN). Local information and global information interact through different layers of BERT, allowing them to influence mutually and to build together a final representation for classification. In our experiments on several text classification datasets, our approach outperforms BERT and GCN alone, and achieve higher effectiveness than that reported in previous studies. 2020-03-17 /pmc/articles/PMC7148240/ http://dx.doi.org/10.1007/978-3-030-45439-5_25 Text en © Springer Nature Switzerland AG 2020 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.
spellingShingle Article
Lu, Zhibin
Du, Pan
Nie, Jian-Yun
VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
title VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
title_full VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
title_fullStr VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
title_full_unstemmed VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
title_short VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
title_sort vgcn-bert: augmenting bert with graph embedding for text classification
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7148240/
http://dx.doi.org/10.1007/978-3-030-45439-5_25
work_keys_str_mv AT luzhibin vgcnbertaugmentingbertwithgraphembeddingfortextclassification
AT dupan vgcnbertaugmentingbertwithgraphembeddingfortextclassification
AT niejianyun vgcnbertaugmentingbertwithgraphembeddingfortextclassification