Cargando…

Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network

Text classification is a fundamental task in many applications such as topic labeling, sentiment analysis, and spam detection. The text syntactic relationship and word sequence are important and useful for text classification. How to model and incorporate them to improve performance is one key chall...

Descripción completa

Detalles Bibliográficos
Autores principales: Jia, Xudong, Wang, Li
Formato: Online Artículo Texto
Lenguaje:English
Publicado: PeerJ Inc. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8771801/
https://www.ncbi.nlm.nih.gov/pubmed/35111918
http://dx.doi.org/10.7717/peerj-cs.831
_version_ 1784635694256226304
author Jia, Xudong
Wang, Li
author_facet Jia, Xudong
Wang, Li
author_sort Jia, Xudong
collection PubMed
description Text classification is a fundamental task in many applications such as topic labeling, sentiment analysis, and spam detection. The text syntactic relationship and word sequence are important and useful for text classification. How to model and incorporate them to improve performance is one key challenge. Inspired by human behavior in understanding text. In this paper, we combine the syntactic relationship, sequence structure, and semantics for text representation, and propose an attention-enhanced capsule network-based text classification model. Specifically, we use graph convolutional neural networks to encode syntactic dependency trees, build multi-head attention to encode dependencies relationship in text sequence, merge with semantic information by capsule network at last. Extensive experiments on five datasets demonstrate that our approach can effectively improve the performance of text classification compared with state-of-the-art methods. The result also shows capsule network, graph convolutional neural network, and multi-headed attention has integration effects on text classification tasks.
format Online
Article
Text
id pubmed-8771801
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher PeerJ Inc.
record_format MEDLINE/PubMed
spelling pubmed-87718012022-02-01 Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network Jia, Xudong Wang, Li PeerJ Comput Sci Artificial Intelligence Text classification is a fundamental task in many applications such as topic labeling, sentiment analysis, and spam detection. The text syntactic relationship and word sequence are important and useful for text classification. How to model and incorporate them to improve performance is one key challenge. Inspired by human behavior in understanding text. In this paper, we combine the syntactic relationship, sequence structure, and semantics for text representation, and propose an attention-enhanced capsule network-based text classification model. Specifically, we use graph convolutional neural networks to encode syntactic dependency trees, build multi-head attention to encode dependencies relationship in text sequence, merge with semantic information by capsule network at last. Extensive experiments on five datasets demonstrate that our approach can effectively improve the performance of text classification compared with state-of-the-art methods. The result also shows capsule network, graph convolutional neural network, and multi-headed attention has integration effects on text classification tasks. PeerJ Inc. 2022-01-05 /pmc/articles/PMC8771801/ /pubmed/35111918 http://dx.doi.org/10.7717/peerj-cs.831 Text en © 2022 Jia and Wang https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ Computer Science) and either DOI or URL of the article must be cited.
spellingShingle Artificial Intelligence
Jia, Xudong
Wang, Li
Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
title Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
title_full Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
title_fullStr Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
title_full_unstemmed Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
title_short Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
title_sort attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
topic Artificial Intelligence
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8771801/
https://www.ncbi.nlm.nih.gov/pubmed/35111918
http://dx.doi.org/10.7717/peerj-cs.831
work_keys_str_mv AT jiaxudong attentionenhancedcapsulenetworkfortextclassificationbyencodingsyntacticdependencytreeswithgraphconvolutionalneuralnetwork
AT wangli attentionenhancedcapsulenetworkfortextclassificationbyencodingsyntacticdependencytreeswithgraphconvolutionalneuralnetwork