Cargando…

Interactive Dual Attention Network for Text Sentiment Classification

Text sentiment classification is an essential research field of natural language processing. Recently, numerous deep learning-based methods for sentiment classification have been proposed and achieved better performances compared with conventional machine learning methods. However, most of the propo...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhu, Yinglin, Zheng, Wenbin, Tang, Hong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7657682/
https://www.ncbi.nlm.nih.gov/pubmed/33204245
http://dx.doi.org/10.1155/2020/8858717
_version_ 1783608546880913408
author Zhu, Yinglin
Zheng, Wenbin
Tang, Hong
author_facet Zhu, Yinglin
Zheng, Wenbin
Tang, Hong
author_sort Zhu, Yinglin
collection PubMed
description Text sentiment classification is an essential research field of natural language processing. Recently, numerous deep learning-based methods for sentiment classification have been proposed and achieved better performances compared with conventional machine learning methods. However, most of the proposed methods ignore the interactive relationship between contextual semantics and sentimental tendency while modeling their text representation. In this paper, we propose a novel Interactive Dual Attention Network (IDAN) model that aims to interactively learn the representation between contextual semantics and sentimental tendency information. Firstly, we design an algorithm that utilizes linguistic resources to obtain sentimental tendency information from text and then extract word embeddings from the BERT (Bidirectional Encoder Representations from Transformers) pretraining model as the embedding layer of IDAN. Next, we use two Bidirectional LSTM (BiLSTM) networks to learn the long-range dependencies of contextual semantics and sentimental tendency information, respectively. Finally, two types of attention mechanisms are implemented in IDAN. One is multihead attention, which is the next layer of BiLSTM and is used to learn the interactive relationship between contextual semantics and sentimental tendency information. The other is global attention that aims to make the model focus on the important parts of the sequence and generate the final representation for classification. These two attention mechanisms enable IDAN to interactively learn the relationship between semantics and sentimental tendency information and improve the classification performance. A large number of experiments on four benchmark datasets show that our IDAN model is superior to competitive methods. Moreover, both the result analysis and the attention weight visualization further demonstrate the effectiveness of our proposed method.
format Online
Article
Text
id pubmed-7657682
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-76576822020-11-16 Interactive Dual Attention Network for Text Sentiment Classification Zhu, Yinglin Zheng, Wenbin Tang, Hong Comput Intell Neurosci Research Article Text sentiment classification is an essential research field of natural language processing. Recently, numerous deep learning-based methods for sentiment classification have been proposed and achieved better performances compared with conventional machine learning methods. However, most of the proposed methods ignore the interactive relationship between contextual semantics and sentimental tendency while modeling their text representation. In this paper, we propose a novel Interactive Dual Attention Network (IDAN) model that aims to interactively learn the representation between contextual semantics and sentimental tendency information. Firstly, we design an algorithm that utilizes linguistic resources to obtain sentimental tendency information from text and then extract word embeddings from the BERT (Bidirectional Encoder Representations from Transformers) pretraining model as the embedding layer of IDAN. Next, we use two Bidirectional LSTM (BiLSTM) networks to learn the long-range dependencies of contextual semantics and sentimental tendency information, respectively. Finally, two types of attention mechanisms are implemented in IDAN. One is multihead attention, which is the next layer of BiLSTM and is used to learn the interactive relationship between contextual semantics and sentimental tendency information. The other is global attention that aims to make the model focus on the important parts of the sequence and generate the final representation for classification. These two attention mechanisms enable IDAN to interactively learn the relationship between semantics and sentimental tendency information and improve the classification performance. A large number of experiments on four benchmark datasets show that our IDAN model is superior to competitive methods. Moreover, both the result analysis and the attention weight visualization further demonstrate the effectiveness of our proposed method. Hindawi 2020-11-03 /pmc/articles/PMC7657682/ /pubmed/33204245 http://dx.doi.org/10.1155/2020/8858717 Text en Copyright © 2020 Yinglin Zhu et al. https://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Zhu, Yinglin
Zheng, Wenbin
Tang, Hong
Interactive Dual Attention Network for Text Sentiment Classification
title Interactive Dual Attention Network for Text Sentiment Classification
title_full Interactive Dual Attention Network for Text Sentiment Classification
title_fullStr Interactive Dual Attention Network for Text Sentiment Classification
title_full_unstemmed Interactive Dual Attention Network for Text Sentiment Classification
title_short Interactive Dual Attention Network for Text Sentiment Classification
title_sort interactive dual attention network for text sentiment classification
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7657682/
https://www.ncbi.nlm.nih.gov/pubmed/33204245
http://dx.doi.org/10.1155/2020/8858717
work_keys_str_mv AT zhuyinglin interactivedualattentionnetworkfortextsentimentclassification
AT zhengwenbin interactivedualattentionnetworkfortextsentimentclassification
AT tanghong interactivedualattentionnetworkfortextsentimentclassification