Cargando…

A BERT-Based Aspect-Level Sentiment Analysis Algorithm for Cross-Domain Text

Cross-domain text sentiment analysis is a text sentiment classification task that uses the existing source domain annotation data to assist the target domain, which can not only reduce the workload of new domain data annotation, but also significantly improve the utilization of source domain annotat...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Ning, Zhao, Jianhua
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9252649/
https://www.ncbi.nlm.nih.gov/pubmed/35795761
http://dx.doi.org/10.1155/2022/8726621
_version_ 1784740312717983744
author Liu, Ning
Zhao, Jianhua
author_facet Liu, Ning
Zhao, Jianhua
author_sort Liu, Ning
collection PubMed
description Cross-domain text sentiment analysis is a text sentiment classification task that uses the existing source domain annotation data to assist the target domain, which can not only reduce the workload of new domain data annotation, but also significantly improve the utilization of source domain annotation resources. In order to effectively achieve the performance of cross-domain text sentiment classification, this paper proposes a BERT-based aspect-level sentiment analysis algorithm for cross-domain text to achieve fine-grained sentiment analysis of cross-domain text. First, the algorithm uses the BERT structure to extract sentence-level and aspect-level representation vectors, extracts local features through an improved convolutional neural network, and combines aspect-level corpus and sentence-level corpus to form a sequence sentence pair. Then, the algorithm uses domain adversarial neural network to make the feature representation extracted from different domains as indistinguishable as possible, that is, the features extracted from the source domain and the target domain have more similarity. Finally, by training the sentiment classifier on the source domain dataset with sentiment labels, it is expected that the classifier can achieve a good sentiment classification effect in both source and target domain, and achieve sentence-level and aspect-level sentiment classification. At the same time, the error pooled values of the sentiment classifier and the domain adversary are passed backwards to realize the update and optimization of the model parameters, thereby training a model with cross-domain analysis capability. Experiments are carried out on the Amazon product review dataset, and accuracy and F1 value are used as evaluation indicators. Compared with other classical algorithms, the experimental results show that the proposed algorithm has better performance.
format Online
Article
Text
id pubmed-9252649
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-92526492022-07-05 A BERT-Based Aspect-Level Sentiment Analysis Algorithm for Cross-Domain Text Liu, Ning Zhao, Jianhua Comput Intell Neurosci Research Article Cross-domain text sentiment analysis is a text sentiment classification task that uses the existing source domain annotation data to assist the target domain, which can not only reduce the workload of new domain data annotation, but also significantly improve the utilization of source domain annotation resources. In order to effectively achieve the performance of cross-domain text sentiment classification, this paper proposes a BERT-based aspect-level sentiment analysis algorithm for cross-domain text to achieve fine-grained sentiment analysis of cross-domain text. First, the algorithm uses the BERT structure to extract sentence-level and aspect-level representation vectors, extracts local features through an improved convolutional neural network, and combines aspect-level corpus and sentence-level corpus to form a sequence sentence pair. Then, the algorithm uses domain adversarial neural network to make the feature representation extracted from different domains as indistinguishable as possible, that is, the features extracted from the source domain and the target domain have more similarity. Finally, by training the sentiment classifier on the source domain dataset with sentiment labels, it is expected that the classifier can achieve a good sentiment classification effect in both source and target domain, and achieve sentence-level and aspect-level sentiment classification. At the same time, the error pooled values of the sentiment classifier and the domain adversary are passed backwards to realize the update and optimization of the model parameters, thereby training a model with cross-domain analysis capability. Experiments are carried out on the Amazon product review dataset, and accuracy and F1 value are used as evaluation indicators. Compared with other classical algorithms, the experimental results show that the proposed algorithm has better performance. Hindawi 2022-06-27 /pmc/articles/PMC9252649/ /pubmed/35795761 http://dx.doi.org/10.1155/2022/8726621 Text en Copyright © 2022 Ning Liu and Jianhua Zhao. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Liu, Ning
Zhao, Jianhua
A BERT-Based Aspect-Level Sentiment Analysis Algorithm for Cross-Domain Text
title A BERT-Based Aspect-Level Sentiment Analysis Algorithm for Cross-Domain Text
title_full A BERT-Based Aspect-Level Sentiment Analysis Algorithm for Cross-Domain Text
title_fullStr A BERT-Based Aspect-Level Sentiment Analysis Algorithm for Cross-Domain Text
title_full_unstemmed A BERT-Based Aspect-Level Sentiment Analysis Algorithm for Cross-Domain Text
title_short A BERT-Based Aspect-Level Sentiment Analysis Algorithm for Cross-Domain Text
title_sort bert-based aspect-level sentiment analysis algorithm for cross-domain text
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9252649/
https://www.ncbi.nlm.nih.gov/pubmed/35795761
http://dx.doi.org/10.1155/2022/8726621
work_keys_str_mv AT liuning abertbasedaspectlevelsentimentanalysisalgorithmforcrossdomaintext
AT zhaojianhua abertbasedaspectlevelsentimentanalysisalgorithmforcrossdomaintext
AT liuning bertbasedaspectlevelsentimentanalysisalgorithmforcrossdomaintext
AT zhaojianhua bertbasedaspectlevelsentimentanalysisalgorithmforcrossdomaintext