Cargando…
An Improved BERT and Syntactic Dependency Representation Model for Sentiment Analysis
Text representation of social media is an important task for users' sentiment analysis. Utilizing the better representation, we can accurately acquire the real semantic information expressed by online users. However, existing works cannot achieve the best results. In this paper, we construct an...
Autores principales: | Liu, Wenfeng, Yi, Jing, Hu, Zhanliang, Gao, Yaling |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9098270/ https://www.ncbi.nlm.nih.gov/pubmed/35571684 http://dx.doi.org/10.1155/2022/5754151 |
Ejemplares similares
-
Arabic Syntactic Diacritics Restoration Using BERT Models
por: Nazih, Waleed, et al.
Publicado: (2022) -
CharAs-CBert: Character Assist Construction-Bert Sentence Representation Improving Sentiment Classification
por: Chen, Bo, et al.
Publicado: (2022) -
Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
por: Areshey, Ali, et al.
Publicado: (2023) -
What does Chinese BERT learn about syntactic knowledge?
por: Zheng, Jianyu, et al.
Publicado: (2023) -
A BERT Framework to Sentiment Analysis of Tweets
por: Bello, Abayomi, et al.
Publicado: (2023)