Cargando…

Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks

In this paper, we adopt the algorithms of linguistic feature Rong and sparse self-learning neural network to conduct an in-depth study and analysis of Chinese semantic mapping, which complements the emotion semantic representation ability of traditional word embedding and fully explores the emotion...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhang, Haiping, Chao, Bo, Huang, Zhijing, Li, Tingyu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9236845/
https://www.ncbi.nlm.nih.gov/pubmed/35769283
http://dx.doi.org/10.1155/2022/2315802
_version_ 1784736632377704448
author Zhang, Haiping
Chao, Bo
Huang, Zhijing
Li, Tingyu
author_facet Zhang, Haiping
Chao, Bo
Huang, Zhijing
Li, Tingyu
author_sort Zhang, Haiping
collection PubMed
description In this paper, we adopt the algorithms of linguistic feature Rong and sparse self-learning neural network to conduct an in-depth study and analysis of Chinese semantic mapping, which complements the emotion semantic representation ability of traditional word embedding and fully explores the emotion semantic information contained in the text in the task preprocessing stage. We incorporate various semantic features such as lexical information and location information to make the model have richer emotion semantic expression, and the model also uses an attention mechanism to allow various features to interact and abstract deeper contextual internal semantic associations to improve the model's sentiment classification performance. Finally, experiments are conducted on two publicly available English sentiment classification corpora, and the results prove that the model outperforms other comparison models and effectively improves the sentiment classification performance. The model uses deep memory networks and capsule networks to construct a transfer learning framework and effectively leverages the transfer learning properties of capsule networks to transfer knowledge embedded in large-scale labeled data from similar domains to the target domain, improving the classification performance on small data sets. The use of multidimensional combined features compensates for the lack of a one-dimensional feature attention mechanism, while multiple domain category-based attention computation layers are superimposed to obtain deeper domain-specific sentiment feature information. Based on the segmented convolutional neural network, the model first introduces the dependent subtree of relational attributes to obtain the position weights of each word in the sentence, then introduces domain ontology knowledge in the output layer to constrain the extraction results, and conducts experimental comparison through the data set to verify the validity of the model, which ensures the accuracy of the network term entity and relational attribute recognition extraction and makes the knowledge map constructed in this paper. It ensures the accuracy of the extraction rate of the web term entities and relationship attributes and makes the knowledge map constructed in this paper more factual.
format Online
Article
Text
id pubmed-9236845
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-92368452022-06-28 Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks Zhang, Haiping Chao, Bo Huang, Zhijing Li, Tingyu Comput Intell Neurosci Research Article In this paper, we adopt the algorithms of linguistic feature Rong and sparse self-learning neural network to conduct an in-depth study and analysis of Chinese semantic mapping, which complements the emotion semantic representation ability of traditional word embedding and fully explores the emotion semantic information contained in the text in the task preprocessing stage. We incorporate various semantic features such as lexical information and location information to make the model have richer emotion semantic expression, and the model also uses an attention mechanism to allow various features to interact and abstract deeper contextual internal semantic associations to improve the model's sentiment classification performance. Finally, experiments are conducted on two publicly available English sentiment classification corpora, and the results prove that the model outperforms other comparison models and effectively improves the sentiment classification performance. The model uses deep memory networks and capsule networks to construct a transfer learning framework and effectively leverages the transfer learning properties of capsule networks to transfer knowledge embedded in large-scale labeled data from similar domains to the target domain, improving the classification performance on small data sets. The use of multidimensional combined features compensates for the lack of a one-dimensional feature attention mechanism, while multiple domain category-based attention computation layers are superimposed to obtain deeper domain-specific sentiment feature information. Based on the segmented convolutional neural network, the model first introduces the dependent subtree of relational attributes to obtain the position weights of each word in the sentence, then introduces domain ontology knowledge in the output layer to constrain the extraction results, and conducts experimental comparison through the data set to verify the validity of the model, which ensures the accuracy of the network term entity and relational attribute recognition extraction and makes the knowledge map constructed in this paper. It ensures the accuracy of the extraction rate of the web term entities and relationship attributes and makes the knowledge map constructed in this paper more factual. Hindawi 2022-06-20 /pmc/articles/PMC9236845/ /pubmed/35769283 http://dx.doi.org/10.1155/2022/2315802 Text en Copyright © 2022 Haiping Zhang et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Zhang, Haiping
Chao, Bo
Huang, Zhijing
Li, Tingyu
Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks
title Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks
title_full Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks
title_fullStr Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks
title_full_unstemmed Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks
title_short Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks
title_sort construction and research on chinese semantic mapping based on linguistic features and sparse self-learning neural networks
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9236845/
https://www.ncbi.nlm.nih.gov/pubmed/35769283
http://dx.doi.org/10.1155/2022/2315802
work_keys_str_mv AT zhanghaiping constructionandresearchonchinesesemanticmappingbasedonlinguisticfeaturesandsparseselflearningneuralnetworks
AT chaobo constructionandresearchonchinesesemanticmappingbasedonlinguisticfeaturesandsparseselflearningneuralnetworks
AT huangzhijing constructionandresearchonchinesesemanticmappingbasedonlinguisticfeaturesandsparseselflearningneuralnetworks
AT litingyu constructionandresearchonchinesesemanticmappingbasedonlinguisticfeaturesandsparseselflearningneuralnetworks