Cargando…

Leveraging Contextual Sentences for Text Classification by Using a Neural Attention Model

We explored several approaches to incorporate context information in the deep learning framework for text classification, including designing different attention mechanisms based on different neural network and extracting some additional features from text by traditional methods as the part of repre...

Descripción completa

Detalles Bibliográficos
Autores principales: Yan, DanFeng, Guo, Shiyao
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6701294/
https://www.ncbi.nlm.nih.gov/pubmed/31467518
http://dx.doi.org/10.1155/2019/8320316
Descripción
Sumario:We explored several approaches to incorporate context information in the deep learning framework for text classification, including designing different attention mechanisms based on different neural network and extracting some additional features from text by traditional methods as the part of representation. We propose two kinds of classification algorithms: one is based on convolutional neural network fusing context information and the other is based on bidirectional long and short time memory network. We integrate the context information into the final feature representation by designing attention structures at sentence level and word level, which increases the diversity of feature information. Our experimental results on two datasets validate the advantages of the two models in terms of time efficiency and accuracy compared to the different models with fundamental AM architectures.