Cargando…

Large Scale Subject Category Classification of Scholarly Papers With Deep Attentive Neural Networks

Subject categories of scholarly papers generally refer to the knowledge domain(s) to which the papers belong, examples being computer science or physics. Subject category classification is a prerequisite for bibliometric studies, organizing scientific publications for domain knowledge extraction, an...

Descripción completa

Detalles Bibliográficos
Autores principales: Kandimalla, Bharath, Rohatgi, Shaurya, Wu, Jian, Giles, C. Lee
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8025978/
https://www.ncbi.nlm.nih.gov/pubmed/33870061
http://dx.doi.org/10.3389/frma.2020.600382
_version_ 1783675590845399040
author Kandimalla, Bharath
Rohatgi, Shaurya
Wu, Jian
Giles, C. Lee
author_facet Kandimalla, Bharath
Rohatgi, Shaurya
Wu, Jian
Giles, C. Lee
author_sort Kandimalla, Bharath
collection PubMed
description Subject categories of scholarly papers generally refer to the knowledge domain(s) to which the papers belong, examples being computer science or physics. Subject category classification is a prerequisite for bibliometric studies, organizing scientific publications for domain knowledge extraction, and facilitating faceted searches for digital library search engines. Unfortunately, many academic papers do not have such information as part of their metadata. Most existing methods for solving this task focus on unsupervised learning that often relies on citation networks. However, a complete list of papers citing the current paper may not be readily available. In particular, new papers that have few or no citations cannot be classified using such methods. Here, we propose a deep attentive neural network (DANN) that classifies scholarly papers using only their abstracts. The network is trained using nine million abstracts from Web of Science (WoS). We also use the WoS schema that covers 104 subject categories. The proposed network consists of two bi-directional recurrent neural networks followed by an attention layer. We compare our model against baselines by varying the architecture and text representation. Our best model achieves micro- [Formula: see text] measure of 0.76 with [Formula: see text] of individual subject categories ranging from 0.50 to 0.95. The results showed the importance of retraining word embedding models to maximize the vocabulary overlap and the effectiveness of the attention mechanism. The combination of word vectors with TFIDF outperforms character and sentence level embedding models. We discuss imbalanced samples and overlapping categories and suggest possible strategies for mitigation. We also determine the subject category distribution in CiteSeerX by classifying a random sample of one million academic papers.
format Online
Article
Text
id pubmed-8025978
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-80259782021-04-15 Large Scale Subject Category Classification of Scholarly Papers With Deep Attentive Neural Networks Kandimalla, Bharath Rohatgi, Shaurya Wu, Jian Giles, C. Lee Front Res Metr Anal Research Metrics and Analytics Subject categories of scholarly papers generally refer to the knowledge domain(s) to which the papers belong, examples being computer science or physics. Subject category classification is a prerequisite for bibliometric studies, organizing scientific publications for domain knowledge extraction, and facilitating faceted searches for digital library search engines. Unfortunately, many academic papers do not have such information as part of their metadata. Most existing methods for solving this task focus on unsupervised learning that often relies on citation networks. However, a complete list of papers citing the current paper may not be readily available. In particular, new papers that have few or no citations cannot be classified using such methods. Here, we propose a deep attentive neural network (DANN) that classifies scholarly papers using only their abstracts. The network is trained using nine million abstracts from Web of Science (WoS). We also use the WoS schema that covers 104 subject categories. The proposed network consists of two bi-directional recurrent neural networks followed by an attention layer. We compare our model against baselines by varying the architecture and text representation. Our best model achieves micro- [Formula: see text] measure of 0.76 with [Formula: see text] of individual subject categories ranging from 0.50 to 0.95. The results showed the importance of retraining word embedding models to maximize the vocabulary overlap and the effectiveness of the attention mechanism. The combination of word vectors with TFIDF outperforms character and sentence level embedding models. We discuss imbalanced samples and overlapping categories and suggest possible strategies for mitigation. We also determine the subject category distribution in CiteSeerX by classifying a random sample of one million academic papers. Frontiers Media S.A. 2021-02-10 /pmc/articles/PMC8025978/ /pubmed/33870061 http://dx.doi.org/10.3389/frma.2020.600382 Text en Copyright © 2021 Kandimalla, Rohatgi, Wu and Giles. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Research Metrics and Analytics
Kandimalla, Bharath
Rohatgi, Shaurya
Wu, Jian
Giles, C. Lee
Large Scale Subject Category Classification of Scholarly Papers With Deep Attentive Neural Networks
title Large Scale Subject Category Classification of Scholarly Papers With Deep Attentive Neural Networks
title_full Large Scale Subject Category Classification of Scholarly Papers With Deep Attentive Neural Networks
title_fullStr Large Scale Subject Category Classification of Scholarly Papers With Deep Attentive Neural Networks
title_full_unstemmed Large Scale Subject Category Classification of Scholarly Papers With Deep Attentive Neural Networks
title_short Large Scale Subject Category Classification of Scholarly Papers With Deep Attentive Neural Networks
title_sort large scale subject category classification of scholarly papers with deep attentive neural networks
topic Research Metrics and Analytics
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8025978/
https://www.ncbi.nlm.nih.gov/pubmed/33870061
http://dx.doi.org/10.3389/frma.2020.600382
work_keys_str_mv AT kandimallabharath largescalesubjectcategoryclassificationofscholarlypaperswithdeepattentiveneuralnetworks
AT rohatgishaurya largescalesubjectcategoryclassificationofscholarlypaperswithdeepattentiveneuralnetworks
AT wujian largescalesubjectcategoryclassificationofscholarlypaperswithdeepattentiveneuralnetworks
AT gilesclee largescalesubjectcategoryclassificationofscholarlypaperswithdeepattentiveneuralnetworks