Cargando…

An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks

Recently, there has been a resurgence of formal language theory in deep learning research. However, most research focused on the more practical problems of attempting to represent symbolic knowledge by machine learning. In contrast, there has been limited research on exploring the fundamental connec...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhang, Kaixuan, Wang, Qinglong, Giles, C. Lee
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7835824/
https://www.ncbi.nlm.nih.gov/pubmed/33478020
http://dx.doi.org/10.3390/e23010127
_version_ 1783642616455233536
author Zhang, Kaixuan
Wang, Qinglong
Giles, C. Lee
author_facet Zhang, Kaixuan
Wang, Qinglong
Giles, C. Lee
author_sort Zhang, Kaixuan
collection PubMed
description Recently, there has been a resurgence of formal language theory in deep learning research. However, most research focused on the more practical problems of attempting to represent symbolic knowledge by machine learning. In contrast, there has been limited research on exploring the fundamental connection between them. To obtain a better understanding of the internal structures of regular grammars and their corresponding complexity, we focus on categorizing regular grammars by using both theoretical analysis and empirical evidence. Specifically, motivated by the concentric ring representation, we relaxed the original order information and introduced an entropy metric for describing the complexity of different regular grammars. Based on the entropy metric, we categorized regular grammars into three disjoint subclasses: the polynomial, exponential and proportional classes. In addition, several classification theorems are provided for different representations of regular grammars. Our analysis was validated by examining the process of learning grammars with multiple recurrent neural networks. Our results show that as expected more complex grammars are generally more difficult to learn.
format Online
Article
Text
id pubmed-7835824
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-78358242021-02-24 An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks Zhang, Kaixuan Wang, Qinglong Giles, C. Lee Entropy (Basel) Article Recently, there has been a resurgence of formal language theory in deep learning research. However, most research focused on the more practical problems of attempting to represent symbolic knowledge by machine learning. In contrast, there has been limited research on exploring the fundamental connection between them. To obtain a better understanding of the internal structures of regular grammars and their corresponding complexity, we focus on categorizing regular grammars by using both theoretical analysis and empirical evidence. Specifically, motivated by the concentric ring representation, we relaxed the original order information and introduced an entropy metric for describing the complexity of different regular grammars. Based on the entropy metric, we categorized regular grammars into three disjoint subclasses: the polynomial, exponential and proportional classes. In addition, several classification theorems are provided for different representations of regular grammars. Our analysis was validated by examining the process of learning grammars with multiple recurrent neural networks. Our results show that as expected more complex grammars are generally more difficult to learn. MDPI 2021-01-19 /pmc/articles/PMC7835824/ /pubmed/33478020 http://dx.doi.org/10.3390/e23010127 Text en © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Zhang, Kaixuan
Wang, Qinglong
Giles, C. Lee
An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks
title An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks
title_full An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks
title_fullStr An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks
title_full_unstemmed An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks
title_short An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks
title_sort entropy metric for regular grammar classification and learning with recurrent neural networks
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7835824/
https://www.ncbi.nlm.nih.gov/pubmed/33478020
http://dx.doi.org/10.3390/e23010127
work_keys_str_mv AT zhangkaixuan anentropymetricforregulargrammarclassificationandlearningwithrecurrentneuralnetworks
AT wangqinglong anentropymetricforregulargrammarclassificationandlearningwithrecurrentneuralnetworks
AT gilesclee anentropymetricforregulargrammarclassificationandlearningwithrecurrentneuralnetworks
AT zhangkaixuan entropymetricforregulargrammarclassificationandlearningwithrecurrentneuralnetworks
AT wangqinglong entropymetricforregulargrammarclassificationandlearningwithrecurrentneuralnetworks
AT gilesclee entropymetricforregulargrammarclassificationandlearningwithrecurrentneuralnetworks