Cargando…
An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks
Recently, there has been a resurgence of formal language theory in deep learning research. However, most research focused on the more practical problems of attempting to represent symbolic knowledge by machine learning. In contrast, there has been limited research on exploring the fundamental connec...
Autores principales: | Zhang, Kaixuan, Wang, Qinglong, Giles, C. Lee |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7835824/ https://www.ncbi.nlm.nih.gov/pubmed/33478020 http://dx.doi.org/10.3390/e23010127 |
Ejemplares similares
-
An Entropy Model for Artificial Grammar Learning
por: Pothos, Emmanuel M.
Publicado: (2010) -
Metrical Presentation Boosts Implicit Learning of Artificial Grammar
por: Selchenkova, Tatiana, et al.
Publicado: (2014) -
Comparing feedforward and recurrent neural network architectures with human behavior in artificial grammar learning
por: Alamia, Andrea, et al.
Publicado: (2020) -
Synthesizing Context-free Grammars from Recurrent Neural Networks
por: Yellin, Daniel M., et al.
Publicado: (2021) -
Can Recurrent Neural Networks Validate Usage-Based Theories of Grammar Acquisition?
por: Pannitto, Ludovica, et al.
Publicado: (2022)