Cargando…

Fractional-Order Deep Backpropagation Neural Network

In recent years, the research of artificial neural networks based on fractional calculus has attracted much attention. In this paper, we proposed a fractional-order deep backpropagation (BP) neural network model with L(2) regularization. The proposed network was optimized by the fractional gradient...

Descripción completa

Detalles Bibliográficos
Autores principales: Bao, Chunhui, Pu, Yifei, Zhang, Yi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6051328/
https://www.ncbi.nlm.nih.gov/pubmed/30065757
http://dx.doi.org/10.1155/2018/7361628
Descripción
Sumario:In recent years, the research of artificial neural networks based on fractional calculus has attracted much attention. In this paper, we proposed a fractional-order deep backpropagation (BP) neural network model with L(2) regularization. The proposed network was optimized by the fractional gradient descent method with Caputo derivative. We also illustrated the necessary conditions for the convergence of the proposed network. The influence of L(2) regularization on the convergence was analyzed with the fractional-order variational method. The experiments have been performed on the MNIST dataset to demonstrate that the proposed network was deterministically convergent and can effectively avoid overfitting.