Cargando…

Equivalent Neural Network Optimal Coefficients Using Forgetting Factor with Sliding Modes

The Artificial Neural Network (ANN) concept is familiar in methods whose task is, for example, the identification or approximation of the outputs of complex systems difficult to model. In general, the objective is to determine online the adequate parameters to reach a better point-to-point convergen...

Descripción completa

Detalles Bibliográficos
Autores principales: Aguilar Cruz, Karen Alicia, Medel Juárez, José de Jesús, Urbieta Parrazales, Romeo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi Publishing Corporation 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5187595/
https://www.ncbi.nlm.nih.gov/pubmed/28058045
http://dx.doi.org/10.1155/2016/4642052
Descripción
Sumario:The Artificial Neural Network (ANN) concept is familiar in methods whose task is, for example, the identification or approximation of the outputs of complex systems difficult to model. In general, the objective is to determine online the adequate parameters to reach a better point-to-point convergence rate, so that this paper presents the parameter estimation for an equivalent ANN (EANN), obtaining a recursive identification for a stochastic system, firstly, with constant parameters and, secondly, with nonstationary output system conditions. Therefore, in the last estimation, the parameters also have stochastic properties, making the traditional approximation methods not adequate due to their losing of convergence rate. In order to give a solution to this problematic, we propose a nonconstant exponential forgetting factor (NCEFF) with sliding modes, obtaining in almost all points an exponential convergence rate decreasing. Theoretical results of both identification stages are performed using MATLAB® and compared, observing improvement when the new proposal for nonstationary output conditions is applied.