Cargando…

Equivalent Neural Network Optimal Coefficients Using Forgetting Factor with Sliding Modes

The Artificial Neural Network (ANN) concept is familiar in methods whose task is, for example, the identification or approximation of the outputs of complex systems difficult to model. In general, the objective is to determine online the adequate parameters to reach a better point-to-point convergen...

Descripción completa

Detalles Bibliográficos
Autores principales: Aguilar Cruz, Karen Alicia, Medel Juárez, José de Jesús, Urbieta Parrazales, Romeo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi Publishing Corporation 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5187595/
https://www.ncbi.nlm.nih.gov/pubmed/28058045
http://dx.doi.org/10.1155/2016/4642052
_version_ 1782486874978254848
author Aguilar Cruz, Karen Alicia
Medel Juárez, José de Jesús
Urbieta Parrazales, Romeo
author_facet Aguilar Cruz, Karen Alicia
Medel Juárez, José de Jesús
Urbieta Parrazales, Romeo
author_sort Aguilar Cruz, Karen Alicia
collection PubMed
description The Artificial Neural Network (ANN) concept is familiar in methods whose task is, for example, the identification or approximation of the outputs of complex systems difficult to model. In general, the objective is to determine online the adequate parameters to reach a better point-to-point convergence rate, so that this paper presents the parameter estimation for an equivalent ANN (EANN), obtaining a recursive identification for a stochastic system, firstly, with constant parameters and, secondly, with nonstationary output system conditions. Therefore, in the last estimation, the parameters also have stochastic properties, making the traditional approximation methods not adequate due to their losing of convergence rate. In order to give a solution to this problematic, we propose a nonconstant exponential forgetting factor (NCEFF) with sliding modes, obtaining in almost all points an exponential convergence rate decreasing. Theoretical results of both identification stages are performed using MATLAB® and compared, observing improvement when the new proposal for nonstationary output conditions is applied.
format Online
Article
Text
id pubmed-5187595
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher Hindawi Publishing Corporation
record_format MEDLINE/PubMed
spelling pubmed-51875952017-01-05 Equivalent Neural Network Optimal Coefficients Using Forgetting Factor with Sliding Modes Aguilar Cruz, Karen Alicia Medel Juárez, José de Jesús Urbieta Parrazales, Romeo Comput Intell Neurosci Research Article The Artificial Neural Network (ANN) concept is familiar in methods whose task is, for example, the identification or approximation of the outputs of complex systems difficult to model. In general, the objective is to determine online the adequate parameters to reach a better point-to-point convergence rate, so that this paper presents the parameter estimation for an equivalent ANN (EANN), obtaining a recursive identification for a stochastic system, firstly, with constant parameters and, secondly, with nonstationary output system conditions. Therefore, in the last estimation, the parameters also have stochastic properties, making the traditional approximation methods not adequate due to their losing of convergence rate. In order to give a solution to this problematic, we propose a nonconstant exponential forgetting factor (NCEFF) with sliding modes, obtaining in almost all points an exponential convergence rate decreasing. Theoretical results of both identification stages are performed using MATLAB® and compared, observing improvement when the new proposal for nonstationary output conditions is applied. Hindawi Publishing Corporation 2016 2016-12-13 /pmc/articles/PMC5187595/ /pubmed/28058045 http://dx.doi.org/10.1155/2016/4642052 Text en Copyright © 2016 Karen Alicia Aguilar Cruz et al. https://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Aguilar Cruz, Karen Alicia
Medel Juárez, José de Jesús
Urbieta Parrazales, Romeo
Equivalent Neural Network Optimal Coefficients Using Forgetting Factor with Sliding Modes
title Equivalent Neural Network Optimal Coefficients Using Forgetting Factor with Sliding Modes
title_full Equivalent Neural Network Optimal Coefficients Using Forgetting Factor with Sliding Modes
title_fullStr Equivalent Neural Network Optimal Coefficients Using Forgetting Factor with Sliding Modes
title_full_unstemmed Equivalent Neural Network Optimal Coefficients Using Forgetting Factor with Sliding Modes
title_short Equivalent Neural Network Optimal Coefficients Using Forgetting Factor with Sliding Modes
title_sort equivalent neural network optimal coefficients using forgetting factor with sliding modes
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5187595/
https://www.ncbi.nlm.nih.gov/pubmed/28058045
http://dx.doi.org/10.1155/2016/4642052
work_keys_str_mv AT aguilarcruzkarenalicia equivalentneuralnetworkoptimalcoefficientsusingforgettingfactorwithslidingmodes
AT medeljuarezjosedejesus equivalentneuralnetworkoptimalcoefficientsusingforgettingfactorwithslidingmodes
AT urbietaparrazalesromeo equivalentneuralnetworkoptimalcoefficientsusingforgettingfactorwithslidingmodes