Cargando…
Investigation of Effectiveness of Shuffled Frog-Leaping Optimizer in Training a Convolution Neural Network
One of the leading algorithms and architectures in deep learning is Convolution Neural Network (CNN). It represents a unique method for image processing, object detection, and classification. CNN has shown to be an efficient approach in the machine learning and computer vision fields. CNN is compose...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8967525/ https://www.ncbi.nlm.nih.gov/pubmed/35368933 http://dx.doi.org/10.1155/2022/4703682 |
_version_ | 1784678859826790400 |
---|---|
author | Baseri Saadi, Soroush Tataei Sarshar, Nazanin Sadeghi, Soroush Ranjbarzadeh, Ramin Kooshki Forooshani, Mersedeh Bendechache, Malika |
author_facet | Baseri Saadi, Soroush Tataei Sarshar, Nazanin Sadeghi, Soroush Ranjbarzadeh, Ramin Kooshki Forooshani, Mersedeh Bendechache, Malika |
author_sort | Baseri Saadi, Soroush |
collection | PubMed |
description | One of the leading algorithms and architectures in deep learning is Convolution Neural Network (CNN). It represents a unique method for image processing, object detection, and classification. CNN has shown to be an efficient approach in the machine learning and computer vision fields. CNN is composed of several filters accompanied by nonlinear functions and pooling layers. It enforces limitations on the weights and interconnections of the neural network to create a good structure for processing spatial and temporal distributed data. A CNN can restrain the numbering of free parameters of the network through its weight-sharing property. However, the training of CNNs is a challenging approach. Some optimization techniques have been recently employed to optimize CNN's weight and biases such as Ant Colony Optimization, Genetic, Harmony Search, and Simulated Annealing. This paper employs the well-known nature-inspired algorithm called Shuffled Frog-Leaping Algorithm (SFLA) for training a classical CNN structure (LeNet-5), which has not been experienced before. The training method is investigated by employing four different datasets. To verify the study, the results are compared with some of the most famous evolutionary trainers: Whale Optimization Algorithm (WO), Bacteria Swarm Foraging Optimization (BFSO), and Ant Colony Optimization (ACO). The outcomes demonstrate that the SFL technique considerably improves the performance of the original LeNet-5 although using this algorithm slightly increases the training computation time. The results also demonstrate that the suggested algorithm presents high accuracy in classification and approximation in its mechanism. |
format | Online Article Text |
id | pubmed-8967525 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Hindawi |
record_format | MEDLINE/PubMed |
spelling | pubmed-89675252022-03-31 Investigation of Effectiveness of Shuffled Frog-Leaping Optimizer in Training a Convolution Neural Network Baseri Saadi, Soroush Tataei Sarshar, Nazanin Sadeghi, Soroush Ranjbarzadeh, Ramin Kooshki Forooshani, Mersedeh Bendechache, Malika J Healthc Eng Research Article One of the leading algorithms and architectures in deep learning is Convolution Neural Network (CNN). It represents a unique method for image processing, object detection, and classification. CNN has shown to be an efficient approach in the machine learning and computer vision fields. CNN is composed of several filters accompanied by nonlinear functions and pooling layers. It enforces limitations on the weights and interconnections of the neural network to create a good structure for processing spatial and temporal distributed data. A CNN can restrain the numbering of free parameters of the network through its weight-sharing property. However, the training of CNNs is a challenging approach. Some optimization techniques have been recently employed to optimize CNN's weight and biases such as Ant Colony Optimization, Genetic, Harmony Search, and Simulated Annealing. This paper employs the well-known nature-inspired algorithm called Shuffled Frog-Leaping Algorithm (SFLA) for training a classical CNN structure (LeNet-5), which has not been experienced before. The training method is investigated by employing four different datasets. To verify the study, the results are compared with some of the most famous evolutionary trainers: Whale Optimization Algorithm (WO), Bacteria Swarm Foraging Optimization (BFSO), and Ant Colony Optimization (ACO). The outcomes demonstrate that the SFL technique considerably improves the performance of the original LeNet-5 although using this algorithm slightly increases the training computation time. The results also demonstrate that the suggested algorithm presents high accuracy in classification and approximation in its mechanism. Hindawi 2022-03-23 /pmc/articles/PMC8967525/ /pubmed/35368933 http://dx.doi.org/10.1155/2022/4703682 Text en Copyright © 2022 Soroush Baseri Saadi et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Research Article Baseri Saadi, Soroush Tataei Sarshar, Nazanin Sadeghi, Soroush Ranjbarzadeh, Ramin Kooshki Forooshani, Mersedeh Bendechache, Malika Investigation of Effectiveness of Shuffled Frog-Leaping Optimizer in Training a Convolution Neural Network |
title | Investigation of Effectiveness of Shuffled Frog-Leaping Optimizer in Training a Convolution Neural Network |
title_full | Investigation of Effectiveness of Shuffled Frog-Leaping Optimizer in Training a Convolution Neural Network |
title_fullStr | Investigation of Effectiveness of Shuffled Frog-Leaping Optimizer in Training a Convolution Neural Network |
title_full_unstemmed | Investigation of Effectiveness of Shuffled Frog-Leaping Optimizer in Training a Convolution Neural Network |
title_short | Investigation of Effectiveness of Shuffled Frog-Leaping Optimizer in Training a Convolution Neural Network |
title_sort | investigation of effectiveness of shuffled frog-leaping optimizer in training a convolution neural network |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8967525/ https://www.ncbi.nlm.nih.gov/pubmed/35368933 http://dx.doi.org/10.1155/2022/4703682 |
work_keys_str_mv | AT baserisaadisoroush investigationofeffectivenessofshuffledfrogleapingoptimizerintrainingaconvolutionneuralnetwork AT tataeisarsharnazanin investigationofeffectivenessofshuffledfrogleapingoptimizerintrainingaconvolutionneuralnetwork AT sadeghisoroush investigationofeffectivenessofshuffledfrogleapingoptimizerintrainingaconvolutionneuralnetwork AT ranjbarzadehramin investigationofeffectivenessofshuffledfrogleapingoptimizerintrainingaconvolutionneuralnetwork AT kooshkiforooshanimersedeh investigationofeffectivenessofshuffledfrogleapingoptimizerintrainingaconvolutionneuralnetwork AT bendechachemalika investigationofeffectivenessofshuffledfrogleapingoptimizerintrainingaconvolutionneuralnetwork |