Cargando…
Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training
BACKGROUND: Particle Swarm Optimization (PSO) is an established method for parameter optimization. It represents a population-based adaptive optimization technique that is influenced by several "strategy parameters". Choosing reasonable parameter values for the PSO is crucial for its conve...
Autores principales: | , , |
---|---|
Formato: | Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2006
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1464136/ https://www.ncbi.nlm.nih.gov/pubmed/16529661 http://dx.doi.org/10.1186/1471-2105-7-125 |
_version_ | 1782127541177286656 |
---|---|
author | Meissner, Michael Schmuker, Michael Schneider, Gisbert |
author_facet | Meissner, Michael Schmuker, Michael Schneider, Gisbert |
author_sort | Meissner, Michael |
collection | PubMed |
description | BACKGROUND: Particle Swarm Optimization (PSO) is an established method for parameter optimization. It represents a population-based adaptive optimization technique that is influenced by several "strategy parameters". Choosing reasonable parameter values for the PSO is crucial for its convergence behavior, and depends on the optimization task. We present a method for parameter meta-optimization based on PSO and its application to neural network training. The concept of the Optimized Particle Swarm Optimization (OPSO) is to optimize the free parameters of the PSO by having swarms within a swarm. We assessed the performance of the OPSO method on a set of five artificial fitness functions and compared it to the performance of two popular PSO implementations. RESULTS: Our results indicate that PSO performance can be improved if meta-optimized parameter sets are applied. In addition, we could improve optimization speed and quality on the other PSO methods in the majority of our experiments. We applied the OPSO method to neural network training with the aim to build a quantitative model for predicting blood-brain barrier permeation of small organic molecules. On average, training time decreased by a factor of four and two in comparison to the other PSO methods, respectively. By applying the OPSO method, a prediction model showing good correlation with training-, test- and validation data was obtained. CONCLUSION: Optimizing the free parameters of the PSO method can result in performance gain. The OPSO approach yields parameter combinations improving overall optimization performance. Its conceptual simplicity makes implementing the method a straightforward task. |
format | Text |
id | pubmed-1464136 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2006 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-14641362006-06-07 Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training Meissner, Michael Schmuker, Michael Schneider, Gisbert BMC Bioinformatics Methodology Article BACKGROUND: Particle Swarm Optimization (PSO) is an established method for parameter optimization. It represents a population-based adaptive optimization technique that is influenced by several "strategy parameters". Choosing reasonable parameter values for the PSO is crucial for its convergence behavior, and depends on the optimization task. We present a method for parameter meta-optimization based on PSO and its application to neural network training. The concept of the Optimized Particle Swarm Optimization (OPSO) is to optimize the free parameters of the PSO by having swarms within a swarm. We assessed the performance of the OPSO method on a set of five artificial fitness functions and compared it to the performance of two popular PSO implementations. RESULTS: Our results indicate that PSO performance can be improved if meta-optimized parameter sets are applied. In addition, we could improve optimization speed and quality on the other PSO methods in the majority of our experiments. We applied the OPSO method to neural network training with the aim to build a quantitative model for predicting blood-brain barrier permeation of small organic molecules. On average, training time decreased by a factor of four and two in comparison to the other PSO methods, respectively. By applying the OPSO method, a prediction model showing good correlation with training-, test- and validation data was obtained. CONCLUSION: Optimizing the free parameters of the PSO method can result in performance gain. The OPSO approach yields parameter combinations improving overall optimization performance. Its conceptual simplicity makes implementing the method a straightforward task. BioMed Central 2006-03-10 /pmc/articles/PMC1464136/ /pubmed/16529661 http://dx.doi.org/10.1186/1471-2105-7-125 Text en Copyright © 2006 Meissner et al; licensee BioMed Central Ltd. |
spellingShingle | Methodology Article Meissner, Michael Schmuker, Michael Schneider, Gisbert Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training |
title | Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training |
title_full | Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training |
title_fullStr | Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training |
title_full_unstemmed | Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training |
title_short | Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training |
title_sort | optimized particle swarm optimization (opso) and its application to artificial neural network training |
topic | Methodology Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1464136/ https://www.ncbi.nlm.nih.gov/pubmed/16529661 http://dx.doi.org/10.1186/1471-2105-7-125 |
work_keys_str_mv | AT meissnermichael optimizedparticleswarmoptimizationopsoanditsapplicationtoartificialneuralnetworktraining AT schmukermichael optimizedparticleswarmoptimizationopsoanditsapplicationtoartificialneuralnetworktraining AT schneidergisbert optimizedparticleswarmoptimizationopsoanditsapplicationtoartificialneuralnetworktraining |