Paper

Generalized Self-Adapting Particle Swarm Optimization algorithm with archive of samples

In this paper we enhance Generalized Self-Adapting Particle Swarm Optimization algorithm (GAPSO), initially introduced at the Parallel Problem Solving from Nature 2018 conference, and to investigate its properties. The research on GAPSO is underlined by the two following assumptions: (1) it is possible to achieve good performance of an optimization algorithm through utilization of all of the gathered samples, (2) the best performance can be accomplished by means of a combination of specialized sampling behaviors (Particle Swarm Optimization, Differential Evolution, and locally fitted square functions). From a software engineering point of view, GAPSO considers a standard Particle Swarm Optimization algorithm as an ideal starting point for creating a generalpurpose global optimization framework. Within this framework hybrid optimization algorithms are developed, and various additional techniques (like algorithm restart management or adaptation schemes) are tested. The paper introduces a new version of the algorithm, abbreviated as M-GAPSO. In comparison with the original GAPSO formulation it includes the following four features: a global restart management scheme, samples gathering within an R-Tree based index (archive/memory of samples), adaptation of a sampling behavior based on a global particle performance, and a specific approach to local search. The above-mentioned enhancements resulted in improved performance of M-GAPSO over GAPSO, observed on both COCO BBOB testbed and in the black-box optimization competition BBComp. Also, for lower dimensionality functions (up to 5D) results of M-GAPSO are better or comparable to the state-of-the art version of CMA-ES (namely the KL-BIPOP-CMA-ES algorithm presented at the GECCO 2017 conference).

Results in Papers With Code
(↓ scroll down to see all results)