1 code implementation • 29 Jun 2023 • Maxim Buzdalov
In many applications of evolutionary algorithms the computational cost of applying operators and storing populations is comparable to the cost of fitness evaluation.
no code implementations • 23 Feb 2023 • Deyao Chen, Maxim Buzdalov, Carola Doerr, Nguyen Dang
Dynamic Algorithm Configuration (DAC) tackles the question of how to automatically learn policies to control parameters of algorithms in a data-driven fashion.
no code implementations • 14 Apr 2021 • Denis Antipov, Maxim Buzdalov, Benjamin Doerr
On the other hand, this algorithm is also very efficient on jump functions, where the best static parameters are very different from those necessary to optimize simple problems.
no code implementations • 23 Feb 2021 • Kirill Antonov, Maxim Buzdalov, Arina Buzdalova, Carola Doerr
With the goal to provide absolute lower bounds for the best possible running times that can be achieved by $(1+\lambda)$-type search heuristics on common benchmark problems, we recently suggested a dynamic programming approach that computes optimal expected running times and the regret values inferred when deviating from the optimal parameter choice.
no code implementations • 9 Feb 2021 • Maxim Buzdalov, Carola Doerr
However, only little is known so far about the influence of these distributions on the performance of evolutionary algorithms, and about the relationships between (dynamic) parameter control and (static) parameter sampling.
no code implementations • 22 Jun 2020 • Denis Antipov, Maxim Buzdalov, Benjamin Doerr
The mathematical runtime analysis of evolutionary algorithms traditionally regards the time an algorithm needs to find a solution of a certain quality when initialized with a random population.
no code implementations • 20 Jun 2020 • Maxim Buzdalov, Carola Doerr
With this in hand, we compute for all population sizes $\lambda \in \{2^i \mid 0 \le i \le 18\}$ and for problem dimension $n \in \{1000, 2000, 5000\}$ which mutation rates minimize the expected running time and which ones maximize the expected progress.
no code implementations • 20 Apr 2020 • Maxim Buzdalov, Benjamin Doerr, Carola Doerr, Dmitry Vinokurov
In this work, we conduct an in-depth study on the advantages and the limitations of fixed-target analyses.
no code implementations • 18 Apr 2020 • Anton Bassin, Maxim Buzdalov
The $(1+(\lambda,\lambda))$ genetic algorithm is a bright example of an evolutionary algorithm which was developed based on the insights from theoretical findings.
1 code implementation • 14 Apr 2020 • Denis Antipov, Maxim Buzdalov, Benjamin Doerr
In this first runtime analysis of a crossover-based algorithm using a heavy-tailed choice of the mutation rate, we show an even stronger impact.
no code implementations • 15 Apr 2019 • Anton Bassin, Maxim Buzdalov
In particular, the one fifth rule, which guides the adaptation in the example above, is able to raise the population size too fast on problems which are too far away from the perfect fitness-distance correlation.
no code implementations • 9 Apr 2019 • Nina Bulanova, Maxim Buzdalov
The binary value function, or BinVal, has appeared in several studies in theory of evolutionary computation as one of the extreme examples of linear pseudo-Boolean functions.
no code implementations • 15 Apr 2018 • Nina Bulanova, Maxim Buzdalov
In their GECCO'12 paper, Doerr and Doerr proved that the $k$-ary unbiased black-box complexity of OneMax on $n$ bits is $O(n/k)$ for $2\le k\le O(\log n)$.
no code implementations • 14 Apr 2018 • Ilya Yakupov, Maxim Buzdalov
We have implemented an asynchronous steady-state version of the NSGA-II algorithm.
no code implementations • 14 Apr 2017 • Maxim Buzdalov, Benjamin Doerr
We show that this problem can be overcome by equipping the self-adjusting GA with an upper limit for the population size.
no code implementations • 1 Oct 2015 • Viktor Arkhipov, Maxim Buzdalov, Anatoly Shalyto
We present our asynchronous implementation of the LM-CMA-ES algorithm, which is a modern evolution strategy for solving complex large-scale continuous optimization problems.