Information-geometric optimization with natural selection

6 Dec 2019  ·  Jakub Otwinowski, Colin LaMont ·

Evolutionary algorithms, inspired by natural evolution, aim to optimize difficult objective functions without computing derivatives. Here we detail the relationship between population genetics and evolutionary optimization and formulate a new evolutionary algorithm. Optimization of a continuous objective function is analogous to searching for high fitness phenotypes on a fitness landscape. We summarize how natural selection moves a population along the non-euclidean gradient that is induced by the population on the fitness landscape (the natural gradient). Under normal approximations common in quantitative genetics, we show how selection is related to Newton's method in optimization. We find that intermediate selection is most informative of the fitness landscape. We describe the generation of new phenotypes and introduce an operator that recombines the whole population to generate variants that preserve normal statistics. Finally, we introduce a proof-of-principle algorithm that combines natural selection, our recombination operator, and an adaptive method to increase selection. Our algorithm is similar to covariance matrix adaptation and natural evolutionary strategies in optimization, and has similar performance. The algorithm is extremely simple in implementation with no matrix inversion or factorization, does not require storing a covariance matrix, and may form the basis of more general model-based optimization algorithms with natural gradient updates.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here