Search Results for author: Konstantin Riedl

Found 2 papers, 2 papers with code

Gradient is All You Need?

2 code implementations16 Jun 2023 Konstantin Riedl, Timo Klock, Carina Geldhauser, Massimo Fornasier

The fundamental value of such link between CBO and SGD lies in the fact that CBO is provably globally convergent to global minimizers for ample classes of nonsmooth and nonconvex objective functions, hence, on the one side, offering a novel explanation for the success of stochastic relaxations of gradient descent.

Leveraging Memory Effects and Gradient Information in Consensus-Based Optimization: On Global Convergence in Mean-Field Law

3 code implementations22 Nov 2022 Konstantin Riedl

In this paper we study consensus-based optimization (CBO), a versatile, flexible and customizable optimization method suitable for performing nonconvex and nonsmooth global optimizations in high dimensions.

Cannot find the paper you are looking for? You can Submit a new open access paper.