Search Results for author: Ulrich Terstiege

Found 2 papers, 0 papers with code

Convergence of gradient descent for learning linear neural networks

no code implementations4 Aug 2021 Gabin Maxime Nguegnang, Holger Rauhut, Ulrich Terstiege

In the case of three or more layers we show that gradient descent converges to a global minimum on the manifold matrices of some fixed rank, where the rank cannot be determined a priori.

Learning deep linear neural networks: Riemannian gradient flows and convergence to global minimizers

no code implementations12 Oct 2019 Bubacarr Bah, Holger Rauhut, Ulrich Terstiege, Michael Westdickenberg

We study the convergence of gradient flows related to learning deep linear neural networks (where the activation function is the identity map) from data.

Cannot find the paper you are looking for? You can Submit a new open access paper.