1 code implementation • NeurIPS 2023 • Christian Kümmerle, Johannes Maly
We prove locally quadratic convergence of the iterates to a simultaneously structured data matrix in a regime of minimal sample complexity (up to constants and a logarithmic factor), which is known to be impossible for a combination of convex surrogates.
1 code implementation • CVPR 2023 • Liangzu Peng, Christian Kümmerle, René Vidal
Outlier-robust estimation involves estimating some parameters (e. g., 3D rotations) from data samples in the presence of outliers, and is typically formulated as a non-convex and non-smooth problem.
no code implementations • 1 Dec 2022 • Christian Kümmerle, Mauro Maggioni, Sui Tang
This Spatio-Temporal Transition Operator Recovery problem is motivated by the recent interest in learning time-varying graph signals that are driven by graph operators depending on the underlying graph topology.
1 code implementation • 3 Jun 2021 • Christian Kümmerle, Claudio Mayrink Verdun
We propose an iterative algorithm for low-rank matrix completion that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a non-convex rank surrogate.
no code implementations • NeurIPS 2021 • Christian Kümmerle, Claudio Mayrink Verdun, Dominik Stöger
The recovery of sparse data is at the core of many applications in machine learning and signal processing.
1 code implementation • 7 Sep 2020 • Christian Kümmerle, Claudio M. Verdun
We propose an iterative algorithm for low-rank matrix completion that can be interpreted as both an iteratively reweighted least squares (IRLS) algorithm and a saddle-escaping smoothing Newton method applied to a non-convex rank surrogate objective.
no code implementations • 17 Jan 2019 • Dominik Alfke, Weston Baines, Jan Blechschmidt, Mauricio J. del Razo Sarmina, Amnon Drory, Dennis Elbrächter, Nando Farchmin, Matteo Gambara, Silke Glas, Philipp Grohs, Peter Hinz, Danijel Kivaranovic, Christian Kümmerle, Gitta Kutyniok, Sebastian Lunz, Jan Macdonald, Ryan Malthaner, Gregory Naisat, Ariel Neufeld, Philipp Christian Petersen, Rafael Reisenhofer, Jun-Da Sheng, Laura Thesing, Philipp Trunschke, Johannes von Lindheim, David Weber, Melanie Weber
We present a novel technique based on deep learning and set theory which yields exceptional classification and prediction results.
1 code implementation • 15 Mar 2017 • Christian Kümmerle, Juliane Sigl
We propose a new iteratively reweighted least squares (IRLS) algorithm for the recovery of a matrix $X \in \mathbb{C}^{d_1\times d_2}$ of rank $r \ll\min(d_1, d_2)$ from incomplete linear observations, solving a sequence of low complexity linear problems.
Numerical Analysis Information Theory Information Theory Optimization and Control