1 code implementation • 20 Aug 2020 • Yaroslav Averyanov, Alain Celisse
We treat the problem of choosing the hyperparameter as an iterative procedure (over $k$) and propose using an easily implemented in practice strategy based on the idea of early stopping and the minimum discrepancy principle.
no code implementations • 14 Jul 2020 • Yaroslav Averyanov, Alain Celisse
In this paper, we study the problem of early stopping for iterative learning algorithms in a reproducing kernel Hilbert space (RKHS) in the nonparametric regression framework.
no code implementations • 17 Apr 2020 • Alain Celisse, Martin Wahl
We investigate the construction of early stopping rules in the nonparametric regression problem where iterative learning algorithms are used and the optimal iteration number is unknown.
no code implementations • 12 Oct 2017 • Alain Celisse, Guillemette Marot, Morgane Pierre-Jean, Guillem Rigaill
Finally, simulations confirmed the higher statistical accuracy of kernel-based approaches to detect changes that are not only in the mean.
no code implementations • 23 Aug 2016 • Alain Celisse, Benjamin Guedj
The present paper provides a new generic strategy leading to non-asymptotic theoretical guarantees on the Leave-one-Out procedure applied to a broad class of learning algorithms.