Search Results for author: Alain Celisse

Found 5 papers, 1 papers with code

Minimum discrepancy principle strategy for choosing $k$ in $k$-NN regression

1 code implementation20 Aug 2020 Yaroslav Averyanov, Alain Celisse

We treat the problem of choosing the hyperparameter as an iterative procedure (over $k$) and propose using an easily implemented in practice strategy based on the idea of early stopping and the minimum discrepancy principle.

Model Selection regression

Early stopping and polynomial smoothing in regression with reproducing kernels

no code implementations14 Jul 2020 Yaroslav Averyanov, Alain Celisse

In this paper, we study the problem of early stopping for iterative learning algorithms in a reproducing kernel Hilbert space (RKHS) in the nonparametric regression framework.

regression

Analyzing the discrepancy principle for kernelized spectral filter learning algorithms

no code implementations17 Apr 2020 Alain Celisse, Martin Wahl

We investigate the construction of early stopping rules in the nonparametric regression problem where iterative learning algorithms are used and the optimal iteration number is unknown.

New efficient algorithms for multiple change-point detection with kernels

no code implementations12 Oct 2017 Alain Celisse, Guillemette Marot, Morgane Pierre-Jean, Guillem Rigaill

Finally, simulations confirmed the higher statistical accuracy of kernel-based approaches to detect changes that are not only in the mean.

Change Point Detection

Stability revisited: new generalisation bounds for the Leave-one-Out

no code implementations23 Aug 2016 Alain Celisse, Benjamin Guedj

The present paper provides a new generic strategy leading to non-asymptotic theoretical guarantees on the Leave-one-Out procedure applied to a broad class of learning algorithms.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.