Search Results for author: Rémi Bardenet

Found 21 papers, 11 papers with code

Point Processes and spatial statistics in time-frequency analysis

no code implementations29 Feb 2024 Barbara Pascal, Rémi Bardenet

The zeros of the spectrogram of a noisy signal are then the zeros of a random analytic function, hence forming a Point Process in $\mathbb{C}$.

Denoising Point Processes

Monte Carlo with kernel-based Gibbs measures: Guarantees for probabilistic herding

no code implementations18 Feb 2024 Martin Rouault, Rémi Bardenet, Mylène Maïda

In spite of strong experimental support, it has revealed difficult to prove that this worst-case error decreases at a faster rate than the standard square root of the number of quadrature nodes, at least in the usual case where the RKHS is infinite-dimensional.

Benchmarking multi-component signal processing methods in the time-frequency plane

no code implementations13 Feb 2024 Juan M. Miramont, Rémi Bardenet, Pierre Chainais, Francois Auger

For instance, detection and denoising based on the zeros of the spectrogram have been proposed since 2015, contrasting with a long history of focusing on larger values of the spectrogram.

Benchmarking Denoising

On sampling determinantal and Pfaffian point processes on a quantum computer

1 code implementation25 May 2023 Rémi Bardenet, Michaël Fanuel, Alexandre Feller

Most applications require sampling from a DPP, and given their quantum origin, it is natural to wonder whether sampling a DPP on a quantum computer is easier than on a classical one.

Point Processes

Sparsification of the regularized magnetic Laplacian with multi-type spanning forests

1 code implementation31 Aug 2022 Michaël Fanuel, Rémi Bardenet

We provide statistical guarantees for a choice of natural estimators of the connection Laplacian, and investigate two practical applications of our sparsifiers: ranking with angular synchronization and graph-based semi-supervised learning.

Vocal Bursts Type Prediction

A covariant, discrete time-frequency representation tailored for zero-based signal detection

1 code implementation8 Feb 2022 Barbara Pascal, Rémi Bardenet

Recent work in time-frequency analysis proposed to switch the focus from the maxima of the spectrogram toward its zeros, which, for signals corrupted by Gaussian noise, form a random point pattern with a very stable structure leveraged by modern spatial statistics tools to perform component disentanglement and signal detection.

Disentanglement

On proportional volume sampling for experimental design in general spaces

no code implementations9 Nov 2020 Arnaud Poinas, Rémi Bardenet

Optimal design for linear regression is a fundamental task in statistics.

Computation

Learning from DPPs via Sampling: Beyond HKPV and symmetry

no code implementations8 Jul 2020 Rémi Bardenet, Subhroshekhar Ghosh

Our approach is scalable and applies to very general DPPs, beyond traditional symmetric kernels.

feature selection Point Processes +2

Kernel interpolation with continuous volume sampling

no code implementations ICML 2020 Ayoub Belhadji, Rémi Bardenet, Pierre Chainais

A fundamental task in kernel methods is to pick nodes and weights, so as to approximate a given function from an RKHS by the weighted sum of kernel translates located at the nodes.

Density Estimation Point Processes

On two ways to use determinantal point processes for Monte Carlo integration

1 code implementation NeurIPS 2019 Guillaume Gautier, Rémi Bardenet, Michal Valko

In the absence of DPP machinery to derive an efficient sampler and analyze their estimator, the idea of Monte Carlo integration with DPPs was stored in the cellar of numerical integration.

Numerical Integration Point Processes

Kernel quadrature with DPPs

1 code implementation NeurIPS 2019 Ayoub Belhadji, Rémi Bardenet, Pierre Chainais

We study quadrature rules for functions from an RKHS, using nodes sampled from a determinantal point process (DPP).

A determinantal point process for column subset selection

no code implementations23 Dec 2018 Ayoub Belhadji, Rémi Bardenet, Pierre Chainais

We give bounds on the ratio of the expected approximation error for this DPP over the optimal error of PCA.

Dimensionality Reduction feature selection

DPPy: Sampling DPPs with Python

2 code implementations19 Sep 2018 Guillaume Gautier, Guillermo Polito, Rémi Bardenet, Michal Valko

Determinantal point processes (DPPs) are specific probability distributions over clouds of points that are used as models and computational tools across physics, probability, statistics, and more recently machine learning.

BIG-bench Machine Learning Point Processes

Time-frequency transforms of white noises and Gaussian analytic functions

1 code implementation30 Jul 2018 Rémi Bardenet, Adrien Hardy

Finally, we provide quantitative estimates concerning the finite-dimensional approximations of these white noises, which is of practical interest when it comes to implementing signal processing algorithms based on GAFs.

Probability Classical Analysis and ODEs Methodology

Zonotope hit-and-run for efficient sampling from projection DPPs

1 code implementation ICML 2017 Guillaume Gautier, Rémi Bardenet, Michal Valko

Previous theoretical results yield a fast mixing time of our chain when targeting a distribution that is close to a projection DPP, but not a DPP in general.

Point Processes Recommendation Systems

Monte Carlo with Determinantal Point Processes

1 code implementation2 May 2016 Rémi Bardenet, Adrien Hardy

We show that repulsive random variables can yield Monte Carlo methods with faster convergence rates than the typical $N^{-1/2}$, where $N$ is the number of integrand evaluations.

Probability Classical Analysis and ODEs Computation Methodology

Inference for determinantal point processes without spectral knowledge

no code implementations NeurIPS 2015 Rémi Bardenet, Michalis K. Titsias

DPPs possess desirable properties, such as exact sampling or analyticity of the moments, but learning the parameters of kernel $K$ through likelihood-based inference is not straightforward.

Point Processes Variational Inference

On Markov chain Monte Carlo methods for tall data

1 code implementation11 May 2015 Rémi Bardenet, Arnaud Doucet, Chris Holmes

Finally, we have only been able so far to propose subsampling-based methods which display good performance in scenarios where the Bernstein-von Mises approximation of the target posterior distribution is excellent.

Bayesian Inference

Algorithms for Hyper-Parameter Optimization

no code implementations NeurIPS 2011 James S. Bergstra, Rémi Bardenet, Yoshua Bengio, Balázs Kégl

Random search has been shown to be sufficiently efficient for learning neural networks for several datasets, but we show it is unreliable for training DBNs.

Image Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.