Search Results for author: Jean Barbier

Found 19 papers, 4 papers with code

Fundamental limits of overparametrized shallow neural networks for supervised learning

no code implementations11 Jul 2023 Francesco Camilli, Daria Tieplova, Jean Barbier

We carry out an information-theoretical analysis of a two-layer neural network trained from input-output pairs generated by a teacher network with matching architecture, in overparametrized regimes.

Mismatched estimation of non-symmetric rank-one matrices corrupted by structured noise

no code implementations7 Feb 2023 Teng Fu, Yuhao Liu, Jean Barbier, Marco Mondelli, Shansuo Liang, Tianqi Hou

We study the performance of a Bayesian statistician who estimates a rank-one signal corrupted by non-symmetric rotationally invariant noise with a generic distribution of singular values.

Bayes-optimal limits in structured PCA, and how to reach them

1 code implementation3 Oct 2022 Jean Barbier, Francesco Camilli, Marco Mondelli, Manuel Saenz

To answer this, we study the paradigmatic spiked matrix model of principal components analysis (PCA), where a rank-one matrix is corrupted by additive noise.

The price of ignorance: how much does it cost to forget noise structure in low-rank matrix estimation?

no code implementations20 May 2022 Jean Barbier, Tianqi Hou, Marco Mondelli, Manuel Sáenz

We consider the problem of estimating a rank-1 signal corrupted by structured rotationally invariant noise, and address the following question: how well do inference algorithms perform when the noise statistics is unknown and hence Gaussian noise is assumed?

Statistical limits of dictionary learning: random matrix theory and the spectral replica method

no code implementations14 Sep 2021 Jean Barbier, Nicolas Macris

We consider increasingly complex models of matrix denoising and dictionary learning in the Bayes-optimal setting, in the challenging regime where the matrices to infer have a rank growing linearly with the system size.

Denoising Dictionary Learning

Performance of Bayesian linear regression in a model with mismatch

no code implementations14 Jul 2021 Jean Barbier, Wei-Kuo Chen, Dmitry Panchenko, Manuel Sáenz

Here we consider a model in which the responses are corrupted by gaussian noise and are known to be generated as linear combinations of the covariates, but the distributions of the ground-truth regression coefficients and of the noise are unknown.

regression

High-dimensional inference: a statistical mechanics perspective

no code implementations28 Oct 2020 Jean Barbier

In modern signal processing and machine learning, inference is done in very high dimension: very many unknown characteristics about the system have to be deduced from a lot of high-dimensional noisy data.

Vocal Bursts Intensity Prediction

Information theoretic limits of learning a sparse rule

no code implementations NeurIPS 2020 Clément Luneau, Jean Barbier, Nicolas Macris

We consider generalized linear models in regimes where the number of nonzero components of the signal and accessible data points are sublinear with respect to the size of the signal.

All-or-nothing statistical and computational phase transitions in sparse spiked matrix estimation

no code implementations NeurIPS 2020 Jean Barbier, Nicolas Macris, Cynthia Rush

We determine statistical and computational limits for estimation of a rank-one matrix (the spike) corrupted by an additive gaussian noise matrix, in a sparse limit, where the underlying hidden vector (that constructs the rank-one matrix) has a number of non-zero components that scales sub-linearly with the total dimension of the vector, and the signal-to-noise ratio tends to infinity at an appropriate speed.

Information-theoretic limits of a multiview low-rank symmetric spiked matrix model

no code implementations16 May 2020 Jean Barbier, Galen Reeves

We consider a generalization of an important class of high-dimensional inference problems, namely spiked symmetric matrix models, often used as probabilistic models for principal component analysis.

0-1 phase transitions in sparse spiked matrix estimation

no code implementations12 Nov 2019 Jean Barbier, Nicolas Macris

We consider statistical models of estimation of a rank-one matrix (the spike) corrupted by an additive gaussian noise matrix in the sparse limit.

 Ranked #1 on Person Re-Identification on Market-1501 (Average-mAP metric)

3D Face Reconstruction Anomaly Detection +4

Overlap matrix concentration in optimal Bayesian inference

no code implementations4 Apr 2019 Jean Barbier

We show that, under a proper perturbation, these models are replica symmetric in the sense that the overlap matrix concentrates.

Information Theory Disordered Systems and Neural Networks Information Theory Probability

Rank-one matrix estimation: analysis of algorithmic and information theoretic limits by the spatial coupling method

no code implementations6 Dec 2018 Jean Barbier, Mohamad Dia, Nicolas Macris, Florent Krzakala, Lenka Zdeborová

We characterize the detectability phase transitions in a large set of estimation problems, where we show that there exists a gap between what currently known polynomial algorithms (in particular spectral methods and approximate message-passing) can do and what is expected information theoretically.

Community Detection Compressive Sensing

The committee machine: Computational to statistical gaps in learning a two-layers neural network

1 code implementation NeurIPS 2018 Benjamin Aubin, Antoine Maillard, Jean Barbier, Florent Krzakala, Nicolas Macris, Lenka Zdeborová

Heuristic tools from statistical physics have been used in the past to locate the phase transitions and compute the optimal learning and generalization errors in the teacher-student scenario in multi-layer neural networks.

Optimal Errors and Phase Transitions in High-Dimensional Generalized Linear Models

1 code implementation10 Aug 2017 Jean Barbier, Florent Krzakala, Nicolas Macris, Léo Miolane, Lenka Zdeborová

Non-rigorous predictions for the optimal errors existed for special cases of GLMs, e. g. for the perceptron, in the field of statistical physics based on the so-called replica method.

Vocal Bursts Intensity Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.