Search Results for author: Simone Brugiapaglia

Found 14 papers, 4 papers with code

Learning smooth functions in high dimensions: from sparse polynomials to deep neural networks

no code implementations4 Apr 2024 Ben Adcock, Simone Brugiapaglia, Nick Dexter, Sebastian Moraga

For the latter, there is currently a significant gap between the approximation theory of DNNs and the practical performance of deep learning.

Uncertainty Quantification

Neural Rank Collapse: Weight Decay and Small Within-Class Variability Yield Low-Rank Bias

no code implementations6 Feb 2024 Emanuele Zangrando, Piero Deidda, Simone Brugiapaglia, Nicola Guglielmi, Francesco Tudisco

Recent work in deep learning has shown strong empirical and theoretical evidence of an implicit low-rank bias: weight matrices in deep networks tend to be approximately low-rank and removing relatively small singular values during training or from available trained models may significantly reduce model size while maintaining or even improving model performance.

A practical existence theorem for reduced order models based on convolutional autoencoders

no code implementations1 Feb 2024 Nicola Rares Franco, Simone Brugiapaglia

In recent years, deep learning has gained increasing popularity in the fields of Partial Differential Equations (PDEs) and Reduced Order Modeling (ROM), providing domain practitioners with new powerful data-driven techniques such as Physics-Informed Neural Networks (PINNs), Neural Operators, Deep Operator Networks (DeepONets) and Deep-Learning based ROMs (DL-ROMs).

Model-adapted Fourier sampling for generative compressed sensing

no code implementations8 Oct 2023 Aaron Berk, Simone Brugiapaglia, Yaniv Plan, Matthew Scott, Xia Sheng, Ozgur Yilmaz

We study generative compressed sensing when the measurement matrix is randomly subsampled from a unitary matrix (with the DFT as an important special case).

Generalization Limits of Graph Neural Networks in Identity Effects Learning

1 code implementation30 Jun 2023 Giuseppe Alessio D'Inverno, Simone Brugiapaglia, Mirco Ravanelli

They are usually based on a message-passing mechanism and have gained increasing popularity for their intuitive formulation, which is closely linked to the Weisfeiler-Lehman (WL) test for graph isomorphism to which they have been proven equivalent in terms of expressive power.

Monte Carlo is a good sampling strategy for polynomial approximation in high dimensions

no code implementations18 Aug 2022 Ben Adcock, Simone Brugiapaglia

We show that there is a least-squares approximation based on $m$ Monte Carlo samples whose error decays algebraically fast in $m/\log(m)$, with a rate that is the same as that of the best $n$-term polynomial approximation.

Uncertainty Quantification

A coherence parameter characterizing generative compressed sensing with Fourier measurements

no code implementations19 Jul 2022 Aaron Berk, Simone Brugiapaglia, Babhru Joshi, Yaniv Plan, Matthew Scott, Özgür Yılmaz

In Bora et al. (2017), a mathematical framework was developed for compressed sensing guarantees in the setting where the measurement matrix is Gaussian and the signal structure is the range of a generative neural network (GNN).

Compressive Fourier collocation methods for high-dimensional diffusion equations with periodic boundary conditions

no code implementations2 Jun 2022 Weiqi Wang, Simone Brugiapaglia

We also present numerical experiments that illustrate the accuracy and stability of the method for the approximation of sparse and compressible solutions.

Compressive Sensing

On efficient algorithms for computing near-best polynomial approximations to high-dimensional, Hilbert-valued functions from limited samples

no code implementations25 Mar 2022 Ben Adcock, Simone Brugiapaglia, Nick Dexter, Sebastian Moraga

On the one hand, there is a well-developed theory of best $s$-term polynomial approximation, which asserts exponential or algebraic rates of convergence for holomorphic functions.

Uncertainty Quantification

Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data

no code implementations11 Dec 2020 Ben Adcock, Simone Brugiapaglia, Nick Dexter, Sebastian Moraga

Such problems are challenging: 1) pointwise samples are expensive to acquire, 2) the function domain is high dimensional, and 3) the range lies in a Hilbert space.

Generalizing Outside the Training Set: When Can Neural Networks Learn Identity Effects?

1 code implementation9 May 2020 Simone Brugiapaglia, Matthew Liu, Paul Tupper

Finally, we demonstrate our theory with computational experiments in which we explore the effect of different input encodings on the ability of algorithms to generalize to novel inputs.

Do log factors matter? On optimal wavelet approximation and the foundations of compressed sensing

no code implementations24 May 2019 Ben Adcock, Simone Brugiapaglia, Matthew King-Roskamp

A signature result in compressed sensing is that Gaussian random sampling achieves stable and robust recovery of sparse vectors under optimal conditions on the number of measurements.

Image Reconstruction Information Theory Information Theory

A compressive spectral collocation method for the diffusion equation under the restricted isometry property

1 code implementation17 Jul 2018 Simone Brugiapaglia

The approach is based on a spectral Sturm-Liouville approximation of the solution and on the collocation of the PDE in strong form at random points, by taking advantage of the compressive sensing principle.

Numerical Analysis

On oracle-type local recovery guarantees in compressed sensing

1 code implementation11 Jun 2018 Ben Adcock, Claire Boyer, Simone Brugiapaglia

We present improved sampling complexity bounds for stable and robust sparse recovery in compressed sensing.

Information Theory Information Theory

Cannot find the paper you are looking for? You can Submit a new open access paper.