Search Results for author: Sandra Fortini

Found 7 papers, 1 papers with code

Non-asymptotic approximations of Gaussian neural networks via second-order Poincaré inequalities

no code implementations8 Apr 2023 Alberto Bordino, Stefano Favaro, Sandra Fortini

There is a growing interest on large-width asymptotic properties of Gaussian neural networks (NNs), namely NNs whose weights are initialized according to Gaussian distributions.

Infinitely wide limits for deep Stable neural networks: sub-linear, linear and super-linear activation functions

no code implementations8 Apr 2023 Alberto Bordino, Stefano Favaro, Sandra Fortini

As a novelty with respect to previous works, our results rely on the use of a generalized central limit theorem for heavy tails distributions, which allows for an interesting unified treatment of infinitely wide limits for deep Stable NNs.

Large-width asymptotics for ReLU neural networks with $α$-Stable initializations

no code implementations16 Jun 2022 Stefano Favaro, Sandra Fortini, Stefano Peluchetti

As a difference with respect to the Gaussian setting, our result shows that the choice of the activation function affects the scaling of the NN, that is: to achieve the infinitely wide $\alpha$-Stable process, the ReLU activation requires an additional logarithmic term in the scaling with respect to sub-linear activations.

regression

Deep Stable neural networks: large-width asymptotics and convergence rates

no code implementations2 Aug 2021 Stefano Favaro, Sandra Fortini, Stefano Peluchetti

Then, we establish sup-norm convergence rates of the rescaled deep Stable NN to the Stable SP, under the ``joint growth" and a ``sequential growth" of the width over the NN's layers.

Bayesian Inference

Large-width functional asymptotics for deep Gaussian neural networks

no code implementations ICLR 2021 Daniele Bracale, Stefano Favaro, Sandra Fortini, Stefano Peluchetti

In this paper, we consider fully connected feed-forward deep neural networks where weights and biases are independent and identically distributed according to Gaussian distributions.

Gaussian Processes

Infinite-channel deep stable convolutional neural networks

no code implementations7 Feb 2021 Daniele Bracale, Stefano Favaro, Sandra Fortini, Stefano Peluchetti

The interplay between infinite-width neural networks (NNs) and classes of Gaussian processes (GPs) is well known since the seminal work of Neal (1996).

Gaussian Processes

Stable behaviour of infinitely wide deep neural networks

1 code implementation1 Mar 2020 Stefano Favaro, Sandra Fortini, Stefano Peluchetti

We consider fully connected feed-forward deep neural networks (NNs) where weights and biases are independent and identically distributed as symmetric centered stable distributions.

Gaussian Processes

Cannot find the paper you are looking for? You can Submit a new open access paper.