Search Results for author: Giulio Franzese

Found 10 papers, 5 papers with code

S$Ω$I: Score-based O-INFORMATION Estimation

1 code implementation8 Feb 2024 Mustapha Bounoua, Giulio Franzese, Pietro Michiardi

The analysis of scientific data and complex multivariate systems requires information quantities that capture relationships among multiple random variables.

MINDE: Mutual Information Neural Diffusion Estimation

1 code implementation13 Oct 2023 Giulio Franzese, Mustapha Bounoua, Pietro Michiardi

In this work we present a new method for the estimation of Mutual Information (MI) between random variables.

Mutual Information Estimation

Multi-modal Latent Diffusion

1 code implementation7 Jun 2023 Mustapha Bounoua, Giulio Franzese, Pietro Michiardi

Multi-modal data-sets are ubiquitous in modern applications, and multi-modal Variational Autoencoders are a popular family of models that aim to learn a joint representation of the different modalities.

multimodal generation

One-Line-of-Code Data Mollification Improves Optimization of Likelihood-based Generative Models

1 code implementation NeurIPS 2023 Ba-Hien Tran, Giulio Franzese, Pietro Michiardi, Maurizio Filippone

Generative Models (GMs) have attracted considerable attention due to their tremendous success in various domains, such as computer vision where they are capable to generate impressive realistic-looking images.

Density Estimation

Continuous-Time Functional Diffusion Processes

1 code implementation NeurIPS 2023 Giulio Franzese, Giulio Corallo, Simone Rossi, Markus Heinonen, Maurizio Filippone, Pietro Michiardi

We introduce Functional Diffusion Processes (FDPs), which generalize score-based diffusion models to infinite-dimensional function spaces.

Image Generation

How Much is Enough? A Study on Diffusion Times in Score-based Generative Models

no code implementations10 Jun 2022 Giulio Franzese, Simone Rossi, Lixuan Yang, Alessandro Finamore, Dario Rossi, Maurizio Filippone, Pietro Michiardi

Score-based diffusion models are a class of generative models whose dynamics is described by stochastic differential equations that map noise into data.

Computational Efficiency

Revisiting the Effects of Stochasticity for Hamiltonian Samplers

no code implementations30 Jun 2021 Giulio Franzese, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi

We revisit the theoretical properties of Hamiltonian stochastic differential equations (SDES) for Bayesian posterior sampling, and we study the two types of errors that arise from numerical SDE simulation: the discretization error and the error due to noisy gradient estimates in the context of data subsampling.

Numerical Integration

Isotropic SGD: a Practical Approach to Bayesian Posterior Sampling

no code implementations9 Jun 2020 Giulio Franzese, Rosa Candela, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi

In this work we define a unified mathematical framework to deepen our understanding of the role of stochastic gradient (SG) noise on the behavior of Markov chain Monte Carlo sampling (SGMCMC) algorithms.

Sparsification as a Remedy for Staleness in Distributed Asynchronous SGD

no code implementations21 Oct 2019 Rosa Candela, Giulio Franzese, Maurizio Filippone, Pietro Michiardi

Large scale machine learning is increasingly relying on distributed optimization, whereby several machines contribute to the training process of a statistical model.

Distributed Optimization

Deep Information Networks

no code implementations6 Mar 2018 Giulio Franzese, Monica Visintin

We describe a novel classifier with a tree structure, designed using information theory concepts.

Cannot find the paper you are looking for? You can Submit a new open access paper.