1 code implementation • 8 Feb 2024 • Mustapha Bounoua, Giulio Franzese, Pietro Michiardi
The analysis of scientific data and complex multivariate systems requires information quantities that capture relationships among multiple random variables.
1 code implementation • 13 Oct 2023 • Giulio Franzese, Mustapha Bounoua, Pietro Michiardi
In this work we present a new method for the estimation of Mutual Information (MI) between random variables.
1 code implementation • 7 Jun 2023 • Mustapha Bounoua, Giulio Franzese, Pietro Michiardi
Multi-modal data-sets are ubiquitous in modern applications, and multi-modal Variational Autoencoders are a popular family of models that aim to learn a joint representation of the different modalities.
1 code implementation • NeurIPS 2023 • Ba-Hien Tran, Giulio Franzese, Pietro Michiardi, Maurizio Filippone
Generative Models (GMs) have attracted considerable attention due to their tremendous success in various domains, such as computer vision where they are capable to generate impressive realistic-looking images.
1 code implementation • NeurIPS 2023 • Giulio Franzese, Giulio Corallo, Simone Rossi, Markus Heinonen, Maurizio Filippone, Pietro Michiardi
We introduce Functional Diffusion Processes (FDPs), which generalize score-based diffusion models to infinite-dimensional function spaces.
Ranked #21 on Image Generation on CelebA 64x64
no code implementations • 10 Jun 2022 • Giulio Franzese, Simone Rossi, Lixuan Yang, Alessandro Finamore, Dario Rossi, Maurizio Filippone, Pietro Michiardi
Score-based diffusion models are a class of generative models whose dynamics is described by stochastic differential equations that map noise into data.
no code implementations • 30 Jun 2021 • Giulio Franzese, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi
We revisit the theoretical properties of Hamiltonian stochastic differential equations (SDES) for Bayesian posterior sampling, and we study the two types of errors that arise from numerical SDE simulation: the discretization error and the error due to noisy gradient estimates in the context of data subsampling.
no code implementations • 9 Jun 2020 • Giulio Franzese, Rosa Candela, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi
In this work we define a unified mathematical framework to deepen our understanding of the role of stochastic gradient (SG) noise on the behavior of Markov chain Monte Carlo sampling (SGMCMC) algorithms.
no code implementations • 21 Oct 2019 • Rosa Candela, Giulio Franzese, Maurizio Filippone, Pietro Michiardi
Large scale machine learning is increasingly relying on distributed optimization, whereby several machines contribute to the training process of a statistical model.
no code implementations • 6 Mar 2018 • Giulio Franzese, Monica Visintin
We describe a novel classifier with a tree structure, designed using information theory concepts.