1 code implementation • 25 Oct 2023 • Thomas Guilmeau, Nicola Branchini, Emilie Chouzenoux, Víctor Elvira
We then show that the $\alpha$-divergence can be approximated by a generalized notion of effective sample size and leverage this new perspective to adapt the tail parameter with Bayesian optimization.
no code implementations • 20 Jul 2023 • Víctor Elvira, Émilie Chouzenoux, Jordi Cerdà, Gustau Camps-Valls
Granger causality (GC) is often considered not an actual form of causality.
no code implementations • 20 Feb 2023 • Wenhan Li, Xiongjie Chen, Wenwu Wang, Víctor Elvira, Yunpeng Li
Differentiable particle filters are an emerging class of particle filtering methods that use neural networks to construct and learn parametric state-space models.
1 code implementation • 30 Sep 2022 • Oskar Kviman, Ricky Molén, Alexandra Hotti, Semih Kurt, Víctor Elvira, Jens Lagergren
In this work, we also demonstrate that increasing the number of mixture components improves the latent-representation capabilities of the VAE on both image and single-cell datasets.
no code implementations • 27 Sep 2022 • Ali Mousavi, Reza Monsefi, Víctor Elvira
Importance sampling (IS) is a powerful Monte Carlo (MC) methodology for approximating integrals, for instance in the context of Bayesian inference.
1 code implementation • 22 Feb 2022 • Oskar Kviman, Harald Melin, Hazal Koptagel, Víctor Elvira, Jens Lagergren
In variational inference (VI), the marginal log-likelihood is estimated using the standard evidence lower bound (ELBO), or improved versions as the importance weighted ELBO (IWELBO).
no code implementations • 18 Jul 2021 • Luca Martino, Víctor Elvira, Javier López-Santiago, Gustau Camps-Valls
In many inference problems, the evaluation of complex and costly models is often required.
no code implementations • 18 Jul 2021 • Luca Martino, Víctor Elvira
In its basic version, C-MC is strictly related to the stratification technique, a well-known method used for variance reduction purposes.
no code implementations • 18 Nov 2020 • Nicola Branchini, Víctor Elvira
In this work, we propose optimized auxiliary particle filters, a framework where the traditional APF auxiliary variables are interpreted as weights in an importance sampling mixture proposal.
Computation
no code implementations • 4 Dec 2018 • Ömer Deniz Akyildiz, Émilie Chouzenoux, Víctor Elvira, Joaquín Míguez
In this paper, we propose a probabilistic optimization method, named probabilistic incremental proximal gradient (PIPG) method, by developing a probabilistic interpretation of the incremental proximal gradient algorithm.
no code implementations • 11 Sep 2016 • Steven Van Vaerenbergh, Jesus Fernandez-Bes, Víctor Elvira
We study the relationship between online Gaussian process (GP) regression and kernel least mean squares (KLMS) algorithms.
no code implementations • 27 Jan 2015 • Jesus Fernandez-Bes, Víctor Elvira, Steven Van Vaerenbergh
We introduce a probabilistic approach to the LMS filter.