no code implementations • 2 Jan 2024 • Ioar Casado, Luis A. Ortega, Andrés R. Masegosa, Aritz Pérez
This result can be understood as a PAC-Bayesian version of the Cram\'er-Chernoff bound.
no code implementations • 2 Oct 2023 • Yijie Zhang, Yi-Shan Wu, Luis A. Ortega, Andrés R. Masegosa
The cold posterior effect (CPE) (Wenzel et al., 2020) in Bayesian deep learning shows that, for posteriors with a temperature $T<1$, the resulting posterior predictive could have better performances than the Bayesian posterior ($T=1$).
no code implementations • 19 Jun 2023 • Andrés R. Masegosa, Luis A. Ortega
This paper introduces a distribution-dependent PAC-Chernoff bound that exhibits perfect tightness for interpolators, even within over-parameterized model classes.
1 code implementation • 24 Feb 2023 • Luis A. Ortega, Simón Rodríguez Santana, Daniel Hernández-Lobato
Specifically, its training cost is independent of the number of training points.
no code implementations • 21 Jul 2022 • Simón Rodríguez Santana, Luis A. Ortega, Daniel Hernández-Lobato, Bryan Zaldívar
Model selection in machine learning (ML) is a crucial part of the Bayesian learning procedure.
1 code implementation • 14 Jun 2022 • Luis A. Ortega, Simón Rodríguez Santana, Daniel Hernández-Lobato
This generalization is similar to that of deep GPs over GPs, but it is more flexible due to the use of IPs as the prior distribution over the latent functions.
1 code implementation • 26 Oct 2021 • Luis A. Ortega, Rafael Cabañas, Andrés R. Masegosa
In this work, we combine and expand previously published results in a theoretically sound framework that describes the relationship between diversity and ensemble performance for a wide range of ensemble methods.