no code implementations • 23 Feb 2024 • Julien Zhou, Pierre Gaillard, Thibaud Rahier, Houssam Zenati, Julyan Arbel
We address the problem of stochastic combinatorial semi-bandits, where a player can select from P subsets of a set containing d base items.
no code implementations • 1 Feb 2024 • Theodore Papamarkou, Maria Skoularidou, Konstantina Palla, Laurence Aitchison, Julyan Arbel, David Dunson, Maurizio Filippone, Vincent Fortuin, Philipp Hennig, Jose Miguel Hernandez Lobato, Aliaksandr Hubin, Alexander Immer, Theofanis Karaletsos, Mohammad Emtiyaz Khan, Agustinus Kristiadi, Yingzhen Li, Stephan Mandt, Christopher Nemeth, Michael A. Osborne, Tim G. J. Rudner, David Rügamer, Yee Whye Teh, Max Welling, Andrew Gordon Wilson, Ruqi Zhang
In the current landscape of deep learning research, there is a predominant emphasis on achieving high predictive accuracy in supervised tasks involving large image and language datasets.
no code implementations • 20 Nov 2023 • Minh Tri Lê, Pierre Wolinski, Julyan Arbel
It then explores MEMS-based applications on ultra-low power MCUs, highlighting their potential for enabling TinyML on resource-constrained devices.
no code implementations • 4 Oct 2023 • Konstantinos Pitas, Julyan Arbel
We present a method to improve the calibration of deep ensembles in the small training data regime in the presence of unlabeled data.
1 code implementation • 28 Sep 2023 • Julyan Arbel, Konstantinos Pitas, Mariia Vladimirova, Vincent Fortuin
Neural networks have achieved remarkable performance across various problem domains, but their widespread applicability is hindered by inherent limitations such as overconfidence in predictions, lack of interpretability, and vulnerability to adversarial attacks.
no code implementations • 11 Sep 2023 • Konstantinos Pitas, Julyan Arbel
Contrary to previous results, we first show that for realistic models and datasets and the tightly controlled case of the Laplace approximation to the posterior, stochasticity does not in general improve test accuracy.
1 code implementation • 14 Jun 2023 • Daria Bystrova, Charles K. Assaad, Julyan Arbel, Emilie Devijver, Eric Gaussier, Wilfried Thuiller
In the second class, a constraint-based strategy is applied to identify a skeleton, which is then oriented using a noise-based strategy.
no code implementations • 22 Jun 2022 • Konstantinos Pitas, Julyan Arbel
We investigate the cold posterior effect through the lens of PAC-Bayes generalization bounds.
1 code implementation • 24 May 2022 • Pierre Wolinski, Julyan Arbel
The study of feature propagation at initialization in neural networks lies at the root of numerous initialization designs.
no code implementations • 29 Nov 2021 • Mariia Vladimirova, Julyan Arbel, Stéphane Girard
The connection between Bayesian neural networks and Gaussian processes gained a lot of attention in the last few years, with the flagship result that hidden units converge to a Gaussian process limit when the layers width tends to infinity.
no code implementations • 6 Oct 2021 • Mariia Vladimirova, Julyan Arbel, Stéphane Girard
The connection between Bayesian neural networks and Gaussian processes gained a lot of attention in the last few years.
2 code implementations • pproximateinference AABI Symposium 2021 • Daria Bystrova, Julyan Arbel, Guillaume Kon Kam King, François Deslandes
In Bayesian nonparametrics, knowledge of the prior distribution induced on the number of clusters is key for prior specification and calibration.
1 code implementation • 14 May 2019 • Hien D. Nguyen, Julyan Arbel, Hongliang Lü, Florence Forbes
Furthermore, we propose a consistent V-statistic estimator of the energy statistic, under which we show that the large sample result holds, and prove that the rejection ABC algorithm, based on the energy statistic, generates pseudo-posterior distributions that achieves convergence to the correct limits, when implemented with rejection thresholds that converge to zero, in the finite sample setting.
no code implementations • 11 Oct 2018 • Mariia Vladimirova, Jakob Verbeek, Pablo Mesejo, Julyan Arbel
We investigate deep Bayesian neural networks with Gaussian weight priors and a class of ReLU-like nonlinearities.
no code implementations • 8 Jun 2016 • Julyan Arbel, Igor Prünster
Completely random measures (CRM) represent the key building block of a wide variety of popular stochastic models and play a pivotal role in modern Bayesian Nonparametrics.