no code implementations • 13 Feb 2024 • Xuexin Chen, Ruichu Cai, Zhengting Huang, Yuxuan Zhu, Julien Horwood, Zhifeng Hao, Zijian Li, Jose Miguel Hernandez-Lobato
We investigate the problem of explainability in machine learning. To address this problem, Feature Attribution Methods (FAMs) measure the contribution of each feature through a perturbation test, where the difference in prediction is compared under different perturbations. However, such perturbation tests may not accurately distinguish the contributions of different features, when their change in prediction is the same after perturbation. In order to enhance the ability of FAMs to distinguish different features' contributions in this challenging setting, we propose to utilize the probability (PNS) that perturbing a feature is a necessary and sufficient cause for the prediction to change as a measure of feature importance. Our approach, Feature Attribution with Necessity and Sufficiency (FANS), computes the PNS via a perturbation test involving two stages (factual and interventional). In practice, to generate counterfactual samples, we use a resampling-based approach on the observed samples to approximate the required conditional distribution. Finally, we combine FANS and gradient-based optimization to extract the subset with the largest PNS. We demonstrate that FANS outperforms existing feature attribution methods on six benchmarks.
no code implementations • 23 Aug 2023 • Richard Bergna, Felix Opolka, Pietro Liò, Jose Miguel Hernandez-Lobato
We present a novel model Graph Neural Stochastic Differential Equations (Graph Neural SDEs).
no code implementations • 12 Mar 2020 • Zichao Wang, Sebastian Tschiatschek, Simon Woodhead, Jose Miguel Hernandez-Lobato, Simon Peyton Jones, Richard G. Baraniuk, Cheng Zhang
Online education platforms enable teachers to share a large number of educational resources such as questions to form exercises and quizzes for students.
no code implementations • pproximateinference AABI Symposium 2019 • Chao Ma, Sebastian Tschiatschek, Yingzhen Li, Richard Turner, Jose Miguel Hernandez-Lobato, Cheng Zhang
In this paper, we focused on improving VAEs for real-valued data that has heterogeneous marginal distributions.
no code implementations • CVPR 2016 • Viktoriia Sharmanska, Daniel Hernandez-Lobato, Jose Miguel Hernandez-Lobato, Novi Quadrianto
On the technical side, we propose a framework to incorporate annotation disagreements into the classifiers.
no code implementations • NeurIPS 2015 • Yingzhen Li, Jose Miguel Hernandez-Lobato, Richard E. Turner
Expectation propagation (EP) is a deterministic approximation algorithm that is often used to perform approximate Bayesian parameter learning.