Calibration of Model Uncertainty for Dropout Variational Inference

20 Jun 2020Max-Heinrich LavesSontje IhlerKarl-Philipp KortmannTobias Ortmaier

The model uncertainty obtained by variational Bayesian inference with Monte Carlo dropout is prone to miscalibration. In this paper, different logit scaling methods are extended to dropout variational inference to recalibrate model uncertainty... (read more)

PDF Abstract

Code


No code implementations yet. Submit your code now

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper