Out-of-Distribution Detection

328 papers with code • 51 benchmarks • 23 datasets

Detect out-of-distribution or anomalous examples.

Libraries

Use these libraries to find Out-of-Distribution Detection models and implementations

Most implemented papers

SSD: A Unified Framework for Self-Supervised Outlier Detection

inspire-group/SSD ICLR 2021

We demonstrate that SSD outperforms most existing detectors based on unlabeled data by a large margin.

A Simple Fix to Mahalanobis Distance for Improving Near-OOD Detection

google/uncertainty-baselines 16 Jun 2021

Mahalanobis distance (MD) is a simple and popular post-processing method for detecting out-of-distribution (OOD) inputs in neural networks.

Natural Synthetic Anomalies for Self-Supervised Anomaly Detection and Localization

hmsch/natural-synthetic-anomalies 30 Sep 2021

We introduce a simple and intuitive self-supervision task, Natural Synthetic Anomalies (NSA), for training an end-to-end model for anomaly detection and localization using only normal training data.

Generalized Out-of-Distribution Detection: A Survey

jingkang50/openood 21 Oct 2021

In this survey, we first present a unified framework called generalized OOD detection, which encompasses the five aforementioned problems, i. e., AD, ND, OSR, OOD detection, and OD.

Out of Distribution Detection via Neural Network Anchoring

llnl/amp 8 Jul 2022

Our goal in this paper is to exploit heteroscedastic temperature scaling as a calibration strategy for out of distribution (OOD) detection.

OpenOOD: Benchmarking Generalized Out-of-Distribution Detection

jingkang50/openood 13 Oct 2022

Out-of-distribution (OOD) detection is vital to safety-critical machine learning applications and has thus been extensively studied, with a plethora of methods developed in the literature.

Probabilistic MIMO U-Net: Efficient and Accurate Uncertainty Estimation for Pixel-wise Regression

antonbaumann/mimo-unet 14 Aug 2023

For that purpose, we adapted the U-Net architecture to train multiple subnetworks within a single model, harnessing the overparameterization in deep neural networks.

On Mixup Training: Improved Calibration and Predictive Uncertainty for Deep Neural Networks

paganpasta/onmixup NeurIPS 2019

In this work, we discuss a hitherto untouched aspect of mixup training -- the calibration and predictive uncertainty of models trained with mixup.

Detecting Underspecification with Local Ensembles

dmadras/local-ensembles ICLR 2020

We present local ensembles, a method for detecting underspecification -- when many possible predictors are consistent with the training data and model class -- at test time in a pre-trained model.

Generalized ODIN: Detecting Out-of-distribution Image without Learning from Out-of-distribution Data

sayakpaul/Generalized-ODIN-TF CVPR 2020

Deep neural networks have attained remarkable performance when applied to data that comes from the same distribution as that of the training set, but can significantly degrade otherwise.