Out-of-Distribution Detection based on In-Distribution Data Patterns Memorization with Modern Hopfield Energy

Out-of-Distribution (OOD) detection is essential for safety-critical applications of deep neural networks. OOD detection is challenging since DNN models may produce very high logits value even for OOD samples. Hence, it is of great difficulty to discriminate OOD data by directly adopting Softmax on output logits as the confidence score. Differently, we detect the OOD sample with Hopfield energy in a store-then-compare paradigm. In more detail, penultimate layer outputs on the training set are considered as the representations of in-distribution (ID) data. Thus they can be transformed into stored patterns that serve as anchors to measure the discrepancy of unseen data for OOD detection. Starting from the energy function defined in Modern Hopfield Network for the discrepancy score calculation, we derive a simplified version SHE with theoretical analysis. In SHE, we utilize only one stored pattern to present each class, and these patterns can be obtained by simply averaging the penultimate layer outputs of training samples within this class. SHE has the advantages of hyperparameterfree and high computational efficiency. The evaluations of nine widely-used OOD datasets show the promising performance of such a simple yet effective approach and its superiority over State-of-the-Art models. Code is available at https://github.com/zjs975584714/SHE ood detection.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Out-of-Distribution Detection ImageNet-1k vs iNaturalist SHE FPR95 34.22 # 14
Out-of-Distribution Detection ImageNet-1k vs Places SHE FPR95 45.35 # 11
Out-of-Distribution Detection ImageNet-1k vs SUN SHE FPR95 54.19 # 16
Out-of-Distribution Detection ImageNet-1k vs Textures SHE FPR95 45.09 # 17

Methods