Rethinking Softmax with Cross-Entropy: Neural Network Classifier as Mutual Information Estimator

25 Nov 2019Zhenyue QinDongwoo KimTom Gedeon

Mutual information is widely applied to learn latent representations of observations, whilst its implication in classification neural networks remain to be better explained. We show that optimising the parameters of classification neural networks with softmax cross-entropy is equivalent to maximising the mutual information between inputs and labels under the balanced data assumption... (read more)

PDF Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT LEADERBOARD
Weakly-Supervised Object Localization CUB-200-2011 InfoCAM Top-1 Error Rate 54.17 # 3
Top-1 Localization Accuracy 55.83 # 1
Fine-Grained Image Classification Imbalanced CUB-200-2011 PC-Softmax Average Per-Class Accuracy 87.69 # 1
Accuracy 89.73 # 1
Image Classification Imbalanced CUB-200-2011 PC-Softmax Average Per-Class Accuracy 87.69 # 1
Accuracy 89.73 # 1
Weakly-Supervised Object Localization Tiny ImageNet InfoCAM Top-1 Localization Accuracy 43.34 # 1

Methods used in the Paper