Imbalanced Image Classification with Complement Cross Entropy

4 Sep 2020  ·  Yechan Kim, Younkwan Lee, Moongu Jeon ·

Recently, deep learning models have achieved great success in computer vision applications, relying on large-scale class-balanced datasets. However, imbalanced class distributions still limit the wide applicability of these models due to degradation in performance. To solve this problem, in this paper, we concentrate on the study of cross entropy which mostly ignores output scores on incorrect classes. This work discovers that neutralizing predicted probabilities on incorrect classes improves the prediction accuracy for imbalanced image classification. This paper proposes a simple but effective loss named complement cross entropy based on this finding. The proposed loss makes the ground truth class overwhelm the other classes in terms of softmax probability, by neutralizing probabilities of incorrect classes, without additional training procedures. Along with it, this loss facilitates the models to learn key information especially from samples on minority classes. It ensures more accurate and robust classification results on imbalanced distributions. Extensive experiments on imbalanced datasets demonstrate the effectiveness of the proposed method.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods