The Majority Can Help The Minority: Context-rich Minority Oversampling for Long-tailed Classification

The problem of class imbalanced data is that the generalization performance of the classifier deteriorates due to the lack of data from minority classes. In this paper, we propose a novel minority over-sampling method to augment diversified minority samples by leveraging the rich context of the majority classes as background images. To diversify the minority samples, our key idea is to paste an image from a minority class onto rich-context images from a majority class, using them as background images. Our method is simple and can be easily combined with the existing long-tailed recognition methods. We empirically prove the effectiveness of the proposed oversampling method through extensive experiments and ablation studies. Without any architectural changes or complex algorithms, our method achieves state-of-the-art performance on various long-tailed classification benchmarks. Our code is made available at https://github.com/naver-ai/cmo.

PDF Abstract CVPR 2022 PDF CVPR 2022 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Long-tail Learning CIFAR-100-LT (ρ=100) RIDE 3 experts + CMO Error Rate 50 # 27
Long-tail Learning CIFAR-100-LT (ρ=100) CE-DRW Error Rate 58.9 # 60
Long-tail Learning CIFAR-100-LT (ρ=100) Balanced Softmax + CMO Error Rate 53.4 # 38
Long-tail Learning CIFAR-100-LT (ρ=100) LDAM-DRW + CMO Error Rate 52.8 # 36
Long-tail Learning ImageNet-LT BS-CMO (ResNet-50) Top-1 Accuracy 58.0 # 20
Image Classification iNaturalist 2018 BS-CMO (ResNet-50) Top-1 Accuracy 74.0% # 25

Methods


No methods listed for this paper. Add relevant methods here