Weight-guided class complementing for long-tailed image recognition

Real-world data are often long-tailed distributed and have plenty classes. This characteristic leads to a significant performance drop for various models. One reason behind that is the gradient shift caused by unsampled classes in each training iteration. In this paper, we propose a Weight-Guided Class Complementing framework to address this issue. Specifically, this framework first complements the unsampled classes in each training iteration by using a dynamic updated data slot. Then, considering the over-fitting issue caused by class complementing, we utilize the classifier weights as learned knowledge and encourage the model to discover more class specific characteristics. Finally, we design a weight refining scheme to deal with the long-tailed bias existing in classifier weights. Experimental results show that our framework can be implemented upon different existing approaches effectively, achieving consistent improvements on various benchmarks with new state-of-the-art performances.

PDF Abstract
No code implementations yet. Submit your code now
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Long-tail Learning CIFAR-100-LT (ρ=100) LDAM-DRW + WGCC Error Rate 56.4 # 53
Long-tail Learning CIFAR-100-LT (ρ=100) NCL* + WGCC (ensemble) Error Rate 44.9 # 12
Long-tail Learning CIFAR-10-LT (ρ=100) NCL* + WGCC (ensemble) Error Rate 15.4 # 11

Methods


No methods listed for this paper. Add relevant methods here