Long-Tailed Classification Based on Coarse-Grained Leading Forest and Multi-Center Loss

12 Oct 2023  ·  Jinye Yang, Ji Xu, Di wu, Jianhang Tang, Shaobo Li, Guoyin Wang ·

Long-tailed (LT) classification is an unavoidable and challenging problem in the real world. Most existing long-tailed classification methods focus only on solving the class-wise imbalance while ignoring the attribute-wise imbalance. The deviation of a classification model is caused by both class-wise and attribute-wise imbalance. Due to the fact that attributes are implicit in most datasets and the combination of attributes is complex, attribute-wise imbalance is more difficult to handle. For this purpose, we proposed a novel long-tailed classification framework, aiming to build a multi-granularity classification model by means of invariant feature learning. This method first unsupervisedly constructs Coarse-Grained forest (CLF) to better characterize the distribution of attributes within a class. Depending on the distribution of attributes, one can customize suitable sampling strategies to construct different imbalanced datasets. We then introduce multi-center loss (MCL) that aims to gradually eliminate confusing attributes during feature learning process. The proposed framework does not necessarily couple to a specific LT classification model structure and can be integrated with any existing LT method as an independent component. Extensive experiments show that our approach achieves state-of-the-art performance on both existing benchmarks ImageNet-GLT and MSCOCO-GLT and can improve the performance of existing LT methods. Our codes are available on GitHub: \url{https://github.com/jinyery/cognisance}

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods