Contraction Mapping of Feature Norms for Classifier Learning on the Data with Different Quality

27 Jul 2020  ·  Weihua Liu, Xiabi Liu, Murong Wang, Ling Ma ·

The popular softmax loss and its recent extensions have achieved great success in the deep learning-based image classification. However, the data for training image classifiers usually has different quality. Ignoring such problem, the correct classification of low quality data is hard to be solved. In this paper, we discover the positive correlation between the feature norm of an image and its quality through careful experiments on various applications and various deep neural networks. Based on this finding, we propose a contraction mapping function to compress the range of feature norms of training images according to their quality and embed this contraction mapping function into softmax loss or its extensions to produce novel learning objectives. The experiments on various classification applications, including handwritten digit recognition, lung nodule classification, face verification and face recognition, demonstrate that the proposed approach is promising to effectively deal with the problem of learning on the data with different quality and leads to the significant and stable improvements in the classification accuracy.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods