Provably End-to-end Label-Noise Learning without Anchor Points

4 Feb 2021  ·  Xuefeng Li, Tongliang Liu, Bo Han, Gang Niu, Masashi Sugiyama ·

In label-noise learning, the transition matrix plays a key role in building statistically consistent classifiers. Existing consistent estimators for the transition matrix have been developed by exploiting anchor points. However, the anchor-point assumption is not always satisfied in real scenarios. In this paper, we propose an end-to-end framework for solving label-noise learning without anchor points, in which we simultaneously optimize two objectives: the cross entropy loss between the noisy label and the predicted probability by the neural network, and the volume of the simplex formed by the columns of the transition matrix. Our proposed framework can identify the transition matrix if the clean class-posterior probabilities are sufficiently scattered. This is by far the mildest assumption under which the transition matrix is provably identifiable and the learned classifier is statistically consistent. Experimental results on benchmark datasets demonstrate the effectiveness and robustness of the proposed method.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Learning with noisy labels CIFAR-100N VolMinNet Accuracy (mean) 57.80 # 14
Learning with noisy labels CIFAR-10N-Aggregate VolMinNet Accuracy (mean) 89.70 # 20
Learning with noisy labels CIFAR-10N-Random1 VolMinNet Accuracy (mean) 88.30 # 20
Learning with noisy labels CIFAR-10N-Random2 VolMinNet Accuracy (mean) 88.27 # 17
Learning with noisy labels CIFAR-10N-Random3 VolMinNet Accuracy (mean) 88.19 # 17
Learning with noisy labels CIFAR-10N-Worst VolMinNet Accuracy (mean) 80.53 # 20

Methods


No methods listed for this paper. Add relevant methods here