Learning Pseudo Labels for Semi-and-Weakly Supervised Semantic Segmentation

In this paper, we aim to tackle semi-and-weakly supervised semantic segmentation (SWSSS), where many image-level classification labels and a few pixel-level annotations are available. We believe the most crucial point for solving SWSSS is to produce high-quality pseudo labels, and our method deals with it from two perspectives. Firstly, we introduce a class-aware cross entropy (CCE) loss for network training. Compared to conventional cross entropy loss, CCE loss encourages the model to distinguish concurrent classes only and simplifies the learning target of pseudo label generation. Secondly, we propose a progressive cross training (PCT) method to build cross supervision between two networks with a dynamic evaluation mechanism, which progressively introduces high-quality predictions as additional supervision for network training. Our method significantly improves the quality of generated pseudo labels in the regime with extremely limited annotations. Extensive experiments demonstrate that our approach outperforms state-of-the-art methods significantly.

PDF

Datasets


  Add Datasets introduced or used in this paper
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Semi-Supervised Semantic Segmentation Pascal VOC 2012 12.5% labeled PCT (DeepLab v3+ with ResNet-50 pretrained on ImageNet-1K) Validation mIoU 75.52% # 14
Semi-Supervised Semantic Segmentation PASCAL VOC 2012 25% labeled PCT (DeepLab v3+ with ResNet-50 pretrained on ImageNet-1K) Validation mIoU 76.47 # 14
Semi-Supervised Semantic Segmentation PASCAL VOC 2012 50% PCT (DeepLab v3+ with ResNet-50 pretrained on ImageNet-1K) Validation mIoU 77.26% # 9
Semi-Supervised Semantic Segmentation Pascal VOC 2012 6.25% labeled PCT (DeepLab v3+ with ResNet-50 pretrained on ImageNet-1K) Validation mIoU 71.35 # 12

Methods


No methods listed for this paper. Add relevant methods here