Semi-Supervised Image Classification

124 papers with code • 58 benchmarks • 13 datasets

Semi-supervised image classification leverages unlabelled data as well as labelled data to increase classification performance.

You may want to read some blog posts to get an overview before reading the papers and checking the leaderboards:

( Image credit: Self-Supervised Semi-Supervised Learning )

Libraries

Use these libraries to find Semi-Supervised Image Classification models and implementations
7 papers
2,742
6 papers
1,355
See all 16 libraries.

Most implemented papers

Unsupervised Data Augmentation for Consistency Training

google-research/uda NeurIPS 2020

In this work, we present a new perspective on how to effectively noise unlabeled examples and argue that the quality of noising, specifically those produced by advanced data augmentation methods, plays a crucial role in semi-supervised learning.

Unsupervised Learning of Visual Features by Contrasting Cluster Assignments

facebookresearch/swav NeurIPS 2020

In addition, we also propose a new data augmentation strategy, multi-crop, that uses a mix of views with different resolutions in place of two full-resolution views, without increasing the memory or compute requirements much.

Virtual Adversarial Training: A Regularization Method for Supervised and Semi-Supervised Learning

takerum/vat_tf 13 Apr 2017

In our experiments, we applied VAT to supervised and semi-supervised learning tasks on multiple benchmark datasets.

Semi-Supervised Learning with Ladder Networks

arasmus/ladder NeurIPS 2015

We combine supervised learning with unsupervised learning in deep neural networks.

Meta Pseudo Labels

google-research/google-research CVPR 2021

We present Meta Pseudo Labels, a semi-supervised learning method that achieves a new state-of-the-art top-1 accuracy of 90. 2% on ImageNet, which is 1. 6% better than the existing state-of-the-art.

Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results

CuriousAI/mean-teacher NeurIPS 2017

Without changing the network architecture, Mean Teacher achieves an error rate of 4. 35% on SVHN with 250 labels, outperforming Temporal Ensembling trained with 1000 labels.

Big Self-Supervised Models are Strong Semi-Supervised Learners

google-research/simclr NeurIPS 2020

The proposed semi-supervised learning algorithm can be summarized in three steps: unsupervised pretraining of a big ResNet model using SimCLRv2, supervised fine-tuning on a few labeled examples, and distillation with unlabeled examples for refining and transferring the task-specific knowledge.

Temporal Ensembling for Semi-Supervised Learning

smlaine2/tempens 7 Oct 2016

In this paper, we present a simple and efficient method for training deep neural networks in a semi-supervised setting where only a small portion of training data is labeled.

Semi-Supervised Learning with Context-Conditional Generative Adversarial Networks

edenton/cc-gan 19 Nov 2016

We introduce a simple semi-supervised learning approach for images based on in-painting using an adversarial loss.

Self-Supervised Learning of Pretext-Invariant Representations

facebookresearch/vissl CVPR 2020

The goal of self-supervised learning from images is to construct image representations that are semantically meaningful via pretext tasks that do not require semantic annotations for a large training set of images.