Browse SoTA > Computer Vision > Image Classification > Semi-Supervised Image Classification

# Semi-Supervised Image Classification Edit

41 papers with code · Computer Vision

Semi-supervised image classification leverages unlabelled data as well as labelled data to increase classification performance.

You may want to read some blog posts to get an overview before reading the papers and checking the leaderboard:

( Image credit: Self-Supervised Semi-Supervised Learning )

# Big Self-Supervised Models are Strong Semi-Supervised Learners

The proposed semi-supervised learning algorithm can be summarized in three steps: unsupervised pretraining of a big ResNet model using SimCLRv2 (a modification of SimCLR), supervised fine-tuning on a few labeled examples, and distillation with unlabeled examples for refining and transferring the task-specific knowledge.

1,019
17 Jun 2020

# Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning

13 Jun 2020lucidrains/byol-pytorch

From an augmented view of an image, we train the online network to predict the target network representation of the same image under a different augmented view.

166
13 Jun 2020

# SCAN: Learning to Classify Images without Labels

25 May 2020wvangansbeke/Unsupervised-Classification

First, a self-supervised task from representation learning is employed to obtain semantically meaningful features.

242
25 May 2020

# Milking CowMask for Semi-Supervised Image Classification

Using it to provide perturbations for semi-supervised consistency regularization, we achieve a state-of-the-art result on ImageNet with 10% labeled data, with a top-5 error of 8. 76% and top-1 error of 26. 06%.

11,374
26 Mar 2020

# A Simple Framework for Contrastive Learning of Visual Representations

This paper presents SimCLR: a simple framework for contrastive learning of visual representations.

1,019
13 Feb 2020

# Subspace Capsule Network

7 Feb 2020MarziEd/SubSpace-Capsule-Network

In this paper, we propose the SubSpace Capsule Network (SCN) that exploits the idea of capsule networks to model possible variations in the appearance or implicitly defined properties of an entity through a group of capsule subspaces instead of simply grouping neurons to create capsules.

12
07 Feb 2020

# FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence

Semi-supervised learning (SSL) provides an effective means of leveraging unlabeled data to improve a model's performance.

450
21 Jan 2020

# batchboost: regularization for stabilizing training with resistance to underfitting & overfitting

21 Jan 2020maciejczyzewski/batchboost

Pairing stage calculates the error per sample, sorts the samples and pairs with strategy: hardest with easiest one, than mixing stage merges two samples using mixup, $x_1 + (1-\lambda)x_2$.

45
21 Jan 2020

# Semi-Supervised Learning with Normalizing Flows

Normalizing flows transform a latent distribution through an invertible neural network for a flexible and pleasingly simple approach to generative modelling, while preserving an exact likelihood.

39
30 Dec 2019

# SESS: Self-Ensembling Semi-Supervised 3D Object Detection

The performance of existing point cloud-based 3D object detection methods heavily relies on large-scale high-quality 3D annotations.

40
26 Dec 2019