Browse SoTA > Computer Vision > Contrastive Learning

Contrastive Learning

26 papers with code · Computer Vision

Leaderboards

No evaluation results yet. Help compare methods by submit evaluation metrics.

Latest papers with code

Debiased Contrastive Learning

1 Jul 2020chingyaoc/DCL

A prominent technique for self-supervised representation learning has been to contrast semantically similar and dissimilar pairs of samples.

CONTRASTIVE LEARNING REPRESENTATION LEARNING

58
01 Jul 2020

Approximate Nearest Neighbor Negative Contrastive Learning for Dense Text Retrieval

1 Jul 2020microsoft/ANCE

In this paper, we identify that the main bottleneck is in the training mechanisms, where the negative instances used in training are not representative of the irrelevant documents in testing.

CONTRASTIVE LEARNING

8
01 Jul 2020

Video Representation Learning with Visual Tempo Consistency

28 Jun 2020decisionforce/VTHCL

Visual tempo, which describes how fast an action goes, has shown its potential in supervised action recognition.

ACTION DETECTION CONTRASTIVE LEARNING REPRESENTATION LEARNING

10
28 Jun 2020

Unsupervised Discovery of Object Landmarks via Contrastive Learning

26 Jun 2020cvl-umass/ContrastLandmark

We show that when a deep network is trained to be invariant to geometric and photometric transformations, representations from its intermediate layers are highly predictive of object landmarks.

CONTRASTIVE LEARNING REPRESENTATION LEARNING

10
26 Jun 2020

GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training

17 Jun 2020THUDM/GCC

Graph representation learning has emerged as a powerful technique for addressing real-world problems.

CONTRASTIVE LEARNING GRAPH CLASSIFICATION GRAPH REPRESENTATION LEARNING LINK PREDICTION NODE CLASSIFICATION

34
17 Jun 2020

Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction

15 Jun 2020ryanchankh/mcr2

To learn intrinsic low-dimensional structures from high-dimensional data that most discriminate between classes, we propose the principle of Maximal Coding Rate Reduction ($\text{MCR}^2$), an information-theoretic measure that maximizes the coding rate difference between the whole dataset and the sum of each individual class.

CONTRASTIVE LEARNING

11
15 Jun 2020

Knowledge Distillation Meets Self-Supervision

12 Jun 2020xuguodong03/SSKD

Knowledge distillation, which involves extracting the "dark knowledge" from a teacher network to guide the learning of a student network, has emerged as an important technique for model compression and transfer learning.

CONTRASTIVE LEARNING MODEL COMPRESSION TRANSFER LEARNING

46
12 Jun 2020

DisCont: Self-Supervised Visual Attribute Disentanglement using Context Vectors

10 Jun 2020sarthak268/DisCont

Disentangling the underlying feature attributes within an image with no prior supervision is a challenging task.

CONTRASTIVE LEARNING

3
10 Jun 2020

Deep Graph Contrastive Representation Learning

7 Jun 2020CRIPAC-DIG/GRACE

Inspired by recent success of contrastive methods, in this paper, we propose a novel framework for unsupervised graph representation learning by leveraging a contrastive objective at the node level.

CONTRASTIVE LEARNING GRAPH REPRESENTATION LEARNING NODE CLASSIFICATION

7
07 Jun 2020

Self-paced Contrastive Learning with Hybrid Memory for Domain Adaptive Object Re-ID

4 Jun 2020open-mmlab/OpenUnReID

Our method outperforms state-of-the-arts on multiple domain adaptation tasks of object re-ID and even boosts the performance on the source domain without any extra annotations.

CONTRASTIVE LEARNING UNSUPERVISED DOMAIN ADAPTATION

64
04 Jun 2020