Training Techniques | SwAV, Weight Decay, SGD with Momentum |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_swav_100ep_batch4k |
SHOW MORE |
Training Techniques | SwAV, Weight Decay, SGD with Momentum |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_swav_200ep_batch256 |
SHOW MORE |
Training Techniques | SwAV, Weight Decay, SGD with Momentum |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_swav_200ep_batch4k |
SHOW MORE |
Training Techniques | SwAV, Weight Decay, SGD with Momentum |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_swav_2x224_400ep_batch4k |
SHOW MORE |
Training Techniques | SwAV, Weight Decay, SGD with Momentum |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_swav_400ep_batch256 |
SHOW MORE |
Training Techniques | SwAV, Weight Decay, SGD with Momentum |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_swav_400ep_batch4k |
SHOW MORE |
Training Techniques | SwAV, Weight Decay, SGD with Momentum |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_swav_800ep_batch4k |
SHOW MORE |
Training Techniques | SwAV, Weight Decay, SGD with Momentum |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_w2_in1k_swav_400ep |
SHOW MORE |
Training Techniques | SwAV, Weight Decay, SGD with Momentum |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_w4_in1k_swav_400ep |
SHOW MORE |
SwaV, or Swapping Assignments Between Views, is a self-supervised learning approach that takes advantage of contrastive methods without requiring to compute pairwise comparisons. Specifically, it simultaneously clusters the data while enforcing consistency between cluster assignments produced for different augmentations (or views) of the same image, instead of comparing features directly as in contrastive learning. Simply put, SwaV uses a swapped prediction mechanism where we predict the cluster assignment of a view from the representation of another view.
Get started with VISSL by trying one of the Colab tutorial notebooks.
@misc{caron2021unsupervised,
title={Unsupervised Learning of Visual Features by Contrasting Cluster Assignments},
author={Mathilde Caron and Ishan Misra and Julien Mairal and Priya Goyal and Piotr Bojanowski and Armand Joulin},
year={2021},
eprint={2006.09882},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
MODEL | TOP 1 ACCURACY |
---|---|
SwAV ResNet-50-w4 (400 epochs, 2x224+6x96, 2560 bs) | 77.03% |
SwAV ResNet-50-w2 (400 epochs, 2x224+6x96, 4096 bs) | 77.01% |
SwAV ResNet-50 (800 epochs, 2x224+6x96, 4096 bs) | 74.92% |
SwAV ResNet-50 (400 epochs, 2x224+6x96, 4096 bs) | 74.81% |
SwAV ResNet-50 (400 epochs, 2x224+6x96, 256 bs) | 74.3% |
SwAV ResNet-50 (200 epochs, 2x224+6x96, 4096 bs) | 73.85% |
SwAV ResNet-50 (200 epochs, 2x224+6x96, 256 bs) | 73.07% |
SwAV ResNet-50 (100 epochs, 2x224+6x96, 4096 bs) | 71.99% |
SwAV ResNet-50 (400 epochs, 2x224, 4096 bs) | 69.53% |