Improved Embeddings with Easy Positive Triplet Mining

8 Apr 2019  ·  Hong Xuan, Abby Stylianou, Robert Pless ·

Deep metric learning seeks to define an embedding where semantically similar images are embedded to nearby locations, and semantically dissimilar images are embedded to distant locations. Substantial work has focused on loss functions and strategies to learn these embeddings by pushing images from the same class as close together in the embedding space as possible. In this paper, we propose an alternative, loosened embedding strategy that requires the embedding function only map each training image to the most similar examples from the same class, an approach we call "Easy Positive" mining. We provide a collection of experiments and visualizations that highlight that this Easy Positive mining leads to embeddings that are more flexible and generalize better to new unseen data. This simple mining strategy yields recall performance that exceeds state of the art approaches (including those with complicated loss functions and ensemble methods) on image retrieval datasets including CUB, Stanford Online Products, In-Shop Clothes and Hotels-50K.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Metric Learning CARS196 EPSHN(64) R@1 75.5 # 35
Image Retrieval CARS196 EPSHN512 R@1 82.7 # 7
Metric Learning CARS196 EPSHN(512) R@1 82.7 # 31
Metric Learning CUB-200-2011 EPSHN(512) R@1 64.9 # 22
Metric Learning CUB-200-2011 EPSHN(64) R@1 57.3 # 29
Image Retrieval CUB-200-2011 EPSHN512 R@1 64.9 # 8
Metric Learning In-Shop EPSHN(512) R@1 87.8 # 15
Image Retrieval In-Shop EPSHN512 R@1 87.8 # 6
Image Retrieval SOP EPSHN512 R@1 78.3 # 10
Metric Learning Stanford Online Products EPSHN(512) R@1 78.3 # 28

Methods


No methods listed for this paper. Add relevant methods here