Metric Learning for 3D Point Clouds Using Optimal Transport

Learning embeddings of any data largely depends on the ability of the target space to capture semantic rela- tions. The widely used Euclidean space, where embed- dings are represented as point vectors, is known to be lack- ing in its potential to exploit complex structures and re- lations. Contrary to standard Euclidean embeddings, in this work, we embed point clouds as discrete probability distributions in Wasserstein space. We build a contrastive learning setup to learn Wasserstein embeddings that can be used as a pre-training method with or without supervision towards any downstream task. We show that the features captured by Wasserstein embeddings are better in preserv- ing the point cloud geometry, including both global and local information, thus resulting in improved quality em- beddings. We perform exhaustive experiments and demon- strate the effectiveness of our method for point cloud classi- fication, transfer learning, segmentation, and interpolation tasks over multiple datasets including synthetic and real- world objects. We also compare against recent methods that use Wasserstein space and show that our method out- performs them in all downstream tasks. Additionally, our study reveals a promising interpretation of capturing criti- cal points of point clouds that makes our proposed method self-explainable.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods