1 code implementation • 2 Jun 2022 • Zebang Shen, Zhenfu Wang, Satyen Kale, Alejandro Ribeiro, Amin Karbasi, Hamed Hassani
In this paper, we exploit this concept to design a potential function of the hypothesis velocity fields, and prove that, if such a function diminishes to zero during the training procedure, the trajectory of the densities generated by the hypothesis velocity fields converges to the solution of the FPE in the Wasserstein-2 sense.
no code implementations • NeurIPS 2020 • Zebang Shen, Zhenfu Wang, Alejandro Ribeiro, Hamed Hassani
In this regard, we propose a novel Sinkhorn Natural Gradient (SiNG) algorithm which acts as a steepest descent method on the probability space endowed with the Sinkhorn divergence.
no code implementations • NeurIPS 2020 • Zebang Shen, Zhenfu Wang, Alejandro Ribeiro, Hamed Hassani
In this paper, we consider the problem of computing the barycenter of a set of probability distributions under the Sinkhorn divergence.