Search Results for author: Minh Ha Quang

Found 6 papers, 0 papers with code

Kullback-Leibler and Renyi divergences in reproducing kernel Hilbert space and Gaussian process settings

no code implementations18 Jul 2022 Minh Ha Quang

In this work, we present formulations for regularized Kullback-Leibler and R\'enyi divergences via the Alpha Log-Determinant (Log-Det) divergences between positive Hilbert-Schmidt operators on Hilbert spaces in two different settings, namely (i) covariance operators and Gaussian measures defined on reproducing kernel Hilbert spaces (RKHS); and (ii) Gaussian processes with squared integrable sample paths.

Gaussian Processes

Finite sample approximations of exact and entropic Wasserstein distances between covariance operators and Gaussian processes

no code implementations26 Apr 2021 Minh Ha Quang

Using this representation, we show that the Sinkhorn divergence between two centered Gaussian processes can be consistently and efficiently estimated from the divergence between their corresponding normalized finite-dimensional covariance matrices, or alternatively, their sample covariance operators.

Gaussian Processes

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings

no code implementations5 Jan 2021 Minh Ha Quang

Our first main result is that for Gaussian measures on an infinite-dimensional Hilbert space, convergence in the 2-Sinkhorn divergence is {\it strictly weaker} than convergence in the exact 2-Wasserstein distance.

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

no code implementations15 Nov 2020 Minh Ha Quang

This work studies the entropic regularization formulation of the 2-Wasserstein distance on an infinite-dimensional Hilbert space, in particular for the Gaussian setting.

Gaussian Processes valid

Infinite-dimensional Log-Determinant divergences II: Alpha-Beta divergences

no code implementations13 Oct 2016 Minh Ha Quang

This work presents a parametrized family of divergences, namely Alpha-Beta Log- Determinant (Log-Det) divergences, between positive definite unitized trace class operators on a Hilbert space.

Log-Hilbert-Schmidt metric between positive definite operators on Hilbert spaces

no code implementations NeurIPS 2014 Minh Ha Quang, Marco San Biagio, Vittorio Murino

This paper introduces a novel mathematical and computational framework, namely {\it Log-Hilbert-Schmidt metric} between positive definite operators on a Hilbert space.

Image Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.