1 code implementation • 7 Mar 2023 • Soroosh Shafieezadeh-Abadeh, Liviu Aolaritei, Florian Dörfler, Daniel Kuhn
We study optimal transport-based distributionally robust optimization problems where a fictitious adversary, often envisioned as nature, can choose the distribution of the uncertain problem parameters by reshaping a prescribed reference distribution at a finite transportation cost.
no code implementations • 2 Mar 2022 • Bahar Taşkesen, Soroosh Shafieezadeh-Abadeh, Daniel Kuhn, Karthik Natarajan
We study the computational complexity of the optimal transport problem that evaluates the Wasserstein distance between the distributions of two K-dimensional discrete random vectors.
1 code implementation • 10 Mar 2021 • Bahar Taskesen, Soroosh Shafieezadeh-Abadeh, Daniel Kuhn
Semi-discrete optimal transport problems, which evaluate the Wasserstein distance between a discrete and a generic (possibly non-discrete) probability measure, are believed to be computationally hard.
1 code implementation • 8 Nov 2019 • Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh, Daniel Kuhn, Peyman Mohajerin Esfahani
The proposed model can be viewed as a zero-sum game between a statistician choosing an estimator -- that is, a measurable function of the observation -- and a fictitious adversary choosing a prior -- that is, a pair of signal and noise distributions ranging over independent Wasserstein balls -- with the goal to minimize and maximize the expected squared estimation error, respectively.
1 code implementation • NeurIPS 2019 • Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh, Man-Chung Yue, Daniel Kuhn, Wolfram Wiesemann
The likelihood function is a fundamental component in Bayesian statistics.
1 code implementation • NeurIPS 2019 • Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh, Man-Chung Yue, Daniel Kuhn, Wolfram Wiesemann
A fundamental problem arising in many areas of machine learning is the evaluation of the likelihood of a given observation under different nominal distributions.
no code implementations • 23 Aug 2019 • Daniel Kuhn, Peyman Mohajerin Esfahani, Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh
The goal of data-driven decision-making is to learn a decision from finitely many training samples that will perform well on unseen test samples.
1 code implementation • NeurIPS 2018 • Soroosh Shafieezadeh-Abadeh, Viet Anh Nguyen, Daniel Kuhn, Peyman Mohajerin Esfahani
Despite the non-convex nature of the ambiguity set, we prove that the estimation problem is equivalent to a tractable convex program.
1 code implementation • 27 Oct 2017 • Soroosh Shafieezadeh-Abadeh, Daniel Kuhn, Peyman Mohajerin Esfahani
The goal of regression and classification methods in supervised learning is to minimize the empirical risk, that is, the expectation of some loss function quantifying the prediction error under the empirical distribution.
no code implementations • NeurIPS 2015 • Soroosh Shafieezadeh-Abadeh, Peyman Mohajerin Esfahani, Daniel Kuhn
This paper proposes a distributionally robust approach to logistic regression.