no code implementations • 19 Dec 2023 • Reza Belbasi, Aras Selvi, Wolfram Wiesemann
A key challenge in this context is the presence of estimation errors in the prediction models, which tend to be amplified by the subsequent optimization model -- a phenomenon that is often referred to as the Optimizer's Curse or the Error-Maximization Effect of Optimization.
no code implementations • 11 Nov 2023 • Florian Joseph Baader, Stefano Moret, Wolfram Wiesemann, Iain Staffell, André Bardow
Uncertainties surrounding the energy transition often lead modelers to present large sets of scenarios that are challenging for policymakers to interpret and act upon.
1 code implementation • 25 Apr 2023 • Aras Selvi, Huikang Liu, Wolfram Wiesemann
We show that the problem affords a strong dual, and we exploit this duality to develop converging hierarchies of finite-dimensional upper and lower bounding problems.
no code implementations • 27 May 2022 • Chin Pang Ho, Marek Petrik, Wolfram Wiesemann
In recent years, robust Markov decision processes (MDPs) have emerged as a prominent modeling framework for dynamic decision problems affected by uncertainty.
1 code implementation • 16 Jun 2020 • Chin Pang Ho, Marek Petrik, Wolfram Wiesemann
Robust Markov decision processes (MDPs) allow to compute reliable solutions for dynamic decision problems whose evolution is modeled by rewards and partially-known transition probabilities.
no code implementations • 15 Apr 2020 • Man-Chung Yue, Daniel Kuhn, Wolfram Wiesemann
In this technical note we prove that the Wasserstein ball is weakly compact under mild conditions, and we offer necessary and sufficient conditions for the existence of optimal solutions.
1 code implementation • NeurIPS 2019 • Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh, Man-Chung Yue, Daniel Kuhn, Wolfram Wiesemann
The likelihood function is a fundamental component in Bayesian statistics.
1 code implementation • NeurIPS 2019 • Viet Anh Nguyen, Soroosh Shafieezadeh-Abadeh, Man-Chung Yue, Daniel Kuhn, Wolfram Wiesemann
A fundamental problem arising in many areas of machine learning is the evaluation of the likelihood of a given observation under different nominal distributions.
no code implementations • ICML 2018 • Chin Pang Ho, Marek Petrik, Wolfram Wiesemann
The first algorithm uses a homotopy continuation method to compute updates for L1-constrained s, a-rectangular ambiguity sets.
no code implementations • 22 May 2017 • Napat Rujeerapaiboon, Kilian Schindler, Daniel Kuhn, Wolfram Wiesemann
Plain vanilla K-means clustering has proven to be successful in practice, yet it suffers from outlier sensitivity and may produce highly unbalanced clusters.