no code implementations • 28 Feb 2019 • Hiva Ghanbari, Minhan Li, Katya Scheinberg
In this work, we show that in the case of linear predictors, the expected error and the expected ranking loss can be effectively approximated by smooth functions whose closed form expressions and those of their first (and second) order derivatives depend on the first and second moments of the data distribution, which can be precomputed.
no code implementations • 7 Feb 2018 • Hiva Ghanbari, Katya Scheinberg
We show that even when data is not normally distributed, computed derivatives are sufficiently useful to render an efficient optimization method and high quality solutions.
no code implementations • 20 Mar 2017 • Hiva Ghanbari, Katya Scheinberg
In this work, we utilize a Trust Region based Derivative Free Optimization (DFO-TR) method to directly maximize the Area Under Receiver Operating Characteristic Curve (AUC), which is a nonsmooth, noisy function.
no code implementations • 11 Jul 2016 • Hiva Ghanbari, Katya Scheinberg
In [19], a general, inexact, efficient proximal quasi-Newton algorithm for composite optimization problems has been proposed and a sublinear global convergence rate has been established.