no code implementations • 9 Jan 2023 • Chuan He, Zhaosong Lu, Ting Kei Pong
In particular, we first propose a new Newton-CG method for finding an approximate SOSP of unconstrained optimization and show that it enjoys a substantially better complexity than the Newton-CG method [56].
no code implementations • 10 Feb 2019 • Peiran Yu, Guoyin Li, Ting Kei Pong
In addition, for nonconvex models, we show that the KL exponent of many difference-of-convex functions can be derived from that of their natural majorant functions, and the KL exponent of the Bregman envelope of a function is the same as that of the function itself.
no code implementations • 19 Apr 2018 • Tianxiang Liu, Ting Kei Pong, Akiko Takeda
Moreover, for a large class of loss functions and regularizers, the KL exponent of the corresponding potential function are shown to be 1/2, which implies that the pDCA$_e$ is locally linearly convergent when applied to these problems.
no code implementations • 22 Oct 2017 • Peiran Yu, Ting Kei Pong
Iteratively reweighted $\ell_1$ algorithm is a popular algorithm for solving a large class of optimization problems whose objective is the sum of a Lipschitz differentiable loss function and a possibly nonconvex sparsity inducing regularizer.
no code implementations • 16 Oct 2017 • Tianxiang Liu, Ting Kei Pong, Akiko Takeda
We consider a class of nonconvex nonsmooth optimization problems whose objective is the sum of a smooth function and a finite number of nonnegative proper closed possibly nonsmooth functions (whose proximal mappings are easy to compute), some of which are further composed with linear maps.
no code implementations • 18 May 2017 • Lei Yang, Ting Kei Pong, Xiaojun Chen
Finally, we conduct some numerical experiments using real datasets to compare our method with some existing efficient methods for non-negative matrix factorization and matrix completion.
no code implementations • 1 May 2016 • Tianxiang Liu, Ting Kei Pong
In this paper, we further study the forward-backward envelope first introduced in [28] and [30] for problems whose objective is the sum of a proper closed convex function and a twice continuously differentiable possibly nonconvex function with Lipschitz continuous gradient.
no code implementations • 9 Feb 2016 • Guoyin Li, Ting Kei Pong
Since many existing local convergence rate analysis for first-order methods in the nonconvex scenario relies on the KL exponent, our results enable us to obtain explicit convergence rate for various first-order methods when they are applied to a large variety of practical optimization models.
1 code implementation • 31 Dec 2015 • Bo Wen, Xiaojun Chen, Ting Kei Pong
In this paper, we study the proximal gradient algorithm with extrapolation for minimizing the sum of a Lipschitz differentiable function and a proper closed convex function.
no code implementations • 30 Sep 2014 • Guoyin Li, Ting Kei Pong
We then apply our nonconvex DR splitting method to finding a point in the intersection of a closed convex set $C$ and a general closed set $D$ by minimizing the squared distance to $C$ subject to $D$.
no code implementations • 9 Sep 2014 • Xiaojun Chen, Zhaosong Lu, Ting Kei Pong
We consider a class of constrained optimization problems with a possibly nonconvex non-Lipschitz objective and a convex feasible set being the intersection of a polyhedron and a possibly degenerate ellipsoid.
no code implementations • 3 Jul 2014 • Guoyin Li, Ting Kei Pong
In this paper, we examined two types of splitting methods for solving this nonconvex optimization problem: alternating direction method of multipliers and proximal gradient algorithm.