Exterior-point Optimization for Sparse and Low-rank Optimization

9 Nov 2020  ·  Shuvomoy Das Gupta, Bartolomeo Stellato, Bart P. G. Van Parys ·

Many problems of substantial current interest in machine learning, statistics, and data science can be formulated as sparse and low-rank optimization problems. In this paper, we present the nonconvex exterior-point optimization solver NExOS -- a first-order algorithm tailored to sparse and low-rank optimization problems. We consider the problem of minimizing a convex function over a nonconvex constraint set, where the set can be decomposed as the intersection of a compact convex set and a nonconvex set involving sparse or low-rank constraints. Unlike the convex relaxation approaches, NExOS finds a locally optimal point of the original problem by solving a sequence of penalized problems with strictly decreasing penalty parameters by exploiting the nonconvex geometry. NExOS solves each penalized problem by applying a first-order algorithm, which converges linearly to a local minimum of the corresponding penalized formulation under regularity conditions. Furthermore, the local minima of the penalized problems converge to a local minimum of the original problem as the penalty parameter goes to zero. We then implement and test NExOS on many instances from a wide variety of sparse and low-rank optimization problems, empirically demonstrating that our algorithm outperforms specialized methods.

PDF Abstract