Global and Quadratic Convergence of Newton Hard-Thresholding Pursuit

9 Jan 2019  ·  Shenglong Zhou, Naihua Xiu, Hou-Duo Qi ·

Algorithms based on the hard thresholding principle have been well studied with sounding theoretical guarantees in the compressed sensing and more general sparsity-constrained optimization. It is widely observed in existing empirical studies that when a restricted Newton step was used (as the debiasing step), the hard-thresholding algorithms tend to meet halting conditions in a significantly low number of iterations and hence are very efficient. However, the thus obtained Newton hard-thresholding algorithms do not offer any better theoretical guarantees than their simple hard-thresholding counterparts. This annoying discrepancy between theory and empirical studies has been known for some time. This paper provides a theoretical justification for the use of the restricted Newton step. We build our theory and algorithm, Newton Hard-Thresholding Pursuit ( NHTP), for the sparsity-constrained optimization. Our main result shows that NHTP is quadratically convergent under the standard assumption of restricted strong convexity and smoothness. We also establish its global convergence to a stationary point under a weaker assumption. In the special case of the compressive sensing, NHTP eventually reduces to some existing hard-thresholding algorithms with a Newton step. Consequently, our fast convergence result justifies why those algorithms perform better than without the Newton step. The efficiency of NHTP was demonstrated on both synthetic and real data in compressed sensing and sparse logistic regression.

PDF Abstract