Subsampled Optimization: Statistical Guarantees, Mean Squared Error Approximation, and Sampling Method

10 Apr 2018  ·  Rong Zhu, Jiming Jiang ·

For optimization on large-scale data, exactly calculating its solution may be computationally difficulty because of the large size of the data. In this paper we consider subsampled optimization for fast approximating the exact solution. In this approach, one gets a surrogate dataset by sampling from the full data, and then obtains an approximate solution by solving the subsampled optimization based on the surrogate. One main theoretical contributions are to provide the asymptotic properties of the approximate solution with respect to the exact solution as statistical guarantees, and to rigorously derive an accurate approximation of the mean squared error (MSE) and an approximately unbiased MSE estimator. These results help us better diagnose the subsampled optimization in the context that a confidence region on the exact solution is provided using the approximate solution. The other consequence of our results is to propose an optimal sampling method, Hessian-based sampling, whose probabilities are proportional to the norms of Newton directions. Numerical experiments with least-squares and logistic regression show promising performance, in line with our results.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods