Search Results for author: Nicholas D'Imperio

Found 2 papers, 0 papers with code

On the expected running time of nonconvex optimization with early stopping

no code implementations25 Sep 2019 Thomas Flynn, Kwang Min Yu, Abid Malik, Shinjae Yoo, Nicholas D'Imperio

This work examines the convergence of stochastic gradient algorithms that use early stopping based on a validation function, wherein optimization ends when the magnitude of a validation function gradient drops below a threshold.

Layered SGD: A Decentralized and Synchronous SGD Algorithm for Scalable Deep Neural Network Training

no code implementations13 Jun 2019 Kwangmin Yu, Thomas Flynn, Shinjae Yoo, Nicholas D'Imperio

The efficiency of the algorithm is tested by training a deep network on the ImageNet classification task.

Cannot find the paper you are looking for? You can Submit a new open access paper.