Search Results for author: Kangqiao Liu

Found 4 papers, 0 papers with code

Logarithmic landscape and power-law escape rate of SGD

no code implementations29 Sep 2021 Takashi Mori, Liu Ziyin, Kangqiao Liu, Masahito Ueda

Stochastic gradient descent (SGD) undergoes complicated multiplicative noise for the mean-square loss.

Power-law escape rate of SGD

no code implementations20 May 2021 Takashi Mori, Liu Ziyin, Kangqiao Liu, Masahito Ueda

Stochastic gradient descent (SGD) undergoes complicated multiplicative noise for the mean-square loss.

Strength of Minibatch Noise in SGD

no code implementations ICLR 2022 Liu Ziyin, Kangqiao Liu, Takashi Mori, Masahito Ueda

The noise in stochastic gradient descent (SGD), caused by minibatch sampling, is poorly understood despite its practical importance in deep learning.

Noise and Fluctuation of Finite Learning Rate Stochastic Gradient Descent

no code implementations7 Dec 2020 Kangqiao Liu, Liu Ziyin, Masahito Ueda

In the vanishing learning rate regime, stochastic gradient descent (SGD) is now relatively well understood.

Bayesian Inference Second-order methods

Cannot find the paper you are looking for? You can Submit a new open access paper.