Accelerating Stochastic Gradient Descent

Subscribers:
344,000
Published on ● Video Link: https://www.youtube.com/watch?v=_UFGB2MBo4o



Duration: 1:15:47
8,505 views
121


There is widespread sentiment that it is not possible to effectively utilize fast gradient methods (e.g. Nesterov's acceleration, conjugate gradient, heavy ball) for the purposes of stochastic optimization due to their instability and error accumulation, a notion made precise in dAspremont 2008 and Devolder, Glineur, and Nesterov 2014. This work considers the use of "fast gradient" methods for the special case of stochastic approximation for the least squares regression problem. Our main result refutes the conventional wisdom by showing that acceleration can be made robust to statistical errors. In particular, this work introduces an accelerated stochastic gradient method that provably achieves the minimax optimal statistical risk faster than stochastic gradient descent. Critical to the analysis is a sharp characterization of accelerated stochastic gradient descent as a stochastic process.

See more on this video at https://www.microsoft.com/en-us/research/video/accelerating-stochastic-gradient-descent/







Tags:
microsoft research