Elad Hazan: Efficient Optimization for Machine Learning: Beyond Stochastic Gradient Descent
Channel:
Subscribers:
2,450
Published on ● Video Link: https://www.youtube.com/watch?v=AbM5JjcRLEk
A talk by Elad Hazan at the Quantum Machine Learning Workshop, hosted September 24-28, 2018 by the Joint Center for Quantum Information and Computer Science at the University of Maryland (QuICS).
Abstract: In this talk we will describe recent advances in optimization that gave rise to state-of-the-art algorithms in machine learning. While stochastic gradient descent is the workhorse behind the recent deep learning revolution, improving its performance both theoretically and in practice has proven challenging. We will explore recent innovations in stochastic second order methods and adaptive regularization that yield some of the fastest algorithms known.
Other Videos By QuICS
Tags:
machine learning
optimization