On Gradient-Based Optimization: Accelerated, Stochastic and Nonconvex
Channel:
Subscribers:
351,000
Published on ● Video Link: https://www.youtube.com/watch?v=ft9Abw6VULo
Many new theoretical challenges have arisen in the area of gradient-based optimization for large-scale statistical data analysis, driven by the needs of applications and the opportunities provided by new hardware and software platforms. I discuss several recent, related results in this area: (1) a new framework for understanding Nesterov acceleration, obtained by taking a continuous-time, Lagrangian/Hamiltonian/symplectic perspective, (2) a discussion of how to escape saddle points efficiently in nonconvex optimization, and (3) the acceleration of Langevin diffusion.
See more at https://www.microsoft.com/en-us/research/videos/ai-distinguished-lecture-series/
Other Videos By Microsoft Research
Tags:
microsoft research