Introduction to large-scale optimization - Part 2
Channel:
Subscribers:
351,000
Published on ● Video Link: https://www.youtube.com/watch?v=rNwkvvm2Hes
These lectures will cover both basics as well as cutting-edge topics in large-scale convex and nonconvex optimization (continuous case only). Examples include stochastic convex optimization, variance reduced stochastic gradient, coordinate descent methods, proximal-methods, operator splitting techniques, and more. The lectures will also cover relevant mathematical background, as well as some pointers to interesting directions of future research.
Other Videos By Microsoft Research
Tags:
microsoft research