Introduction to large-scale optimization - Part1
Channel:
Subscribers:
351,000
Published on ● Video Link: https://www.youtube.com/watch?v=AYcfpq5hH5g
These lectures will cover both basics as well as cutting-edge topics in large-scale convex and nonconvex optimization (continuous case only). Examples include stochastic convex optimization, variance reduced stochastic gradient, coordinate descent methods, proximal-methods, operator splitting techniques, and more. The lectures will also cover relevant mathematical background, as well as some pointers to interesting directions of future research (time permitting).
Other Videos By Microsoft Research
Tags:
microsoft research