A Primal-dual Analysis of Margin Maximization by Steepest Descent Methods

Published on ● Video Link: https://www.youtube.com/watch?v=Rc76ZBP6fHM



Duration: 41:42
1,041 views
14


Matus Telgarsky (University of Illinois, Urbana-Champaign)
https://simons.berkeley.edu/talks/tbd-56
Frontiers of Deep Learning




Other Videos By Simons Institute for the Theory of Computing


2019-07-17Lessons Learned from Evaluating the Robustness of Defenses to Adversarial Examples
2019-07-17Interpreting Deep Neural Networks (DNNs)
2019-07-171Provably Robust Deep Learning via Adversarially Trained Smoothed Classifiers
2019-07-17A New Perspective on Adversarial Perturbations
2019-07-17Provable Robustness Beyond Bound Propagation
2019-07-16Splitting Gradient Descent for Incremental Learning of Neural Architectures
2019-07-16Mad Max: Affine Spline Insights into Deep Learning
2019-07-16Computation in Very Wide Neural Networks
2019-07-16Training on the Test Set and Other Heresies
2019-07-16Explaining Landscape Connectivity of Low-cost Solutions for Multilayer Nets
2019-07-16A Primal-dual Analysis of Margin Maximization by Steepest Descent Methods
2019-07-16On the Foundations of Deep Learning: SGD, Overparametrization, and Generalization
2019-07-15Size-free Generalization Bounds for Convolutional Neural Networks
2019-07-15Learning and Generalization in Over-parametrized Neural Networks, Going Beyond Kernels
2019-07-15From Classical Statistics to Modern Machine Learning
2019-07-15Analyzing Optimization and Generalization in Deep Learning via Trajectories of Gradient Descent
2019-07-15Practical Model-based Algorithms for Reinforcement Learning and Imitation Learning, with...
2019-07-15Benign Overfitting in Linear Prediction
2019-07-12Expansion from a Cohomological Viewpoint
2019-07-12Spectral HDX III: Random Walks
2019-07-12Connection Between Codes and HDX Part 2



Tags:
Frontiers of Deep Learning
Matus Telgarsky
Simons Institute
Theory of Computing
Theory of Computation
Theoretical Computer Science
Computer Science
UC Berkeley