Complete Statistical Theory of Learning (Vladimir Vapnik) | MIT Deep Learning Series

Channel:
Subscribers:
4,820,000
Published on ● Video Link: https://www.youtube.com/watch?v=Ow25mjFjSmg



Duration: 1:19:21
69,162 views
1,896


Lecture by Vladimir Vapnik in January 2020, part of the MIT Deep Learning Lecture Series.
Slides: http://bit.ly/2ORVofC
Associated podcast conversation: https://www.youtube.com/watch?v=bQa7hpUpMzM
Series website: https://deeplearning.mit.edu
Playlist: http://bit.ly/deep-learning-playlist

OUTLINE:
0:00 - Introduction
0:46 - Overview: Complete Statistical Theory of Learning
3:47 - Part 1: VC Theory of Generalization
11:04 - Part 2: Target Functional for Minimization
27:13 - Part 3: Selection of Admissible Set of Functions
37:26 - Part 4: Complete Solution in Reproducing Kernel Hilbert Space (RKHS)
53:16 - Part 5: LUSI Approach in Neural Networks
59:28 - Part 6: Examples of Predicates
1:10:39 - Conclusion
1:16:10 - Q&A: Overfitting
1:17:18 - Q&A: Language

CONNECT:
- If you enjoyed this video, please subscribe to this channel.
- Twitter: https://twitter.com/lexfridman
- LinkedIn: https://www.linkedin.com/in/lexfridman
- Facebook: https://www.facebook.com/lexfridman
- Instagram: https://www.instagram.com/lexfridman




Other Videos By Lex Fridman


2020-02-25What is Statistics? (Michael I. Jordan) | AI Podcast Clips
2020-02-24Michael I. Jordan: Machine Learning, Recommender Systems, and Future of AI | Lex Fridman Podcast #74
2020-02-23Sleep and Burnout | AMA #2 - Ask Me Anything with Lex Fridman
2020-02-22Scott Aaronson: Quantum Supremacy | AI Podcast Clips
2020-02-21Andrew Ng: Advice on Getting Started in Deep Learning | AI Podcast Clips
2020-02-20Andrew Ng: Deep Learning, Education, and Real-World AI | Lex Fridman Podcast #73
2020-02-19Do Something Difficult Every Day | AMA #1 - Ask Me Anything with Lex Fridman
2020-02-18Scott Aaronson: What is a Quantum Computer? | AI Podcast Clips
2020-02-17Scott Aaronson: Quantum Computing | Lex Fridman Podcast #72
2020-02-16Jim Keller: Abstraction Layers from the Atom to the Data Center | AI Podcast Clips
2020-02-15Complete Statistical Theory of Learning (Vladimir Vapnik) | MIT Deep Learning Series
2020-02-14Vladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence | Lex Fridman Podcast #71
2020-02-10Jim Keller: Most People Don't Think Simple Enough | AI Podcast Clips
2020-02-09Moore's Law is Not Dead (Jim Keller) | AI Podcast Clips
2020-02-08Favorite Boris Pasternak Poem of Buvaisar Saitiev | Joe Rogan Experience
2020-02-07Jim Keller: Elon Musk and Tesla Autopilot | AI Podcast Clips
2020-02-06Roll the Dice (Go All the Way) by Charles Bukowski | Joe Rogan Experience
2020-02-05Jim Keller: Moore's Law, Microprocessors, and First Principles | Lex Fridman Podcast #70
2020-02-01Joe Rogan Podcast Theme Music (Guitar)
2020-01-30David Chalmers: What is Consciousness? | AI Podcast Clips
2020-01-29David Chalmers: The Hard Problem of Consciousness | Lex Fridman Podcast #69



Tags:
statistical learning theory
vladimir vapnik
deep learning
artificial intelligence
mit deep learning