Extending Generalization Theory Towards Addressing Modern Challenges in Machine Learning

Published on ● Video Link: https://www.youtube.com/watch?v=m9IvoZpZF-8



Duration: 40:21
529 views
8


Shay Moran (Technion)
https://simons.berkeley.edu/talks/extending-generalization-theory-towards-addressing-modern-challenges-machine-learning
Lower Bounds, Learning, and Average-Case Complexity

Abstract
Recent years have witnessed tremendous progress in the field of Machine Learning (ML). However, many of the recent breakthroughs demonstrate phenomena that lack explanations, and sometimes even contradict conventional wisdom. One main reason for this is because classical ML theory adopts a worst-case perspective which seems too pessimistic to explain practical ML: in reality data is rarely worst-case, and experiments indicate that often much less data is needed than predicted by traditional theory.
In this talk we will discuss two variations of classical learning theory. These models are based on a distribution- and data-dependent perspective which complements the distribution-free worst-case perspective of classical theory, and is suitable for exploiting specific properties of a given learning task.
A common theme of these models is their combinatorial nature. This can be seen as a continuation of the fruitful link between machine learning and combinatorics, which goes back to the discovery of the VC dimension more than 50 years ago.







Tags:
Simons Institute
theoretical computer science
UC Berkeley
Computer Science
Theory of Computation
Theory of Computing
Lower Bounds Learning and Average-Case Complexity
Shay Moran