An Introduction to Concentration Inequalities and Statistical Learning Theory
The aim of this tutorial is to introduce tools and techniques that are used to analyze machine learning algorithms in statistical settings. Our focus will be on learning problems such as classification, regression, and ranking. We will look at concentration inequalities and other commonly used techniques such as uniform convergence and symmetrization, and use them to prove learning theoretic guarantees for algorithms in these settings. The talk will be largely self-contained. However, it would help if the audience could brush up basic probability and statistics concepts such as random variables, events, probability of events, Boole's inequality etc. There are several good resources for these online and I do not wish to recommend one over the other. However, a couple of nice resources are given below Https://www.khanacademy.org/math/probability Http://ocw.mit.edu/courses/mathematics/18-05-introduction-to-probability-and-statistics-spring-2014/ Https://en.wikipedia.org/wiki/Boole's inequality