Machine Learning Algorithms Workshop

Subscribers:
349,000
Published on ● Video Link: https://www.youtube.com/watch?v=AUbK_cL_-4I



Duration: 1:39:55
474 views
3


Machine Learning Algorithms Workshop: Logarithmic Time Online Multiclass Prediction & Log-Concave Sampling with SGD
Logarithmic Time Online Multiclass prediction: We study the problem of multiclass classification with an extremely large number of classes (k), with the goal of obtaining train and test time complexity logarithmic in the number of classes. We develop top-down tree construction approaches for constructing logarithmic depth trees. On the theoretical front, we formulate a new objective function, which is optimized at each node of the tree and creates dynamic partitions of the data which are both pure (in terms of class labels) and balanced. We demonstrate that under favorable conditions, we can construct logarithmic depth trees that have leaves with low label entropy. However, the objective function at the nodes is challenging to optimize computationally. We address the empirical problem with a new online decision tree construction procedure. Experiments demonstrate that this online algorithm quickly achieves improvement in test error compared to more common logarithmic training time approaches, which makes it a plausible method in computationally constrained large-k applications. And... Log-concave Sampling with SGD: We extend the Langevin Monte Carlo (LMC) algorithm to compactly supported measures via a projection step, akin to projected Stochastic Gradient Descent (SGD). We show that (projected) LMC allows to sample in polynomial time from a log-concave distribution with smooth potential. This gives a new Markov chain to sample from a log-concave distribution.




Other Videos By Microsoft Research


2016-07-07Sharp higher order corrections for the critical value of a bootstrap percolation model
2016-07-07Oral Session 8
2016-07-07MSR NYC Data Science Seminar Series: From "In" to "Over"
2016-07-07Oral Session 9
2016-07-07IMS-Microsoft Research Workshop: Foundations of Data Science - The small clustering problem
2016-07-07Advances in Quantum Algorithms and Devices
2016-07-07Tutorial Session B - Causes and Counterfactuals: Concepts, Principles and Tools.
2016-07-07Modeling human intelligence with Probabilistic Programs and Program Induction
2016-07-07Invited Talk: Post-selection Inference for Forward Stepwise Regression, Lasso and other procedures
2016-07-07Tutorials Session A - Deep Learning for Computer Vision
2016-07-07Machine Learning Algorithms Workshop
2016-07-07Joint Talk: Automatic Statistician
2016-07-07MSR NYC Data Science Seminar Series #4 - What Makes us Human?
2016-07-07Quantifiers meet their match(ing loop)
2016-07-07MSR Gender Diversity Lecture Series 4: Diversity Driving Innovation Moving from Research to Action
2016-07-07Social Computing Symposium 2015: Consequences of Humanizing Systems -Darius Kazemi
2016-07-07Spatially Defined Measures of Mobility from Call Data Records
2016-07-07The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World
2016-07-07Strategy Rules: Five Timeless Lessons from Bill Gates, Andy Grove, and Steve Jobs
2016-07-07One Second Ahead: Enhance Your Performance at Work with Mindfulness
2016-07-07Asymptotic Behavior of the Eden Model with Positively Homogeneous Edge Weights



Tags:
microsoft research