Machine Learning Algorithms Workshop

Subscribers:
344,000
Published on ● Video Link: https://www.youtube.com/watch?v=VpNaQR8jbN8



Duration: 1:39:55
1,631 views
7


Logarithmic Time Online Multiclass prediction: We study the problem of multiclass classification with an extremely large number of classes (k), with the goal of obtaining train and test time complexity logarithmic in the number of classes. We develop top-down tree construction approaches for constructing logarithmic depth trees. On the theoretical front, we formulate a new objective function, which is optimized at each node of the tree and creates dynamic partitions of the data which are both pure (in terms of class labels) and balanced. We demonstrate that under favorable conditions, we can construct logarithmic depth trees that have leaves with low label entropy. However, the objective function at the nodes is challenging to optimize computationally. We address the empirical problem with a new online decision tree construction procedure. Experiments demonstrate that this online algorithm quickly achieves improvement in test error compared to more common logarithmic training time approaches, which makes it a plausible method in computationally constrained large-k applications.

Log-concave Sampling with SGD: We extend the Langevin Monte Carlo (LMC) algorithm to compactly supported measures via a projection step, akin to projected Stochastic Gradient Descent (SGD). We show that (projected) LMC allows to sample in polynomial time from a log-concave distribution with smooth potential. This gives a new Markov chain to sample from a log-concave distribution.




Other Videos By Microsoft Research


2016-06-13Opportunities and Challenges in Global Network Cameras
2016-06-13Nature in the City: Changes in Bangalore over Time and Space
2016-06-13Making Small Spaces Feel Large: Practical Illusions in Virtual Reality
2016-06-13Machine Learning as Creative Tool for Designing Real-Time Expressive Interactions
2016-06-13Recent Developments in Combinatorial Optimization
2016-06-13Computational Limits in Statistical Inference: Hidden Cliques and Sum of Squares
2016-06-13Coloring the Universe: An Insider's Look at Making Spectacular Images of Space
2016-06-13Towards Understandable Neural Networks for High Level AI Tasks - Part 6
2016-06-13The 37th UW/MS Symposium in Computational Linguistics
2016-06-13The Linear Algebraic Structure of Word Meanings
2016-06-13Machine Learning Algorithms Workshop
2016-06-13Interactive and Interpretable Machine Learning Models for Human Machine Collaboration
2016-06-13Improving Access to Clinical Data Locked in Narrative Reports: An Informatics Approach
2016-06-13Representation Power of Neural Networks
2016-06-13Green Security Games
2016-06-13e-NABLE: A Global Network of Digital Humanitarians on an Infrastructure of Electronic Communications
2016-06-10Microsoft Research New England: An introduction
2016-06-06Python+Machine Learning tutorial - Data munging for predictive modeling with pandas and scikit-learn
2016-06-06Symposium: Deep Learning - Xiaogang Wang
2016-06-06Symposium: Deep Learning - Leon Gatys
2016-06-06Symposium: Brains, Minds and Machines - Surya Ganguli



Tags:
microsoft research
machine learning
algorithms
lmc
long-concave sampling
sgd
classification