Oral Session: Learning Theory and Algorithms for Forecasting Non-stationary Time Series

Subscribers:
343,000
Published on ● Video Link: https://www.youtube.com/watch?v=iQu9QmWtWWc



Duration: 20:50
878 views
10


We present data-dependent learning bounds for the general scenario of non-stationary non-mixing stochastic processes. Our learning guarantees are expressed in terms of a data-dependent measure of sequential complexity and a discrepancy measure that can be estimated from data under some mild assumptions. We use our learning bounds to devise new algorithms for non-stationary time series forecasting for which we report some preliminary experimental results.




Other Videos By Microsoft Research


2016-06-22Symposium: Deep Learning - Max Jaderberg
2016-06-22Satisfiability of Ordering CSPs Above Average Is Fixed-Parameter Tractable
2016-06-22Symposium: Deep Learning - Harri Valpola
2016-06-22Machine Learning as Creative Tool for Designing Real-Time Expressive Interactions
2016-06-22Symposium: Deep Learning - Sergey Ioffe
2016-06-22Multi-rate neural networks for efficient acoustic modeling
2016-06-22Robust Spectral Inference for Joint Stochastic Matrix Factorization and Topic Modeling
2016-06-22Computational Limits in Statistical Inference: Hidden Cliques and Sum of Squares
2016-06-22Extreme Classification: A New Paradigm for Ranking & Recommendation
2016-06-22A Lasserre-Based (1+epsilon)-Approximation for Makespan Scheduling with Precedence Constraints
2016-06-22Oral Session: Learning Theory and Algorithms for Forecasting Non-stationary Time Series
2016-06-22Recent Developments in Combinatorial Optimization
2016-06-22Invited Talks: Computational Principles for Deep Neuronal Architectures
2016-06-22Coalescence in Branching Trees and Branching Random Walks
2016-06-22Oral Session: Randomized Block Krylov Methods
2016-06-22Oral Session: Fast Convergence of Regularized Learning in Games
2016-06-22Ito Processes, Correlated Sampling and Applications
2016-06-22Statistical and computational trade-offs in estimation of sparse principal components.
2016-06-22How to Write a Great Research Paper
2016-06-22Graphical Bandits
2016-06-22Tutorial: Monte Carlo Inference Methods



Tags:
microsoft research
algorithms