Machine Learning Work Shop- A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem

Subscribers:
344,000
Published on ● Video Link: https://www.youtube.com/watch?v=ZmNmxWXV_Ho



Duration: 23:24
809 views
4


Machine Learning Work Shop-Session 5 - Lin Xiao - 'A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem' We consider the l1-regularized least-squares problem in the context of sparse recovery or compressed sensing. The standard proximal gradient method (iterative soft-thresholding) has low computational cost per iteration but a rather slow convergence rate. Nevertheless, when the solution is sparse, it often exhibits fast linear convergence in the end. We exploit this local linear convergence using a homotopy continuation strategy, i.e., we minimize the objective for a sequence of decreasing values of the regularization parameter, and use an approximate solution at the end of each stage to warm-start the next stage. Similar strategies have been studied in the literature, but there has been no theoretical analysis of their global iteration complexity. We show that under suitable assumptions for sparse recovery, the proposed homotopy strategy ensures that all iterates along the homotopy solution path are sparse. Therefore the objective function is effectively strongly convex along the path, and geometric convergence at each stage can be established. As a result, the overall iteration complexity of our method is O(log(1/epsilon)) for finding an epsilon-optimal solution.




Other Videos By Microsoft Research


2016-08-11Future Perfect: The Case for Progress in A Networked Age
2016-08-11Towards ad hoc interactions with robots
2016-08-11Dynamically Enforcing Knowledge-based Security Policies
2016-08-11Real Applications of Non-Real Numbers
2016-08-11From the Information Extraction Pipeline to Global Models, and Back
2016-08-11Some Algorithmic Problems in High Dimensions
2016-08-11Machine Learning Course - Lecture 2
2016-08-11Panel: Open Data for Open Science - Data Interoperability
2016-08-11Cloud Computing - What Do Researchers Want? - A Panel Discussion
2016-08-11Machine Learning Work Shop - Recovery of Simultaneously Structured Models by Convex Optimization
2016-08-11Machine Learning Work Shop- A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem
2016-08-11Machine Learning Work Shop - Combining Machine and Human Intelligence in Crowdsourcing
2016-08-11Graph Drawing 2012 Day 3 - Session 4
2016-08-11Machine Learning Work Shop-Session 4 - Hariharan Narayanan - Testing the Manifold Hypothesis
2016-08-11Machine Learning Work Shop-Session 3 - Pedro Domingos - Learning Tractable but Expressive Models
2016-08-11Machine Learning Work Shop - Graphical Event Models for Temporal Event Streams
2016-08-11Machine Learning Work Shop - Online Learning Against Adaptive Adversaries
2016-08-11Machine Learning Work Shop - Counterfactual Measurements and Learning Systems
2016-08-11Machine Learning Work Shop - Why Submodularity is Important to Machine Learning
2016-08-11Machine Learning Work Shop - Bayesian Nonparametrics for Complex Dynamical Phenomena
2016-08-11Machine Learning Work Shop - GraphLab: Large-scale Machine Learning on Natural Graphs



Tags:
microsoft research