Machine Learning Work Shop - Recovery of Simultaneously Structured Models by Convex Optimization

Subscribers:
344,000
Published on ● Video Link: https://www.youtube.com/watch?v=UXig6s6DP7E



Duration: 26:02
1,211 views
4


Machine Learning Work Shop-Session 5 - Maryam Fazel - 'Recovery of Simultaneously Structured Models by Convex Optimization' The topic of deriving a structured model from a small number of linear observations by solving a convex optimization problem has been well-studied in recent years. Examples include the recovery of sparse or group-sparse vectors (which gave rise to the area of Compressed Sensing), low-rank matrices (arising in collaborative filtering and system identification), and the sum of sparse and low-rank matrices (arising in PCA with sparse outliers, graphical models). In many applications in signal processing and machine learning, the model of interest is known to be structured in several ways at the same time, for example, a matrix that is simultaneously sparse and low-rank. An application in signal processing is the classic sparse phase retrieval problem, where the goal is to recover a sparse signal from phaseless (magnitude-only) measurements. In machine learning, the problem comes up when combining several regularizers that each promote a certain desired structure. In this work, we address the questions: what convex relaxation should we use for the recovery of a ``simultaneously structured'' model? And how many measurements are needed generically? Often penalties (norms) that promote each structure are known (e.g. l1 norm for sparsity, nuclear norm for matrix rank), so it is reasonable to minimize a combination of these norms. We show that, surprisingly, if we use as objective function any convex joint function of the individual norms, we can do no better than an algorithm that exploits only one of the several structures. We then specialize our result to the case of simultaneously sparse and low-rank matrices, and present numerical simulations that support the theoretical bounds on required number of observations.




Other Videos By Microsoft Research


2016-08-11Design-led Innovation
2016-08-11Future Perfect: The Case for Progress in A Networked Age
2016-08-11Towards ad hoc interactions with robots
2016-08-11Dynamically Enforcing Knowledge-based Security Policies
2016-08-11Real Applications of Non-Real Numbers
2016-08-11From the Information Extraction Pipeline to Global Models, and Back
2016-08-11Some Algorithmic Problems in High Dimensions
2016-08-11Machine Learning Course - Lecture 2
2016-08-11Panel: Open Data for Open Science - Data Interoperability
2016-08-11Cloud Computing - What Do Researchers Want? - A Panel Discussion
2016-08-11Machine Learning Work Shop - Recovery of Simultaneously Structured Models by Convex Optimization
2016-08-11Machine Learning Work Shop- A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem
2016-08-11Machine Learning Work Shop - Combining Machine and Human Intelligence in Crowdsourcing
2016-08-11Graph Drawing 2012 Day 3 - Session 4
2016-08-11Machine Learning Work Shop-Session 4 - Hariharan Narayanan - Testing the Manifold Hypothesis
2016-08-11Machine Learning Work Shop-Session 3 - Pedro Domingos - Learning Tractable but Expressive Models
2016-08-11Machine Learning Work Shop - Graphical Event Models for Temporal Event Streams
2016-08-11Machine Learning Work Shop - Online Learning Against Adaptive Adversaries
2016-08-11Machine Learning Work Shop - Counterfactual Measurements and Learning Systems
2016-08-11Machine Learning Work Shop - Why Submodularity is Important to Machine Learning
2016-08-11Machine Learning Work Shop - Bayesian Nonparametrics for Complex Dynamical Phenomena



Tags:
microsoft research