Machine Learning Work Shop - Recovery of Simultaneously Structured Models by Convex Optimization

1 225
102.1
Опубликовано 12 августа 2016, 3:06
Machine Learning Work Shop-Session 5 - Maryam Fazel - 'Recovery of Simultaneously Structured Models by Convex Optimization' The topic of deriving a structured model from a small number of linear observations by solving a convex optimization problem has been well-studied in recent years. Examples include the recovery of sparse or group-sparse vectors (which gave rise to the area of Compressed Sensing), low-rank matrices (arising in collaborative filtering and system identification), and the sum of sparse and low-rank matrices (arising in PCA with sparse outliers, graphical models). In many applications in signal processing and machine learning, the model of interest is known to be structured in several ways at the same time, for example, a matrix that is simultaneously sparse and low-rank. An application in signal processing is the classic sparse phase retrieval problem, where the goal is to recover a sparse signal from phaseless (magnitude-only) measurements. In machine learning, the problem comes up when combining several regularizers that each promote a certain desired structure. In this work, we address the questions: what convex relaxation should we use for the recovery of a ``simultaneously structured'' model? And how many measurements are needed generically? Often penalties (norms) that promote each structure are known (e.g. l1 norm for sparsity, nuclear norm for matrix rank), so it is reasonable to minimize a combination of these norms. We show that, surprisingly, if we use as objective function any convex joint function of the individual norms, we can do no better than an algorithm that exploits only one of the several structures. We then specialize our result to the case of simultaneously sparse and low-rank matrices, and present numerical simulations that support the theoretical bounds on required number of observations.
автотехномузыкадетское