Understanding non-convex optimization for sparse coding

2 497
43.8
Следующее
27.06.16 – 1 2141:29:41
Visiting Artist Trimpin
Популярные
Опубликовано 27 июня 2016, 21:01
Sparse coding is a basic algorithmic primitive in many machine learning applications, such as image denoising, edge detection, compression and deep learning. Most of the existing algorithms for sparse coding minimize a non-convex function by heuristics like alternating minimization, gradient descent or their variants. Despite their success in practice, they are not mathematically rigorous because they could potentially converge to local minima. In this work we prove that, alternating minimization and some other variants indeed converge to global minima provably form a suitable starting point, under a generative model and incoherence assumptions on the ground truth codes. We also provide a new spectral-method based initialization procedure that returns such a good starting point. Our framework of analysis is potentially useful for analyzing other alternating minimization type algorithms for problems with hidden variables. No prior knowledge is assumed and this is the joint work with Sanjeev Arora, Rong Ge and Ankur Moitra.
автотехномузыкадетское