When Are Nonconvex Optimization Problems Not Scary?

3 297
33.3
Опубликовано 11 августа 2016, 8:16
General nonconvex optimization problems are NP-hard. In applied disciplines, however, nonconvex problems abound, and heuristic algorithms are often surprisingly effective. The ability of nonconvex heuristics to find high-quality solutions for practical problems remains largely mysterious. In this talk, I will describe a family of nonconvex problems which can be solved efficiently. This family has the characteristic structure that every local minimizer is also global, and the function has a directional negative curvature around all saddle points (ΓÇ£ridable saddleΓÇ¥). Natural formulations for a number of important problems in signal processing and machine learning lie in this family, including the eigenvector problem, complete dictionary learning (DL), generalized phase retrieval (PR), orthogonal tensor decomposition. This benign geometric structure allows a number of optimization methods to efficiently find a global minimizer, without special initializations. To corroborate this, I will describe a second-order trust-region algorithm. This geometric approach to solving nonconvex problems has led to new computational guarantees for several practical problems, such as DL and PR. To complete and enrich the framework is an ongoing research effort. I will highlight challenges from both theoretical and algorithmic sides. Joint work with Qing Qu and John Wright. An overview article on this is available online: arxiv.org/abs/1510.06096
автотехномузыкадетское