Algorithms for Near-Separable Nonnegative Matrix Factorization

1 057
35.2
Следующее
08.08.16 – 2431:07:43
Accelerated Learning with Kernels
Популярные
Опубликовано 8 августа 2016, 18:14
The goal in nonnegative matrix factorization (NMF) is to express, exactly or approximately, a given matrix as a product of two nonnegative matrices of smaller inner dimension. NMFs arise naturally in a variety of signal separation and unsupervised feature extraction problems, such as modeling topics in text and analyzing Hyperspectral images. Computing NMF has been shown to be NP-hard (Vavasis, 2009). Popular methods for solving the NMF problem use local search to reach a locally optimal solution. Very recently, a new class of algorithms have been proposed to solve NMF exactly under certain separability assumptions on the generative model. In this talk, I will first survey these recent developments, evaluate their potential for topic modeling applications, and then present new algorithms that solve the factorization problem exactly under the same assumptions, but are superior in terms of scalability and noise-robustness. I will also briefly talk about extensions to other factorization losses like $\ell_1$-loss and Bregman divergences, and show applications to foreground-background separation in Video. This talk is based on joint work with Vikas Sindhwani and Prabhanjan Kambadur.
автотехномузыкадетское