Higher Order Principal Components: Complexity and Applications

54
Опубликовано 17 августа 2016, 1:02
Standard Principal Components are directions that optimize second moments of a given set of points and have proven to be a powerful tool. Here we consider higher order principal components, i.e., directions that optimize higher moments of a data set (or the spectral norm of higher-dimensional arrays). They appear to be much less structured --- there could be exponentially many, need not be pairwise orthogonal and it is NP-hard to find global maxima for arbitrary inputs. We discuss applications to combinatorial optimization and learning: (a) finding a planted clique in a random graph, where higher-order maxima even for semi-random inputs would be effective and (b) learning an unknown function of a low-dimensional subspace from labeled examples (a k-subspace junta, generalizing the well-known class of k-juntas), where *local* optima suffice and can be approximated efficiently for a wide class of input distributions. Most of the talk is joint work with Ying Xiao.
Свежие видео
8 дней – 63 14611:23
How Adam Savage's Son Once Tricked Him
10 дней – 119 94114:59
iPhone 16 Pro Max - My Initial Review!
автотехномузыкадетское