Microsoft Research334 тыс
Опубликовано 17 августа 2016, 1:02
Standard Principal Components are directions that optimize second moments of a given set of points and have proven to be a powerful tool. Here we consider higher order principal components, i.e., directions that optimize higher moments of a data set (or the spectral norm of higher-dimensional arrays). They appear to be much less structured --- there could be exponentially many, need not be pairwise orthogonal and it is NP-hard to find global maxima for arbitrary inputs. We discuss applications to combinatorial optimization and learning: (a) finding a planted clique in a random graph, where higher-order maxima even for semi-random inputs would be effective and (b) learning an unknown function of a low-dimensional subspace from labeled examples (a k-subspace junta, generalizing the well-known class of k-juntas), where *local* optima suffice and can be approximated efficiently for a wide class of input distributions. Most of the talk is joint work with Ying Xiao.
Свежие видео
Случайные видео
Best Tablet for Students and Drawing 2024 - Huawei MatePad 11.5"S PaperMatte Edition in-depth Review
How do I fix the error “canceling statement due to conflict with recovery” for a read replica query?