Supervised Dimensionality Reduction with Principal Component Analysis

636
30.3
Следующее
Популярные
Опубликовано 6 сентября 2016, 6:42
Principal component analysis (PCA) is widely applied for unsupervised dimensionality reduction. When labels of data are available, e.g., in a classification or regression task, PCA is however not able to use this information. In this talk I will present our recent work on supervised dimensionality reduction, where the outputs (i.e., supervised information) can be (multi-label) classification labels and regression values. The first approach is based on a supervised latent variable model, and it turns out to solve an generalized eigenvalue problem. The second approach goes beyond the first one and can handle the more interesting semi-supervised setting, i.e., only part of the data are labeled. In this approach learning is done via an efficient EM algorithm. Both approaches can be kernelized to handle non-linear mappings. They are compared with other competitors on various data sets and show very encouraging performance.
Случайные видео
14.04.23 – 395 6681:19
Boundary - The AMD Experience
22.12.22 – 956 1714:46
Geosynchronous Orbits are WEIRD
автотехномузыкадетское