Supervised Dimension Reduction

358
59.7
Следующее
Популярные
Опубликовано 12 августа 2016, 2:11
We look at the problem of supervised dimension reduction (SDR) from three perspectives. The SDR problem is given observations, x-y pairs, infer a subspace of the input data without losing any predictive accuracy. We first study it as a geometric problem, show that the gradient of the regression or classification function is a central quantity, and in the case where the inputs are concentrated on manifold the covergence of estimators will be a function of the dimension of the manifold and not the ambient space. We then examine the same problem from a probabilistic modeling perspective and give a Bayesian solution. In this setting factor models and the Grassmann manifold are central. We close with an algorithmic perspective where we show that randomized algorithms can be used for SDR on massive data. We discuss the statistical implications of these randomized algorithms, they are a form or regularization.
автотехномузыкадетское