Achieving information-theoretic limits in high-dimensional regression.

125
Опубликовано 28 июля 2016, 0:10
Problems in high-dimensional regression have been of immense interest lately. Examples include graphical model selection, multi-label prediction, computer vision, and in genomics. There are information-theoretic limits which relate the four quantities, viz. sample size, dimension, sparsity, and signal-to-noise ratio, for accurate variable selection. We provide analysis of an iterative algorithm, which is similar in spirit to forward stepwise regression, for a linear model with specific coefficient structure. We demonstrate that the algorithm has optimal performance when compared to these information-theoretic limits. These results, apart from providing a practical solution to a long standing problem in communication, also contribute to the understanding of thresholds for variable selection in high-dimensional regression.
автотехномузыкадетское