Achieving information-theoretic limits in high-dimensional regression.

Subscribers:
344,000
Published on ● Video Link: https://www.youtube.com/watch?v=-lgKB3QoBMA



Category:
Guide
Duration: 56:15
104 views
1


Problems in high-dimensional regression have been of immense interest lately. Examples include graphical model selection, multi-label prediction, computer vision, and in genomics. There are information-theoretic limits which relate the four quantities, viz. sample size, dimension, sparsity, and signal-to-noise ratio, for accurate variable selection. We provide analysis of an iterative algorithm, which is similar in spirit to forward stepwise regression, for a linear model with specific coefficient structure. We demonstrate that the algorithm has optimal performance when compared to these information-theoretic limits. These results, apart from providing a practical solution to a long standing problem in communication, also contribute to the understanding of thresholds for variable selection in high-dimensional regression.







Tags:
microsoft research