Basis pursuit (BP), basis pursuit deNoising (BPDN), and least absolute shrinkage and selection operator (LASSO) are popular methods for identifying important predictors in the high-dimensional linear regression model (Formula presented.). By definition, when (Formula presented.), BP uniquely recovers (Formula presented.) when (Formula presented.) and (Formula presented.) implies (Formula presented.) (identifiability condition). Furthermore, LASSO can recover the sign of (Formula presented.) only under a much stronger irrepresentability condition. Meanwhile, it is known that the model selection properties of LASSO can be improved by hard thresholding its estimates. This article supports these findings by proving that thresholded LASSO, thresholded BPDN, and thresholded BP recover the sign of (Formula presented.) in both the noisy and noiseless cases if and only if (Formula presented.) is identifiable and large enough. In particular, if X has iid Gaussian entries and the number of predictors grows linearly with the sample size, then these thresholded estimators can recover the sign of (Formula presented.) when the signal sparsity is asymptotically below the Donoho–Tanner transition curve. This is in contrast to the regular LASSO, which asymptotically, recovers the sign of (Formula presented.) only when the signal sparsity tends to 0. Numerical experiments show that the identifiability condition, unlike the irrepresentability condition, does not seem to be affected by the structure of the correlations in the X matrix.
Subject classification (UKÄ)
- Probability Theory and Statistics