“Preconditioning” for feature selection and regression in high-dimensional problems
Paul, Debashis ; Bair, Eric ; Hastie, Trevor ; Tibshirani, Robert
Ann. Statist., Tome 36 (2008) no. 1, p. 1595-1618 / Harvested from Project Euclid
We consider regression problems where the number of predictors greatly exceeds the number of observations. We propose a method for variable selection that first estimates the regression function, yielding a “preconditioned” response variable. The primary method used for this initial regression is supervised principal components. Then we apply a standard procedure such as forward stepwise selection or the LASSO to the preconditioned response variable. In a number of simulated and real data examples, this two-step procedure outperforms forward stepwise selection or the usual LASSO (applied directly to the raw outcome). We also show that under a certain Gaussian latent variable model, application of the LASSO to the preconditioned response variable is consistent as the number of predictors and observations increases. Moreover, when the observational noise is rather large, the suggested procedure can give a more accurate estimate than LASSO. We illustrate our method on some real problems, including survival analysis with microarray data.
Publié le : 2008-08-15
Classification:  Model selection,  prediction error,  lasso,  62J07
@article{1216237293,
     author = {Paul, Debashis and Bair, Eric and Hastie, Trevor and Tibshirani, Robert},
     title = {``Preconditioning'' for feature selection and regression in high-dimensional problems},
     journal = {Ann. Statist.},
     volume = {36},
     number = {1},
     year = {2008},
     pages = { 1595-1618},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1216237293}
}
Paul, Debashis; Bair, Eric; Hastie, Trevor; Tibshirani, Robert. “Preconditioning” for feature selection and regression in high-dimensional problems. Ann. Statist., Tome 36 (2008) no. 1, pp.  1595-1618. http://gdmltest.u-ga.fr/item/1216237293/