Some sharp performance bounds for least squares regression with L1 regularization
Zhang, Tong
Ann. Statist., Tome 37 (2009) no. 1, p. 2109-2144 / Harvested from Project Euclid
We derive sharp performance bounds for least squares regression with L1 regularization from parameter estimation accuracy and feature selection quality perspectives. The main result proved for L1 regularization extends a similar result in [Ann. Statist. 35 (2007) 2313–2351] for the Dantzig selector. It gives an affirmative answer to an open question in [Ann. Statist. 35 (2007) 2358–2364]. Moreover, the result leads to an extended view of feature selection that allows less restrictive conditions than some recent work. Based on the theoretical insights, a novel two-stage L1-regularization procedure with selective penalization is analyzed. It is shown that if the target parameter vector can be decomposed as the sum of a sparse parameter vector with large coefficients and another less sparse vector with relatively small coefficients, then the two-stage procedure can lead to improved performance.
Publié le : 2009-10-15
Classification:  L_1 regularization,  Lasso,  regression,  sparsity,  variable selection,  parameter estimation,  62G05,  62J05
@article{1247663750,
     author = {Zhang, Tong},
     title = {Some sharp performance bounds for least squares regression with L<sub>1</sub> regularization},
     journal = {Ann. Statist.},
     volume = {37},
     number = {1},
     year = {2009},
     pages = { 2109-2144},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1247663750}
}
Zhang, Tong. Some sharp performance bounds for least squares regression with L1 regularization. Ann. Statist., Tome 37 (2009) no. 1, pp.  2109-2144. http://gdmltest.u-ga.fr/item/1247663750/