Since the time of Gauss, it has been generally accepted that
$\ell_2$-methods of combining observations by minimizing sums of squared errors
have significant computational advantages over earlier $\ell_1$-methods based
on minimization of absolute errors advocated by Boscovich, Laplace and others.
However, $\ell_1$-methods are known to have significant robustness advantages
over $\ell_2$-methods in many applications, and related quantile regression
methods provide a useful, complementary approach to classical least-squares
estimation of statistical models. Combining recent advances in interior point
methods for solving linear programs with a new statistical preprocessing
approach for $\ell_1$-type problems, we obtain a 10- to 100-fold improvement in
computational speeds over current (simplex-based) $\ell_1$-algorithms in large
problems, demonstrating that $\ell_1$-methods can be made competitive with
$\ell_2$-methods in terms of computational speed throughout the entire range of
problem sizes. Formal complexity results suggest that $\ell_1$-regression can
be made faster than least-squares regression for n sufficiently large
and p modest.
Publié le : 1997-11-14
Classification:
$\ell_1$,
$L_1$,
least absolute deviations,
median,
regression quantiles,
interior point,
statistical preprocessing,
linear programming,
simplex method,
simultaneous confidence bands
@article{1030037960,
author = {Portnoy, Stephen and Koenker, Roger},
title = {The Gaussian hare and the Laplacian tortoise: computability of
squared-error versus absolute-error estimators},
journal = {Statist. Sci.},
volume = {12},
number = {1},
year = {1997},
pages = { 279-300},
language = {en},
url = {http://dml.mathdoc.fr/item/1030037960}
}
Portnoy, Stephen; Koenker, Roger. The Gaussian hare and the Laplacian tortoise: computability of
squared-error versus absolute-error estimators. Statist. Sci., Tome 12 (1997) no. 1, pp. 279-300. http://gdmltest.u-ga.fr/item/1030037960/