This paper provides a comparative sensitivity analysis of one-step
Newton–Raphson estimators for linear regression. Such estimators have
been proposed as a way to combine the global stability of high breakdown
estimators with the local stability of generalized maximum likelihood
estimators. We analyze this strategy, obtaining upper bounds for the maximum
bias induced by $\varepsilon$-contamination of the model. These bounds yield
break-down points and local rates of convergence of the bias as
$\varepsilon$decreases to zero. We treat a unified class of
Newton–Raphson estimators, including one-step versions of the well-known
Schweppe, Mallows and Hill–Ryan GM estimators. Of the three
well-known types, the Hill–Ryan form emerges as the most stable in terms
of one-step estimation. The Schweppe form is susceptible to a breakdown of the
Hessian matrix. For this reason it fails to improve on the local stability of
the initial estimator, and it may lead to falsely optimistic estimates of
precision.
Publié le : 1998-06-14
Classification:
Breakdown point,
maximum bias function,
Newton–Raphson,
robust statistics,
weighted least squares,
62F35,
62J02,
62F10,
62J05
@article{1024691092,
author = {Simpson, Douglas G. and Yohai, Victor J.},
title = {Functional stability of one-step GM-estimators in
approximately linear regression},
journal = {Ann. Statist.},
volume = {26},
number = {3},
year = {1998},
pages = { 1147-1169},
language = {en},
url = {http://dml.mathdoc.fr/item/1024691092}
}
Simpson, Douglas G.; Yohai, Victor J. Functional stability of one-step GM-estimators in
approximately linear regression. Ann. Statist., Tome 26 (1998) no. 3, pp. 1147-1169. http://gdmltest.u-ga.fr/item/1024691092/