We study estimation of the parameters of a Gaussian linear model $\mathscr{M}_0$ when we entertain the possibility that $\mathscr{M}_0$ is invalid and a larger model $\mathscr{M}_1$ should be assumed. Estimates are robust if their maximum risk over $\mathscr{M}_1$ is finite and the most robust estimate is the least squares estimate under $\mathscr{M}_1$. We apply notions of Hodges and Lehmann (1952) and Efron and Morris (1971) to obtain (biased) estimates which do well under $\mathscr{M}_0$ at a small price in robustness. Extensions to confidence intervals, simultaneous estimation of several parameters and large sample approximations applying to nested parametric models are also discussed.
@article{1176346707,
author = {Bickel, P. J.},
title = {Parametric Robustness: Small Biases can be Worthwhile},
journal = {Ann. Statist.},
volume = {12},
number = {1},
year = {1984},
pages = { 864-879},
language = {en},
url = {http://dml.mathdoc.fr/item/1176346707}
}
Bickel, P. J. Parametric Robustness: Small Biases can be Worthwhile. Ann. Statist., Tome 12 (1984) no. 1, pp. 864-879. http://gdmltest.u-ga.fr/item/1176346707/