We consider the problem of robust inference for the binomial $(m,
\pi)$ model. The discreteness of the data and the fact that the parameter and
sample spaces are bounded mean that standard robustness theory gives surprising
results. For example, the maximum likelihood estimator (MLE) is quite robust,
it cannot be improved on for $m=1$ but can be for $m>1$. We discuss four
other classes of estimators: $M$-estimators, minimum disparity estimators,
optimal MGP estimators, and a new class of estimators which we call
$E$-estimators. We show that $E$-estimators have a non-standard asymptotic
theory which challenges the accepted relationships between robustness concepts
and thereby provides new perspectives on these concepts.
@article{1013699996,
author = {Ruckstuhl, A. F. and Welsh, A. H.},
title = {Robust fitting of the binomial model},
journal = {Ann. Statist.},
volume = {29},
number = {2},
year = {2001},
pages = { 1117-1136},
language = {en},
url = {http://dml.mathdoc.fr/item/1013699996}
}
Ruckstuhl, A. F.; Welsh, A. H. Robust fitting of the binomial model. Ann. Statist., Tome 29 (2001) no. 2, pp. 1117-1136. http://gdmltest.u-ga.fr/item/1013699996/