Asymptotic Lower Bounds for Risk in Robust Estimation
Beran, Rudolf
Ann. Statist., Tome 8 (1980) no. 1, p. 1252-1264 / Harvested from Project Euclid
Robustness and efficiency of a parameter estimate $T$ can be assessed by comparing the fitted parametric distribution $P_T$ with the actual distribution, which is assumed to lie near the parametric family $\{P_\theta:\theta\in\Theta\}$. Asymptotic lower bounds are established for the minimax risk over distributions near the parametric model, taking as loss function a monotone increasing function of the Hellinger distance between the actual distribution of the sample and the fitted distribution determined by $T$. The set of marginal distributions considered in the minimax calculation is a subset of the Hellinger ball of radius $O(n^{-1/2})$ centered at $P_\theta, n$ being the sample size. When the loss function is bounded, the lower bound on maximum risk can be attained asymptotically. However, an estimator of $\theta$ which is asymptotically minimax for bounded loss functions may be far from optimal when the loss function is unbounded. Such divergent behavior is exhibited, for instance, by the sample mean in nearly normal models.
Publié le : 1980-11-14
Classification:  Robust estimation,  parametric models,  risk,  asymptotic minimax bounds,  asymptotic minimax estimators,  62G35,  62F10
@article{1176345198,
     author = {Beran, Rudolf},
     title = {Asymptotic Lower Bounds for Risk in Robust Estimation},
     journal = {Ann. Statist.},
     volume = {8},
     number = {1},
     year = {1980},
     pages = { 1252-1264},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176345198}
}
Beran, Rudolf. Asymptotic Lower Bounds for Risk in Robust Estimation. Ann. Statist., Tome 8 (1980) no. 1, pp.  1252-1264. http://gdmltest.u-ga.fr/item/1176345198/