A Note on Admissibility When Precision is Unbounded
Anderson, Charles ; Pal, Nabendu
Ann. Statist., Tome 23 (1995) no. 6, p. 593-597 / Harvested from Project Euclid
The estimation of a common mean vector $\theta$ given two independent normal observations $X \sim N_p(\theta, \sigma^2_x I)$ and $Y \sim N_p(\theta, \sigma^2_y I)$ is reconsidered. It being known that the estimator $\eta X + (1 - \eta)Y$ is inadmissible when $\eta \in (0, 1)$, we show that when $\eta$ is 0 or 1, then the opposite is true, that is, the estimator is admissible. The general situation is that an estimator $X^\ast$ can be improved by shrinkage when there exists a statistic $B$ which, in a certain sense, estimates a lower bound on the risk of $X^\ast$. On the other hand, an estimator is admissible under very general conditions if there is no reasonable way to detect that its risk is small.
Publié le : 1995-04-14
Classification:  Inadmissibility,  shrinkage estimation,  Stein's normal identity,  62C15,  62H12
@article{1176324537,
     author = {Anderson, Charles and Pal, Nabendu},
     title = {A Note on Admissibility When Precision is Unbounded},
     journal = {Ann. Statist.},
     volume = {23},
     number = {6},
     year = {1995},
     pages = { 593-597},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176324537}
}
Anderson, Charles; Pal, Nabendu. A Note on Admissibility When Precision is Unbounded. Ann. Statist., Tome 23 (1995) no. 6, pp.  593-597. http://gdmltest.u-ga.fr/item/1176324537/