Ever since Stein's result, that the sample mean vector $\mathbf{X}$ of a $k \geqq 3$ dimensional normal distribution is an inadmissible estimator of its expectation $\mathbf{\theta}$, statisticians have searched for uniformly better (minimax) estimators. Unbiased estimators are derived here of the risk of arbitrary orthogonally-invariant and scale-invariant estimators of $\mathbf{\theta}$ when the dispersion matrix $\sum$ of $\mathbf{X}$ is unknown and must be estimated. Stein obtained this result earlier for known $\mathbf{\sum}$. Minimax conditions which are weaker than any yet published are derived by finding all estimators whose unbiased estimate of risk is bounded uniformly by $k$, the risk of $\mathbf{X}$. One sequence of risk functions and risk estimates applies simultaneously to the various assumptions about $\mathbf{\sum}$, resulting in a unified theory for these situations.
Publié le : 1976-01-14
Classification:
Estimation,
minimax estimators,
risk of invariant estimators,
mean of a multivariate normal distribution,
Stein's estimator,
62F10,
62C99
@article{1176343344,
author = {Efron, Bradley and Morris, Carl},
title = {Families of Minimax Estimators of the Mean of a Multivariate Normal Distribution},
journal = {Ann. Statist.},
volume = {4},
number = {1},
year = {1976},
pages = { 11-21},
language = {en},
url = {http://dml.mathdoc.fr/item/1176343344}
}
Efron, Bradley; Morris, Carl. Families of Minimax Estimators of the Mean of a Multivariate Normal Distribution. Ann. Statist., Tome 4 (1976) no. 1, pp. 11-21. http://gdmltest.u-ga.fr/item/1176343344/