On Minimax Estimation of a Sparse Normal Mean Vector
Johnstone, Iain M.
Ann. Statist., Tome 22 (1994) no. 1, p. 271-289 / Harvested from Project Euclid
Mallows has conjectured that among distributions which are Gaussian but for occasional contamination by additive noise, the one having least Fisher information has (two-sided) geometric contamination. A very similar problem arises in estimation of a nonnegative vector parameter in Gaussian white noise when it is known also that most [i.e., $(1 - \varepsilon)$] components are zero. We provide a partial asymptotic expansion of the minimax risk as $\varepsilon \rightarrow 0$. While the conjecture seems unlikely to be exactly true for finite $\varepsilon$, we verify it asymptotically up to the accuracy of the expansion. Numerical work suggests the expansion is accurate for $\varepsilon$ as large as 0.05. The best $l_1$-estimation rule is first- but not second-order minimax. The results bear on an earlier study of maximum entropy estimation and various questions in robustness and function estimation using wavelet bases.
Publié le : 1994-03-14
Classification:  Fisher information,  minimax decision theory,  least favorable prior,  nearly black object,  robustness,  white noise model,  62C20,  62C10,  62G05
@article{1176325368,
     author = {Johnstone, Iain M.},
     title = {On Minimax Estimation of a Sparse Normal Mean Vector},
     journal = {Ann. Statist.},
     volume = {22},
     number = {1},
     year = {1994},
     pages = { 271-289},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176325368}
}
Johnstone, Iain M. On Minimax Estimation of a Sparse Normal Mean Vector. Ann. Statist., Tome 22 (1994) no. 1, pp.  271-289. http://gdmltest.u-ga.fr/item/1176325368/