Suppose that the mean $\tau$ of a vector of Poisson variates is known to lie in a bounded domain $T$ in $\lbrack 0,\infty)^p$. How much does this a priori information increase precision of estimation of $\tau$? Using error measure $\sum_i(\hat\tau_i - \tau_i)^2/\tau_i$ and minimax risk $\rho(T)$, we give analytical and numerical results for small intervals when $p = 1$. Usually, however, approximations are needed. If $T$ is "rectangularly convex" at 0, there exist linear estimators with risk at most 1.26$\rho(T)$. For general $T, \rho(T) \geq p^2/(p + \lambda(\Omega))$, where $\lambda(\Omega)$ is the principal eigenvalue of the Laplace operator on the polydisc transform $\Omega = \Omega(T)$, a domain in twice-$p$-dimensional space. The bound is asymptotically sharp: $\rho(mT) = p - \lambda(\Omega)/m + o(m^{-1})$. Explicit forms are given for $T$ a simplex or a hyperrectangle. We explore the curious parallel of the results for $T$ with those for a Gaussian vector of double the dimension lying in $\Omega$.
Publié le : 1992-06-14
Classification:
Polydisc transform,
Bayes risk lower bound,
second-order minimax,
Laplace operator,
principal eigenvalue,
loss estimation,
Fisher information,
linear risk,
minimax risk,
hardest rectangular subproblem,
isoperimetric inequalities,
62F10,
62F11,
62C20
@article{1176348658,
author = {Johnstone, Iain M. and MacGibbon, K. Brenda},
title = {Minimax Estimation of a Constrained Poisson Vector},
journal = {Ann. Statist.},
volume = {20},
number = {1},
year = {1992},
pages = { 807-831},
language = {en},
url = {http://dml.mathdoc.fr/item/1176348658}
}
Johnstone, Iain M.; MacGibbon, K. Brenda. Minimax Estimation of a Constrained Poisson Vector. Ann. Statist., Tome 20 (1992) no. 1, pp. 807-831. http://gdmltest.u-ga.fr/item/1176348658/