Improved minimax predictive densities under Kullback–Leibler loss
George, Edward I. ; Liang, Feng ; Xu, Xinyi
Ann. Statist., Tome 34 (2006) no. 1, p. 78-91 / Harvested from Project Euclid
Let X|μ∼Np(μ,vxI) and Y|μ∼Np(μ,vyI) be independent p-dimensional multivariate normal vectors with common unknown mean μ. Based on only observing X=x, we consider the problem of obtaining a predictive density p̂(y|x) for Y that is close to p(y|μ) as measured by expected Kullback–Leibler loss. A natural procedure for this problem is the (formal) Bayes predictive density p̂U(y|x) under the uniform prior πU(μ)≡1, which is best invariant and minimax. We show that any Bayes predictive density will be minimax if it is obtained by a prior yielding a marginal that is superharmonic or whose square root is superharmonic. This yields wide classes of minimax procedures that dominate p̂U(y|x), including Bayes predictive densities under superharmonic priors. Fundamental similarities and differences with the parallel theory of estimating a multivariate normal mean under quadratic loss are described.
Publié le : 2006-02-14
Classification:  Bayes rules,  heat equation,  inadmissibility,  multiple shrinkage,  multivariate normal,  prior distributions,  shrinkage estimation,  superharmonic marginals,  superharmonic priors,  unbiased estimate of risk,  62C20,  62C10,  62F15
@article{1146576256,
     author = {George, Edward I. and Liang, Feng and Xu, Xinyi},
     title = {Improved minimax predictive densities under Kullback--Leibler loss},
     journal = {Ann. Statist.},
     volume = {34},
     number = {1},
     year = {2006},
     pages = { 78-91},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1146576256}
}
George, Edward I.; Liang, Feng; Xu, Xinyi. Improved minimax predictive densities under Kullback–Leibler loss. Ann. Statist., Tome 34 (2006) no. 1, pp.  78-91. http://gdmltest.u-ga.fr/item/1146576256/