Asymptotically minimax Bayes predictive densities
Aslan, Mihaela
Ann. Statist., Tome 34 (2006) no. 1, p. 2921-2938 / Harvested from Project Euclid
Given a random sample from a distribution with density function that depends on an unknown parameter θ, we are interested in accurately estimating the true parametric density function at a future observation from the same distribution. The asymptotic risk of Bayes predictive density estimates with Kullback–Leibler loss function D(fθ||{f̂})=∫fθ log(fθ/f̂) is used to examine various ways of choosing prior distributions; the principal type of choice studied is minimax. We seek asymptotically least favorable predictive densities for which the corresponding asymptotic risk is minimax. A result resembling Stein’s paradox for estimating normal means by maximum likelihood holds for the uniform prior in the multivariate location family case: when the dimensionality of the model is at least three, the Jeffreys prior is minimax, though inadmissible. The Jeffreys prior is both admissible and minimax for one- and two-dimensional location problems.
Publié le : 2006-12-15
Classification:  Bayes predictive density,  Kullback–Leibler loss,  the Jeffreys prior,  asymptotically least favorable priors,  minimax risk,  62G07,  62G07,  62C20
@article{1179935070,
     author = {Aslan, Mihaela},
     title = {Asymptotically minimax Bayes predictive densities},
     journal = {Ann. Statist.},
     volume = {34},
     number = {1},
     year = {2006},
     pages = { 2921-2938},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1179935070}
}
Aslan, Mihaela. Asymptotically minimax Bayes predictive densities. Ann. Statist., Tome 34 (2006) no. 1, pp.  2921-2938. http://gdmltest.u-ga.fr/item/1179935070/