We present some general results determining minimax bounds on
statistical risk for density estimation based on certain information-theoretic
considerations. These bounds depend only on metric entropy conditions and are
used to identify the minimax rates of convergence.
Publié le : 1999-10-14
Classification:
Minimax risk,
density estimation,
metric entropy,
Kullback-Leibler distance,
62G07,
62B10,
62C20,
94A29
@article{1017939142,
author = {Yang, Yuhong and Barron, Andrew},
title = {Information-theoretic determination of minimax rates of
convergence},
journal = {Ann. Statist.},
volume = {27},
number = {4},
year = {1999},
pages = { 1564-1599},
language = {en},
url = {http://dml.mathdoc.fr/item/1017939142}
}
Yang, Yuhong; Barron, Andrew. Information-theoretic determination of minimax rates of
convergence. Ann. Statist., Tome 27 (1999) no. 4, pp. 1564-1599. http://gdmltest.u-ga.fr/item/1017939142/