On Kullback-Leibler Loss and Density Estimation
Hall, Peter
Ann. Statist., Tome 15 (1987) no. 1, p. 1491-1519 / Harvested from Project Euclid
"Discrimination information," or Kullback-Leibler loss, is an appropriate measure of distance in problems of discrimination. We examine it in the context of nonparametric kernel density estimation and show that its asymptotic properties are profoundly influenced by tail properties of the kernel and of the unknown density. We suggest ways of choosing the kernel so as to reduce loss, and describe the extent to which likelihood cross-validation asymptotically minimises loss. Likelihood cross-validation generally leads to selection of a window width of the correct order of magnitude, but not necessarily to a window with the correct first-order properties. However, if the kernel is chosen appropriately, then likelihood cross-validation does result in asymptotic minimisation of Kullback-Leibler loss.
Publié le : 1987-12-14
Classification:  Density estimation,  discrimination,  kernel method,  Kullback-Leibler loss,  likelihood cross-validation,  62G99,  62H99
@article{1176350606,
     author = {Hall, Peter},
     title = {On Kullback-Leibler Loss and Density Estimation},
     journal = {Ann. Statist.},
     volume = {15},
     number = {1},
     year = {1987},
     pages = { 1491-1519},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176350606}
}
Hall, Peter. On Kullback-Leibler Loss and Density Estimation. Ann. Statist., Tome 15 (1987) no. 1, pp.  1491-1519. http://gdmltest.u-ga.fr/item/1176350606/