Information theory and superefficiency
Barron, Andrew ; Hengartner, Nicolas
Ann. Statist., Tome 26 (1998) no. 3, p. 1800-1825 / Harvested from Project Euclid
The asymptotic risk of efficient estimators with Kullback–Leibler loss in smoothly parametrized statistical models is $k/2_n$, where $k$ is the parameter dimension and $n$ is the sample size. Under fairly general conditions, we given a simple information-theoretic proof that the set of parameter values where any arbitrary estimator is superefficient is negligible. The proof is based on a result of Rissanen that codes have asymptotic redundancy not smaller than $(k/2)\log n$, except in a set of measure 0.
Publié le : 1998-10-14
Classification:  Superefficiency,  information theory,  data compression,  Kullback–Leibler loss,  62F12,  94A65,  94A29,  62G20
@article{1024691358,
     author = {Barron, Andrew and Hengartner, Nicolas},
     title = {Information theory and superefficiency},
     journal = {Ann. Statist.},
     volume = {26},
     number = {3},
     year = {1998},
     pages = { 1800-1825},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1024691358}
}
Barron, Andrew; Hengartner, Nicolas. Information theory and superefficiency. Ann. Statist., Tome 26 (1998) no. 3, pp.  1800-1825. http://gdmltest.u-ga.fr/item/1024691358/