Certain Inequalities in Information Theory and the Cramer-Rao Inequality
Kullback, S.
Ann. Math. Statist., Tome 25 (1954) no. 4, p. 745-751 / Harvested from Project Euclid
The Cramer-Rao inequality provides, under certain regularity conditions, a lower bound for the variance of an estimator [7], [15]. Various generalizations, extensions and improvements in the bound have been made, by Barankin [1], [2], Bhattacharyya [3], Chapman and Robbins [5], Fraser and Guttman [11], Kiefer [12], and Wolfowitz [16], among others. Further considerations of certain inequality properties of a measure of information, discussed by Kullback and Leibler [14], yields a greater lower bound for the information measure (formula (4.11)), and leads to a result which may be considered a generalization of the Cramer-Rao inequality, the latter following as a special case. The results are used to define discrimination efficiency and estimation efficiency at a point in parameter space.
Publié le : 1954-12-14
Classification: 
@article{1177728660,
     author = {Kullback, S.},
     title = {Certain Inequalities in Information Theory and the Cramer-Rao Inequality},
     journal = {Ann. Math. Statist.},
     volume = {25},
     number = {4},
     year = {1954},
     pages = { 745-751},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177728660}
}
Kullback, S. Certain Inequalities in Information Theory and the Cramer-Rao Inequality. Ann. Math. Statist., Tome 25 (1954) no. 4, pp.  745-751. http://gdmltest.u-ga.fr/item/1177728660/