Published exactly seventy years ago, Jeffreys’s Theory of Probability (1939) has had a unique impact on the Bayesian community and is now considered to be one of the main classics in Bayesian Statistics as well as the initiator of the objective Bayes school. In particular, its advances on the derivation of noninformative priors as well as on the scaling of Bayes factors have had a lasting impact on the field. However, the book reflects the characteristics of the time, especially in terms of mathematical rigor. In this paper we point out the fundamental aspects of this reference work, especially the thorough coverage of testing problems and the construction of both estimation and testing noninformative priors based on functional divergences. Our major aim here is to help modern readers in navigating in this difficult text and in concentrating on passages that are still relevant today.
Publié le : 2009-05-15
Classification:
Bayesian foundations,
noninformative prior,
σ-finite measure,
Jeffreys’s prior,
Kullback divergence,
tests,
Bayes factor,
p-values,
goodness of fit
@article{1263478373,
author = {Robert, Christian P. and Chopin, Nicolas and Rousseau, Judith},
title = {Harold Jeffreys's Theory of Probability Revisited},
journal = {Statist. Sci.},
volume = {24},
number = {1},
year = {2009},
pages = { 141-172},
language = {en},
url = {http://dml.mathdoc.fr/item/1263478373}
}
Robert, Christian P.; Chopin, Nicolas; Rousseau, Judith. Harold Jeffreys’s Theory of Probability Revisited. Statist. Sci., Tome 24 (2009) no. 1, pp. 141-172. http://gdmltest.u-ga.fr/item/1263478373/