Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems
Csiszar, Imre
Ann. Statist., Tome 19 (1991) no. 1, p. 2032-2066 / Harvested from Project Euclid
An attempt is made to determine the logically consistent rules for selecting a vector from any feasible set defined by linear constraints, when either all $n$-vectors or those with positive components or the probability vectors are permissible. Some basic postulates are satisfied if and only if the selection rule is to minimize a certain function which, if a "prior guess" is available, is a measure of distance from the prior guess. Two further natural postulates restrict the permissible distances to the author's $f$-divergences and Bregman's divergences, respectively. As corollaries, axiomatic characterizations of the methods of least squares and minimum discrimination information are arrived at. Alternatively, the latter are also characterized by a postulate of composition consistency. As a special case, a derivation of the method of maximum entropy from a small set of natural axioms is obtained.
Publié le : 1991-12-14
Classification:  Image reconstruction,  linear constraitns,  logically consistent inference,  minimum discrimination information,  nonlinear projection,  nonsymmetric distance,  selection rules,  62A99,  68T01,  94A17,  92C55
@article{1176348385,
     author = {Csiszar, Imre},
     title = {Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems},
     journal = {Ann. Statist.},
     volume = {19},
     number = {1},
     year = {1991},
     pages = { 2032-2066},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176348385}
}
Csiszar, Imre. Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems. Ann. Statist., Tome 19 (1991) no. 1, pp.  2032-2066. http://gdmltest.u-ga.fr/item/1176348385/