Adjustment by Minimum Discriminant Information
Haberman, Shelby J.
Ann. Statist., Tome 12 (1984) no. 1, p. 971-988 / Harvested from Project Euclid
Minimum discriminant information adjustment has primarily been used in the analysis of multinomial data; however, no such restriction is necessary. Let $P$ be a distribution on $R^a$, and let $\mathscr{C}$ be a convex set of distributions on $R^a$. Let $\mathbf{X}_i, 1 \leq i \leq n$, be independent and identically distributed observations with common distribution $P$. The minimum discriminant information adjustment (MDIA) of $P$ relative to $\mathscr{C}$ is the element $Q$ of $\mathscr{C}$ that is closest to $P$ in the sense of Kullback-Leibler discriminant information. If $\bar{P}_n$ is the empirical distribution of the $X_i, 1 \leq i \leq n$, and $\bar{Q}_n$ is the MDIA of $\bar{P}_n$ relative to $\mathscr{C}$, then $\bar{Q}_n$ is the maximum likelihood estimate in $\mathscr{C}$. Let $\mathscr{C}$ consist of distributions $A$ on $R^a$ such that $\int T dA = t$, where $T$ is a measurable transformation from $R^a$ to $R^b$ and $t \in R^b$. It is shown that under mild regularity conditions $\bar{Q}_n$ converges weakly to $Q$, the MDIA of the true $P$, with probability 1 and that $\bar{E}_n(D) = \int Dd\bar{Q}_n$ is an asymptotically normal and asymptotically unbiased estimate of $E(D) = \int D dQ$.
Publié le : 1984-09-14
Classification:  Consistency,  asymptotic normality,  empirical distribution,  weighting,  62E99,  62D05
@article{1176346715,
     author = {Haberman, Shelby J.},
     title = {Adjustment by Minimum Discriminant Information},
     journal = {Ann. Statist.},
     volume = {12},
     number = {1},
     year = {1984},
     pages = { 971-988},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176346715}
}
Haberman, Shelby J. Adjustment by Minimum Discriminant Information. Ann. Statist., Tome 12 (1984) no. 1, pp.  971-988. http://gdmltest.u-ga.fr/item/1176346715/