Rates of Convergence of Minimum Distance Estimators and Kolmogorov's Entropy
Yatracos, Yannis G.
Ann. Statist., Tome 13 (1985) no. 1, p. 768-774 / Harvested from Project Euclid
Let $(\mathscr{X, A})$ be a space with a $\sigma$-field, $M = \{P_s; s \in \Theta\}$ be a family of probability measures on $\mathscr{A}$ with $\Theta$ arbitrary, $X_1, \cdots, X_n$ i.i.d. observations on $P_\theta.$ Define $\mu_n(A) = (1/n) \sum^n_{i = 1} I_A(X_i),$ the empirical measure indexed by $A \in \mathscr{A}.$ Assume $\Theta$ is totally bounded when metrized by the $L_1$ distance between measures. Robust minimum distance estimators $\hat{\theta}_n$ are constructed for $\theta$ and the resulting rate of convergence is shown naturally to depend on an entropy function for $\Theta$.
Publié le : 1985-06-14
Classification:  Minimum distance estimation,  rates of convergence,  Kolmogorov's entropy,  density estimation,  62G05,  62G30
@article{1176349553,
     author = {Yatracos, Yannis G.},
     title = {Rates of Convergence of Minimum Distance Estimators and Kolmogorov's Entropy},
     journal = {Ann. Statist.},
     volume = {13},
     number = {1},
     year = {1985},
     pages = { 768-774},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176349553}
}
Yatracos, Yannis G. Rates of Convergence of Minimum Distance Estimators and Kolmogorov's Entropy. Ann. Statist., Tome 13 (1985) no. 1, pp.  768-774. http://gdmltest.u-ga.fr/item/1176349553/