A Lower Bound on the Error in Nonparametric Regression Type Problems
Yatracos, Yannis G.
Ann. Statist., Tome 16 (1988) no. 1, p. 1180-1187 / Harvested from Project Euclid
Let $(X_1, Y_1), \cdots, (X_n, Y_n)$ be a sample, denote the conditional density of $Y_i\mid X_i = x_i$ as $f(y\mid x_i, \theta(x_i))$ and $\theta$ an element of a metric space $(\Theta, d)$. A lower bound is provided for the $d$-error in estimating $\theta$. The order of the bound depends on the local behavior of the Kullback information of the conditional density. As an application, we consider the case where $\Theta$ is the space of $q$-smooth functions on $\lbrack 0, 1 \rbrack^d$ metrized with the $L_r$ distance, $1 \leq r < \infty$.
Publié le : 1988-09-14
Classification:  Nonparametric regression,  lower bound on minimax risk,  lower bound of loss in probability,  optimal rates of convergence,  Kullback information,  62G20
@article{1176350954,
     author = {Yatracos, Yannis G.},
     title = {A Lower Bound on the Error in Nonparametric Regression Type Problems},
     journal = {Ann. Statist.},
     volume = {16},
     number = {1},
     year = {1988},
     pages = { 1180-1187},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176350954}
}
Yatracos, Yannis G. A Lower Bound on the Error in Nonparametric Regression Type Problems. Ann. Statist., Tome 16 (1988) no. 1, pp.  1180-1187. http://gdmltest.u-ga.fr/item/1176350954/