Optimal Global Rates of Convergence for Nonparametric Regression
Stone, Charles J.
Ann. Statist., Tome 10 (1982) no. 1, p. 1040-1053 / Harvested from Project Euclid
Consider a $p$-times differentiable unknown regression function $\theta$ of a $d$-dimensional measurement variable. Let $T(\theta)$ denote a derivative of $\theta$ of order $m$ and set $r = (p - m)/(2p + d)$. Let $\hat{T}_n$ denote an estimator of $T(\theta)$ based on a training sample of size $n$, and let $\| \hat{T}_n - T(\theta)\|_q$ be the usual $L^q$ norm of the restriction of $\hat{T}_n - T(\theta)$ to a fixed compact set. Under appropriate regularity conditions, it is shown that the optimal rate of convergence for $\| \hat{T}_n - T(\theta)\|_q$ is $n^{-r}$ if $0 < q < \infty$; while $(n^{-1} \log n)^r$ is the optimal rate if $q = \infty$.
Publié le : 1982-12-14
Classification:  Optimal rate of convergence,  nonparametric regression,  62G20,  62G05
@article{1176345969,
     author = {Stone, Charles J.},
     title = {Optimal Global Rates of Convergence for Nonparametric Regression},
     journal = {Ann. Statist.},
     volume = {10},
     number = {1},
     year = {1982},
     pages = { 1040-1053},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176345969}
}
Stone, Charles J. Optimal Global Rates of Convergence for Nonparametric Regression. Ann. Statist., Tome 10 (1982) no. 1, pp.  1040-1053. http://gdmltest.u-ga.fr/item/1176345969/