A Risk Bound in Sobolev Class Regression
Golubev, Grigori K. ; Nussbaum, Michael
Ann. Statist., Tome 18 (1990) no. 1, p. 758-778 / Harvested from Project Euclid
For nonparametric regression estimation, when the unknown function belongs to a Sobolev smoothness class, sharp risk bounds for integrated mean square error have been found recently which improve on optimal rates of convergence results. The key to these has been the fact that under normality of the errors, the minimax linear estimator is asymptotically minimax in the class of all estimators. We extend this result to the nonnormal case, when the noise distribution is unknown. The pertaining lower asymptotic risk bound is established, based on an analogy with a location model in the independent identically distributed case. Attainment of the bound and its relation to adaptive optimal smoothing are discussed.
Publié le : 1990-06-14
Classification:  Nonparametric regression,  asymptotic minimax $L_2$ risk,  smoothness ellipsoid,  location model,  shrinking Hellinger neighborhoods,  adaptive bandwidth choice,  experimental design,  robust smoothing,  62G20,  62G05,  62C20
@article{1176347624,
     author = {Golubev, Grigori K. and Nussbaum, Michael},
     title = {A Risk Bound in Sobolev Class Regression},
     journal = {Ann. Statist.},
     volume = {18},
     number = {1},
     year = {1990},
     pages = { 758-778},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176347624}
}
Golubev, Grigori K.; Nussbaum, Michael. A Risk Bound in Sobolev Class Regression. Ann. Statist., Tome 18 (1990) no. 1, pp.  758-778. http://gdmltest.u-ga.fr/item/1176347624/