For a regression model $y_i = \theta(x_i) + \varepsilon_i$, the unknown function $\theta$ is estimated by least squares on a subspace $\Lambda_m = \operatorname{span}\{\psi_1, \psi, \cdots, \psi_m\}$, where the basis functions $\psi_i$ are predetermined and $m$ is varied. Assuming that the design is suitably approximated by an asymptotic design measure, a general method is presented for approximating the bias and variance in a scale of Hilbertian norms natural to the problem. The general theory is illustrated with two examples: truncated Fourier series regression and polynomial regression. For these examples, we give rates of convergence of derivative estimates in (weighted) $L_2$ norms and establish consistency in supremum norm.
Publié le : 1988-06-14
Classification:
Regression,
nonparametric regression,
bias approximation,
polynomial regression,
model selection,
rates of convergence,
orthogonal polynomials,
62J05,
62F12,
41A10
@article{1176350830,
author = {Cox, Dennis D.},
title = {Approximation of Least Squares Regression on Nested Subspaces},
journal = {Ann. Statist.},
volume = {16},
number = {1},
year = {1988},
pages = { 713-732},
language = {en},
url = {http://dml.mathdoc.fr/item/1176350830}
}
Cox, Dennis D. Approximation of Least Squares Regression on Nested Subspaces. Ann. Statist., Tome 16 (1988) no. 1, pp. 713-732. http://gdmltest.u-ga.fr/item/1176350830/