A reproducing kernel Hilbert space approach to functional linear regression
Yuan, Ming ; Cai, T. Tony
Ann. Statist., Tome 38 (2010) no. 1, p. 3412-3444 / Harvested from Project Euclid
We study in this paper a smoothness regularization method for functional linear regression and provide a unified treatment for both the prediction and estimation problems. By developing a tool on simultaneous diagonalization of two positive definite kernels, we obtain shaper results on the minimax rates of convergence and show that smoothness regularized estimators achieve the optimal rates of convergence for both prediction and estimation under conditions weaker than those for the functional principal components based methods developed in the literature. Despite the generality of the method of regularization, we show that the procedure is easily implementable. Numerical results are obtained to illustrate the merits of the method and to demonstrate the theoretical developments.
Publié le : 2010-12-15
Classification:  Covariance,  eigenfunction,  eigenvalue,  functional linear regression,  minimax,  optimal convergence rate,  principal component analysis,  reproducing kernel Hilbert space,  Sacks–Ylvisaker conditions,  simultaneous diagonalization,  slope function,  Sobolev space,  62J05,  62G20
@article{1291126962,
     author = {Yuan, Ming and Cai, T. Tony},
     title = {A reproducing kernel Hilbert space approach to functional linear regression},
     journal = {Ann. Statist.},
     volume = {38},
     number = {1},
     year = {2010},
     pages = { 3412-3444},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1291126962}
}
Yuan, Ming; Cai, T. Tony. A reproducing kernel Hilbert space approach to functional linear regression. Ann. Statist., Tome 38 (2010) no. 1, pp.  3412-3444. http://gdmltest.u-ga.fr/item/1291126962/