Kernel dimension reduction in regression
Fukumizu, Kenji ; Bach, Francis R. ; Jordan, Michael I.
Ann. Statist., Tome 37 (2009) no. 1, p. 1871-1905 / Harvested from Project Euclid
We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives directly from the formulation of SDR in terms of the conditional independence of the covariate X from the response Y, given the projection of X on the central subspace [cf. J. Amer. Statist. Assoc. 86 (1991) 316–342 and Regression Graphics (1998) Wiley]. We show that this conditional independence assertion can be characterized in terms of conditional covariance operators on reproducing kernel Hilbert spaces and we show how this characterization leads to an M-estimator for the central subspace. The resulting estimator is shown to be consistent under weak conditions; in particular, we do not have to impose linearity or ellipticity conditions of the kinds that are generally invoked for SDR methods. We also present empirical results showing that the new methodology is competitive in practice.
Publié le : 2009-08-15
Classification:  Dimension reduction,  regression,  positive definite kernel,  reproducing kernel,  consistency,  62H99,  62J02
@article{1245332835,
     author = {Fukumizu, Kenji and Bach, Francis R. and Jordan, Michael I.},
     title = {Kernel dimension reduction in regression},
     journal = {Ann. Statist.},
     volume = {37},
     number = {1},
     year = {2009},
     pages = { 1871-1905},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1245332835}
}
Fukumizu, Kenji; Bach, Francis R.; Jordan, Michael I. Kernel dimension reduction in regression. Ann. Statist., Tome 37 (2009) no. 1, pp.  1871-1905. http://gdmltest.u-ga.fr/item/1245332835/