Fisher Lecture: Dimension Reduction in Regression
Cook, R. Dennis
Statist. Sci., Tome 22 (2007) no. 1, p. 1-26 / Harvested from Project Euclid
Beginning with a discussion of R. A. Fisher’s early written remarks that relate to dimension reduction, this article revisits principal components as a reductive method in regression, develops several model-based extensions and ends with descriptions of general approaches to model-based and model-free dimension reduction in regression. It is argued that the role for principal components and related methodology may be broader than previously seen and that the common practice of conditioning on observed values of the predictors may unnecessarily limit the choice of regression methodology.
Publié le : 2007-02-14
Classification:  Central subspace,  Grassmann manifolds,  inverse regression,  minimum average variance estimation,  principal components,  principal fitted components,  sliced inverse regression,  sufficient dimension reduction
@article{1185975631,
     author = {Cook, R. Dennis},
     title = {Fisher Lecture: Dimension Reduction in Regression},
     journal = {Statist. Sci.},
     volume = {22},
     number = {1},
     year = {2007},
     pages = { 1-26},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1185975631}
}
Cook, R. Dennis. Fisher Lecture: Dimension Reduction in Regression. Statist. Sci., Tome 22 (2007) no. 1, pp.  1-26. http://gdmltest.u-ga.fr/item/1185975631/