Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression
Nicoleris, Theodoros ; Yatracos, Yannis G.
Ann. Statist., Tome 25 (1997) no. 6, p. 2493-2511 / Harvested from Project Euclid
$L_1$-optimal minimum distance estimators are provided for a projection pursuit regression type function with smooth functional components that are either additive or multiplicative, in the presence of or without interactions. The obtained rates of convergence of the estimate to the true parameter depend on Kolmogorov's entropy of the assumed model and confirm Stone's heuristic dimensionality reduction principle. Rates of convergence are also obtained for the error in estimating the derivatives of a regression type function.
Publié le : 1997-12-14
Classification:  Nonparametric regression,  optimal rates of convergence,  Kolmogorov's entropy,  Hoeffding's inequality,  dimensionality reduction principle,  additive and multiplicative regression,  projection pursuit,  interactions,  model selection,  62J02,  62G20,  62G05,  62G30
@article{1030741082,
     author = {Nicoleris, Theodoros and Yatracos, Yannis G.},
     title = {Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression},
     journal = {Ann. Statist.},
     volume = {25},
     number = {6},
     year = {1997},
     pages = { 2493-2511},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1030741082}
}
Nicoleris, Theodoros; Yatracos, Yannis G. Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression. Ann. Statist., Tome 25 (1997) no. 6, pp.  2493-2511. http://gdmltest.u-ga.fr/item/1030741082/