Nonparametric bootstrap prediction
Fushiki, Tadayoshi ; Komaki, Fumiyasu ; Aihara, Kazuyuki
Bernoulli, Tome 11 (2005) no. 1, p. 293-307 / Harvested from Project Euclid
Ensemble learning has recently been intensively studied in the field of machine learning. `Bagging' is a method of ensemble learning and uses bootstrap data to construct various predictors. The required prediction is then obtained by averaging the predictors. Harris proposed using this technique with the parametric bootstrap predictive distribution to construct predictive distributions, and showed that the parametric bootstrap predictive distribution gives asymptotically better prediction than a plug-in distribution with the maximum likelihood estimator. In this paper, we investigate nonparametric bootstrap predictive distributions. The nonparametric bootstrap predictive distribution is precisely that obtained by applying bagging to the statistical prediction problem. We show that the nonparametric bootstrap predictive distribution gives predictions asymptotically as good as the parametric bootstrap predictive distribution.
Publié le : 2005-04-14
Classification:  asymptotic theory,  bagging,  bootstrap predictive distribution,  information geometry,  Kullback-Leibler divergence
@article{1116340296,
     author = {Fushiki, Tadayoshi and Komaki, Fumiyasu and Aihara, Kazuyuki},
     title = {Nonparametric bootstrap prediction},
     journal = {Bernoulli},
     volume = {11},
     number = {1},
     year = {2005},
     pages = { 293-307},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1116340296}
}
Fushiki, Tadayoshi; Komaki, Fumiyasu; Aihara, Kazuyuki. Nonparametric bootstrap prediction. Bernoulli, Tome 11 (2005) no. 1, pp.  293-307. http://gdmltest.u-ga.fr/item/1116340296/