Process consistency for AdaBoost
Jiang, Wenxin
Ann. Statist., Tome 32 (2004) no. 1, p. 13-29 / Harvested from Project Euclid
Recent experiments and theoretical studies show that AdaBoost can overfit in the limit of large time. If running the algorithm forever is suboptimal, a natural question is how low can the prediction error be during the process of AdaBoost? We show under general regularity conditions that during the process of AdaBoost a consistent prediction is generated, which has the prediction error approximating the optimal Bayes error as the sample size increases. This result suggests that, while running the algorithm forever can be suboptimal, it is reasonable to expect that some regularization method via truncation of the process may lead to a near-optimal performance for sufficiently large sample size.
Publié le : 2004-02-14
Classification:  AdaBoost,  Bayes error,  boosting,  consistency,  prediction error,  VC dimension,  62G99,  68T99
@article{1079120128,
     author = {Jiang, Wenxin},
     title = {Process consistency for AdaBoost},
     journal = {Ann. Statist.},
     volume = {32},
     number = {1},
     year = {2004},
     pages = { 13-29},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1079120128}
}
Jiang, Wenxin. Process consistency for AdaBoost. Ann. Statist., Tome 32 (2004) no. 1, pp.  13-29. http://gdmltest.u-ga.fr/item/1079120128/