On the Bayes-risk consistency of regularized boosting methods
Lugosi, Gábor ; Vayatis, Nicolas
Ann. Statist., Tome 32 (2004) no. 1, p. 30-55 / Harvested from Project Euclid
The probability of error of classification methods based on convex combinations of simple base classifiers by "boosting" algorithms is investigated. The main result of the paper is that certain regularized boosting algorithms provide Bayes-risk consistent classifiers under the sole assumption that the Bayes classifier may be approximated by a convex combination of the base classifiers. Nonasymptotic distribution-free bounds are also developed which offer interesting new insight into how boosting works and help explain its success in practical classification problems.
Publié le : 2004-02-14
Classification:  Boosting,  classification,  Bayes-risk consistency,  penalized model selection,  smoothing parameter,  convex cost functions,  empirical processes,  60G99,  62C12,  62G99
@article{1079120129,
     author = {Lugosi, G\'abor and Vayatis, Nicolas},
     title = {On the Bayes-risk consistency of regularized boosting methods},
     journal = {Ann. Statist.},
     volume = {32},
     number = {1},
     year = {2004},
     pages = { 30-55},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1079120129}
}
Lugosi, Gábor; Vayatis, Nicolas. On the Bayes-risk consistency of regularized boosting methods. Ann. Statist., Tome 32 (2004) no. 1, pp.  30-55. http://gdmltest.u-ga.fr/item/1079120129/