Optimal aggregation of classifiers in statistical learning
Tsybakov, Alexander B.
Ann. Statist., Tome 32 (2004) no. 1, p. 135-166 / Harvested from Project Euclid
Classification can be considered as nonparametric estimation of sets, where the risk is defined by means of a specific distance between sets associated with misclassification error. It is shown that the rates of convergence of classifiers depend on two parameters: the complexity of the class of candidate sets and the margin parameter. The dependence is explicitly given, indicating that optimal fast rates approaching $O(n^{-1})$ can be attained, where n is the sample size, and that the proposed classifiers have the property of robustness to the margin. The main result of the paper concerns optimal aggregation of classifiers: we suggest a classifier that automatically adapts both to the complexity and to the margin, and attains the optimal fast rates, up to a logarithmic factor.
Publié le : 2004-02-14
Classification:  Classification,  statistical learning,  aggregation of classifiers,  optimal rates,  empirical processes,  margins,  complexity of classes of sets,  62G07,  62G08,  62H30,  68T10
@article{1079120131,
     author = {Tsybakov, Alexander B.},
     title = {Optimal aggregation of classifiers in statistical learning},
     journal = {Ann. Statist.},
     volume = {32},
     number = {1},
     year = {2004},
     pages = { 135-166},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1079120131}
}
Tsybakov, Alexander B. Optimal aggregation of classifiers in statistical learning. Ann. Statist., Tome 32 (2004) no. 1, pp.  135-166. http://gdmltest.u-ga.fr/item/1079120131/