We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular.
@article{1123250221,
author = {Bartlett, Peter L. and Bousquet, Olivier and Mendelson, Shahar},
title = {Local Rademacher complexities},
journal = {Ann. Statist.},
volume = {33},
number = {1},
year = {2005},
pages = { 1497-1537},
language = {en},
url = {http://dml.mathdoc.fr/item/1123250221}
}
Bartlett, Peter L.; Bousquet, Olivier; Mendelson, Shahar. Local Rademacher complexities. Ann. Statist., Tome 33 (2005) no. 1, pp. 1497-1537. http://gdmltest.u-ga.fr/item/1123250221/