For binary classification we establish learning rates up to the order of
$n^{-1}$ for support vector machines (SVMs) with hinge loss and Gaussian RBF
kernels. These rates are in terms of two assumptions on the considered
distributions: Tsybakov's noise assumption to establish a small estimation
error, and a new geometric noise condition which is used to bound the
approximation error. Unlike previously proposed concepts for bounding the
approximation error, the geometric noise assumption does not employ any
smoothness assumption.
@article{0708.1838,
author = {Steinwart, Ingo and Scovel, Clint},
title = {Fast rates for support vector machines using Gaussian kernels},
journal = {arXiv},
volume = {2007},
number = {0},
year = {2007},
language = {en},
url = {http://dml.mathdoc.fr/item/0708.1838}
}
Steinwart, Ingo; Scovel, Clint. Fast rates for support vector machines using Gaussian kernels. arXiv, Tome 2007 (2007) no. 0, . http://gdmltest.u-ga.fr/item/0708.1838/