Fast rates for support vector machines using Gaussian kernels
Steinwart, Ingo ; Scovel, Clint
Ann. Statist., Tome 35 (2007) no. 1, p. 575-607 / Harvested from Project Euclid
For binary classification we establish learning rates up to the order of n−1 for support vector machines (SVMs) with hinge loss and Gaussian RBF kernels. These rates are in terms of two assumptions on the considered distributions: Tsybakov’s noise assumption to establish a small estimation error, and a new geometric noise condition which is used to bound the approximation error. Unlike previously proposed concepts for bounding the approximation error, the geometric noise assumption does not employ any smoothness assumption.
Publié le : 2007-04-14
Classification:  Support vector machines,  classification,  nonlinear discrimination,  learning rates,  noise assumption,  Gaussian RBF kernels,  68Q32,  62G20,  62G99,  68T05,  68T10,  41A46,  41A99
@article{1183667285,
     author = {Steinwart, Ingo and Scovel, Clint},
     title = {Fast rates for support vector machines using Gaussian kernels},
     journal = {Ann. Statist.},
     volume = {35},
     number = {1},
     year = {2007},
     pages = { 575-607},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1183667285}
}
Steinwart, Ingo; Scovel, Clint. Fast rates for support vector machines using Gaussian kernels. Ann. Statist., Tome 35 (2007) no. 1, pp.  575-607. http://gdmltest.u-ga.fr/item/1183667285/