In this article, model selection via penalized empirical loss minimization in nonparametric classification problems is studied. Data-dependent penalties are constructed, which are based on estimates of the complexity of a small subclass of each model class, containing only those functions with small empirical loss. The penalties are novel since those considered in the literature are typically based on the entire model class. Oracle inequalities using these penalties are established, and the advantage of the new penalties over those based on the complexity of the whole model class is demonstrated.
@article{1091626183,
author = {Lugosi, G\'abor and Wegkamp, Marten},
title = {Complexity regularization via localized random penalties},
journal = {Ann. Statist.},
volume = {32},
number = {1},
year = {2004},
pages = { 1679-1697},
language = {en},
url = {http://dml.mathdoc.fr/item/1091626183}
}
Lugosi, Gábor; Wegkamp, Marten. Complexity regularization via localized random penalties. Ann. Statist., Tome 32 (2004) no. 1, pp. 1679-1697. http://gdmltest.u-ga.fr/item/1091626183/