A Finite Sample Distribution-Free Performance Bound for Local Discrimination Rules
Rogers, W. H. ; Wagner, T. J.
Ann. Statist., Tome 6 (1978) no. 1, p. 506-514 / Harvested from Project Euclid
In the discrimination problem the random variable $\theta$, known to take values in $\{1,\cdots, M\}$, is estimated from the random vector $X$. All that is known about the joint distribution of $(X, \theta)$ is that which can be inferred from a sample $(X_1, \theta_1),\cdots, (X_n, \theta_n)$ of size $n$ drawn from that distribution. A discrimination rule is any procedure which determines a decision $\hat{\theta}$ for $\theta$ from $X$ and $(X_1, \theta_1),\cdots, (X_n, \theta_n)$. A rule is called $k$-local if the decision $\hat{\theta}$ depends only on $X$ and the pairs $(X_i, \theta_i)$ for which $X_i$ is one of the $k$-closest to $X$ from $X_1,\cdots, X_n$. It is shown that for any $k$-local discrimination rule, the mean-square difference between the probability of error for the rule and its deleted estimate is bounded by $A/n$ where $A$ is an explicitly given small constant which depends only on $M$ and $k$. Thus distribution-free confidence intervals can be placed about probability of error estimates for $k$-local discrimination rules.
Publié le : 1978-05-14
Classification:  Discrimination,  distribution-free bound,  local inference,  nearest neighbor rules,  finite sample bound,  deleted estimate,  error rate estimate,  62H30,  62G05,  62G15
@article{1176344196,
     author = {Rogers, W. H. and Wagner, T. J.},
     title = {A Finite Sample Distribution-Free Performance Bound for Local Discrimination Rules},
     journal = {Ann. Statist.},
     volume = {6},
     number = {1},
     year = {1978},
     pages = { 506-514},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176344196}
}
Rogers, W. H.; Wagner, T. J. A Finite Sample Distribution-Free Performance Bound for Local Discrimination Rules. Ann. Statist., Tome 6 (1978) no. 1, pp.  506-514. http://gdmltest.u-ga.fr/item/1176344196/