In a recent paper, C. F. J. Wu showed that the jackknife estimator of a distribution function has optimal convergence rate $O(n^{-1/2})$, where $n$ denotes the sample size. This rate is achieved by retaining $O(n)$ data values from the original sample during the jackknife algorithm. Wu's result is particularly important since it permits a direct comparison of jackknife and bootstrap methods for distribution estimation. In the present paper we show that a very simple, nonempirical modification of the jackknife estimator improves the convergence rate from $O(n^{-1/2})$ to $O(n^{-5/6})$, and that this rate may be achieved by retaining only $O(n^{2/3})$ data values from the original sample. Our technique consists of mixing the jackknife distribution estimator with the standard normal distribution in an appropriate proportion. The convergence rate of $O(n^{-5/6})$ makes the jackknife significantly more competitive with the bootstrap, which enjoys a convergence rate of $O(n^{-1})$ in this particular problem.
Publié le : 1993-09-14
Classification:
Asymptotic normality,
bootstrap,
convergence rate,
distribution estimation,
Edgeworth expansion,
jackknife,
mixture,
sampling without replacement,
62G05,
62D05
@article{1176349268,
author = {Booth, James G. and Hall, Peter},
title = {An Improvement of the Jackknife Distribution Function Estimator},
journal = {Ann. Statist.},
volume = {21},
number = {1},
year = {1993},
pages = { 1476-1485},
language = {en},
url = {http://dml.mathdoc.fr/item/1176349268}
}
Booth, James G.; Hall, Peter. An Improvement of the Jackknife Distribution Function Estimator. Ann. Statist., Tome 21 (1993) no. 1, pp. 1476-1485. http://gdmltest.u-ga.fr/item/1176349268/