The delete-1 jackknife is known to give inconsistent variance estimators for nonsmooth estimators such as the sample quantiles. This well-known deficiency can be rectified by using a more general jackknife with $d$, the number of observations deleted, depending on a smoothness measure of the point estimator. Our general theory explains why jackknife works or fails. It also shows that (i) for "sufficiently smooth" estimators, the jackknife variance estimators with bounded $d$ are consistent and asymptotically unbiased and (ii) for "nonsmooth" estimators, $d$ has to go to infinity at a rate explicitly determined by a smoothness measure to ensure consistency and asymptotic unbiasedness. Improved results are obtained for several classes of estimators. In particular, for the sample $p$-quantiles, the jackknife variance estimators with $d$ satisfying $n^{1/2}/d \rightarrow 0$ and $n - d \rightarrow \infty$ are consistent and asymptotically unbiased.
Publié le : 1989-09-14
Classification:
Asymptotic unbiasedness,
balanced subsampling,
consistency,
Frechet differentiability,
grouped jackknife,
$L$-estimator,
$M$-estimator,
sample quantile,
smoothness of an estimator,
$U$-statistic,
von Mises expansion,
62G05,
62E20,
62G99
@article{1176347263,
author = {Shao, Jun and Wu, C. F. J.},
title = {A General Theory for Jackknife Variance Estimation},
journal = {Ann. Statist.},
volume = {17},
number = {1},
year = {1989},
pages = { 1176-1197},
language = {en},
url = {http://dml.mathdoc.fr/item/1176347263}
}
Shao, Jun; Wu, C. F. J. A General Theory for Jackknife Variance Estimation. Ann. Statist., Tome 17 (1989) no. 1, pp. 1176-1197. http://gdmltest.u-ga.fr/item/1176347263/