Bootstrap Methods: Another Look at the Jackknife
Efron, B.
Ann. Statist., Tome 7 (1979) no. 1, p. 1-26 / Harvested from Project Euclid
We discuss the following problem: given a random sample $\mathbf{X} = (X_1, X_2, \cdots, X_n)$ from an unknown probability distribution $F$, estimate the sampling distribution of some prespecified random variable $R(\mathbf{X}, F)$, on the basis of the observed data $\mathbf{x}$. (Standard jackknife theory gives an approximate mean and variance in the case $R(\mathbf{X}, F) = \theta(\hat{F}) - \theta(F), \theta$ some parameter of interest.) A general method, called the "bootstrap," is introduced, and shown to work satisfactorily on a variety of estimation problems. The jackknife is shown to be a linear approximation method for the bootstrap. The exposition proceeds by a series of examples: variance of the sample median, error rates in a linear discriminant analysis, ratio estimation, estimating regression parameters, etc.
Publié le : 1979-01-14
Classification:  Jackknife,  bootstrap,  resampling,  subsample values,  nonparametric variance estimation,  error rate estimation,  discriminant analysis,  nonlinear regression,  62G05,  62G15,  62H30,  62J05
@article{1176344552,
     author = {Efron, B.},
     title = {Bootstrap Methods: Another Look at the Jackknife},
     journal = {Ann. Statist.},
     volume = {7},
     number = {1},
     year = {1979},
     pages = { 1-26},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176344552}
}
Efron, B. Bootstrap Methods: Another Look at the Jackknife. Ann. Statist., Tome 7 (1979) no. 1, pp.  1-26. http://gdmltest.u-ga.fr/item/1176344552/