Outperforming the Gibbs sampler empirical estimator for nearest-neighbor random fields
Greenwood, Priscilla E. ; McKeague, Ian W. ; Wefelmeyer, Wolfgang
Ann. Statist., Tome 24 (1996) no. 6, p. 1433-1456 / Harvested from Project Euclid
Given a Markov chain sampling scheme, does the standard empirical estimator make best use of the data? We show that this is not so and construct better estimators. We restrict attention to nearest-neighbor random fields and to Gibbs samplers with deterministic sweep, but our approach applies to any sampler that uses reversible variable-at-a-time updating with deterministic sweep. The structure of the transition distribution of the sampler is exploited to construct further empirical estimators that are combined with the standard empirical estimator to reduce asymptotic variance. The extra computational cost is negligible. When the random field is spatially homogeneous, symmetrizations of our estimator lead to further variance reduction. The performance of the estimators is evaluated in a simulation study of the Ising model.
Publié le : 1996-08-14
Classification:  Markov chain Monte Carlo,  Metropolis-Hastings algorithm,  asymptotic relative efficiency,  variance reduction,  Ising model,  parallel updating,  62M40,  65U05,  60J05,  62G20,  62M05
@article{1032298276,
     author = {Greenwood, Priscilla E. and McKeague, Ian W. and Wefelmeyer, Wolfgang},
     title = {Outperforming the Gibbs sampler empirical estimator for nearest-neighbor random fields},
     journal = {Ann. Statist.},
     volume = {24},
     number = {6},
     year = {1996},
     pages = { 1433-1456},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1032298276}
}
Greenwood, Priscilla E.; McKeague, Ian W.; Wefelmeyer, Wolfgang. Outperforming the Gibbs sampler empirical estimator for nearest-neighbor random fields. Ann. Statist., Tome 24 (1996) no. 6, pp.  1433-1456. http://gdmltest.u-ga.fr/item/1032298276/