Gibbs posterior for variable selection in high-dimensional classification and data mining
Jiang, Wenxin ; Tanner, Martin A.
Ann. Statist., Tome 36 (2008) no. 1, p. 2207-2231 / Harvested from Project Euclid
In the popular approach of “Bayesian variable selection” (BVS), one uses prior and posterior distributions to select a subset of candidate variables to enter the model. A completely new direction will be considered here to study BVS with a Gibbs posterior originating in statistical mechanics. The Gibbs posterior is constructed from a risk function of practical interest (such as the classification error) and aims at minimizing a risk function without modeling the data probabilistically. This can improve the performance over the usual Bayesian approach, which depends on a probability model which may be misspecified. Conditions will be provided to achieve good risk performance, even in the presence of high dimensionality, when the number of candidate variables “K” can be much larger than the sample size “n.” In addition, we develop a convenient Markov chain Monte Carlo algorithm to implement BVS with the Gibbs posterior.
Publié le : 2008-10-15
Classification:  Data augmentation,  data mining,  Gibbs posterior,  high-dimensional data,  linear classification,  Markov chain Monte Carlo,  prior distribution,  risk performance,  sparsity,  variable selection,  62F99,  82-08
@article{1223908090,
     author = {Jiang, Wenxin and Tanner, Martin A.},
     title = {Gibbs posterior for variable selection in high-dimensional classification and data mining},
     journal = {Ann. Statist.},
     volume = {36},
     number = {1},
     year = {2008},
     pages = { 2207-2231},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1223908090}
}
Jiang, Wenxin; Tanner, Martin A. Gibbs posterior for variable selection in high-dimensional classification and data mining. Ann. Statist., Tome 36 (2008) no. 1, pp.  2207-2231. http://gdmltest.u-ga.fr/item/1223908090/