Admissible Selection of an Accurate and Parsimonious Normal Linear Regression Model
Stone, Charles J.
Ann. Statist., Tome 9 (1981) no. 1, p. 475-485 / Harvested from Project Euclid
Let $M_0$ be a normal linear regression model and let $M_1,\cdots, M_K$ be distinct proper linear submodels of $M_0$. Let $\hat k \in \{0,\cdots, K\}$ be a model selection rule based on observed data from the true model. Given $\hat k$, let the unknown parameters of the selected model $M_{\hat k}$ be fitted by the maximum likelihood method. A loss function is introduced which depends additively on two parts: (i) a measure of the difference between the fitted model $M_{\hat k}$ and the true model; and (ii) a measure $C_{\hat k}$ of the "complexity" of the selected model. A natural model selection rule $\bar{k}$, which minimizes an empirical version of this loss, is shown to be admissible and very nearly Bayes.
Publié le : 1981-05-14
Classification:  Admissibility,  normal linear regression model,  generalized Bayes,  parsimony,  complexity,  62J05,  62C15
@article{1176345452,
     author = {Stone, Charles J.},
     title = {Admissible Selection of an Accurate and Parsimonious Normal Linear Regression Model},
     journal = {Ann. Statist.},
     volume = {9},
     number = {1},
     year = {1981},
     pages = { 475-485},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176345452}
}
Stone, Charles J. Admissible Selection of an Accurate and Parsimonious Normal Linear Regression Model. Ann. Statist., Tome 9 (1981) no. 1, pp.  475-485. http://gdmltest.u-ga.fr/item/1176345452/