Let Y be a Gaussian vector whose components are independent with a common unknown variance. We consider the problem of estimating the mean μ of Y by model selection. More precisely, we start with a collection $\mathcal{S}=\{S_{m},m\in\mathcal{M}\}$ of linear subspaces of ℝn and associate to each of these the least-squares estimator of μ on Sm. Then, we use a data driven penalized criterion in order to select one estimator among these. Our first objective is to analyze the performance of estimators associated to classical criteria such as FPE, AIC, BIC and AMDL. Our second objective is to propose better penalties that are versatile enough to take into account both the complexity of the collection $\mathcal{S}$ and the sample size. Then we apply those to solve various statistical problems such as variable selection, change point detections and signal estimation among others. Our results are based on a nonasymptotic risk bound with respect to the Euclidean loss for the selected estimator. Some analogous results are also established for the Kullback loss.
Publié le : 2009-04-15
Classification:
Model selection,
penalized criterion,
AIC,
FPE,
BIC,
AMDL,
variable selection,
change-points detection,
adaptive estimation,
62G08
@article{1236693145,
author = {Baraud, Yannick and Giraud, Christophe and Huet, Sylvie},
title = {Gaussian model selection with an unknown variance},
journal = {Ann. Statist.},
volume = {37},
number = {1},
year = {2009},
pages = { 630-672},
language = {en},
url = {http://dml.mathdoc.fr/item/1236693145}
}
Baraud, Yannick; Giraud, Christophe; Huet, Sylvie. Gaussian model selection with an unknown variance. Ann. Statist., Tome 37 (2009) no. 1, pp. 630-672. http://gdmltest.u-ga.fr/item/1236693145/