Estimation up to a Change-Point
Foster, Dean P. ; George, Edward I.
Ann. Statist., Tome 21 (1993) no. 1, p. 625-644 / Harvested from Project Euclid
Consider the problem of estimating $\mu$, based on the observation of $Y_0, Y_1, \ldots, Y_n$, where it is assumed only that $Y_0, Y_1, \ldots, Y_\kappa \operatorname{iid} N(\mu, \sigma^2)$ for some unknown $\kappa$. Unlike the traditional change-point problem, the focus here is not on estimating $\kappa$, which is now a nuisance parameter. When it is known that $\kappa = k$, the sample mean $\bar{Y}_k = \sum^k_0Y_i/(k + 1)$, provides, in addition to wonderful efficiency properties, safety in the sense that it is minimax under squared error loss. Unfortunately, this safety breaks down when $\kappa$ is unknown; indeed if $k > \kappa$, the risk of $\bar{Y}_k$ is unbounded. To address this problem, a generalized minimax criterion is considered whereby each estimator is evaluated by its maximum risk under $Y_0, Y_1, \ldots, Y_\kappa \operatorname{iid} N(\mu, \sigma^2)$ for each possible value of $\kappa$. An essentially complete class under this criterion is obtained. Generalizations to other situations such as variance estimation are illustrated.
Publié le : 1993-06-14
Classification:  Change-point problems,  equivariance,  Hunt-Stein theorem,  minimax procedures,  risk,  pooling data,  62F10,  62C20,  62L12
@article{1176349141,
     author = {Foster, Dean P. and George, Edward I.},
     title = {Estimation up to a Change-Point},
     journal = {Ann. Statist.},
     volume = {21},
     number = {1},
     year = {1993},
     pages = { 625-644},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176349141}
}
Foster, Dean P.; George, Edward I. Estimation up to a Change-Point. Ann. Statist., Tome 21 (1993) no. 1, pp.  625-644. http://gdmltest.u-ga.fr/item/1176349141/