On Minimax Estimation in the Presence of Side Information About Remote Data
Ahlswede, R. ; Burnashev, M. V.
Ann. Statist., Tome 18 (1990) no. 1, p. 141-171 / Harvested from Project Euclid
We analyze the following model: One person, called "helper" observes an outcome $x^n = (x_1, \cdots, x_n) \in \mathscr{X}^n$ of the sequence $X^n = (X_1, \cdots, X_n)$ of i.i.d. RV's and the statistician gets a sample $y^n = (y_1, \cdots, y_n)$ of the sequence $Y^n(\theta, x^n)$ of RV's with a density $\prod^n_{t = 1} f(y_t \mid \theta, x_t)$. The helper can give some (side) information about $x^n$ to the statistician via an encoding function $s_n: \mathscr{X}^n \rightarrow \mathbb{N}$ with rate($s_n)^{def}{=}(1/n)\log {\tt\#}$ range($s_n) \leq R$. Based on the knowledge of $s_n(x^n)$ and $y^n$ the statistician tries to estimate $\theta$ by an estimator $\hat{\theta}_n$. For the maximal mean square error $e_n(R) =^{def} \inf_{\hat\theta_n} \inf_{s_n: \text{rate}}(s_n) \leq R \sup_{\theta \in \Theta} E_\theta|\hat{\theta}_n - \theta|^2$ we establish a Cramer-Rao type bound and, in case of a finite $\mathscr{X}$, prove asymptotic achievability of this bound under certain conditions. The proof involves a nonobvious combination of results (some of which are novel) for both coding and estimation.
Publié le : 1990-03-14
Classification:  Side information,  Cramer-Rao-type inequality,  efficiency,  multiuser source coding,  information measures,  62A99,  62F12,  94A15,  62N99
@article{1176347496,
     author = {Ahlswede, R. and Burnashev, M. V.},
     title = {On Minimax Estimation in the Presence of Side Information About Remote Data},
     journal = {Ann. Statist.},
     volume = {18},
     number = {1},
     year = {1990},
     pages = { 141-171},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1176347496}
}
Ahlswede, R.; Burnashev, M. V. On Minimax Estimation in the Presence of Side Information About Remote Data. Ann. Statist., Tome 18 (1990) no. 1, pp.  141-171. http://gdmltest.u-ga.fr/item/1176347496/