Estimation of the Last Mean of a Monotone Sequence
Cohen, Arthur ; Sackrowitz, Harold B.
Ann. Math. Statist., Tome 41 (1970) no. 6, p. 2021-2034 / Harvested from Project Euclid
The problem of estimating the larger of two translation parameters, when it is not known which population has the larger parameter, was studied by Blumenthal and Cohen (1968a). Blumenthal and Cohen (1968b) have also investigated various estimators for the larger of two means of two independent normal populations. Problems of estimating the largest of a set of ordered parameters, when it is known which populations correspond to each ordered parameter, have been studied in the discrete case by Sackrowitz (1969). Some questions for the continuous case where one does not know which populations correspond to each ordered parameter have been studied by Dudewicz (1969). In this paper we study various problems of estimating the largest of a set of ordered parameters, when it is known which populations correspond to each ordered parameter. The observed random variables are either normally distributed or are continuous and characterized by a translation parameter. The main portion of the study is devoted to estimating the larger of two normal means when we know which population has the larger mean. Note that in one result below, an example of an estimator which is admissible with respect to a convex loss function, but which is not generalized Bayes is given. We proceed to state the models and list the results. Let $X_i, i = 1, 2$, be independent normal random variables with means $\theta_i$, and known variances. Without loss of generality we let the variance of $X_1$ be $\tau$ and the variance of $X_2$ be 1. Assume $\theta_2 \geqq \theta_1$, and consider the problem of estimating $\theta_2$ with respect to a squared error loss function. Let $\delta(X_2)$ be any estimator based on $X_2$ alone. Consider only those $\delta(X_2)$ which are admissible for estimating $\theta_2$ when $X_1$ is not observed. The following results are obtained. (1) If the risk of $\delta(X_2)$ is bounded, then $\delta(X_2)$ is inadmissible. This result can be generalized in a few directions. In fact if $\theta_i$ are translation parameters of identical symmetric densities, then for any nonnegative strictly convex loss function $W(\cdot)$, with a minimum at $0, X_2$ is an inadmissible estimator. Suitable generalizations for arbitrary sample sizes are given. Another generalization is that if $C$ is any positive constant, then $X_2 \pm C$ is inadmissible as a confidence interval of $\theta_2$. (2) Let $U_\tau$ be the positive solution to the equation $a^2 + (\tau + 1)a - \tau = 0$. The quantity $U_\tau$ will be such that, $0 \leqq U_\tau < 1$. Then the estimators $aX_2$, for $0 \leqq a < U_\tau$, are admissible. It will be shown that no $\delta(X_2)$, such that $\delta(X_2)$ is unbounded below, can be generalized Bayes. Thus this result provides an example of an estimator which is not generalized Bayes, but which is admissible for the squared error loss function. The results above are also true for estimating the largest of $k$ ordered means with known order, in the case of equal variances. It is interesting that for some $a > 0, aX_k$ is admissible, regardless of the size of $k$. The proof of admissibility of $aX_2$ uses the methods of Blyth (1951) and Farrel (1968). (3) Consider the analogue of the Pitman estimator. That is, the estimator which is generalized Bayes with respect to the uniform prior on the space $\theta_2 \geqq \theta_1$. We prove that this estimator is admissible and minimax. In the next section the inadmissibility of $\delta(X_2)$, given that the risk of $\delta(X_2)$ is bounded, is proved. In Section 3 we prove $aX_2$ admissible for $0 \leqq a < U_\tau$. We also extend the result here to the case of estimating the largest of $k$ ordered means with known order, in the case of equal variances. In Section 4 we show that if $\delta(X_2)$ is unbounded below then it cannot be generalized Bayes. Generalizations to the symmetric translation case and arbitrary but equal sample sizes are given in Section 5. The confidence interval result is in Section 6 and the final results on the admissible and minimax property of the analogue to the Pitman estimator are given in Section 7. Throughout, the letters $C, K$, and $M$ with or without subscripts are used to denote positive constants, not necessarily the same in all cases. Also the symbols $\varphi$ and $\Phi$ are used to denote the probability density function and cumulative distribution function respectively, of the standard normal.
Publié le : 1970-12-14
Classification: 
@article{1177696702,
     author = {Cohen, Arthur and Sackrowitz, Harold B.},
     title = {Estimation of the Last Mean of a Monotone Sequence},
     journal = {Ann. Math. Statist.},
     volume = {41},
     number = {6},
     year = {1970},
     pages = { 2021-2034},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177696702}
}
Cohen, Arthur; Sackrowitz, Harold B. Estimation of the Last Mean of a Monotone Sequence. Ann. Math. Statist., Tome 41 (1970) no. 6, pp.  2021-2034. http://gdmltest.u-ga.fr/item/1177696702/