Gauss-Markov Estimation for Multivariate Linear Models: A Coordinate Free Approach
Eaton, Morris L.
Ann. Math. Statist., Tome 41 (1970) no. 6, p. 528-538 / Harvested from Project Euclid
The coordinate free (geometric) approach to univariate linear models has added both insight and understanding to the problems of Gauss Markov (GM) estimation and hypothesis testing. One of the initial papers emphasizing the geometric aspects of univariate linear models is Kruskal's (1961). The coordinate free approach is used in this paper to treat GM estimation in a multivariate analysis context. In contrast to the univariate situation, a central question for multivariate linear models is the existence of GM estimates. Of course, it is the more complicated covariance structure in the multivariate case that creates the concern over the existence of GM estimates. As the emphasis is on GM estimation, first and second moment assumptions (as opposed to distributional assumptions) play the key role. Classical results for the univariate linear model are outlined in Section 1. In addition, a recent theorem due to Kruskal (1968) concerning the equality of GM and Least Squares (LS) estimates is discussed. A minor modification of Kruskal's result gives a very useful, necessary and sufficient condition for the existence of GM estimators for arbitrary covariance structures and a fixed regression manifold. In Section 2, the outer product of two vectors and the Kronecker product of linear transformations is discussed and applied to describe the covariance structure of a random matrix. This application includes the case of a random sample from a multivariate population with covariance matrix $\Sigma > 0 ("\Sigma > 0"$ means that $\Sigma$ is positive definite). The question of GM estimation in the standard multivariate linear model is taken up in Section 3. This model is described as follows: a random matrix $Y: n \times p$, whose rows are uncorrelated and each row has a covariance matrix $\Sigma > 0$, is observed. The mean matrix of $Y, \mu$, is assumed to have the form $\mu = ZB$ where $Z: n \times q$ is known and of rank $q$, and $B: q \times p$ is a matrix of regression coefficients. For this model, GM estimators for $\mu$ and $B$ exist and are well known (see Anderson (1958) chapter 8). The main result in Section 3 establishes a converse to this classical result. Explicitly, let $Y$ have the covariance structure as above and assume $\Omega$ is a fixed regression manifold. It is shown that if a GM estimator for $\mu\in\Omega$ exists, then each element $\mu\in\Omega$ can be written as $\mu = ZB$ where $Z: n \times q$ is fixed and $B: q \times p$ ranges over all $q \times p$ real matrices. The results in Section 4 and Section 5 are similar to the main result of Section 3. A complete description of all regression manifolds for which GM estimators exist is given for two different kinds of covariance assumptions concerning $\Sigma$ ($\Sigma$ as above). In Section 4, it is assumed that $\Sigma$ has a block diagonal form with two blocks. Section 5 is concerned with the case when $\Sigma$ has the so-called intra-class correlation form.
Publié le : 1970-04-14
Classification: 
@article{1177697093,
     author = {Eaton, Morris L.},
     title = {Gauss-Markov Estimation for Multivariate Linear Models: A Coordinate Free Approach},
     journal = {Ann. Math. Statist.},
     volume = {41},
     number = {6},
     year = {1970},
     pages = { 528-538},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177697093}
}
Eaton, Morris L. Gauss-Markov Estimation for Multivariate Linear Models: A Coordinate Free Approach. Ann. Math. Statist., Tome 41 (1970) no. 6, pp.  528-538. http://gdmltest.u-ga.fr/item/1177697093/