Generalizations of a Gaussian Theorem
Dwyer, Paul S.
Ann. Math. Statist., Tome 29 (1958) no. 4, p. 106-117 / Harvested from Project Euclid
Plackett [1] has discussed the history and generalizations of the Gaussian theorem which states that least squares estimates are linear unbiased estimates with minimum variance. General forms of the theorem are due to Aitken [2], [3] and Rao [4], [5]. The essence of the proof for Aitken's general case consists in minimizing, simultaneously, certain quadratic forms involving linear combinations of the parameters. Plackett derived Aitken's result by using a matrix relation. The proof of the theorem follows quickly once the relation is established. A somewhat similar but simpler matrix relation is used by Rao ([4], page 10). Aitken [2] and Rao [4], [5] obtain minimum variance with the use of Lagrange multipliers. Unless one has a method of working with matrices of derivatives it seems necessary to differentiate with respect to the many scalars constituting the matrices and to assemble the results in desired matrix form. Authors frequently give only the assembled results ([4], page 10, [5], page 17, [6], page 83). The question arises as to whether it is possible to use the logically preferable matrix derivative methods of minimization. It is shown below that the use of matrices of partial derivatives [7] leads logically to the solution without the necessity of changing to and from scalar notation, or without the necessity of establishing some relation which implicitly contains the solution. Matrix derivative methods seem to be preferable methods for undertaking solutions of problems of simultaneous matrix minimization with side conditions for the same reason that derivative methods are preferable to the use of some (unknown) relation in solving problems of minimization involving scalars. They may also be used in establishing the relation which may then be verified without their use. The paper includes generalizations of the results of Aitken [2], [3], Rao [4], [5], and David and Neyman [8]. It gives a general formula for simultaneous unbiased estimators of linear functions of parameters when the parameters are subject to linear restrictions and shows how the results are applicable to special cases. It provides formulas for the variance matrix of these estimators. It generalizes a matrix relation used by Plackett [1]. It used the matrix square root transformation in establishing the general result for the variance of (weighted) residuals when there may be linear restrictions on the parameters. It provides a generalization of a formula of David and Neyman [8] in estimating the variance matrix of the unbiased linear estimators.
Publié le : 1958-03-14
Classification: 
@article{1177706708,
     author = {Dwyer, Paul S.},
     title = {Generalizations of a Gaussian Theorem},
     journal = {Ann. Math. Statist.},
     volume = {29},
     number = {4},
     year = {1958},
     pages = { 106-117},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177706708}
}
Dwyer, Paul S. Generalizations of a Gaussian Theorem. Ann. Math. Statist., Tome 29 (1958) no. 4, pp.  106-117. http://gdmltest.u-ga.fr/item/1177706708/