Application of a Measure of Information to the Design and Comparison of Regression Experiments
Stone, M.
Ann. Math. Statist., Tome 30 (1959) no. 4, p. 55-70 / Harvested from Project Euclid
A normal regression experiment can be represented by \begin{equation*}\tag{1.1} Y_i = \sum_{j=1}^k X_{ij} \theta_j + \eta_i\qquad (i = 1, \cdots, n)\end{equation*} where $\{\eta_i/i = 1, \cdots, n\}$ is a set of normally distributed random variables with zero means and non-singular dispersion matrix $C, \theta = (\theta_1, \cdots, \theta_k)$ is the parameter-vector of interest and $X = (X_{ij})$ is a known $n \times k$ matrix which will be called the allocation matrix. The rows of $X$ will be called the allocation vectors. We denote the experiment by $\varepsilon(X, C)$. We assume that $C$ is known; generally it will be a function of $X, C(X)$. The particular realisation of $Y$ will be denoted $y$. The matrix $F = X'C^{-1}X$ is the Fisher-information-matrix of $\varepsilon(X, C)$. When $F$ is non-singular, one answer to the question "What information does $y$ give about $\theta$?" is to quote $F^{-1}$, the dispersion matrix of the maximum-likelihood-estimates of $\theta$. A strong argument in favour of this is that $F^{-1}$ is independent of both $\theta$ and $y$. The fact that it is independent of $\theta$ means that the answer is not "local"; the fact that it is independent of $y$ leads to simplicity. This approach is taken by Box and Hunter [1] in their work on rotatable designs. However, we must accept the fact that many experimenters wish to have a one-dimensional answer to the question i.e. we must associate with $\varepsilon(X, C)$ a single number which we call the "information". For instance Elfving [5] has developed the use of trace $F^{-1}$. In this paper we adopt the measure of information introduced by Lindley [7]. In Section 2 we generalise Lindley's treatment of the regression situation to include the singular case, explain the uses of the measure and compare it with that of Elfving. Section 3 deals with the analogue of Elfving's main theorem. Theorems 4.1 and 4.2 of Section 4 provide links with the traditional variance approach. In Section 5 we derive the asymptotic form of the measure as the $n$ of (1.1) increases and show that this form can be derived also from Neyman-Pearsonian theory. In Section 6 the influence of nuisance parameters is discussed and an analogue of a theorem of Chernoff [2] is established.
Publié le : 1959-03-14
Classification: 
@article{1177706359,
     author = {Stone, M.},
     title = {Application of a Measure of Information to the Design and Comparison of Regression Experiments},
     journal = {Ann. Math. Statist.},
     volume = {30},
     number = {4},
     year = {1959},
     pages = { 55-70},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177706359}
}
Stone, M. Application of a Measure of Information to the Design and Comparison of Regression Experiments. Ann. Math. Statist., Tome 30 (1959) no. 4, pp.  55-70. http://gdmltest.u-ga.fr/item/1177706359/