Consider independent and normally distributed random variables $X_1,\cdots, X_n$ such that $0 < \operatorname{Var} X_i = \sigma^2; i = 1,\cdots, n$ and $E(X_1,\cdots, X_n)' = A'\beta$ where $A'$ is a known $n \times k$ matrix and $\beta = (\beta_1,\cdots, \beta_k)'$ is an unknown column matrix. (The prime denotes transposition.) The cases of known and totally unknown $\sigma^2$ are considered simultaneously. Denote the experiment obtained by observing $X_1,\cdots, X_n$ by $\mathscr{E}_A$. Let $A$ and $B$ be matrices of, respectively, dimensions $n_A \times k$ and $n_B \times k$. Then, if $\sigma^2$ is known, (if $\sigma^2$ is unknown) $\mathscr{E}_A$ is more informative than $\mathscr{E}_B$ if and only if $AA' - BB'$ is nonnegative definite (and $n_A \geqq n_B + \operatorname{rank} (AA' - BB'))$.