The least squares estimate of the parameter matrix $\mathbf{B}$ in the model $\mathbf{y}_t = \mathbf{B'x}_t + \mathbf{u}_t$, where $\mathbf{u}_t$ is an $m$-component vector of unobservable disturbances and $x_t$ is a $p$-component vector, converges to $\mathbf{B}$ with probability one under certain conditions on the behavior of $x_t$ and $\mathbf{u}_t$. When $\mathbf{x}_t$ is stochastic and the conditional expectation of $\mathbf{u}_t$ given $\mathbf{x}_s$ for $s \leqslant t$ and $\mathbf{u}_t$ for $s < t$ is zero, then the least squares estimates are strongly consistent if the inverse of $\mathbf{A}_T = \sigma^T_{t=1} \mathbf{x}_t\mathbf{x}'_t$, where $T$ is the sample size, converges to the zero matrix and if the ratio of the largest to the smallest characteristic root of $\mathbf{A}_T$ is bounded with probability one.