Let $\{x_t\}$ be a linear stationary process of the form $x_t + \Sigma_{1\leqslant i<\infty}a_ix_{t-i} = e_t$, where $\{e_t\}$ is a sequence of i.i.d. normal random variables with mean 0 and variance $\sigma^2$. Given observations $x_1, \cdots, x_n$, least squares estimates $\hat{a}(k)$ of $a' = (a_1, a_2, \cdots)$, and $\hat{\sigma}^2_k$ of $\sigma^2$ are obtained if the $k$th order autoregressive model is assumed. By using $\hat{a}(k)$, we can also estimate coefficients of the best predictor based on $k$ successive realizations. An asymptotic lower bound is obtained for the mean squared error of the estimated predictor when $k$ is selected from the data. If $k$ is selected so as to minimize $S_n(k) = (n + 2k)\hat{\sigma}^2_k$, then the bound is attained in the limit. The key assumption is that the order of the autoregression of $\{x_t\}$ is infinite.