For the problem of optimum prediction by means of $k$th degree polynomial regression, it is shown in [3] how to find the observation points and respective proportions of observations in the interval $\lbrack-1, 1\rbrack$ in order to obtain the minimax variance over the interval $\lbrack -1, t\brack$ of the predicted regression value for all $t \geqq t_1 > 1; t_1$ is the point outside the interval of observations at which the Chebyschev polynomial of degree $k$ is equal to the maximum value of the variance of the least squares estimate in $\lbrack -1, 1\rbrack$. It is shown herein that if the observation points and proportions are chosen as specified in [3], then the maximum of the "least squares" variance in the interval $\lbrack -1, 1\rbrack$ is at -1. As a consequence, an equation is developed which permits the evaluation of $t_1$ as a function of $k$. Moreover, it is shown that $t_1 \rightarrow 1$ as $k \rightarrow \infty$, so that, for large $k$, the solution given in [3] yields an approximation to the minimax variance over the interval $\lbrack -1, t \rbrack$, all $t > 1$.