Let $x_t(t = 1, 2, \cdots)$ be defined recursively by \begin{equation*}\tag{1.1}x_t = ax_{t-1} + u_t,\quad t = 1, 2, \cdots,\end{equation*} where $x_0$ is a constant, $\varepsilon u_t = 0, \varepsilon u^2_t = \sigma^2$ and $\varepsilon u_tu_s = 0, t \neq s$. ($\varepsilon$ denotes mathematical expectation.) An estimate of $\alpha$ based on $x_1, \cdots, x_T$ (which is the maximum likelihood estimate of $\alpha$ if the $u$'s are normally distributed) is \begin{equation*}\tag{1.2}\hat \alpha = \bigg(\sum^T_{t=1} x_tx_{t-1}\bigg)\bigg/\big(\sum^T_{t=1} x^2_{t-1}\bigg).\end{equation*} If $|\alpha| < 1, \sqrt T (\hat \alpha - \alpha)$ has a limiting normal distribution with mean 0 under fairly general conditions such as independence of the $u$'s and uniformly bounded moments of the $u$'s of order $4 + \epsilon$, for some $\epsilon > 0$. (See [2], Chapter II, for example.) If $|\alpha| > 1$, White [3] has shown $(\hat \alpha - \alpha)|\alpha|^T/(\alpha^2 - 1)$ has a limiting Cauchy distribution under the assumption that $x_0 = 0$ and the $u$'s are normally distributed; he has also found the distribution when $x_0 \neq 0$. His results can be easily modified and restated in the following form $(\Sigma^T_{t=1} x^2_{t-1})^{\frac{1}{2}}(\hat \alpha - \alpha)$ has a limiting normal distribution if the $u$'s are normally distributed and if $|\alpha| \neq 1$. Peculiarly, for $|\alpha| = 1$ this statistic has a limiting distribution which is not normal (and is not even symmetric for $x_0 = 0)$. One purpose of this paper is to characterize the limiting distributions for $|\alpha| > 1$ when the $u$'s are not necessarily normally distributed; it will be shown that for $|\alpha| > 1$ the results depend on the distribution of the $u$'s. Central limit theorems are not applicable. Secondly, the limiting distribution for $|\alpha| < 1$ will be shown to hold under the assumption that the $u$'s are independently, identically distributed with finite variance. This was conjectured by White.