Let $x_1, x_2, \cdots$ be independent random variables with mean 0 and variance 1. Let $s_n = x_1 + \cdots + x_n, n \geqq 1$, and for each $c > 0$ define \begin{align*}\tau_1 &= \tau_1(c) = \text{first} n \geqq 1 \text{for which} s_n > cn^{\frac{1}{2}} \\ &= \infty \text{if} s_n \leqq cn^{\frac{1}{2}} \text{for all} n, \\ \end{align*} and \begin{align*}\tau_2 = \tau_2(c) &= \text{first} n \geqq 1 \text{for which} |s_n| > cn^{\frac{1}{2}} \\ &= \infty \text{if} |s_n| \leqq cn^{\frac{1}{2}} \text{for all} n. \\ \end{align*} The stopping times $\tau_1$ and $\tau_2$ have received considerable attention recently (see, for example, [1], [2], [3], [6], and [8]), and naturally there has arisen the question as to whether $P\{\tau_k < \infty\} = 1, k = 1, 2$. One contribution of this note is (1) THEOREM. If $s_n/n^{\frac{1}{2}}$ does not tend in probability to 0, then for each $c > 0, P\{\tau_2 < \infty\} = 1$. We show by examples that (1) is no longer true with $\tau_2$ replaced by $\tau_1$. The final section contains two remarks bearing on the converse to (1).