This paper considers random processes of the form $X_{n + 1} = a_nX_n + b_n (\operatorname{mod} p)$, where $X_0 = 0$ and the sequences $a_n$ and $b_n$ are independent with $a_n$ identically distributed for $n = 0, 1, 2, \ldots$ and $b_n$ identically distributed for $n = 0, 1, 2, \ldots$. Chung, Diaconis and Graham studied such processes where $a_n = 2$ always; this paper considers more general distributions for $a_n$ and $b_n$. The question is how long does it take these processes to get close to the uniform distribution? If $a_n$ is a distribution on $\mathbf{Z}^+$ which does not vary with $p$ and $b_n$ is a distribution on $\mathbf{Z}$ which also does not vary with $p$, an upper bound on this time is $O((\log p)^2)$ with appropriate restrictions on $p$ unless $a_n = 1$ always, $b_n = 0$ always or $a_n$ and $b_n$ can each take on only one value. This paper uses a recursive relation involving the discrete Fourier transform to find the bound. Under more restrictive conditions for $a_n$ and $b_n$, this paper finds that a generalization of the technique of Chung, Diaconis and Graham shows that $O(\log p \log \log p)$ steps suffice.