For a wide class of stationary random sequences possessing a spectral density function, the variance of the best linear unbiased estimator for the mean is seen to depend asymptotically only on the behavior of the spectral density near the origin. Asymptotically efficient estimators based only on this behavior may be chosen. For spectral densities behaving like $\lambda^\nu$ at the origin, $\nu > -1$ a constant, the minimum variance decreases like $n^{-\nu-1}$, where $n$ is the sample size. Asymptotically efficient estimators depending on $\nu$ are given. Finally, the consequences of over- or under-estimating the value of $\nu$ in choosing an estimator are considered.