Let $\{X_t\}^\infty_{t=1}$ be a stationary Gaussian time series with zero mean, unit variance, absolutely summable autocorrelation function and at least once differentiable spectral density function which is strictly positive in $\lbrack 0, \pi \rbrack$. In this paper it is shown that, if $M_n$ denotes the maximum of the normalized periodogram of $\{X_1,\ldots, X_n\}$ over the interval $\lbrack 0, \pi \rbrack$, then, almost surely, \begin{equation*}\tag{1} \lim \inf_{n\rightarrow\infty} \lbrack M_n - 2 \log n + \log \log n \rbrack \geq 0\end{equation*} and \begin{equation*}\tag{2} \lim \sup_{n\rightarrow\infty} \lbrack M_n - 2 \log n - 2(\log n)^\delta \rbrack = -\infty\end{equation*} for any $\delta > 0$.