The tail of the binomial distribution is defined as \begin{equation*}\tag{1}E(n, s, p) = \sum^n_{r=s} \binom{n}{r} p^r (1 - p)^{n-r}\end{equation*} with the restriction that $p$ be non-negative and less than (approximately) $s/n$. An estimate of $E(n, s, p)$ can be considered as the product of two separate estimates, an estimate of the size of the leading term in (1) and an estimate of the ratio of $E(n, s, p)$ to the leading term. This paper is concerned solely with the second estimate. That is, let \begin{equation*}\tag{2}R = E(n, s, p) \bigg/ \binom{n}{s} p^s(1 - p)^{n-s};\end{equation*} two estimates of $R$ are presented. In Section 2 an upper bound to $R$ is derived from a geometric series approach. The resulting bound is an improvement over previous results in that it is useful even when $p$ is near $s/n$, as opposed to simpler geometric bounds, such as that of Bahadur [1], which either blow up or become excessively large when $p$ is near $s/n$. The bound is given in Theorem 1, Equation (9). Section 3 discusses the error in using this bound. Section 4 gives a normal approximation to $R$, namely $R^\ast$ of Equation (18). Theorem 3 shows that the relative error of this estimate goes to zero as $s$ and $n - s$ go to infinity provided $0 \leqq p \leqq (s - 1)/(n - 1)$ and under a weak restriction on $s/n$ in the limit. The uniformity of this result with $p$ contrasts with the many normal approximations to $E(n, s, p)$, such as those of Bernstein, [2] Feller, [6], [7] Uspensky, [9] and Camp [3] in which the relative error becomes infinite as $p$ approaches zero. Section 5 presents a brief discussion of the behavior of $R^\ast$.