Let $ \{Z_{n}, n \ge 1 \}$ be
a single type supercritical Galton--Watson
process with mean
$EZ_{1} \equiv m$,
initiated by a single ancestor.
This paper studies the large
deviation behavior of the
sequence $\{R_n \equiv \frac{Z_{n+1}}{Z_n}\dvtx n \ge 1 \}$
and establishes a "phase transition"
in rates depending on whether $r$,
the maximal number of moments
possessed by the
offspring distribution,
is less than, equal to or
greater than the Schröder
constant $\alpha$.
This is done via a careful analysis of the harmonic moments of $Z_n$.