Let $X$ be a normal random variable with variance one and mean either $\pm a/2$, where $a$ is a given positive constant. Let $C_m(m \geqq 0)$ denote the class of all two-stage rules with first sample size of $m$ for deciding, after two successive samples of independent observations on $X$, which of the two mean values is correct. This paper investigates the class of Bayes rules in $C_m$, parametrized by a priori probabilities on the hypotheses, and simple wrong decision losses. The cost per observation is taken throughout to be unity. Section 1 gives some general properties of Bayes rules in $C_m$ for decisions between any two continuous densities for $X$; Sections 2, 3, and 4 concern the densities specified above. Section 2 consists of a detailed development of Bayes second sample size properties in terms of the Bayes parameters and first sample outcomes. For example, Theorem 2.1 gives non-trivial lower and upper bounds for positive values of the Bayes second sample size corresponding to any fixed value for the minimum wrong decision loss. In Section 3, sufficient conditions are given under which the losses may be chosen so as to obtain Bayes rules with preassigned invariant error probabilities. (Invariance is taken with respect to changes in the prior probabilities.) It is shown how this result leads to rules which minimize the maximum expected sample size among rules in $C_m$ with error probabilities less than or equal to specified values. An illustrative example is considered for the case when these specified bounds are equal. The selection of an optimum first sample size for this example is treated in Section 4. The resulting rule has the above described good property among all two-stage rules (of any first sample size) subject to these bounds. Tables are included giving optimum first and second sample sizes and the values of auxiliary functions when this common bound on the error probabilities is .05 and .01.