We have a population composed of two subpopulations whose probability properties are described by known univariate distribution functions, $G(x)$ and $H(x)$, respectively. The probability of observing an individual from the first population is $\theta$, from the second is $1 - \theta$. We assume $\theta$ is a random variable with a prior distribution on (0, 1) and find the Bayes rule for classifying $n$ observations as from $G$ or from $H$ when the loss function is equal to the number of misclassifications. The main results in the paper give the asymptotic properties of the Bayes rule and several proposed approximations.