Random variables $X, Y_1, Y_2, \cdots$ are available for observation with $X$ real valued and $Y_1, Y_2, \cdots$ taking values in arbitrary spaces. The distribution of $Y = (Y_1, Y_2, \cdots)$ is given by $\mu_j (j = 1, \cdots, r)$ and the conditional density with respect to Lebesgue measure given $Y_i = y_i(i = 1, \cdots, n - 1)$ is $p_{jn}(x - \theta, y)$ where $y = (y_1, y_2, \cdots)$. The parameters $j$ and $\theta$ are unknown. A decision $k \in \{1, \cdots, m\}$ is to be made with loss $W(j, k, n, y)$ when $n$ observations are taken. Following Brown's (1966) methods admissibility is proved for the decision procedure which is Bayes in the class of invariant procedures. The result contains that of Lehmann and Stein (1953).