For any chance variable $x = (x_1,\cdots,x_N)$ having known distribution, the translation parameter estimation problem is to estimate an unknown constant $h$, having observed $y = (x_1 + h,\cdots,x_N + h)$. Extending the work of Pitman [2], Girshick and Savage [1] have, for any loss function depending only on the error of estimate, described an estimate whose risk is a constant $R$ independent of $h$, and have shown that under certain hypotheses their estimate is minimax. We investigate whether the Girshick-Savage estimate is admissible, i.e., whether it is impossible to find an estimate with risk $R(h) \leq R$ for all $h$ and actual inequality for some $h$. We consider only bounded discrete variables $x$, and show that, if all values of $x$ have all integer coordinates and if the loss $f(d)$ from an error $d$ is, for instance, strictly convex and assumes its minimum value, the Girshick-Savage estimate is admissible. Two examples in which the Girshick-Savage estimate is not admissible are given.