Two examples are presented. In each, $p$ independent normal random variables having unit variance are observed. It is desired to estimate the unknown means, $\theta_i$, and the loss is of the form $L(\theta, a) = (\Sigma^p_{i=1} \nu(\theta_i))^{-1} \Sigma^p_{i=1}\nu(\theta_i)(\theta_i - a_i)^2$. The usual estimator, $\delta_0(x) = x$, is minimax with constant risk. In the first example $\nu(t) = e^{rt}$. It is shown that when $r \neq 0, \delta_0$ is inadmissible if and only if $p \geqslant 2$ whereas when $r = 0$ it is known to be inadmissible if and only if $p \geqslant 3$. In the second example $\nu(t) = (1 + t^2)^{r/2}$. It is shown that $\delta_0$ is inadmissible if $p > (2 - r)/(1 - r)$ and admissible if $p < (2 - r)/(1 - r)$. (In particular $\delta_0$ is admissible for all $p$ when $r \geqslant 1$ and only for $p = 1$ when $r < 0.$) In the first example the first order qualitative description of the better estimator when $\delta_0$ is inadmissible depends on $r$, while in the second example it does not. An example which is closely related to the first example, and which has more significance in applications, has been described by J. Berger.