Let $y = \theta + e$ where $\theta$ and $e$ are independent random variables so that the regression of $y$ on $\theta$ is linear and the conditional distribution of $y$ given $\theta$ is homoscedastic. We find prior distributions of $\theta$ which induce a linear regression of $\theta$ on $y$. If in addition, the conditional distribution of $\theta$ given $y$ is homoscedastic (or weakly so), then $\theta$ has a normal distribution. The result is generalized to the Gauss-Markoff model $\mathbf{Y} = \mathbf{X\theta} + \mathbf{\varepsilon}$ where $\mathbf{\theta}$ and $\mathbf{\varepsilon}$ are independent vector random variables. Suppose $\bar{y}_i$ is the average of $p$ observations drawn from the $i$th normal population with mean $\theta_i$ and variance $\sigma_0^2$ for $i = 1,\cdots, k$, and the problem is the simultaneous estimation of $\theta_1,\cdots, \theta_k$. An estimator alternative to that of James and Stein is obtained and shown to have some advantage.