Let $X$ be an $n \times k$ random matrix whose coordinates are independently normally distributed with common variance $\sigma^2$ and means given by $EX = e\mu' + \theta\lambda',$ where $e$ is the vector in $R^n$ having all coordinates equal to $1, \theta \in R^n,$ and $\mu, \lambda \in R^k$ with $\sum^k_{j = 1} \lambda^2_j = 1.$ The problem is to estimate $\lambda$, say by $\hat{\lambda},$ with loss function $1 - (\lambda'\hat{\lambda})^2$ when $\mu, \theta,$ and $\sigma^2$ are unknown. It is shown that the largest principal component of $X'X - (1/n)X'ee'X$ is the best estimator invariant under rotations in $R^k$ and rotations in $R^n$ leaving $e$ invariant and is admissible.