Consider the problem of estimating the location parameter $\theta \in R^d$ based on a sample of size $n$ from $(\theta + X, Y)$, where $X$ is a $d$-dimensional random vector, $Y$ is a random element of some measure space, and $(X, Y)$ has a known distribution. Let $\mathscr{J}^-$ denote the corresponding inverse Fisher information matrix. We show that there is always an invariant estimator $\hat{\theta}_n$ such that $\mathscr{L}(n^{\frac{1}{2}}(\hat{\theta}_n - \theta)) \rightarrow N(0, \mathscr{J}^-)$ as $n \rightarrow \infty$. Let $\rho$ be a fixed probability density on $R^d$, let $\tilde{\theta}_n$ be any estimator of $\theta$ and set $R_n(c) = \int \rho(\theta)d\theta E_\theta \min (c, n|\tilde{\theta}_n - \theta|^2)$. We show that $\lim_{c\rightarrow\infty} \lim\inf_{n\rightarrow\infty} R_n(c) \geqq$ trace $\mathscr{J}^-$ and that if $\lim_{c\rightarrow\infty} \lim\sup_{n\rightarrow\infty} R_n(c) =$ trace $\mathscr{J}^-$, then $\lim_{n\rightarrow\infty} \int \rho(\theta) d\theta P_\theta(n^{\frac{1}{2}}|\tilde{\theta}_n - \hat{\theta}_n| \geqq c) = 0$ for all $c > 0$. These results are obtained with no regularity conditions imposed on the distribution of $(X, Y)$.