Suppose the observed random variable $X$ is the sum of two independent random variables $Z$ and $Y,$ where $Z$ has a normal distribution with zero expectation and a known variance, and $Y$ has a distribution function, say $G(y)$, which is completely unknown. Then the distribution function of $X$ may be written as \begin{equation*}\tag{1.1} F(x) = \frac{1}{\sqrt{2\pi\sigma}}\int^\infty_{-\infty}G(y)\exp\big\lbrack - \frac{(x - y)^2}{2\sigma^2}\big\rbrack dy,\end{equation*} where $F(x)$ and $G(y)$ are unknown. We consider here the problem of estimating $G(y)$ from a sample $x_1, x_2, \cdots x_n.$ Such a problem may arise if, for example, each $x_i$ represents a serum cholesterol determination on one human being randomly selected from some population. Then each $x_i$ may be thought of as the true cholesterol value for that person, plus an "instrumental error" introduced by the complex chemical analysis. We may wish to "correct" for the instrumental error, so to speak, by estimating the distribution of true cholesterol levels in the population. The maximum likelihood and minimum distance principles do not seem to yield estimators which may be expressed as explicit, more or less easily computable functions of the sample values. We present such an estimator, which is consistent at every continuity point of $G(y).$ (We consider only continuity points throughout the paper.) The estimator is constructed by first exhibiting an inversion formula for $G(y)$ in terms of the derivatives of $F(x),$ and then replacing the derivatives by the difference quotients of the empiric distribution function $F_n(x).$ The asymptotic mean square error of the estimator is derived, and a rough rule is suggested for deciding when it is worthwhile to compute an estimate. The fact that the estimator is still consistent under certain kinds of dependence between $Z$ and $Y$ is indicated. Finally, some comments are made on the relationship between the present estimator and one derived by Eddington.