Empirical Bayes estimators, asymptotically optimal with rates, are proposed. In the component problem there is a pair $(X, \omega)$ of real valued random variables. The Lebesgue density of $X$, conditional on $\omega$, is of the form $u(x)C(\omega)e^{\omega x}$. Based on a realization of $X$, the problem is squared error loss estimation of $\omega$. Let $G$ be a prior distribution on $\omega$, and $R(G)$ be the Bayes optimal risk wrt $G$. Using $(X_1, \cdots, X_n)$, the observations in the past $n$ such problems, mean square consistent estimators of the derivative of $\log (\int C(\omega)e^{\omega x} dG(\omega))$ are proposed. Then these statistics and the present observation $X$ are used to exhibit estimators $\psi_n$ for the present problem whose risks $R_n$ converge to the Bayes optimal risk $R(G)$ as $n \rightarrow \infty$. In particular, with no assumption on the smoothness or on the form of $u$, a $\psi_n$ for each $\gamma$ in [0, 2) is exhibited. Sufficient conditions are given under which $c_1n^{-4/(4+3\gamma)} \leqq R_n - R(G) \leqq c_2n^{-2\gamma/(4+3\gamma)}$, where $c_1$ and $c_2$ are positive constants. The rhs inequality holds uniformly in $G$ with support in a bounded interval of the real line, while the other holds for a $G$ degenerate at a point and for all $n$ sufficiently large. (Thus with $\gamma$ close to $2, \psi_n$ achieves almost the exact rate.) Examples of families, including one whose $u$ function has infinitely many discontinuities, are given where conditions for the above inequalities are satisfied for $\gamma$ arbitrarily close to 2.