Asymptotically optimal (a.o.) empirical Bayes (EB) estimators are proposed. Speeds and the best possible speed at which these estimators are a.o. are investigated. The underlying component problem is the squared error loss estimation of $\theta$ based on an observation $X$ whose conditional (on $\theta$) $\operatorname{pdf}$ is of the form $u(x)C(\theta)\exp(\theta x)$. The function $u$ could have infinitely many discontinuities; $\theta$ is distributed according to an unknown and unspecified $G$ with support in $\Theta$, and $\Theta$ could be unbounded. Using $n$ independent past experiences of the component problem, EB estimators $\phi_n$ for the present problem are exhibited for each integer $r > 1$. The risks $R(\phi_n, G)$ due to $\phi_n$ are shown to converge to the minimum Bayes risk $R(G)$. In particular, for each $\delta$ in $\lbrack r^{-1}, 1\rbrack$, sufficient conditions are given under which $c_1n^{-2(r - 1)/(1 + 2r)} \leqslant R(\phi_n, G) - R(G) \leqslant c_2n^{-2(\delta r - 1)/(1 + 2r)}$, where $c_1$ and $c_2$ are positive constants. The right hand-side inequality holds uniformly in $G$ satisfying certain conditions, while the other holds at all degenerate $G$ and for all large $n$. (Thus with $\delta$ close to one, $\phi_n$ achieves almost the exact rate.) Examples of exponential families such as normal, gamma and one with $\operatorname{pdf's}$ having infinitely many discontinuities are given where the conditions for the above inequalities are satisfied uniformly in $G$ with $\int|\theta|^{2r\delta}dG(\theta) < \infty$.