Let $\{X_n: n \geqslant 1\}$ be a sequence of i.i.d. random variables with bounded continuous density or probability mass function $f(x)$. If $E(\exp(\alpha|X_1|^\beta)) < \infty$ for some $\alpha > 0$ and $0 < \beta \leqslant 1, \mu = L(X_1), c_n = o(n^{1/(2 - \beta)})$ and $h$ is a measurable function such that $M = E(|h(X_1)|\exp(\alpha|X_1|^\beta)) < \infty$, then $$E(h(X_1)|X_1 + \cdots + X_n = n\mu + c_n) = E(h(X_1)) + M\cdot O\big(\frac{1 + |c_n|}{n}\big)$$ uniformly in $h$. It follows that $$\|\mathscr{L} (X_1\mid X_1 + \cdots + X_n = n\mu + c_n) - \mathscr{L}(X_1)\|_{\operatorname{Var}} = O\big(\frac{1 + |c_n|}{n}\big).$$ Applications are given to the binomial-Poisson convergence theorem, spacings, and statistical mechanics.