The general stochastic approximation procedure: $X_{n+1} = X_n - a_n c_n^{-1}h(Y_n),\quad n = 1,2, \cdots$ is considered, where $h$ is a Borel measurable transformation on the random observations $Y_n$. Under some mild requirements on $h$ and on the error random variables, the asymptotic properties, the a.s. convergence and the asymptotic normality are studied. The analysis is confined to the case where the error random variables are (conditionally) distributed according to a distribution function $G$ which is symmetric around 0 and admits a density $g$. The optimal choices of the design sequences $a_n$ and $c_n$ as well as the transformation $h$ are studied. The optimal transformation turned out to be equal to $-C(g'/g)$ (a.e. with respect to $G$) for a $C > 0$ and it is the transformation which minimizes the second moment of the asymptotic distribution of $n^\beta(X_n - \theta)$. The Robbins-Monro and the Kiefer-Wolfowitz situations are emphasized as special cases. With the optimal transformation, the new proposed generalized procedure is shown to yield asymptotically efficient estimators.