Let $H(y\mid x)$ be a family of distribution functions depending upon a real parameter $x,$ and let $M(x) = \int^\infty_{-\infty} y dH(y \mid x)$ be the corresponding regression function. It is assumed $M(x)$ is unknown to the experimenter, who is, however, allowed to take observations on $H(y\mid x)$ for any value $x.$ Robbins and Monro [1] give a method for defining successively a sequence $\{x_n\}$ such that $x_n$ converges to $\theta$ in probability, where $\theta$ is a root of the equation $M(x) = \alpha$ and $\alpha$ is a given number. Wolfowitz [2] generalizes these results, and Kiefer and Wolfowitz [3], solve a similar problem in the case when $M(x)$ has a maximum at $x = \theta.$ Using a lemma due to Loeve [4], we show that in both cases $x_n$ converges to $\theta$ with probability one, under weaker conditions than those imposed in [2] and [3]. Further we solve a similar problem in the case when $M(x)$ is the median of $H(y \mid x).$