Approximation Methods which Converge with Probability one
Blum, Julius R.
Ann. Math. Statist., Tome 25 (1954) no. 4, p. 382-386 / Harvested from Project Euclid
Let $H(y\mid x)$ be a family of distribution functions depending upon a real parameter $x,$ and let $M(x) = \int^\infty_{-\infty} y dH(y \mid x)$ be the corresponding regression function. It is assumed $M(x)$ is unknown to the experimenter, who is, however, allowed to take observations on $H(y\mid x)$ for any value $x.$ Robbins and Monro [1] give a method for defining successively a sequence $\{x_n\}$ such that $x_n$ converges to $\theta$ in probability, where $\theta$ is a root of the equation $M(x) = \alpha$ and $\alpha$ is a given number. Wolfowitz [2] generalizes these results, and Kiefer and Wolfowitz [3], solve a similar problem in the case when $M(x)$ has a maximum at $x = \theta.$ Using a lemma due to Loeve [4], we show that in both cases $x_n$ converges to $\theta$ with probability one, under weaker conditions than those imposed in [2] and [3]. Further we solve a similar problem in the case when $M(x)$ is the median of $H(y \mid x).$
Publié le : 1954-06-14
Classification: 
@article{1177728794,
     author = {Blum, Julius R.},
     title = {Approximation Methods which Converge with Probability one},
     journal = {Ann. Math. Statist.},
     volume = {25},
     number = {4},
     year = {1954},
     pages = { 382-386},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177728794}
}
Blum, Julius R. Approximation Methods which Converge with Probability one. Ann. Math. Statist., Tome 25 (1954) no. 4, pp.  382-386. http://gdmltest.u-ga.fr/item/1177728794/