A Note on the Reciprocal of the Conditional Expectation of a Positive Random Variable
Robertson, Tim
Ann. Math. Statist., Tome 36 (1965) no. 6, p. 1302-1305 / Harvested from Project Euclid
Brunk [3] discusses conditional expectations given $\sigma$-lattices. This note is concerned with the observation that the reciprocal of the conditional expectation given a $\sigma$-lattice with respect to some measure $\mu$ of a positive random variable $X$ is the conditional expectation of $1/X$ given the complementary $\sigma$-lattice with respect to another measure. (For the case of a $\sigma$-field this result is equivalent to the familiar result about the reciprocal of a Radon-Nikodym derivative.) Associated with this property of conditional expectations is a mapping of the set of all convex functions on $(0, \infty)$ into itself which leads to alternative ways of formulating certain extremum problems. Let $(\Omega, \mathfrak{A}, \mu)$ be a measure space with $\mu(\Omega) < \infty$. Let $I_A$ denote the indicator function of a set $A$. We shall let $\mathscr{L}$ denote a $\sigma$-lattice of subsets of $\Omega(\mathscr{L} \subset \mathfrak{A})$. A $\sigma$-lattice is, by definition, closed under countable unions and intersections and contains both $\Omega$ and the null set $\varnothing$. If $\mathscr{L}$ is such a $\sigma$-lattice then $\mathscr{L}^c$ will denote the $\sigma$-lattice of all subsets of $\Omega$ which are complements of members of $\mathscr{L}$. We say that a random variable $X$ is $\mathscr{L}$-measurable if $\{X > a\} \varepsilon \mathscr{L}$ for each real number $a$. Let $L_2$ denote the class of square integrable random variables and $L_2(\mathscr{L})$ the class of $\mathscr{L}$-measurable, square integrable random variables. Later we shall want to restrict our attention to strictly positive random variables, so for any set $S$ of random variables we let $S^+$ denote the set of all those members of $S$ which are strictly positive. Let $\mathscr{B}$ denote the class of Borel subsets of the real line. The following is one of several available definitions for the conditional expectation, $E_\mu(X \mid \mathscr{L})$, of $X$ given $\mathscr{L}$ (see Brunk [3]). Definition. If $X \varepsilon L_2$ then $Y \varepsilon L_2(\mathscr{L})$ is equal to $E_\mu(X \mid \mathscr{L})$ if and only if $Y$ has both of the following properties: \begin{equation*}\tag{1}\int(X - Y)Z d\mu \leqq 0 \quad\text{for each} Z \varepsilon L_2(\mathscr{L})\end{equation*} \begin{equation*}\tag{2}\int_B (X - Y) d\mu = 0 \quad\text{for each} B \varepsilon Y^{-1}(\mathscr{B})\end{equation*} (Brunk [3] shows the existence of such a $Y$ and that it is unique in the sense that if $W$ is any other member of $L_2(\mathscr{L})$ having these properties then $W = Y\lbrack\mu\rbrack$.)
Publié le : 1965-08-14
Classification: 
@article{1177700004,
     author = {Robertson, Tim},
     title = {A Note on the Reciprocal of the Conditional Expectation of a Positive Random Variable},
     journal = {Ann. Math. Statist.},
     volume = {36},
     number = {6},
     year = {1965},
     pages = { 1302-1305},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177700004}
}
Robertson, Tim. A Note on the Reciprocal of the Conditional Expectation of a Positive Random Variable. Ann. Math. Statist., Tome 36 (1965) no. 6, pp.  1302-1305. http://gdmltest.u-ga.fr/item/1177700004/