A One-Sided Inequality of the Chebyshev Type
Marshall, Albert W. ; Olkin, Ingram
Ann. Math. Statist., Tome 31 (1960) no. 4, p. 488-491 / Harvested from Project Euclid
If $x$ is a random variable with mean zero and variance $\sigma^2$, then, according to Chebyshev's inequality, $P{|x| \geqq 1} \leqq \sigma^2$. The corresponding one-sided inequality $P{x \geqq 1} \leqq \sigma^2/(\sigma^2 + 1)$ is also known (see e.g. [2, p. 198]). Both inequalities are sharp. A generalization of Chebyshev's inequality was obtained by Olkin and Pratt [1] for $P{|x_1| \geqq 1 or \cdots or |x_k| \geqq 1}$, where $Ex_i = 0, Ex_i^2 = \sigma^2$, $Ex_ix_j = \sigma^2 \rho (i \neq j), i, j = 1, \cdots, k;$ we give here the corresponding generalization of the one-sided inequality, and we consider also the case where only means and variances are known. To obtain an upper bound for $P\{x \varepsilon T\} \equiv P\{x_1 \geqq 1 \text{or} \cdots \text{or} x_k \geqq 1\}$, we consider a nonnegative function, $f(x) \equiv f(x_1, \cdots, x_k)$, such that $f(x) \geqq 1$ for $x \varepsilon T$. Then $Ef(x) \geqq \int_{\{x \varepsilon T\}} f(x) dP \geqq P\{x \varepsilon T\}$. Since the bound is to be a function of the covariance matrix, $\Sigma, f(x)$ must be of the form $(x - a)A(x - a)'$, where $a = (a_1, \cdots, a_k), A = (a_{ij}): k \times k$. A "best" bound is one which minimizes $Ef(x) = \operatorname{tr} A(\Sigma + a'a)$, subject to $f(x) \geqq 0, f(x) \geqq 1$ on $T$.
Publié le : 1960-06-14
Classification: 
@article{1177705913,
     author = {Marshall, Albert W. and Olkin, Ingram},
     title = {A One-Sided Inequality of the Chebyshev Type},
     journal = {Ann. Math. Statist.},
     volume = {31},
     number = {4},
     year = {1960},
     pages = { 488-491},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177705913}
}
Marshall, Albert W.; Olkin, Ingram. A One-Sided Inequality of the Chebyshev Type. Ann. Math. Statist., Tome 31 (1960) no. 4, pp.  488-491. http://gdmltest.u-ga.fr/item/1177705913/