If $x$ is a random variable with mean zero and variance $\sigma^2$, then, according to Chebyshev's inequality, $P{|x| \geqq 1} \leqq \sigma^2$. The corresponding one-sided inequality $P{x \geqq 1} \leqq \sigma^2/(\sigma^2 + 1)$ is also known (see e.g. [2, p. 198]). Both inequalities are sharp. A generalization of Chebyshev's inequality was obtained by Olkin and Pratt [1] for $P{|x_1| \geqq 1 or \cdots or |x_k| \geqq 1}$, where $Ex_i = 0, Ex_i^2 = \sigma^2$, $Ex_ix_j = \sigma^2 \rho (i \neq j), i, j = 1, \cdots, k;$ we give here the corresponding generalization of the one-sided inequality, and we consider also the case where only means and variances are known. To obtain an upper bound for $P\{x \varepsilon T\} \equiv P\{x_1 \geqq 1 \text{or} \cdots \text{or} x_k \geqq 1\}$, we consider a nonnegative function, $f(x) \equiv f(x_1, \cdots, x_k)$, such that $f(x) \geqq 1$ for $x \varepsilon T$. Then $Ef(x) \geqq \int_{\{x \varepsilon T\}} f(x) dP \geqq P\{x \varepsilon T\}$. Since the bound is to be a function of the covariance matrix, $\Sigma, f(x)$ must be of the form $(x - a)A(x - a)'$, where $a = (a_1, \cdots, a_k), A = (a_{ij}): k \times k$. A "best" bound is one which minimizes $Ef(x) = \operatorname{tr} A(\Sigma + a'a)$, subject to $f(x) \geqq 0, f(x) \geqq 1$ on $T$.