For a pair of random variables, $(X, Y)$ on the space $\mathscr{X} \times \mathscr{Y}$ and a positive constant, $\lambda$, it is an important problem of information theory to look for subsets $\mathscr{A}$ of $\mathscr{X}$ and $\mathscr{B}$ of $\mathscr{Y}$ such that the conditional probability of $Y$ being in $\mathscr{B}$ supposed $X$ is in $\mathscr{A}$ is larger than $\lambda$. In many typical situations in order to satisfy this condition, $\mathscr{B}$ must be chosen much larger than $\mathscr{A}$. We shall deal with the most frequently investigated case when $X = (X_1,\cdots, X_n), Y = (Y_1,\cdots, Y_n)$ and $(X_i, Y_i)$ are independent, identically distributed pairs of random variables with a finite range. Suppose that the distribution of $(X, Y)$ is positive for all pairs of values $(x, y)$. We show that if $\mathscr{A}$ and $\mathscr{B}$ satisfy the above condition with a constant $\lambda$ and the probability of $\mathscr{B}$ goes to 0, then the probability of $\mathscr{A}$ goes even faster to 0. Generalizations and some exact estimates of the exponents of probabilities are given. Our methods reveal an interesting connection with a so-called hypercontraction phenomenon in theoretical physics.