Renyi [19] gives a set of seven postulates which a measure of dependence for a pair of random variables should satisfy. Of the dependence measures considered by Renyi, only Gebelein's [5] maximal correlation, $S_P$, satisfies all seven postulates. Kramer [10] in considering the uncertainty principle in Fourier analysis [11] generalizes the Gebelein maximal correlation to the case of arbitrary pairs of $\sigma$-algebras; and asks whether this generalization is equivalent to Shannon's mutual information, $C_P$, [4, 9, 21] for pairs of $\sigma$-algebras--equivalent in the sense of preserving order. The object of this note is to compare $S_P$ and the two normalizations $C'_P$ and $C"_P$, of $C_P$, as dependence measures for strictly positive probability spaces (which are necessarily generated by random variables). It is found that for such spaces with the proper finiteness restrictions (a) (Thm 5.1) $0 \leqq S_P, C'_P, C"_P \leqq 1$; (b) (Thm 5.2) $S_P = 0 \text{iff} C'_P = 0 \text{iff} C"_P = 0$ iff the random variables are independent; (c) (Thm 5.4) $S_P = 1$ if the two generated algebras have a nontrivial intersection (the conditions are equivalent for finite algebras); $C'_P = 1$ iff one of the random variables is a function of the other; and $C"_P = 1$ iff the random variables are functions of each other; and, consequently, (d) (Thm 5.5) there exist probability spaces for which the dependence measures are not equivalent. The paper is divided into six sections. Section 1 contains the introduction and summary. Section 2 introduces the terminology, notation and preliminaries. Section 3 treats $S_P$ and the Renyi postulates. In Section 4, the basic Shannon-Feinstein-Khinchin mutual information is extended to strictly positive measure spaces, not necessarily finite. The comparison of the dependence measures and postulate modifications are given in Section 5. Finally, in Section 6 some extensions and open problems are mentioned.