IIt is demonstrated that the entropy of statistical mechanics and of
information theory, $S({\bf p}) = -\sum p_i \log p_i $ may be viewed as a
measure of correlation. Given a probability distribution on two discrete
variables, $p_{ij}$, we define the correlation-destroying transformation $C:
p_{ij} \to \pi_{ij}$, which creates a new distribution on those same variables
in which no correlation exists between the variables, i.e. $\pi_{ij} = P_i
Q_j$. It is then shown that the entropy obeys the relation $S({\bf p}) \leq
S({\bf \pi}) = S({\bf P}) + S({\bf Q})$, i.e. the entropy is non-decreasing
under these correlation-destroying transformations.