The Boltzmann/Shannon entropy as a measure of correlation
Van Drie, John H.
arXiv, 0001024 / Harvested from arXiv
IIt is demonstrated that the entropy of statistical mechanics and of information theory, $S({\bf p}) = -\sum p_i \log p_i $ may be viewed as a measure of correlation. Given a probability distribution on two discrete variables, $p_{ij}$, we define the correlation-destroying transformation $C: p_{ij} \to \pi_{ij}$, which creates a new distribution on those same variables in which no correlation exists between the variables, i.e. $\pi_{ij} = P_i Q_j$. It is then shown that the entropy obeys the relation $S({\bf p}) \leq S({\bf \pi}) = S({\bf P}) + S({\bf Q})$, i.e. the entropy is non-decreasing under these correlation-destroying transformations.
Publié le : 2000-01-17
Classification:  Mathematical Physics,  94A17
@article{0001024,
     author = {Van Drie, John H.},
     title = {The Boltzmann/Shannon entropy as a measure of correlation},
     journal = {arXiv},
     volume = {2000},
     number = {0},
     year = {2000},
     language = {en},
     url = {http://dml.mathdoc.fr/item/0001024}
}
Van Drie, John H. The Boltzmann/Shannon entropy as a measure of correlation. arXiv, Tome 2000 (2000) no. 0, . http://gdmltest.u-ga.fr/item/0001024/