There is a simple inequality by Pinsker between variational distance
and informational divergence of probability measures defined on arbitrary
probability spaces. We shall consider probability measures on sequences taken
from countable alphabets, and derive, from Pinsker's inequality, bounds
on the $\bar{d}$-distance by informational divergence. Such bounds can be used
to prove the "concentration of measure" phenomenon for some
nonproduct distributions.
@article{1039639365,
author = {Marton, K.},
title = {Bounding $\bar{d}$-distance by informational divergence: a method
to prove measure concentration},
journal = {Ann. Probab.},
volume = {24},
number = {2},
year = {1996},
pages = { 857-866},
language = {en},
url = {http://dml.mathdoc.fr/item/1039639365}
}
Marton, K. Bounding $\bar{d}$-distance by informational divergence: a method
to prove measure concentration. Ann. Probab., Tome 24 (1996) no. 2, pp. 857-866. http://gdmltest.u-ga.fr/item/1039639365/