Bounding $\bar{d}$-distance by informational divergence: a method to prove measure concentration
Marton, K.
Ann. Probab., Tome 24 (1996) no. 2, p. 857-866 / Harvested from Project Euclid
There is a simple inequality by Pinsker between variational distance and informational divergence of probability measures defined on arbitrary probability spaces. We shall consider probability measures on sequences taken from countable alphabets, and derive, from Pinsker's inequality, bounds on the $\bar{d}$-distance by informational divergence. Such bounds can be used to prove the "concentration of measure" phenomenon for some nonproduct distributions.
Publié le : 1996-04-14
Classification:  Measure concentration,  isoperimetric inequality,  Markov chains,  $\bar{d}$-distance,  informational divergence,  60F10,  60G70,  60G05
@article{1039639365,
     author = {Marton, K.},
     title = {Bounding $\bar{d}$-distance by informational divergence: a method
		 to prove measure concentration},
     journal = {Ann. Probab.},
     volume = {24},
     number = {2},
     year = {1996},
     pages = { 857-866},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1039639365}
}
Marton, K. Bounding $\bar{d}$-distance by informational divergence: a method
		 to prove measure concentration. Ann. Probab., Tome 24 (1996) no. 2, pp.  857-866. http://gdmltest.u-ga.fr/item/1039639365/