Epsilon Entropy and Data Compression
Posner, Edward C. ; Rodemich, Eugene R.
Ann. Math. Statist., Tome 42 (1971) no. 6, p. 2079-2125 / Harvested from Project Euclid
This article studies efficient data transmission, or "data compression", from the standpoint of the theory of epsilon entropy. The notion of the entropy of a "data source" is defined. This quantity gives a precise measure of the amount of channel capacity necessary to describe a data source to within a given fidelity, epsilon, with probability one, when each separate "experiment" must be transmitted without storage from experiment to experiment. We also define the absolute epsilon entropy of a source, which is the amount of capacity needed when storage of experiments is allowed before transmission. The absolute epsilon entropy is shown to be equal to Shannon's rate distortion function evaluated for zero distortion, when suitable identifications are made. The main result is that the absolute epsilon entropy and the epsilon entropy have ratio close to one if either is large. Thus, very little can be saved by storing the results of independent experiments before transmission.
Publié le : 1971-12-14
Classification: 
@article{1177693077,
     author = {Posner, Edward C. and Rodemich, Eugene R.},
     title = {Epsilon Entropy and Data Compression},
     journal = {Ann. Math. Statist.},
     volume = {42},
     number = {6},
     year = {1971},
     pages = { 2079-2125},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1177693077}
}
Posner, Edward C.; Rodemich, Eugene R. Epsilon Entropy and Data Compression. Ann. Math. Statist., Tome 42 (1971) no. 6, pp.  2079-2125. http://gdmltest.u-ga.fr/item/1177693077/