$(R,S)$-information radius of type $t$ and comparison of experiments
Taneja, Inder Jeet ; Pardo, Luis ; Morales, D.
Applications of Mathematics, Tome 36 (1991), p. 440-455 / Harvested from Czech Digital Mathematics Library

Various information, divergence and distance measures have been used by researchers to compare experiments using classical approaches such as those of Blackwell, Bayesian ets. Blackwell's [1] idea of comparing two statistical experiments is based on the existence of stochastic transformations. Using this idea of Blackwell, as well as the classical bayesian approach, we have compared statistical experiments by considering unified scalar parametric generalizations of Jensen difference divergence measure.

Publié le : 1991-01-01
Classification:  62B10,  62B15,  94A15,  94A17
@article{104481,
     author = {Inder Jeet Taneja and Luis Pardo and D. Morales},
     title = {$(R,S)$-information radius of type $t$ and comparison of experiments},
     journal = {Applications of Mathematics},
     volume = {36},
     year = {1991},
     pages = {440-455},
     zbl = {0748.62003},
     mrnumber = {1134921},
     language = {en},
     url = {http://dml.mathdoc.fr/item/104481}
}
Taneja, Inder Jeet; Pardo, Luis; Morales, D. $(R,S)$-information radius of type $t$ and comparison of experiments. Applications of Mathematics, Tome 36 (1991) pp. 440-455. http://gdmltest.u-ga.fr/item/104481/

Blackwell D. (1951) Comparison of experiments, Proc. 2nd Berkeley Symp. Berkeley: University of California Press, 93-102. | MR 0046002

Burbea J. (1984) The Bose-Einstein Entropy of degree a and its Jensen Difference, Utilitas Math. 25, 225-240. | MR 0752861

Burbea J.; Rao C . R. (1982) Entropy Differential Metric, Distance and Divergence Measures in Probability Spaces: A Unified Approach, J. Multi. Analy. 12, 575 - 596. | Article | MR 0680530

Burbea J.; Rao C. R. (1982) On the Convexity of some Divergence Measures based on Entropy Functions, IEEE Trans. on Inform. Theory IT-28, 489-495. | MR 0672884

Capocelli R. M.; Taneja I. J. (1984) Generalized Divergence Measures and Error Bounds, Proc. IEEE Internat. Conf. on Systems, man and Cybernetics, Oct. 9-12, Halifax, Canada, pp. 43 - 47.

Campbell L. L. (1986) An extended Čencov characterization of the Information Metric, Proc. Ann. Math. Soc., 98, 135-141. | MR 0848890

Čencov N. N. (1982) Statistical Decisions Rules and Optimal Inference, Trans. of Math. Monographs, 53, Am. Math. Soc., Providence, R. L. | MR 0645898

De Groot M. H. (1970) Optimal Statistical Decisions, McGraw-Hill. New York. | MR 0356303

Ferentinos K.; Papaioannou T. (1982) Information in experiments and sufficiency, J. Statist. Plann. Inference 6, 309-317. | Article | MR 0667911

Goel P. K.; De Groot (1979) Comparison of experiments and information measures, Ann. Statist. 7, 1066-1077. | Article | MR 0536509

Kullback S.; Leibler A. (1951) On information and sufficiency, Ann. Math Stat. 27, 986-1005.

Lindley D. V. (1956) On a measure of information provided by an experiment, Ann. Math. Statis. 27, 986-1005. | Article | MR 0083936

Marshall A. W.; Olkin I. (1979) Inequalities: Theory of Majorization and its Applications, Academic Press. New York. | MR 0552278

Morales D.; Taneja I. J.; Pardo L. Comparison of Experiments based on $\phi$-Measures of Jensen Difference, Communicated.

Pardo L.; Morales D.; Taneja I. J. $\lambda$-measures of hypoentropy and comparison of experiments: Bayesian approach, To appear in Statistica. | MR 1173196 | Zbl 0782.62011

Rao C. R. (1982) Diversity and Dissimilarity Coefficients: A Unified Approach, J. Theoret. Pop. Biology, 21, 24-43. | Article | MR 0662520

Rao C. R.; Nayak T. K. (1985) Cross Entropy, Dissimilarity Measures and characterization of Quadratic Entropy, IEEE Trans, on Inform. Theory, IT-31(5), 589-593. | Article | MR 0808230

Sakaguchi M. (1964) Information Theory and Decision Making, Unpublished Lecture Notes, Statist. Dept., George Washington Univ., Washington DC.

Sanťanna A. P.; Taneja I. J. Trigonometric Entropies, Jensen Difference Divergence Measures and Error Bounds, Information Sciences 25, 145-156. | MR 0794765

Shannon C. E. (1948) A Mathematical Theory of Communications, Bell. Syst. Tech. J. 27, 379-423. | MR 0026286

Sibson R. (1969) Information Radius, Z. Wahrs. und verw. Geb. 14, 149-160. | MR 0258198

Taneja I. J.: 1(983) On characterization of J-divergence and its generalizations, J. Combin. Inform. System Sci. 8, 206-212. | MR 0783757

Taneja I. J. (1986) $\lambda$-measures of hypoentropy and their applications, Statistica, anno XLVI, n. 4, 465-478. | MR 0887303

Taneja I. J. (1986) Unified Measure of Information applied to Markov Chains and Sufficiency, J. Comb. Inform. & Syst. Sci., 11, 99-109. | MR 0966074

Taneja I. J. (1987) Statistical aspects of Divergence Measures, J. Statist. Plann. & Inferen., 16, 137-145. | MR 0895754

Taneja I. J. (1989) On Generalized Information Measures and their Applications, Adv. Elect. Phys. 76, 327 - 413. Academic Press.

Taneja I. J. (1990) Bounds on the Probability of Error in Terms of Generalized Information Radius, Information Sciences. 46.

Taneja I. J.; Morales D.; Pardo L. (1991) $\lambda$-measures of hypoentropy and comparison of experiments: Blackwell and Lehemann approach, Kybernetika, 27, 413 - 420. | MR 1132603

Vajda I. (1968) Bounds on the Minimal Error Probability and checking a finite or countable number of Hypothesis, Inform. Trans. Problems 4, 9-17. | MR 0267685

Vajda I. (1989) Theory of Statistical Inference and Information, Kluwer Academic Publishers, Dordrecht/Boston/London/.