On Divergences and Informations in Statistics and Information Theory
Top Cited Papers
- 25 September 2006
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 52 (10), 4394-4412
- https://doi.org/10.1109/tit.2006.881731
Abstract
The paper deals with the f-divergences of Csiszar generalizing the discrimination information of Kullback, the total variation distance, the Hellinger divergence, and the Pearson divergence. All basic properties of f-divergences including relations to the decision errors are proved in a new manner replacing the classical Jensen inequality by a new generalized Taylor expansion of convex functions. Some new properties are proved too, e.g., relations to the statistical sufficiency and deficiency. The generalized Taylor expansion also shows very easily that all f-divergences are average statistical informations (differences between prior and posterior Bayes errors) mutually differing only in the weights imposed on various prior distributions. The statistical information introduced by De Groot and the classical information of Shannon are shown to be extremal cases corresponding to alpha=0 and alpha=1 in the class of the so-called Arimoto alpha-informations introduced in this paper for 0<alpha<1 by means of the Arimoto alpha-entropies. Some new examples of f-divergences are introduced as well, namely, the Shannon divergences and the Arimoto alpha-divergences leading for alphauarr1 to the Shannon divergences. Square roots of all these divergences are shown to be metrics satisfying the triangle inequality. The last section introduces statistical tests and estimators based on the minimal f-divergence with the empirical distribution achieved in the families of hypothetic distributions. For the Kullback divergence this leads to the classical likelihood ratio test and estimatorKeywords
This publication has 33 references indexed in Scilit:
- Inequalities between entropy and index of coincidence derived from information diagramsIEEE Transactions on Information Theory, 2001
- Minimum Divergence Estimators Based on Grouped DataAnnals of the Institute of Statistical Mathematics, 2001
- Uncertainty of discrete stochastic systems: general theory and statistical inferenceIEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 1996
- Divergence measures based on the Shannon entropyIEEE Transactions on Information Theory, 1991
- Quantization for decentralized hypothesis testing under communication constraintsIEEE Transactions on Information Theory, 1990
- Theory of Point EstimationPublished by Springer Science and Business Media LLC ,1983
- Information-theoretical considerations on estimation problemsInformation and Control, 1971
- The Divergence and Bhattacharyya Distance Measures in Signal SelectionIEEE Transactions on Communications, 1967
- On Information and SufficiencyThe Annals of Mathematical Statistics, 1951
- On Equivalence of Infinite Product MeasuresAnnals of Mathematics, 1948