α-mutual information
- 1 February 2015
- conference paper
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
Abstract
Renyi entropy and Renyi divergence evidence a long track record of usefulness in information theory and its applications. Alfred Rényi never got around to generalizing mutual information in a similar way. In fact, in the literature there are several possible ways to accomplish such generalization, most notably those suggested by Suguru Arimoto, Imre Csiszár, and Robin Sibson. We collect several interesting properties and applications of the proposal by Sibson, hopefully making a case for its more widespread adoption.Keywords
This publication has 23 references indexed in Scilit:
- On the Conditional Rényi EntropyIEEE Transactions on Information Theory, 2014
- Conditional Rényi EntropiesIEEE Transactions on Information Theory, 2012
- A generalized divergence measure for robust image registrationIEEE Transactions on Signal Processing, 2003
- Generalized cutoff rates and Renyi's information measuresIEEE Transactions on Information Theory, 1995
- Generalizing the Fano inequalityIEEE Transactions on Information Theory, 1994
- On the converse to the coding theorem for discrete memoryless channels (Corresp.)IEEE Transactions on Information Theory, 1973
- A class of measures of informativity of observation channelsPeriodica Mathematica Hungarica, 1972
- Probability of error, equivocation, and the Chernoff boundIEEE Transactions on Information Theory, 1970
- Information radiusProbability Theory and Related Fields, 1969
- A simple derivation of the coding theorem and some applicationsIEEE Transactions on Information Theory, 1965