Rényi divergence measures for commonly used univariate continuous distributions
- 10 November 2013
- journal article
- Published by Elsevier BV in Information Sciences
- Vol. 249, 124-131
- https://doi.org/10.1016/j.ins.2013.06.018
Abstract
No abstract availableKeywords
This publication has 19 references indexed in Scilit:
- Rényi entropy rate for Gaussian processesInformation Sciences, 2010
- Rényi statistics for testing equality of autocorrelation coefficientsStatistical Methodology, 2009
- Some properties of Rényi entropy and Rényi entropy rateInformation Sciences, 2009
- On Divergences and Informations in Statistics and Information TheoryIEEE Transactions on Information Theory, 2006
- Interpretations of Rényi entropies and divergencesPhysica A: Statistical Mechanics and its Applications, 2006
- CsiszÁr's Cutoff Rates for the General Hypothesis Testing ProblemIEEE Transactions on Information Theory, 2004
- BETA-NORMAL DISTRIBUTION AND ITS APPLICATIONSCommunications in Statistics - Theory and Methods, 2002
- Entropy expressions for multivariate continuous distributionsIEEE Transactions on Information Theory, 2000
- Generalized cutoff rates and Renyi's information measuresIEEE Transactions on Information Theory, 1995
- On Information and SufficiencyThe Annals of Mathematical Statistics, 1951