The length of a typical Huffman codeword
- 1 July 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 40 (4), 1246-1247
- https://doi.org/10.1109/18.335944
Abstract
If pi(i=1,···, N) is the probability of the ith letter of a memoryless source, the length li of the corresponding binary Huffman codeword can be very different from the value -log pi. For a typical letter, however, li≈-logpi. More precisely, Pm -=Σ/sub j∈{i|l<-logpj-m}/pj-logpi+m/}pj<2-c(m-2)+2, where c≈2.27Keywords
This publication has 6 references indexed in Scilit:
- Information and entropy in the baker’s mapPhysical Review Letters, 1992
- Algorithmic randomness and physical entropyPhysical Review A, 1989
- Sources which maximize the choice of a Huffman coding treeInformation and Control, 1980
- Variations on a theme by HuffmanIEEE Transactions on Information Theory, 1978
- Huffman codes and self-informationIEEE Transactions on Information Theory, 1976
- A Method for the Construction of Minimum-Redundancy CodesProceedings of the IRE, 1952