publication . Preprint . Conference object . 2010

Matching Dyadic Distributions to Channels

Georg Böcherer; Rudolf Mathar;
Open Access English
  • Published: 20 Sep 2010
Abstract
Many communication channels with discrete input have non-uniform capacity achieving probability mass functions (PMF). By parsing a stream of independent and equiprobable bits according to a full prefix-free code, a modu-lator can generate dyadic PMFs at the channel input. In this work, we show that for discrete memoryless channels and for memoryless discrete noiseless channels, searching for good dyadic input PMFs is equivalent to minimizing the Kullback-Leibler distance between a dyadic PMF and a weighted version of the capacity achieving PMF. We define a new algorithm called Geometric Huffman Coding (GHC) and prove that GHC finds the optimal dyadic PMF in O(m ...
Persistent Identifiers
Subjects
arXiv: Computer Science::Information TheoryPhysics::Biological Physics
free text keywords: Computer Science - Information Theory, Mathematics - Probability, Discrete mathematics, Probability mass function, Communication channel, Channel capacity, Kullback–Leibler divergence, Algorithm, Huffman coding, symbols.namesake, symbols, Mathematics, Prefix code, Dyadic distribution, Mutual information

[1] F. R. Kschischang and S. Pasupathy, “Optimal nonuniform signaling for Gaussian channels,” IEEE Trans. Inf. Theory, vol. 39, no. 3, pp. 913-929, 1993. [OpenAIRE]

[2] A. Lempel, S. Even, and M. Cohn, “An algorithm for optimal prefix parsing of a noiseless and memoryless channel,” IEEE Trans. Inf. Theory, vol. 19, no. 2, pp. 208-214, 1973.

[3] K. J. Kerpez, “Runlength codes from source codes,” IEEE Trans. Inf. Theory, vol. 37, no. 3, pp. 682-687, 1991.

[4] G. Ungerboeck, “Huffman shaping,” in Codes, Graphs, and Systems, R. Blahut and R. Koetter, Eds. Springer, 2002, ch. 17, pp. 299-313.

[5] J. Abrahams, “Variable-length unequal cost parsing and coding for shaping,” IEEE Trans. Inf. Theory, vol. 44, no. 4, pp. 1648-1650, 1998.

[6] G. Bo¨cherer, F. Altenbach, and R. Mathar, “Capacity achieving modulation for fixed constellations with average power constraint,” 2010, submitted to ICC 2011.

[7] G. Bo¨cherer, “Geometric huffman coding,” http://www.georg-boecherer.de/ghc, Dec. 2010.

[8] T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed. John Wiley & Sons, Inc., 2006.

[9] T. H. Cormen, C. E. Leiserson, R. L. Rivest, and C. Stein, Introduction to Algorithms, 2nd ed. The MIT Press, 2001.

[10] R. G. Gallager, Principles of Digital Communication. Cambridge University Press, 2008.

[11] --, Information Theory and Reliable Communication. John Wiley & Sons, Inc., 1968.

[12] R. M. Krause, “Channels which transmit letters of unequal duration,” Inf. Contr., vol. 5, pp. 3-24, 1962. [OpenAIRE]

[13] R. S. Marcus, “Discrete noiseless coding,” Master's thesis, MIT, 1957.

[14] J. Abrahams, “Correspondence between variable length parsing and coding,” in The mathematics of information coding, extraction and distribution, G. Cybenko, D. P. O'Leary, and J. Rissanen, Eds. Springer, 1999, ch. 1, pp. 1-7.

Any information missing or wrong?Report an Issue