Entropy of the Mixture of Sources and Entropy Dimension

Preprint English OPEN
Smieja, Marek; Tabor, Jacek;
(2011)
  • Subject: Computer Science - Information Theory
    acm: TheoryofComputation_GENERAL | Data_CODINGANDINFORMATIONTHEORY

We investigate the problem of the entropy of the mixture of sources. There is given an estimation of the entropy and entropy dimension of convex combination of measures. The proof is based on our alternative definition of the entropy based on measures instead of partiti... View more
  • References (13)
    13 references, page 1 of 2

    [1] C. E. Shannon, “A mathematical theory of communication,” The Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, 1948.

    [2] A. Re´nyi, “On the dimension and entropy of probability distributions,” Acta Mathematica Hungarica, vol. 10, no. 1-2, pp. 193-215, 1959.

    [3] R. S. Ellis, Entropy, large deviations, and statistical mechanics, 1st ed. Springer, 1985.

    [4] R. G. Gray, Entropy and Information Theory, 2nd ed. Springer, 2011.

    [5] P. Seibt, Algorithmic Information Theory. Berlin Heidelberg: SpringerVerlag, 2006.

    [6] Y. Wu and S. Verdu´, “Re´nyi information dimension: Fundamental limits of almost lossless analog compression,” IEEE Transactions On Information Theory, vol. 56, no. 8, pp. 3721-3748, 2010.

    [7] J. D. Howroyd, “On dimension and on existence of sets of finite positive hausdorff measure,” Proc. London Math. Soc., vol. 70, no. 3, pp. 581- 604, 1995.

    [8] --, On the theory of Hausdorff measure in metric space. London: Ph.D. Thesis, University Collage, 1994.

    [9] C. A. Rogers, Hausdorff measures, 2nd ed. Cambridge University Press, 1998.

    [10] A. Fan, K. Lau, and H. Rao, “Relationships between diffrent dimensions of a measure,” Monatsh. Math., vol. 135, pp. 191-201, 2002.

  • Metrics
Share - Bookmark