Signal Enhancement as Minimization of Relevant Information Loss

Preprint English OPEN
Geiger, Bernhard C. ; Kubin, Gernot (2012)
  • Subject: Computer Science - Information Theory

We introduce the notion of relevant information loss for the purpose of casting the signal enhancement problem in information-theoretic terms. We show that many algorithms from machine learning can be reformulated using relevant information loss, which allows their application to the aforementioned problem. As a particular example we analyze principle component analysis for dimensionality reduction, discuss its optimality, and show that the relevant information loss can indeed vanish if the relevant information is concentrated on a lower-dimensional subspace of the input space.
  • References (16)
    16 references, page 1 of 2

    [1] B. C. Geiger and G. Kubin, “On the information loss in memoryless systems: The multivariate case,” in Proc. Int. Zurich Seminar on Communications (IZS), Zurich, Feb. 2012, pp. 32-35, extended version available: arXiv:1109.4856 [cs.IT].

    [2] --, “Relative information loss in the PCA,” in Proc. IEEE Information Theory Workshop (ITW), Lausanne, Sep. 2012, pp. 562-566, extended version available: arXiv:1204.0429 [cs.IT].

    [3] D. H. Johnson, “Information theory and neural information processing,” IEEE Trans. Inf. Theory, vol. 56, no. 2, pp. 653-666, Feb. 2010.

    [4] M. Plumbley, “Information theory and unsupervised neural networks,” Cambridge University Engineering Department, Tech. Rep. CUED/FINFENG/TR. 78, 1991.

    [5] G. Deco and D. Obradovic, An Information-Theoretic Approach to Neural Computing. New York, NY: Springer, 1996.

    [6] N. Tishby, F. C. Pereira, and W. Bialek, “The information bottleneck method,” in Proc. Allerton Conf. on Communication, Control, and Computing, Sep. 1999, pp. 368-377.

    [7] M. A. Sa´nchez-Montan˜e´s and F. J. Corbacho, “A new information processing measure for adaptive complex systems,” IEEE Trans. Neural Netw., vol. 15, no. 4, pp. 917-927, Jul. 2004.

    [8] M. A. Sa´nchez-Montan˜e´s, “A theory of information processing for adaptive systems: Inspiration from biology, formal analysis and application to artificial systems,” Ph.D. dissertation, Universidad Auto´noma de Madrid, Jun. 2003.

    [9] J. C. Principe, Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives, ser. Information Science and Statistics. New York, NY: Springer, 2010.

    [10] T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed. Hoboken, NJ: Wiley Interscience, 2006.

  • Metrics
    No metrics available
Share - Bookmark