Share  Bookmark

 Download from


[1] B. C. Geiger and G. Kubin, “On the information loss in memoryless systems: The multivariate case,” in Proc. Int. Zurich Seminar on Communications (IZS), Zurich, Feb. 2012, pp. 3235, extended version available: arXiv:1109.4856 [cs.IT].
[2] , “Relative information loss in the PCA,” in Proc. IEEE Information Theory Workshop (ITW), Lausanne, Sep. 2012, pp. 562566, extended version available: arXiv:1204.0429 [cs.IT].
[3] D. H. Johnson, “Information theory and neural information processing,” IEEE Trans. Inf. Theory, vol. 56, no. 2, pp. 653666, Feb. 2010.
[4] M. Plumbley, “Information theory and unsupervised neural networks,” Cambridge University Engineering Department, Tech. Rep. CUED/FINFENG/TR. 78, 1991.
[5] G. Deco and D. Obradovic, An InformationTheoretic Approach to Neural Computing. New York, NY: Springer, 1996.
[6] N. Tishby, F. C. Pereira, and W. Bialek, “The information bottleneck method,” in Proc. Allerton Conf. on Communication, Control, and Computing, Sep. 1999, pp. 368377.
[7] M. A. Sa´nchezMontan˜e´s and F. J. Corbacho, “A new information processing measure for adaptive complex systems,” IEEE Trans. Neural Netw., vol. 15, no. 4, pp. 917927, Jul. 2004.
[8] M. A. Sa´nchezMontan˜e´s, “A theory of information processing for adaptive systems: Inspiration from biology, formal analysis and application to artificial systems,” Ph.D. dissertation, Universidad Auto´noma de Madrid, Jun. 2003.
[9] J. C. Principe, Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives, ser. Information Science and Statistics. New York, NY: Springer, 2010.
[10] T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed. Hoboken, NJ: Wiley Interscience, 2006.