publication . Preprint . 2018

Short-term Memory of Deep RNN

Gallicchio, Claudio;
Open Access English
  • Published: 02 Feb 2018
Abstract
The extension of deep learning towards temporal data processing is gaining an increasing research interest. In this paper we investigate the properties of state dynamics developed in successive levels of deep recurrent neural networks (RNNs) in terms of short-term memory abilities. Our results reveal interesting insights that shed light on the nature of layering as a factor of RNN design. Noticeably, higher layers in a hierarchically organized RNN architecture results to be inherently biased towards longer memory spans even prior to training of the recurrent connections. Moreover, in the context of Reservoir Computing framework, our analysis also points out the ...
Subjects
free text keywords: Computer Science - Learning, Computer Science - Artificial Intelligence, Mathematics - Dynamical Systems, Statistics - Machine Learning
Download from

[1] I. Goodfellow, Y. Bengio, and A. Courville. Deep learning. MIT press, 2016.

[2] P. Angelov and A. Sperduti. Challenges in deep learning. In Proc. of the 24th European Symposium on Artificial Neural Networks (ESANN), pages 489-495. i6doc.com, 2016.

[3] M. Hermans and B. Schrauwen. Training and analysing deep recurrent neural networks. In NIPS, pages 190-198, 2013.

[4] D. Verstraeten, B. Schrauwen, M. d'Haene, and D. Stroobandt. An experimental unification of reservoir computing methods. Neural networks, 20(3):391-403, 2007. [OpenAIRE]

[5] M. Lukoˇseviˇcius and H. Jaeger. Reservoir computing approaches to recurrent neural network training. Computer Science Review, 3(3):127-149, 2009.

[6] C. Gallicchio, A. Micheli, and L. Pedrelli. Deep reservoir computing: A critical experimental analysis. Neurocomputing, 268:87-99, 2017. [OpenAIRE]

[7] C. Gallicchio and A. Micheli. Echo state property of deep reservoir computing networks. Cognitive Computation, 9(3):337-350, 2017.

[8] R. Pascanu, C. Gulcehre, K. Cho, and Y. Bengio. How to construct deep recurrent neural networks. arXiv preprint arXiv:1312.6026v5, 2014.

[9] H. Jaeger. Short term memory in echo state networks. Technical report, German National Research Center for Information Technology, 2001.

[10] C. Gallicchio, A. Micheli, and L. Silvestri. Local lyapunov exponents of deep rnn. In Proc. of the 25th European Symposium on Artificial Neural Networks (ESANN), pages 559-564. i6doc.com, 2017.

[11] C. Gallicchio, A. Micheli, and L. Silvestri. Local lyapunov exponents of deep echo state networks. Neurocomputing, 2017. (Accepted). [OpenAIRE]

Powered by OpenAIRE Open Research Graph
Any information missing or wrong?Report an Issue