publication . Preprint . Conference object . 2020

Sparsity in Reservoir Computing Neural Networks

Claudio Gallicchio;
Open Access English
  • Published: 11 Sep 2020
Abstract
Reservoir Computing (RC) is a well-known strategy for designing Recurrent Neural Networks featured by striking efficiency of training. The crucial aspect of RC is to properly instantiate the hidden recurrent layer that serves as dynamical memory to the system. In this respect, the common recipe is to create a pool of randomly and sparsely connected recurrent neurons. While the aspect of sparsity in the design of RC systems has been debated in the literature, it is nowadays understood mainly as a way to enhance the efficiency of computation, exploiting sparse matrix operations. In this paper, we empirically investigate the role of sparsity in RC network design un...
Persistent Identifiers
Subjects
ACM Computing Classification System: MathematicsofComputing_NUMERICALANALYSIS
free text keywords: Computer Science - Machine Learning, Computer Science - Neural and Evolutionary Computing, Mathematics - Dynamical Systems, Statistics - Machine Learning, Reservoir Computing, Echo State Networks, Short-term Memory, Sparse Recurrent Neural Networks, Sparse matrix, Computer science, RC circuit, Short-term memory, Computation, Recurrent neural network, Theoretical computer science, Reservoir computing, Artificial neural network
Related Organizations
Funded by
EC| TEACHING
Project
TEACHING
A computing toolkit for building efficient autonomous applications leveraging humanistic intelligence
  • Funder: European Commission (EC)
  • Project Code: 871385
  • Funding stream: H2020 | RIA
Validated by funder
28 references, page 1 of 2

[1] J. F. Kolen and S. C. Kremer, A field guide to dynamical recurrent networks. John Wiley & Sons, 2001.

[2] N. Laptev, J. Yosinski, L. E. Li, and S. Smyl, “Time-series extreme event forecasting with neural networks at uber,” in International Conference on Machine Learning, vol. 34, 2017, pp. 1-5.

[3] I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” in Advances in neural information processing systems, 2014, pp. 3104-3112.

[4] A. Graves, A.-r. Mohamed, and G. Hinton, “Speech recognition with deep recurrent neural networks,” in 2013 IEEE international conference on acoustics, speech and signal processing. IEEE, 2013, pp. 6645- 6649.

[5] R. Nallapati, B. Zhou, C. Gulcehre, B. Xiang et al., “Abstractive text summarization using sequence-to-sequence rnns and beyond,” arXiv preprint arXiv:1602.06023, 2016. [OpenAIRE]

[6] S. Narang, E. Elsen, G. Diamos, and S. Sengupta, “Exploring sparsity in recurrent neural networks,” ICLR 2017. arXiv preprint arXiv:1704.05119, 2017.

[7] G. Bellec, D. Kappel, W. Maass, and R. Legenstein, “Deep rewiring: Training very sparse deep networks,” ICLR 2018. arXiv preprint arXiv:1711.05136, 2018.

[8] A. Litwin-Kumar, K. D. Harris, R. Axel, H. Sompolinsky, and L. Abbott, “Optimal degrees of synaptic connectivity,” Neuron, vol. 93, no. 5, pp. 1153-1164, 2017.

[9] M. Lukosˇevicˇius and H. Jaeger, “Reservoir computing approaches to recurrent neural network training,” Computer Science Review, vol. 3, no. 3, pp. 127-149, 2009.

[10] B. Schrauwen, D. Verstraeten, and J. Van Campenhout, “An overview of reservoir computing: theory, applications and implementations,” in Proceedings of the 15th european symposium on artificial neural networks. p. 471-482 2007, 2007, pp. 471-482.

[11] H. Jaeger and H. Haas, “Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication,” science, vol. 304, no. 5667, pp. 78-80, 2004.

[12] L. Larger, M. C. Soriano, D. Brunner, L. Appeltant, J. M. Gutie´rrez, L. Pesquera, C. R. Mirasso, and I. Fischer, “Photonic information processing beyond turing: an optoelectronic implementation of reservoir computing,” Optics express, vol. 20, no. 3, pp. 3241-3249, 2012.

[13] D. Bacciu, P. Barsocchi, S. Chessa, C. Gallicchio, and A. Micheli, “An experimental characterization of reservoir computing in ambient assisted living applications,” Neural Computing and Applications, vol. 24, no. 6, pp. 1451-1464, 2014.

[14] H. Jaeger, “The echo state approach to analysing and training recurrent neural networks-with an erratum note,” Bonn, Germany: German National Research Center for Information Technology GMD Technical Report, vol. 148, no. 34, p. 13, 2001.

[15] Y. Xue, L. Yang, and S. Haykin, “Decoupled echo state networks with lateral inhibition,” Neural Networks, vol. 20, no. 3, pp. 365-376, 2007.

28 references, page 1 of 2
Abstract
Reservoir Computing (RC) is a well-known strategy for designing Recurrent Neural Networks featured by striking efficiency of training. The crucial aspect of RC is to properly instantiate the hidden recurrent layer that serves as dynamical memory to the system. In this respect, the common recipe is to create a pool of randomly and sparsely connected recurrent neurons. While the aspect of sparsity in the design of RC systems has been debated in the literature, it is nowadays understood mainly as a way to enhance the efficiency of computation, exploiting sparse matrix operations. In this paper, we empirically investigate the role of sparsity in RC network design un...
Persistent Identifiers
Subjects
ACM Computing Classification System: MathematicsofComputing_NUMERICALANALYSIS
free text keywords: Computer Science - Machine Learning, Computer Science - Neural and Evolutionary Computing, Mathematics - Dynamical Systems, Statistics - Machine Learning, Reservoir Computing, Echo State Networks, Short-term Memory, Sparse Recurrent Neural Networks, Sparse matrix, Computer science, RC circuit, Short-term memory, Computation, Recurrent neural network, Theoretical computer science, Reservoir computing, Artificial neural network
Related Organizations
Funded by
EC| TEACHING
Project
TEACHING
A computing toolkit for building efficient autonomous applications leveraging humanistic intelligence
  • Funder: European Commission (EC)
  • Project Code: 871385
  • Funding stream: H2020 | RIA
Validated by funder
28 references, page 1 of 2

[1] J. F. Kolen and S. C. Kremer, A field guide to dynamical recurrent networks. John Wiley & Sons, 2001.

[2] N. Laptev, J. Yosinski, L. E. Li, and S. Smyl, “Time-series extreme event forecasting with neural networks at uber,” in International Conference on Machine Learning, vol. 34, 2017, pp. 1-5.

[3] I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” in Advances in neural information processing systems, 2014, pp. 3104-3112.

[4] A. Graves, A.-r. Mohamed, and G. Hinton, “Speech recognition with deep recurrent neural networks,” in 2013 IEEE international conference on acoustics, speech and signal processing. IEEE, 2013, pp. 6645- 6649.

[5] R. Nallapati, B. Zhou, C. Gulcehre, B. Xiang et al., “Abstractive text summarization using sequence-to-sequence rnns and beyond,” arXiv preprint arXiv:1602.06023, 2016. [OpenAIRE]

[6] S. Narang, E. Elsen, G. Diamos, and S. Sengupta, “Exploring sparsity in recurrent neural networks,” ICLR 2017. arXiv preprint arXiv:1704.05119, 2017.

[7] G. Bellec, D. Kappel, W. Maass, and R. Legenstein, “Deep rewiring: Training very sparse deep networks,” ICLR 2018. arXiv preprint arXiv:1711.05136, 2018.

[8] A. Litwin-Kumar, K. D. Harris, R. Axel, H. Sompolinsky, and L. Abbott, “Optimal degrees of synaptic connectivity,” Neuron, vol. 93, no. 5, pp. 1153-1164, 2017.

[9] M. Lukosˇevicˇius and H. Jaeger, “Reservoir computing approaches to recurrent neural network training,” Computer Science Review, vol. 3, no. 3, pp. 127-149, 2009.

[10] B. Schrauwen, D. Verstraeten, and J. Van Campenhout, “An overview of reservoir computing: theory, applications and implementations,” in Proceedings of the 15th european symposium on artificial neural networks. p. 471-482 2007, 2007, pp. 471-482.

[11] H. Jaeger and H. Haas, “Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication,” science, vol. 304, no. 5667, pp. 78-80, 2004.

[12] L. Larger, M. C. Soriano, D. Brunner, L. Appeltant, J. M. Gutie´rrez, L. Pesquera, C. R. Mirasso, and I. Fischer, “Photonic information processing beyond turing: an optoelectronic implementation of reservoir computing,” Optics express, vol. 20, no. 3, pp. 3241-3249, 2012.

[13] D. Bacciu, P. Barsocchi, S. Chessa, C. Gallicchio, and A. Micheli, “An experimental characterization of reservoir computing in ambient assisted living applications,” Neural Computing and Applications, vol. 24, no. 6, pp. 1451-1464, 2014.

[14] H. Jaeger, “The echo state approach to analysing and training recurrent neural networks-with an erratum note,” Bonn, Germany: German National Research Center for Information Technology GMD Technical Report, vol. 148, no. 34, p. 13, 2001.

[15] Y. Xue, L. Yang, and S. Haykin, “Decoupled echo state networks with lateral inhibition,” Neural Networks, vol. 20, no. 3, pp. 365-376, 2007.

28 references, page 1 of 2
Any information missing or wrong?Report an Issue