# Sparsity in Reservoir Computing Neural Networks

- Published: 11 Sep 2020

- Link this publication to...
- Cite this publication
Add to ORCID Please grant OpenAIRE to access and update your ORCID works.This research outcome is the result of merged research outcomes in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged research outcome.- add annotation

- University of Pisa Italy

- Funder: European Commission (EC)
- Project Code: 871385
- Funding stream: H2020 | RIA

- 1
- 2

[1] J. F. Kolen and S. C. Kremer, A field guide to dynamical recurrent networks. John Wiley & Sons, 2001.

[2] N. Laptev, J. Yosinski, L. E. Li, and S. Smyl, “Time-series extreme event forecasting with neural networks at uber,” in International Conference on Machine Learning, vol. 34, 2017, pp. 1-5.

[3] I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” in Advances in neural information processing systems, 2014, pp. 3104-3112.

[4] A. Graves, A.-r. Mohamed, and G. Hinton, “Speech recognition with deep recurrent neural networks,” in 2013 IEEE international conference on acoustics, speech and signal processing. IEEE, 2013, pp. 6645- 6649.

[5] R. Nallapati, B. Zhou, C. Gulcehre, B. Xiang et al., “Abstractive text summarization using sequence-to-sequence rnns and beyond,” arXiv preprint arXiv:1602.06023, 2016. [OpenAIRE]

[6] S. Narang, E. Elsen, G. Diamos, and S. Sengupta, “Exploring sparsity in recurrent neural networks,” ICLR 2017. arXiv preprint arXiv:1704.05119, 2017.

[7] G. Bellec, D. Kappel, W. Maass, and R. Legenstein, “Deep rewiring: Training very sparse deep networks,” ICLR 2018. arXiv preprint arXiv:1711.05136, 2018.

[8] A. Litwin-Kumar, K. D. Harris, R. Axel, H. Sompolinsky, and L. Abbott, “Optimal degrees of synaptic connectivity,” Neuron, vol. 93, no. 5, pp. 1153-1164, 2017.

[9] M. Lukosˇevicˇius and H. Jaeger, “Reservoir computing approaches to recurrent neural network training,” Computer Science Review, vol. 3, no. 3, pp. 127-149, 2009.

[10] B. Schrauwen, D. Verstraeten, and J. Van Campenhout, “An overview of reservoir computing: theory, applications and implementations,” in Proceedings of the 15th european symposium on artificial neural networks. p. 471-482 2007, 2007, pp. 471-482.

[11] H. Jaeger and H. Haas, “Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication,” science, vol. 304, no. 5667, pp. 78-80, 2004.

[12] L. Larger, M. C. Soriano, D. Brunner, L. Appeltant, J. M. Gutie´rrez, L. Pesquera, C. R. Mirasso, and I. Fischer, “Photonic information processing beyond turing: an optoelectronic implementation of reservoir computing,” Optics express, vol. 20, no. 3, pp. 3241-3249, 2012.

[13] D. Bacciu, P. Barsocchi, S. Chessa, C. Gallicchio, and A. Micheli, “An experimental characterization of reservoir computing in ambient assisted living applications,” Neural Computing and Applications, vol. 24, no. 6, pp. 1451-1464, 2014.

[14] H. Jaeger, “The echo state approach to analysing and training recurrent neural networks-with an erratum note,” Bonn, Germany: German National Research Center for Information Technology GMD Technical Report, vol. 148, no. 34, p. 13, 2001.

[15] Y. Xue, L. Yang, and S. Haykin, “Decoupled echo state networks with lateral inhibition,” Neural Networks, vol. 20, no. 3, pp. 365-376, 2007.

- 1
- 2

- University of Pisa Italy

- Funder: European Commission (EC)
- Project Code: 871385
- Funding stream: H2020 | RIA

- 1
- 2

[1] J. F. Kolen and S. C. Kremer, A field guide to dynamical recurrent networks. John Wiley & Sons, 2001.

[2] N. Laptev, J. Yosinski, L. E. Li, and S. Smyl, “Time-series extreme event forecasting with neural networks at uber,” in International Conference on Machine Learning, vol. 34, 2017, pp. 1-5.

[3] I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” in Advances in neural information processing systems, 2014, pp. 3104-3112.

[4] A. Graves, A.-r. Mohamed, and G. Hinton, “Speech recognition with deep recurrent neural networks,” in 2013 IEEE international conference on acoustics, speech and signal processing. IEEE, 2013, pp. 6645- 6649.

[5] R. Nallapati, B. Zhou, C. Gulcehre, B. Xiang et al., “Abstractive text summarization using sequence-to-sequence rnns and beyond,” arXiv preprint arXiv:1602.06023, 2016. [OpenAIRE]

[6] S. Narang, E. Elsen, G. Diamos, and S. Sengupta, “Exploring sparsity in recurrent neural networks,” ICLR 2017. arXiv preprint arXiv:1704.05119, 2017.

[7] G. Bellec, D. Kappel, W. Maass, and R. Legenstein, “Deep rewiring: Training very sparse deep networks,” ICLR 2018. arXiv preprint arXiv:1711.05136, 2018.

[8] A. Litwin-Kumar, K. D. Harris, R. Axel, H. Sompolinsky, and L. Abbott, “Optimal degrees of synaptic connectivity,” Neuron, vol. 93, no. 5, pp. 1153-1164, 2017.

[9] M. Lukosˇevicˇius and H. Jaeger, “Reservoir computing approaches to recurrent neural network training,” Computer Science Review, vol. 3, no. 3, pp. 127-149, 2009.

[10] B. Schrauwen, D. Verstraeten, and J. Van Campenhout, “An overview of reservoir computing: theory, applications and implementations,” in Proceedings of the 15th european symposium on artificial neural networks. p. 471-482 2007, 2007, pp. 471-482.

[11] H. Jaeger and H. Haas, “Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication,” science, vol. 304, no. 5667, pp. 78-80, 2004.

[12] L. Larger, M. C. Soriano, D. Brunner, L. Appeltant, J. M. Gutie´rrez, L. Pesquera, C. R. Mirasso, and I. Fischer, “Photonic information processing beyond turing: an optoelectronic implementation of reservoir computing,” Optics express, vol. 20, no. 3, pp. 3241-3249, 2012.

[13] D. Bacciu, P. Barsocchi, S. Chessa, C. Gallicchio, and A. Micheli, “An experimental characterization of reservoir computing in ambient assisted living applications,” Neural Computing and Applications, vol. 24, no. 6, pp. 1451-1464, 2014.

[14] H. Jaeger, “The echo state approach to analysing and training recurrent neural networks-with an erratum note,” Bonn, Germany: German National Research Center for Information Technology GMD Technical Report, vol. 148, no. 34, p. 13, 2001.

[15] Y. Xue, L. Yang, and S. Haykin, “Decoupled echo state networks with lateral inhibition,” Neural Networks, vol. 20, no. 3, pp. 365-376, 2007.

- 1
- 2