 Downloads provided by UsageCounts
Downloads provided by UsageCounts
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=undefined&type=result"></script>');
-->
</script>handle: 11568/1072554
Reservoir Computing (RC) is a well-known strategy for designing Recurrent Neural Networks featured by striking efficiency of training. The crucial aspect of RC is to properly instantiate the hidden recurrent layer that serves as dynamical memory to the system. In this respect, the common recipe is to create a pool of randomly and sparsely connected recurrent neurons. While the aspect of sparsity in the design of RC systems has been debated in the literature, it is nowadays understood mainly as a way to enhance the efficiency of computation, exploiting sparse matrix operations. In this paper, we empirically investigate the role of sparsity in RC network design under the perspective of the richness of the developed temporal representations. We analyze both sparsity in the recurrent connections, and in the connections from the input to the reservoir. Our results point out that sparsity, in particular in input-reservoir connections, has a major role in developing internal temporal representations that have a longer short-term memory of past inputs and a higher dimension.
This paper is currently under review
FOS: Computer and information sciences, Echo State Networks; Reservoir Computing; Short-term Memory; Sparse Recurrent Neural Networks, Computer Science - Machine Learning, Computer Science - Neural and Evolutionary Computing, Machine Learning (stat.ML), Dynamical Systems (math.DS), Reservoir Computing, Echo State Networks, Machine Learning (cs.LG), Statistics - Machine Learning, Sparse Recurrent Neural Networks, FOS: Mathematics, Neural and Evolutionary Computing (cs.NE), Mathematics - Dynamical Systems, Short-term Memory
FOS: Computer and information sciences, Echo State Networks; Reservoir Computing; Short-term Memory; Sparse Recurrent Neural Networks, Computer Science - Machine Learning, Computer Science - Neural and Evolutionary Computing, Machine Learning (stat.ML), Dynamical Systems (math.DS), Reservoir Computing, Echo State Networks, Machine Learning (cs.LG), Statistics - Machine Learning, Sparse Recurrent Neural Networks, FOS: Mathematics, Neural and Evolutionary Computing (cs.NE), Mathematics - Dynamical Systems, Short-term Memory
| citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 11 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% | 
| views | 11 | |
| downloads | 25 | 

 Views provided by UsageCounts
Views provided by UsageCounts Downloads provided by UsageCounts
Downloads provided by UsageCounts