
This paper concerns with exponential convergence for a class of high-order recurrent neural networks with continuously distributed delays in the leakage terms. Without assuming the boundedness on the activation functions, some sufficient conditions are derived to ensure that all solutions of the networks converge exponentially to the zero point by using Lyapunov functional method and differential inequality techniques, which correct some recent results of Chen and Yang (Neural Comput Appl. doi: 10.1007/s00521-012-1172-2 , 2012). Moreover, we propose a new approach to prove the exponential convergence of HRNNs with continuously distributed leakage delays.
| citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 12 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
