
This paper considers exponential convergence for a class of high-order recurrent neural networks (HRNNs) with continuously distributed delays in the leakage terms (i.e., “leakage delays”). Without assuming the boundedness on the activation functions, some sufficient conditions are derived to ensure that all solutions of this system converge exponentially to zero point by using Lyapunov functional method and differential inequality techniques, which are new and complement previously known results. In particular, we propose a new approach to prove the exponential convergence of HRNNs with continuously distributed delays in the leakage terms. Moreover, an example is given to show the effectiveness of the proposed method and results.
| citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 25 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
