This paper investigates global asymptotic stability (GAS) and global exponential stability (GES) of a class of continuous-time recurrent neural networks. First, we introduce a necessary and sufficient condition for existence and uniqueness of equilibrium of the neural networks with Lipschitz continuous activation functions. Next, we present two sufficient conditions to ascertain the GAS of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. We then give two GES conditions for the neural networks whose activation functions may not be monotone nondecreasing. We also provide a Lyapunov diagonal stability condition, without the nonsingularity requirement for the connection weight matrices, to ascertain the GES of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. This Lyapunov diagonal stability condition generalizes and unifies many the existing GAS and GES results. Moreover, two higher exponential convergence rates are estimated.
|Number of pages||8|
|Journal||IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications|
|Publication status||Published - Sep 2002|
- Global asymptotic stability
- Global exponential stability
- Global Lipschitz
- Recurrent neural networks