Global stability of a class of continuous-time recurrent neural networks

Sanqing Hu, Jun Wang

Research output: Contribution to journalArticlepeer-review

88 Citations (Scopus)

Abstract

This paper investigates global asymptotic stability (GAS) and global exponential stability (GES) of a class of continuous-time recurrent neural networks. First, we introduce a necessary and sufficient condition for existence and uniqueness of equilibrium of the neural networks with Lipschitz continuous activation functions. Next, we present two sufficient conditions to ascertain the GAS of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. We then give two GES conditions for the neural networks whose activation functions may not be monotone nondecreasing. We also provide a Lyapunov diagonal stability condition, without the nonsingularity requirement for the connection weight matrices, to ascertain the GES of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. This Lyapunov diagonal stability condition generalizes and unifies many the existing GAS and GES results. Moreover, two higher exponential convergence rates are estimated.

Original languageEnglish
Pages (from-to)1334-1341
Number of pages8
JournalIEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications
Volume49
Issue number9
DOIs
Publication statusPublished - Sep 2002
Externally publishedYes

Keywords

  • Continuous
  • Continuous-time
  • Global asymptotic stability
  • Global exponential stability
  • Global Lipschitz
  • Recurrent neural networks

Fingerprint

Dive into the research topics of 'Global stability of a class of continuous-time recurrent neural networks'. Together they form a unique fingerprint.

Cite this