Global asymptotic stability and global exponential stability of continuous-time recurrent neural networks

Sanqing Hu, Jun Wang

Research output: Contribution to journalArticlepeer-review

64 Citations (Scopus)

Abstract

This note presents new results on global asymptotic stability (GAS) and global exponential stability (GES) of a general class of continuous-time recurrent neural networks with Lipschitz continuous and monotone nondecreasing activation functions. We first give three sufficient conditions for the GAS of neural networks. These testable sufficient conditions differ from and improve upon existing ones. We then extend an existing GAS result to GES one and also extend the existing GES results to more general cases with less restrictive connection weight matrices and/or partially Lipschitz activation functions.

Original languageEnglish
Pages (from-to)802-807
Number of pages6
JournalIEEE Transactions on Automatic Control
Volume47
Issue number5
DOIs
Publication statusPublished - May 2002
Externally publishedYes

Keywords

  • Global asymptotic (exponential) stability
  • Lipschitz continuous
  • Recurrent neural networks

Fingerprint

Dive into the research topics of 'Global asymptotic stability and global exponential stability of continuous-time recurrent neural networks'. Together they form a unique fingerprint.

Cite this