Improved conditions for global exponential stability of recurrent neural networks with time-varying delays

Zhigang Zeng, Jun Wang

Research output: Contribution to journalArticlepeer-review

151 Citations (Scopus)

Abstract

This paper presents new theoretical results on global exponential stability of recurrent neural networks with bounded activation functions and time-varying delays. The stability conditions depend on external inputs, connection weights, and time delays of recurrent neural networks. Using these results, the global exponential stability of recurrent neural networks can be derived, and the estimated location of the equilibrium point can be obtained. As typical representatives, the Hopfield neural network (HNN) and the cellular neural network (CNN) are examined in detail.

Original languageEnglish
Pages (from-to)623-635
Number of pages13
JournalIEEE Transactions on Neural Networks
Volume17
Issue number3
DOIs
Publication statusPublished - May 2006
Externally publishedYes

Keywords

  • External inputs
  • M-matrix
  • Neural networks (NNs)
  • Stability
  • Time-varying delay

Fingerprint

Dive into the research topics of 'Improved conditions for global exponential stability of recurrent neural networks with time-varying delays'. Together they form a unique fingerprint.

Cite this