An additive diagonal-stability condition for absolute exponential stability of a general class of neural networks

Xue Bin Liang, Jun Wang

Research output: Contribution to journalArticlepeer-review

49 Citations (Scopus)

Abstract

This paper presents new results on the absolute exponential stability (AEST) of neural networks with a general class of partially Lipschitz continuous and monotone increasing activation functions under a mild condition that the interconnection matrix T of the network system is additively diagonally stable; i.e., for any positive diagonal matrix D 1, there exists a positive diagonal matrix D 2 such that D 2(T - D 1) + (T - D 1) T D 2 is negative definite. This result means that the neural networks with additively diagonally stable interconnection matrices are guaranteed to be globally exponentially stable for any neuron activation functions in the above class, any constant input vectors and any other network parameters. The additively diagonally stable interconnection matrices include diagonally semistable ones and H-matrices with nonpositive diagonal elements as special cases. The obtained AEST result substantially extends the existing ones in the literature on absolute stability (ABST) of neural networks. The additive diagonal stability condition is shown to be necessary and sufficient for AEST of neural networks with two neurons. Summary and discussion of the known results about ABST and AEST of neural networks are also given.

Original languageEnglish
Pages (from-to)1308-1317
Number of pages10
JournalIEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications
Volume48
Issue number11
DOIs
Publication statusPublished - Nov 2001
Externally publishedYes

Keywords

  • Absolute exponential stability
  • Additive diagonal stability
  • Diagonal semistability
  • Global exponential stability
  • H-matrix
  • Neural networks
  • Partial Lipschitz continuity

Fingerprint

Dive into the research topics of 'An additive diagonal-stability condition for absolute exponential stability of a general class of neural networks'. Together they form a unique fingerprint.

Cite this