Absolute exponential stability of neural networks with a general class of activation functions

Xue Bin Liang, Jun Wang

Research output: Contribution to journalArticlepeer-review

63 Citations (Scopus)

Abstract

This brief investigates the absolute exponential stability (AEST) of neural networks with a general class of partially Lipschitz continuous (defined in Section II) and monotone increasing activation functions. The main obtained result is that if the interconnection matrix T of the network system satisfies that - T is an H-matrix with nonnegative diagonal elements, then the neural network system is absolutely exponentially stable (AEST); i.e., that the network system is globally exponentially stable (GES) for any activation functions in the above class, any constant input vectors and any other network parameters. The obtained AEST result extends the existing ones of absolute stability (ABST) of neural networks with special classes of activation functions in the literature.

Original languageEnglish
Pages (from-to)1258-1263
Number of pages6
JournalIEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications
Volume47
Issue number8
DOIs
Publication statusPublished - Aug 2000
Externally publishedYes

Fingerprint

Dive into the research topics of 'Absolute exponential stability of neural networks with a general class of activation functions'. Together they form a unique fingerprint.

Cite this