Global exponential periodicity and global exponential stability of a class of recurrent neural networks with various activation functions and time-varying delays

Boshan Chen, Jun Wang

Research output: Contribution to journalArticlepeer-review

50 Citations (Scopus)

Abstract

The paper presents theoretical results on the global exponential periodicity and global exponential stability of a class of recurrent neural networks with various general activation functions and time-varying delays. The general activation functions include monotone nondecreasing functions, globally Lipschitz continuous and monotone nondecreasing functions, semi-Lipschitz continuous mixed monotone functions, and Lipschitz continuous functions. For each class of activation functions, testable algebraic criteria for ascertaining global exponential periodicity and global exponential stability of a class of recurrent neural networks are derived by using the comparison principle and the theory of monotone operator. Furthermore, the rate of exponential convergence and bounds of attractive domain of periodic oscillations or equilibrium points are also estimated. The convergence analysis based on the generalization of activation functions widens the application scope for the model design of neural networks. In addition, the new effective analytical method enriches the toolbox for the qualitative analysis of neural networks.

Original languageEnglish
Pages (from-to)1067-1080
Number of pages14
JournalNeural Networks
Volume20
Issue number10
DOIs
Publication statusPublished - Dec 2007
Externally publishedYes

Keywords

  • Comparison principle
  • Global exponential stability
  • Mixed monotone operator
  • Periodic oscillation
  • Recurrent neural network

Fingerprint

Dive into the research topics of 'Global exponential periodicity and global exponential stability of a class of recurrent neural networks with various activation functions and time-varying delays'. Together they form a unique fingerprint.

Cite this