Complete stability of delayed recurrent neural networks with Gaussian activation functions

Peng Liu, Zhigang Zeng, Jun Wang

Research output: Contribution to journalArticlepeer-review

31 Citations (Scopus)

Abstract

This paper addresses the complete stability of delayed recurrent neural networks with Gaussian activation functions. By means of the geometrical properties of Gaussian function and algebraic properties of nonsingular M-matrix, some sufficient conditions are obtained to ensure that for an n-neuron neural network, there are exactly 3k equilibrium points with 0≤k≤n, among which 2k and 3k−2k equilibrium points are locally exponentially stable and unstable, respectively. Moreover, it concludes that all the states converge to one of the equilibrium points; i.e., the neural networks are completely stable. The derived conditions herein can be easily tested. Finally, a numerical example is given to illustrate the theoretical results.

Original languageEnglish
Pages (from-to)21-32
Number of pages12
JournalNeural Networks
Volume85
DOIs
Publication statusPublished - 1 Jan 2017
Externally publishedYes

Keywords

  • Complete stability
  • Gaussian functions
  • Recurrent neural networks
  • Time-varying delays

Fingerprint

Dive into the research topics of 'Complete stability of delayed recurrent neural networks with Gaussian activation functions'. Together they form a unique fingerprint.

Cite this