Robustness analysis of global exponential stability of recurrent neural networks in the presence of time delays and random disturbances

Yi Shen, Jun Wang

Research output: Contribution to journalArticlepeer-review

119 Citations (Scopus)

Abstract

In recent years, the global stability of recurrent neural networks (RNNs) has been investigated extensively. It is well known that time delays and external disturbances can derail the stability of RNNs. In this paper, we analyze the robustness of global stability of RNNs subject to time delays and random disturbances. Given a globally exponentially stable neural network, the problem to be addressed here is how much time delay and noise the RNN can withstand to be globally exponentially stable in the presence of delay and noise. The upper bounds of the time delay and noise intensity are characterized by using transcendental equations for the RNNs to sustain global exponential stability. Moreover, we prove theoretically that, for any globally exponentially stable RNNs, if additive noises and time delays are smaller than the derived lower bounds arrived at here, then the perturbed RNNs are guaranteed to also be globally exponentially stable. Three numerical examples are provided to substantiate the theoretical results.

Original languageEnglish
Article number6104229
Pages (from-to)87-96
Number of pages10
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume23
Issue number1
DOIs
Publication statusPublished - 2012
Externally publishedYes

Keywords

  • Global exponential stability
  • random disturbances
  • recurrent neural networks
  • robustness
  • time delays

Fingerprint

Dive into the research topics of 'Robustness analysis of global exponential stability of recurrent neural networks in the presence of time delays and random disturbances'. Together they form a unique fingerprint.

Cite this