Admissible Delay Upper Bounds for Global Asymptotic Stability of Neural Networks with Time-Varying Delays

Xian Ming Zhang, Qing Long Han, Jun Wang

Research output: Contribution to journalArticlepeer-review

123 Citations (Scopus)

Abstract

This paper is concerned with global asymptotic stability of a neural network with a time-varying delay, where the delay function is differentiable uniformly bounded with delay-derivative bounded from above. First, a general reciprocally convex inequality is presented by introducing some slack vectors with flexible dimensions. This inequality provides a tighter bound in the form of a convex combination than some existing ones. Second, by constructing proper Lyapunov-Krasovskii functional, global asymptotic stability of the neural network is analyzed for two types of the time-varying delays depending on whether or not the lower bound of the delay derivative is known. Third, noticing that sufficient conditions on stability from estimation on the derivative of some Lyapunov-Krasovskii functional are affine both on the delay function and its derivative, allowable delay sets can be refined to produce less conservative stability criteria for the neural network under study. Finally, two numerical examples are given to substantiate the effectiveness of the proposed method.

Original languageEnglish
Article number8294050
Pages (from-to)5319-5329
Number of pages11
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume29
Issue number11
DOIs
Publication statusPublished - Nov 2018
Externally publishedYes

Keywords

  • Admissible delay upper bounds
  • global asymptotic stability
  • neural networks
  • time-varying delay

Fingerprint

Dive into the research topics of 'Admissible Delay Upper Bounds for Global Asymptotic Stability of Neural Networks with Time-Varying Delays'. Together they form a unique fingerprint.

Cite this