Global exponential dissipativity and stabilization of memristor-based recurrent neural networks with time-varying delays

Zhenyuan Guo, Jun Wang, Zheng Yan

Research output: Contribution to journalArticlepeer-review

176 Citations (Scopus)

Abstract

This paper addresses the global exponential dissipativity of memristor-based recurrent neural networks with time-varying delays. By constructing proper Lyapunov functionals and using M-matrix theory and LaSalle invariant principle, the sets of global exponentially dissipativity are characterized parametrically. It is proven herein that there are 22n2-n equilibria for an n-neuron memristor-based neural network and they are located in the derived globally attractive sets. It is also shown that memristor-based recurrent neural networks with time-varying delays are stabilizable at the origin of the state space by using a linear state feedback control law with appropriate gains. Finally, two numerical examples are discussed in detail to illustrate the characteristics of the results.

Original languageEnglish
Pages (from-to)158-172
Number of pages15
JournalNeural Networks
Volume48
DOIs
Publication statusPublished - Dec 2013
Externally publishedYes

Keywords

  • Global exponential dissipativity
  • Memristor
  • Neurodynamics
  • Recurrent neural networks
  • Stabilization

Fingerprint

Dive into the research topics of 'Global exponential dissipativity and stabilization of memristor-based recurrent neural networks with time-varying delays'. Together they form a unique fingerprint.

Cite this