A one-layer recurrent neural network for constrained nonsmooth optimization

Qingshan Liu, Jun Wang

Research output: Contribution to journalArticlepeer-review

115 Citations (Scopus)

Abstract

This paper presents a novel one-layer recurrent neural network modeled by means of a differential inclusion for solving nonsmooth optimization problems, in which the number of neurons in the proposed neural network is the same as the number of decision variables of optimization problems. Compared with existing neural networks for nonsmooth optimization problems, the global convexity condition on the objective functions and constraints is relaxed, which allows the objective functions and constraints to be nonconvex. It is proven that the state variables of the proposed neural network are convergent to optimal solutions if a single design parameter in the model is larger than a derived lower bound. Numerical examples with simulation results substantiate the effectiveness and illustrate the characteristics of the proposed neural network.

Original languageEnglish
Article number5759759
Pages (from-to)1323-1333
Number of pages11
JournalIEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Volume41
Issue number5
DOIs
Publication statusPublished - Oct 2011
Externally publishedYes

Keywords

  • Convergence
  • differential inclusion
  • Lyapunov function
  • nonsmooth optimization
  • recurrent neural networks

Fingerprint

Dive into the research topics of 'A one-layer recurrent neural network for constrained nonsmooth optimization'. Together they form a unique fingerprint.

Cite this