A one-layer projection neural network for nonsmooth optimization subject to linear equalities and bound constraints

Qingshan Liu, Jun Wang

Research output: Contribution to journalArticlepeer-review

169 Citations (Scopus)

Abstract

This paper presents a one-layer projection neural network for solving nonsmooth optimization problems with generalized convex objective functions and subject to linear equalities and bound constraints. The proposed neural network is designed based on two projection operators: linear equality constraints, and bound constraints. The objective function in the optimization problem can be any nonsmooth function which is not restricted to be convex but is required to be convex (pseudoconvex) on a set defined by the constraints. Compared with existing recurrent neural networks for nonsmooth optimization, the proposed model does not have any design parameter, which is more convenient for design and implementation. It is proved that the output variables of the proposed neural network are globally convergent to the optimal solutions provided that the objective function is at least pseudoconvex. Simulation results of numerical examples are discussed to demonstrate the effectiveness and characteristics of the proposed neural network.

Original languageEnglish
Pages (from-to)812-824
Number of pages13
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume24
Issue number5
DOIs
Publication statusPublished - 2013
Externally publishedYes

Keywords

  • Differential inclusion
  • Global convergence
  • Lyapunov function
  • Nonsmooth optimization
  • Projection neural network

Fingerprint

Dive into the research topics of 'A one-layer projection neural network for nonsmooth optimization subject to linear equalities and bound constraints'. Together they form a unique fingerprint.

Cite this