A novel recurrent neural network for solving nonlinear optimization problems with inequality constraints

Youshen Xia, Gang Feng, Jun Wang

Research output: Contribution to journalArticlepeer-review

155 Citations (Scopus)

Abstract

This paper presents a novel recurrent neural network for solving nonlinear optimization problems with inequality constraints. Under the condition that the Hessian matrix of the associated Lagrangian function is positive semidefinite, it is shown that the proposed neural network is stable at a Karush-Kuhn-Tucker point in the sense of Lyapunov and its output trajectory is globally convergent to a minimum solution. Compared with variety of the existing projection neural networks, including their extensions and modification, for solving such nonlinearly constrained optimization problems, it is shown that the proposed neural network can solve constrained convex optimization problems and a class of constrained nonconvex optimization problems and there is no restriction on the initial point. Simulation results show the effectiveness of the proposed neural network in solving nonlinearly constrained optimization problems.

Original languageEnglish
Pages (from-to)1340-1353
Number of pages14
JournalIEEE Transactions on Neural Networks
Volume19
Issue number8
DOIs
Publication statusPublished - 2008
Externally publishedYes

Keywords

  • Global convergence
  • Nonconvex programming
  • Nonlinear inequality constraints
  • Nonsmooth analysis
  • Recurrent neural network

Fingerprint

Dive into the research topics of 'A novel recurrent neural network for solving nonlinear optimization problems with inequality constraints'. Together they form a unique fingerprint.

Cite this