A recurrent neural network for nonlinear convex optimization subject to nonlinear inequality constraints

Youshen Xia, Jun Wang

Research output: Contribution to journalArticlepeer-review

149 Citations (Scopus)

Abstract

This paper presents a novel recurrent neural network for solving nonlinear convex programming problems subject to nonlinear inequality constraints. Under the condition that the objective function is convex and all constraint functions are strictly convex or that the objective function is strictly convex and the constraint function is convex, the proposed neural network is proved to be stable in the sense of Lyapunov and globally convergent to an exact optimal solution. Compared with the existing neural networks for solving such nonlinear optimization problems, the proposed neural network has two major advantages. One is that it can solve convex programming problems with general convex inequality constraints. Another is that it does not require a Lipschitz condition on the objective function and constraint function. Simulation results are given to illustrate further the global convergence and performance of the proposed neural network for constrained nonlinear optimization.

Original languageEnglish
Pages (from-to)1385-1394
Number of pages10
JournalIEEE Transactions on Circuits and Systems I: Regular Papers
Volume51
Issue number7
DOIs
Publication statusPublished - 2004
Externally publishedYes

Fingerprint

Dive into the research topics of 'A recurrent neural network for nonlinear convex optimization subject to nonlinear inequality constraints'. Together they form a unique fingerprint.

Cite this