A one-layer dual recurrent neural network with a heaviside step activation function for linear programming with its linear assignment application

Qingshan Liu, Jun Wang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

10 Citations (Scopus)

Abstract

This paper presents a one-layer recurrent neural network for solving linear programming problems. The proposed neural network is guaranteed to be globally convergent in finite time to the optimal solutions under a mild condition on a derived lower bound of a single gain parameter. The number of neurons in the neural network is the same as the number of decision variables of the dual optimization problem. Compared with the existing neural networks for linear programming, the proposed neural network has salient features such as finite-time convergence and lower model complexity. Specifically, the proposed neural network is tailored for solving the linear assignment problem with simulation results to demonstrate the effectiveness and characteristics of the proposed neural network.

Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning, ICANN 2011 - 21st International Conference on Artificial Neural Networks, Proceedings
Pages253-260
Number of pages8
EditionPART 2
DOIs
Publication statusPublished - 2011
Externally publishedYes
Event21st International Conference on Artificial Neural Networks, ICANN 2011 - Espoo, Finland
Duration: 14 Jun 201117 Jun 2011

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 2
Volume6792 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference21st International Conference on Artificial Neural Networks, ICANN 2011
Country/TerritoryFinland
CityEspoo
Period14/06/1117/06/11

Keywords

  • global convergence in finite time
  • linear assignment problem
  • linear programming
  • Recurrent neural networks

Fingerprint

Dive into the research topics of 'A one-layer dual recurrent neural network with a heaviside step activation function for linear programming with its linear assignment application'. Together they form a unique fingerprint.

Cite this