A one-layer recurrent neural network with a unipolar hard-limiting activation function for k-winners-take-all operation

Qingshan Liu, Jun Wang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

This paper presents a one-layer recurrent neural network with a unipolar hard-limiting activation function for k-winners-take-all (kWTA) operation. The kWTA operation is first converted into an equivalent quadratic programming problem. Then a one-layer recurrent neural network is constructed. The neural network is guaranteed to be capable of performing the kWTA operation in real time. The stability and convergence of the neural network are proven by using Lyapunov and nonsmooth analysis methods.

Original languageEnglish
Title of host publicationThe 2007 International Joint Conference on Neural Networks, IJCNN 2007 Conference Proceedings
Pages84-89
Number of pages6
DOIs
Publication statusPublished - 2007
Externally publishedYes
Event2007 International Joint Conference on Neural Networks, IJCNN 2007 - Orlando, FL, United States
Duration: 12 Aug 200717 Aug 2007

Publication series

NameIEEE International Conference on Neural Networks - Conference Proceedings
ISSN (Print)1098-7576

Conference

Conference2007 International Joint Conference on Neural Networks, IJCNN 2007
Country/TerritoryUnited States
CityOrlando, FL
Period12/08/0717/08/07

Fingerprint

Dive into the research topics of 'A one-layer recurrent neural network with a unipolar hard-limiting activation function for k-winners-take-all operation'. Together they form a unique fingerprint.

Cite this