A neurodynamic optimization approach to supervised feature selection via fractional programming

Yadi Wang, Xiaoping Li, Jun Wang

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)


Feature selection is an important issue in machine learning and data mining. Most existing feature selection methods are greedy in nature thus are prone to sub-optimality. Though some global feature selection methods based on unsupervised redundancy minimization can potentiate clustering performance improvements, their efficacy for classification may be limited. In this paper, a neurodynamics-based holistic feature selection approach is proposed via feature redundancy minimization and relevance maximization. An information-theoretic similarity coefficient matrix is defined based on multi-information and entropy to measure feature redundancy with respect to class labels. Supervised feature selection is formulated as a fractional programming problem based on the similarity coefficients. A neurodynamic approach based on two one-layer recurrent neural networks is developed for solving the formulated feature selection problem. Experimental results with eight benchmark datasets are discussed to demonstrate the global convergence of the neural networks and superiority of the proposed neurodynamic approach to several existing feature selection methods in terms of classification accuracy, precision, recall, and F-measure.

Original languageEnglish
Pages (from-to)194-206
Number of pages13
JournalNeural Networks
Publication statusPublished - Apr 2021
Externally publishedYes


  • Feature selection
  • Fractional programming
  • Information-theoretic measures
  • Neurodynamic optimization


Dive into the research topics of 'A neurodynamic optimization approach to supervised feature selection via fractional programming'. Together they form a unique fingerprint.

Cite this