A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik-Chervonenkis Dimension

Ying Tan, Jun Wang

Research output: Contribution to journalArticlepeer-review

118 Citations (Scopus)

Abstract

This paper presents a mechanism to train support vector machines (SVMs) with a hybrid kernel and minimal Vapnik-Chervonenkis (VC) dimension. After describing the VC dimension of sets of separating hyperplanes in a high-dimensional feature space produced by a mapping related to kernels from the input space, we proposed an optimization criterion to design SVMs by minimizing the upper bound of the VC dimension. This method realizes a structural risk minimization and utilizes a flexible kernel function such that a superior generalization over test data can be obtained. In order to obtain a flexible kernel function, we develop a hybrid kernel function and a sufficient condition to be an admissible Mercer kernel based on common Mercer kernels (polynomial, radial basis function, two-layer neural network, etc.). The nonnegative combination coefficients and parameters of the hybrid kernel are determined subject to the minimal upper bound of the VC dimension of the learning machine. The use of the hybrid kernel results in a better performance than those with a single common kernel. Experimental results are discussed to illustrate the proposed method and show that the SVM with the hybrid kernel outperforms that with a single common kernel In terms of generalization power.

Original languageEnglish
Pages (from-to)385-395
Number of pages11
JournalIEEE Transactions on Knowledge and Data Engineering
Volume16
Issue number4
DOIs
Publication statusPublished - Apr 2004
Externally publishedYes

Keywords

  • Hybrid kernel function
  • Hyperplane
  • Structural risk minimization
  • Support vector machines
  • VC dimension

Fingerprint

Dive into the research topics of 'A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik-Chervonenkis Dimension'. Together they form a unique fingerprint.

Cite this