Generalized alpha-beta divergences and their application to robust nonnegative matrix factorization

Andrzej Cichocki, Sergio Cruces, Shun ichi Amari

Research output: Contribution to journalArticlepeer-review

138 Citations (Scopus)


We propose a class of multiplicative algorithms for Nonnegative Matrix Factorization (NMF) which are robust with respect to noise and outliers. To achieve this, we formulate a new family generalized divergences referred to as the Alpha-Beta-divergences (AB-divergences), which are parameterized by the two tuning parameters, alpha and beta, and smoothly connect the fundamental Alpha-, Beta- and Gamma-divergences. By adjusting these tuning parameters, we show that a wide range of standard and new divergences can be obtained. The corresponding learning algorithms for NMF are shown to integrate and generalize many existing ones, including the Lee-Seung, ISRA (Image Space Reconstruction Algorithm), EMML (Expectation Maximization Maximum Likelihood), Alpha-NMF, and Beta-NMF. Owing to more degrees of freedom in tuning the parameters, the proposed family of AB-multiplicative NMF algorithms is shown to improve robustness with respect to noise and outliers. The analysis illuminates the links of between AB-divergence and other divergences, especially Gamma- and Itakura-Saito divergences.

Original languageEnglish
Pages (from-to)134-170
Number of pages37
Issue number1
Publication statusPublished - Jan 2011
Externally publishedYes


  • Alpha-
  • Beta-
  • Extended Itakura-Saito like divergences
  • Gamma- divergences
  • Generalized divergences
  • Generalized Kullback-Leibler divergence
  • Nonnegative matrix factorization (NMF)
  • Robust multiplicative NMF algorithms
  • Similarity measures


Dive into the research topics of 'Generalized alpha-beta divergences and their application to robust nonnegative matrix factorization'. Together they form a unique fingerprint.

Cite this