Regularized alternating least squares algorithms for non-negative matrix/tensor factorization

Andrzej Cichocki, Rafal Zdunek

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

50 Citations (Scopus)

Abstract

Nonnegative Matrix and Tensor Factorization (NMF/NTF) and Sparse Component Analysis (SCA) have already found many potential applications, especially in multi-way Blind Source Separation (BSS), multi-dimensional data analysis, model reduction and sparse signal/image representations. In this paper we propose a family of the modified Regularized Alternating Least Squares (RALS) algorithms for NMF/NTF. By incorporating regularization and penalty terms into the weighted Frobenius norm we are able to achieve sparse and/or smooth representations of the desired solution, and to alleviate the problem of getting stuck in local minima. We implemented the RALS algorithms in our NMFLAB/NTFLAB Matlab Toolboxes, and compared them with standard NMF algorithms. The proposed algorithms are characterized by improved efficiency and convergence properties, especially for largescale problems.

Original languageEnglish
Title of host publicationAdvances in Neural Networks - ISNN 2007 - 4th International Symposium on Neural Networks, ISNN 2007, Proceedings
PublisherSpringer Verlag
Pages793-802
Number of pages10
EditionPART 3
ISBN (Print)9783540723943
DOIs
Publication statusPublished - 2007
Externally publishedYes
Event4th International Symposium on Neural Networks, ISNN 2007 - Nanjing, China
Duration: 3 Jun 20077 Jun 2007

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 3
Volume4493 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference4th International Symposium on Neural Networks, ISNN 2007
Country/TerritoryChina
CityNanjing
Period3/06/077/06/07

Fingerprint

Dive into the research topics of 'Regularized alternating least squares algorithms for non-negative matrix/tensor factorization'. Together they form a unique fingerprint.

Cite this