Sparse super symmetric tensor factorization

Andrzej Cichocki, Marko Jankovic, Rafal Zdunek, Shun Ichi Amari

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Citations (Scopus)


In the paper we derive and discuss a wide class of algorithms for 3D Super-symmetric Nonnegative Tensor Factorization (SNTF) or nonnegative symmetric PARAFAC, and as a special case: Symmetric Nonnegative Matrix Factorization (SNMF) that have many potential applications, including multi-way clustering, feature extraction, multi- sensory or multi-dimensional data analysis, and nonnegative neural sparse coding. The main advantage of the derived algorithms is relatively low complexity, and in the case of multiplicative algorithms possibility for straightforward extension of the algorithms to L-order tensors factorization due to some nice symmetric property. We also propose to use a wide class of cost functions such as Squared Euclidean, Kullback Leibler I-divergence, Alpha divergence and Beta divergence. Preliminary experimental results confirm the validity and good performance of some of these algorithms, especially when the data have sparse representations.

Original languageEnglish
Title of host publicationNeural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers
Number of pages10
EditionPART 1
Publication statusPublished - 2008
Externally publishedYes
Event14th International Conference on Neural Information Processing, ICONIP 2007 - Kitakyushu, Japan
Duration: 13 Nov 200716 Nov 2007

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume4984 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference14th International Conference on Neural Information Processing, ICONIP 2007


Dive into the research topics of 'Sparse super symmetric tensor factorization'. Together they form a unique fingerprint.

Cite this