Advances in PARAFAC using parallel block decomposition

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Citations (Scopus)


Parallel factor analysis (PARAFAC) is a multi-way decomposition method which allows to find hidden factors from the raw tensor data with many potential applications in neuroscience, bioinformatics, chemometrics etc [1,2]. The Alternating Least Squares (ALS) algorithm can explain the raw tensor by a small number of rank-one tensors with a high fitness. However, for large scale data, due to necessity to compute Khatri-Rao products of long factors, and multiplication of large matrices, existing algorithms require high computational cost and large memory. Hence decomposition of large-scale tensor is still a challenging problem for PARAFAC. In this paper, we propose a new algorithm based on the ALS algorithm which computes Hadamard products and small matrices, instead of Khatri-Rao products. The new algorithm is able to process extremely large-scale tensor with billions of entries in parallel. Extensive experiments confirm the validity and high performance of the developed algorithm in comparison with other well-known algorithms.

Original languageEnglish
Title of host publicationNeural Information Processing - 16th International Conference, ICONIP 2009, Proceedings
Number of pages8
EditionPART 1
Publication statusPublished - 2009
Externally publishedYes
Event16th International Conference on Neural Information Processing, ICONIP 2009 - Bangkok, Thailand
Duration: 1 Dec 20095 Dec 2009

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume5863 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference16th International Conference on Neural Information Processing, ICONIP 2009


Dive into the research topics of 'Advances in PARAFAC using parallel block decomposition'. Together they form a unique fingerprint.

Cite this