Efficient Nonnegative Tucker Decompositions: Algorithms and Uniqueness

Guoxu Zhou, Andrzej Cichocki, Qibin Zhao, Shengli Xie

Research output: Contribution to journalArticlepeer-review

64 Citations (Scopus)


Nonnegative Tucker decomposition (NTD) is a powerful tool for the extraction of nonnegative parts-based and physically meaningful latent components from high-dimensional tensor data while preserving the natural multilinear structure of data. However, as the data tensor often has multiple modes and is large scale, the existing NTD algorithms suffer from a very high computational complexity in terms of both storage and computation time, which has been one major obstacle for practical applications of NTD. To overcome these disadvantages, we show how low (multilinear) rank approximation (LRA) of tensors is able to significantly simplify the computation of the gradients of the cost function, upon which a family of efficient first-order NTD algorithms are developed. Besides dramatically reducing the storage complexity and running time, the new algorithms are quite flexible and robust to noise, because any well-established LRA approaches can be applied. We also show how nonnegativity incorporating sparsity substantially improves the uniqueness property and partially alleviates the curse of dimensionality of the Tucker decompositions. Simulation results on synthetic and real-world data justify the validity and high efficiency of the proposed NTD algorithms.

Original languageEnglish
Article number7265046
Pages (from-to)4990-5003
Number of pages14
JournalIEEE Transactions on Image Processing
Issue number12
Publication statusPublished - Dec 2015
Externally publishedYes


  • dimensionality reduction
  • Nonnegative alternating least squares
  • Tucker decompositions


Dive into the research topics of 'Efficient Nonnegative Tucker Decompositions: Algorithms and Uniqueness'. Together they form a unique fingerprint.

Cite this