Fast nonnegative tensor factorization based on accelerated proximal gradient and low-rank approximation

Yu Zhang, Guoxu Zhou, Qibin Zhao, Andrzej Cichocki, Xingyu Wang

    Research output: Contribution to journalArticlepeer-review

    44 Citations (Scopus)

    Abstract

    Nonnegative tensor factorization (NTF) has been widely applied in high-dimensional nonnegative tensor data analysis. However, most of the existing algorithms suffer from slow convergence caused by the nonnegativity constraint and hence their practical applications are severely limited. In this study, we propose a new algorithm called FastNTF_APG to speed up NTF by combining accelerated proximal gradient and low-rank approximation. Experimental results demonstrate that FastNTF_APG achieves significantly higher computational efficiency than state-of-the-art NTF algorithms.

    Original languageEnglish
    Pages (from-to)148-154
    Number of pages7
    JournalNeurocomputing
    Volume198
    DOIs
    Publication statusPublished - 19 Jul 2016

    Keywords

    • Accelerated proximal gradient
    • CP (PARAFAC) decompositions
    • Low-rank approximation
    • Nonnegative tensor factorization

    Fingerprint

    Dive into the research topics of 'Fast nonnegative tensor factorization based on accelerated proximal gradient and low-rank approximation'. Together they form a unique fingerprint.

    Cite this