Fundamental tensor operations for large-scale data analysis using tensor network formats

Namgil Lee, Andrzej Cichocki

    Research output: Contribution to journalArticlepeer-review

    28 Citations (SciVal)


    We discuss extended definitions of linear and multilinear operations such as Kronecker, Hadamard, and contracted products, and establish links between them for tensor calculus. Then we introduce effective low-rank tensor approximation techniques including Candecomp/Parafac, Tucker, and tensor train (TT) decompositions with a number of mathematical and graphical representations. We also provide a brief review of mathematical properties of the TT decomposition as a low-rank approximation technique. With the aim of breaking the curse-of-dimensionality in large-scale numerical analysis, we describe basic operations on large-scale vectors, matrices, and high-order tensors represented by TT decomposition. The proposed representations can be used for describing numerical methods based on TT decomposition for solving large-scale optimization problems such as systems of linear equations and symmetric eigenvalue problems.

    Original languageEnglish
    Pages (from-to)921-960
    Number of pages40
    JournalMultidimensional Systems and Signal Processing
    Issue number3
    Publication statusPublished - 1 Jul 2018


    • Big data
    • Contracted product
    • Generalized Tucker model
    • Matrix product operator
    • Matrix product state
    • Multilinear operator
    • Strong Kronecker product
    • Tensor calculus
    • Tensor networks
    • Tensor train


    Dive into the research topics of 'Fundamental tensor operations for large-scale data analysis using tensor network formats'. Together they form a unique fingerprint.

    Cite this