Regularized computation of approximate pseudoinverse of large matrices using low-rank tensor train decompositions

Namgil Lee, Andrzej Cichocki

    Research output: Contribution to journalArticlepeer-review

    17 Citations (Scopus)

    Abstract

    We propose a new method for low-rank approximation of Moore-Penrose pseudoinverses of large-scale matrices using tensor networks. The computed pseudoinverses can be useful for solving or preconditioning of large-scale overdetermined or underdetermined systems of linear equations. The computation is performed efficiently and stably based on the modified alternating least squares scheme using low-rank tensor train (TT) decompositions and tensor network contractions. The formulated large-scale optimization problem is reduced to sequential smaller-scale problems for which any standard and stable algorithms can be applied. A regularization technique is incorporated in order to alleviate ill-posedness and obtain robust low-rank approximations. Numerical simulation results illustrate that the regularized pseudoinverses of a wide class of nonsquare or nonsymmetric matrices admit good approximate low-rank TT representations. Moreover, we demonstrated that the computational cost of the proposed method is only logarithmic in the matrix size given that the TT ranks of a data matrix and its approximate pseudoinverse are bounded. It is illustrated that a strongly nonsymmetric convection-diifusion problem can be efficiently solved by using the preconditioners computed by the proposed method.

    Original languageEnglish
    Pages (from-to)598-623
    Number of pages26
    JournalSIAM Journal on Matrix Analysis and Applications
    Volume37
    Issue number2
    DOIs
    Publication statusPublished - 2016

    Keywords

    • Alternating least squares
    • Curse of dimensionality
    • Density matrix renormalization group
    • Low-rank tensor approximation
    • Matrix product operators
    • Matrix product states
    • Preconditioning
    • Solving of huge system of linear equations

    Fingerprint

    Dive into the research topics of 'Regularized computation of approximate pseudoinverse of large matrices using low-rank tensor train decompositions'. Together they form a unique fingerprint.

    Cite this