Alternating least squares as moving subspace correction

Ivan V. Oseledets, Maxim V. Rakhuba, André Uschmajew

    Research output: Contribution to journalArticlepeer-review

    5 Citations (Scopus)

    Abstract

    In this note we take a new look at the local con vergence of alternating optimization methods for low-rank matrices and tensors. Our abstract interpretation as sequential optimization on moving subspaces yields insightful reformulations of some known convergence conditions that focus on the interplay between the contractivity of classical multiplicative Schwarz methods with overlapping subspaces and the curvature of low-rank matrix and tensor manifolds. While the verification of the abstract conditions in concrete scenarios remains open in most cases, we are able to provide an alternative and conceptually simple derivation of the asymptotic convergence rate of the twosided block power method of numerical algebra for computing the dominant singular subspaces of a rectangular matrix. This method is equivalent to an alternating least squares method applied to a distance function. The theoretical results are illustrated and validated by numerical experiments.

    Original languageEnglish
    Pages (from-to)3459-3479
    Number of pages21
    JournalSIAM Journal on Numerical Analysis
    Volume56
    Issue number6
    DOIs
    Publication statusPublished - 2018

    Keywords

    • Als
    • Local convergence
    • Low-rank approximation
    • Nonlinear gauss-seidel method

    Fingerprint

    Dive into the research topics of 'Alternating least squares as moving subspace correction'. Together they form a unique fingerprint.

    Cite this