Generalizing the column-row matrix decomposition to multi-way arrays

Cesar F. Caiafa, Andrzej Cichocki

Research output: Contribution to journalArticlepeer-review

55 Citations (Scopus)

Abstract

In this paper, we provide two generalizations of the CUR matrix decomposition Y = CUR (also known as pseudo-skeleton approximation method [1]) to the case of N-way arrays (tensors). These generalizations, which we called Fiber Sampling Tensor Decomposition types 1 and 2 (FSTD1 and FSTD2), provide explicit formulas for the parameters of a rank-(R, R, ..., R) Tucker representation (the core tensor of size R × R × ⋯ × R and the matrix factors of sizes In × R, n = 1, 2, ... N) based only on some selected entries of the original tensor. FSTD1 uses PN - 1 (P ≥ R) n-mode fibers of the original tensor while FSTD2 uses exactly R fibers in each mode as matrix factors, as suggested by the existence theorem provided in Oseledets et al. (2008) [2], with a core tensor defined in terms of the entries of a subtensor of size R × R × ⋯ × R. For N = 2 our results are reduced to the already known CUR matrix decomposition where the core matrix is defined as the inverse of the intersection submatrix, i.e. U = W- 1. Additionally, we provide an adaptive type algorithm for the selection of proper fibers in the FSTD1 model which is useful for large scale applications. Several numerical results are presented showing the performance of our FSTD1 Adaptive Algorithm compared to two recently proposed approximation methods for 3-way tensors.

Original languageEnglish
Pages (from-to)557-573
Number of pages17
JournalLinear Algebra and Its Applications
Volume433
Issue number3
DOIs
Publication statusPublished - 1 Sep 2010
Externally publishedYes

Keywords

  • Large-scale problems
  • Low-rank approximation
  • N-way array
  • Pseudo-skeleton (CUR) matrix decomposition
  • Tucker tensor decomposition

Fingerprint

Dive into the research topics of 'Generalizing the column-row matrix decomposition to multi-way arrays'. Together they form a unique fingerprint.

Cite this