## Abstract

We propose a new method for low-rank approximation of Moore-Penrose pseudoinverses of large-scale matrices using tensor networks. The computed pseudoinverses can be useful for solving or preconditioning of large-scale overdetermined or underdetermined systems of linear equations. The computation is performed efficiently and stably based on the modified alternating least squares scheme using low-rank tensor train (TT) decompositions and tensor network contractions. The formulated large-scale optimization problem is reduced to sequential smaller-scale problems for which any standard and stable algorithms can be applied. A regularization technique is incorporated in order to alleviate ill-posedness and obtain robust low-rank approximations. Numerical simulation results illustrate that the regularized pseudoinverses of a wide class of nonsquare or nonsymmetric matrices admit good approximate low-rank TT representations. Moreover, we demonstrated that the computational cost of the proposed method is only logarithmic in the matrix size given that the TT ranks of a data matrix and its approximate pseudoinverse are bounded. It is illustrated that a strongly nonsymmetric convection-diifusion problem can be efficiently solved by using the preconditioners computed by the proposed method.

Original language | English |
---|---|

Pages (from-to) | 598-623 |

Number of pages | 26 |

Journal | SIAM Journal on Matrix Analysis and Applications |

Volume | 37 |

Issue number | 2 |

DOIs | |

Publication status | Published - 2016 |

## Keywords

- Alternating least squares
- Curse of dimensionality
- Density matrix renormalization group
- Low-rank tensor approximation
- Matrix product operators
- Matrix product states
- Preconditioning
- Solving of huge system of linear equations