Tensor networks for dimensionality reduction, big data and deep learning

    Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

    11 Citations (Scopus)


    Large scale multidimensional data are often available as multiway arrays or higher-order tensors which can be approximately represented in distributed forms via low-rank tensor decompositions and tensor networks. Our particular emphasis is on elucidating that, by virtue of the underlying low-rank approximations, tensor networks have the ability to reduce the dimensionality and alleviate the curse of dimensionality in a number of applied areas, especially in large scale optimization problems and deep learning. We briefly review and provide tensor links between low-rank tensor network decompositions and deep neural networks. We elucidating, through graphical illustrations, that low-rank tensor approximations and sophisticated contractions of core tensors, tensor networks have the ability to perform distributed computations on otherwise prohibitively large volume of data/parameters. Our focus is on the Hierarchical Tucker, tensor train (TT) decompositions and MERA tensor networks in specific applications.

    Original languageEnglish
    Title of host publicationStudies in Computational Intelligence
    PublisherSpringer Verlag
    Number of pages47
    Publication statusPublished - 2018

    Publication series

    NameStudies in Computational Intelligence
    ISSN (Print)1860-949X


    Dive into the research topics of 'Tensor networks for dimensionality reduction, big data and deep learning'. Together they form a unique fingerprint.

    Cite this