Bucket renormalization for approximate inference

Sungsoo Ahn, Michael Chcrtkov, Adrian Welter, Jinwoo Shin

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    1 Citation (Scopus)

    Abstract

    Probabilistic graphical models arc a key tool in machine learning applications. Computing the partition function, i.e., normalizing constant, is a fundamental task of statistical infcrcncc but it is generally computationally intractable, leading to extensive study of approximation methods. Itera-tive variational methods are a popular and successful family of approaches. However, even state of the art variational methods can return poor results or fail to converge on difficult instances. In this paper, we instead consider computing the partition function via sequential summation over variables. We develop robust approximate algorithms by combining ideas from mini-bucket elimination with tensor network and renormalization group methods from statistical physics. The resulting "convergencc-free" methods show good empiri: cal performance on both synthetic and real-world benchmark models, even for difficult instances.

    Original languageEnglish
    Title of host publication35th International Conference on Machine Learning, ICML 2018
    EditorsAndreas Krause, Jennifer Dy
    PublisherInternational Machine Learning Society (IMLS)
    Pages183-193
    Number of pages11
    ISBN (Electronic)9781510867963
    Publication statusPublished - 2018
    Event35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden
    Duration: 10 Jul 201815 Jul 2018

    Publication series

    Name35th International Conference on Machine Learning, ICML 2018
    Volume1

    Conference

    Conference35th International Conference on Machine Learning, ICML 2018
    Country/TerritorySweden
    CityStockholm
    Period10/07/1815/07/18

    Fingerprint

    Dive into the research topics of 'Bucket renormalization for approximate inference'. Together they form a unique fingerprint.

    Cite this