Gauging variational inference

Sungsoo Ahn, Michael Chertkov, Jinwoo Shin

    Research output: Contribution to journalConference articlepeer-review

    4 Citations (Scopus)

    Abstract

    Computing partition function is the most important statistical inference task arising in applications of Graphical Models (GM). Since it is computationally intractable, approximate methods have been used in practice, where mean-field (MF) and belief propagation (BP) are arguably the most popular and successful approaches of a variational type. In this paper, we propose two new variational schemes, coined Gauged-MF (G-MF) and Gauged-BP (G-BP), improving MF and BP, respectively. Both provide lower bounds for the partition function by utilizing the so-called gauge transformation which modifies factors of GM while keeping the partition function invariant. Moreover, we prove that both G-MF and G-BP are exact for GMs with a single loop of a special structure, even though the bare MF and BP perform badly in this case. Our extensive experiments indeed confirm that the proposed algorithms outperform and generalize MF and BP.

    Original languageEnglish
    Pages (from-to)2882-2891
    Number of pages10
    JournalAdvances in Neural Information Processing Systems
    Volume2017-December
    Publication statusPublished - 2017
    Event31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States
    Duration: 4 Dec 20179 Dec 2017

    Fingerprint

    Dive into the research topics of 'Gauging variational inference'. Together they form a unique fingerprint.

    Cite this