Gauging variational inference

Sungsoo Ahn, Michael Chertkov, Jinwoo Shin

Research output: Contribution to journalArticlepeer-review

Abstract

Computing of partition function is the most important statistical inference task arising in applications of graphical models (GM). Since it is computationally intractable, approximate methods have been used in practice, where mean-field (MF) and belief propagation (BP) are arguably the most popular and successful approaches of a variational type. In this paper, we propose two new variational schemes, coined Gauged-MF (G-MF) and Gauged-BP (G-BP), improving MF and BP, respectively. Both provide lower bounds for the partition function by utilizing the so-called gauge transformation which modifies factors of GM while keeping the partition function invariant. Moreover, we prove that both G-MF and G-BP are exact for GMs with a single loop of a special structure, even though the bare MF and BP perform badly in this case. Our extensive experiments indeed confirm that the proposed algorithms outperform and generalize MF and BP.

Original languageEnglish
Article number124015
JournalJournal of Statistical Mechanics: Theory and Experiment
Volume2019
Issue number12
DOIs
Publication statusPublished - 20 Dec 2019
Externally publishedYes

Keywords

  • machine learning

Fingerprint

Dive into the research topics of 'Gauging variational inference'. Together they form a unique fingerprint.

Cite this