Block Belief Propagation for Parameter Learning in Markov Random Fields

Authors

  • You Lu Virginia Polytechnic Institute and State University
  • Zhiyuan Liu University of Colorado Boulder
  • Bert Huang Virginia Polytechnic Institute and State University

DOI:

https://doi.org/10.1609/aaai.v33i01.33014448

Abstract

Traditional learning methods for training Markov random fields require doing inference over all variables to compute the likelihood gradient. The iteration complexity for those methods therefore scales with the size of the graphical models. In this paper, we propose block belief propagation learning (BBPL), which uses block-coordinate updates of approximate marginals to compute approximate gradients, removing the need to compute inference on the entire graphical model. Thus, the iteration complexity of BBPL does not scale with the size of the graphs. We prove that the method converges to the same solution as that obtained by using full inference per iteration, despite these approximations, and we empirically demonstrate its scalability improvements over standard training methods.

Downloads

Published

2019-07-17

How to Cite

Lu, Y., Liu, Z., & Huang, B. (2019). Block Belief Propagation for Parameter Learning in Markov Random Fields. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4448-4455. https://doi.org/10.1609/aaai.v33i01.33014448

Issue

Section

AAAI Technical Track: Machine Learning