Machine Learning — Graphical Model Exact inference (Variable elimination, Belief propagation, Junction tree)

HMM
Replacing joint probability with a Bayesian Network factors

Variable elimination

  1. Multiply all factors Φ containing a variable say b
  2. Marginalize b to obtain a new factor τ
  3. Replace these factors Φ with the new intermediate factor τ
Sum-product inference
Source
  • Choose a variable with the fewest dependent variables, i.e. min-neighbors — the vertex with the fewest neighbors or dependence.
  • Choose the smaller sizer for the merged factor.
  • Minimize the number of fill edges (discussed later) that need to be added to the graph when eliminating a variable.

Message-passing (belief propagation)

Junction tree

Source
  • There exists only one single path between each pair of clusters,
  • A clique in G must belong to a cluster, and
  • For a pair of clusters containing node i, each cluster between the path of these two clusters must also contain node i.
Modified from source
Source
Source
Modified from source
Source of the equation
Ising model

Maximum a posteriori — MAP inference

Modified from source

Exact solutions & Approximation methods

  • Variable elimination algorithm
  • Message-passing algorithm (belief propagation — sum-product inference for marginal distribution or max-product inference for MAP)
  • The junction tree algorithms
  • Loopy belief propagation
  • Sampling method
  • Variational inference

Credits and references

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store