Machine Learning — Graphical Model

  1. whether the sprinkler is automatically on or off,
  2. whether it is raining, and
  3. whether the grass is wet.
  • The likelihood of the observations P(E) — the chance of the lab results (Evidence E).
  • The marginal probability P(X₁), P(X₁, X₃), etc … — the chance of having Alzheimer when you are 70.
  • Conditional probability (or a posterior belief based on evidence), P(Y|E) — the chance of having Alzheimer given your parent has it.
  • Maximum a Posterior (MAP), arg max P(X, E) — the most likely disease given the lab results.

Joint probability

  • The marginal probability p(X) by summing over other variables.
  • Conditional probability solved with Bayes’ Theorem and marginal probabilities.
  • MAP inference (Maximum a posterior). For example, use Bayes’ Theorem to finding the most likely values for a set of variables given the observations.

Bayesian network (BN)

Bayesian network
Source
Modified from source
Modified from source
Source

Independence

  • Finding a perfect map is not always possible in BN.
  • A perfect map may not be unique.
  • Two graphics have the same skeleton if they are the same if we ignore the arrow in the dependency.

BN examples

Modified from source
Modified from source

Markov random fields

Source
Source of the equations
Source
Source
Modified from Wikipedia
Restricted Boltzmann Machine (RBM).
  • Computing the partition function Z is NP-hard in general, the use of approximation is usually required.
  • It is not a generative model. We cannot generate data/sample from this model easily.
Modified from source

Conditional Random Fields

Modified from source
Modified from source
Modified from source

Next

Reference & credits

--

--

Deep Learning

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store