BayesianMachineLearningMachineLearningProbabilityTheoryStatistics A Markov random field or undirected graphical model is a Probabilistic Graphical Model over a collection of random variables defined by an undirected graph . Rather than modeling correlation directly via local conditional dependencies as in a Bayesian Network, one can more generally model functional relationships between random variables in terms of the factorization of the joint distribution of given by where

  • are the maximal cliques (fully-connected subsets of nodes) of
  • are clique potentials or factors
  • is the Partition Function or normalizing constant to ensure is a normalized probability distribution

Conditional independence of variables occurs when there is an observed variable on all paths between the variables, so that any variable is conditionally independent of the rest of the graph, given it’s neighbors. Formally, by defining the factorization in terms of cliques, the Markov random field implements the Markov property as a local independence assumption: Note:

  • Not all clique potentials lead to a well-defined (or existent) partition function
  • may be intractable to compute and typically requires approximations
  • factorizes (as above) if either:
    1. : every factor is strictly positive
    2. is equivalent to a Bayesian Network

Example: The graph corresponds to the following factorization: A Markov network can be converted into a directed graphical model by placing an ordering on the vertices and results in the loss of the Conditional Independence relations for any loop configuration that do not contain chords (shortcut paths). However, typically the conversion is in the other direction.