BayesianMachineLearningProbabilityTheoryMachineLearningStatistics
A Probabilistic Graphical Model is a quadruple
Edges represent the direct interactions or correlations between random variables that are to be explicitly modeled, but imply indirect interactions which are not explicitly modeled. A given factorization of the full joint distribution corresponds to a set of conditional independencies which make exact inference more efficient via Belief Propagation algorithms.
Key components: --- start-multi-column: ID_ny21
Number of Columns: 2
Largest Column: standard
Observed Variables
- deterministic parameters - deterministic latent variable , - deterministic variables - random variable
Unobserved Variables
- random variable
--- column-break ---
Observed Variables - (shaded)
Unobserved Variables - (open)
--- end-multi-column Examples:
- Directed graph (Bayesian Network/belief network)
The joint distribution for the Linear Gaussian Model for Bayesian linear regression:
- Undirected graph (Markov Random Fields)
Easy to infer dependencies between variables. Modeling the pixel value
along with it’s four neighboring pixels: - Factor graph
Expresses conditioning in undirected graphs and Bayesian Networks, where new nodes are introduced to index functions of nodes connected by edges. For example, the following Markov Random Field over pixels (as before, with more pixel interactions) is ambiguous:
It is unclear whether the interactions between
and direct or indirect, i.e. mediated by their respective interactions with . We can include more information by explicitly modeling the interactions via factors: which encodes the particular factorization