MachineLearning
Real-world signals are generated from underlying stochastic processes, so in turn, can be modeled using parametric random processes. While in most situations, the Martingale assumption (that the expected observation only depends on some collection of past states) is reasonable, it is in general computationally intractable when the correlation length of observations is high. One can model these long-range dependencies in a more tractable way by using latent variables that only depend on the previous state of the latent variable, i.e. latent variables form a Markov Process.
A pair of Stochastic Processes, is called a Hidden Markov Model (HMM) if:
is not observed
Assuming a discrete-time, discrete-space model (i.e. with . The observation model is given by the conditional distribution (or emission probability) with joint distribution: