ProbabilityTheoryStochasticProcesses Markov operators generalize the transition matrix of discrete-time Markov chains to an uncountable state space for general Markov Processes. Given a measurable space , a Markov operator is induced by the Markov (or transition) kernel where:

  1. is a probability measure
  2. is a measurable function

gives the probability of starting at and ending in .

The induced Markov operator is the integral operator :