From Wikipedia, the free encyclopedia
Sequence of random variables
In
mathematics
, a
Markov information source
, or simply, a
Markov source
, is an
information source
whose underlying dynamics are given by a stationary finite
Markov chain
.
Formal definition
[
edit
]
An
information source
is a sequence of
random variables
ranging over a finite alphabet
, having a
stationary distribution
.
A Markov information source is then a (stationary) Markov chain
, together with a
function
that maps states
in the Markov chain to letters in the alphabet
.
A
unifilar Markov source
is a Markov source for which the values
are distinct whenever each of the states
are reachable, in one step, from a common prior state. Unifilar sources are notable in that many of their properties are far more easily analyzed, as compared to the general case.
Applications
[
edit
]
Markov sources are commonly used in
communication theory
, as a model of a
transmitter
. Markov sources also occur in
natural language processing
, where they are used to represent hidden meaning in a text. Given the output of a Markov source, whose underlying Markov chain is unknown, the task of solving for the underlying chain is undertaken by the techniques of
hidden Markov models
, such as the
Viterbi algorithm
.
See also
[
edit
]
References
[
edit
]