Meaning of markov chain

Definition of markov chain

(noun) a Markov process for which the parameter is discrete time values

Other information on markov chain

WIKIPEDIA results for markov chain
Amazon results for markov chain