English synonyms about - contact  

Markov chain

noun

1 Markov chain

A Markov process for which the parameter is discrete time values.

synonym: Markoff chain.

Dutch: Markovketen, Markov-keten


Find more on markov chain elsewhere: etymology - rhymes - Wikipedia.

debug info: 0.0232