- Markov chain as a noun:
- 1
Markov chain
noun
1 Markov chain
A Markov process for which the parameter is discrete time values.
synonym: Markoff chain.
Dutch: Markovketen, Markov-keten
debug info: 0.0232
noun
A Markov process for which the parameter is discrete time values.
synonym: Markoff chain.
Dutch: Markovketen, Markov-keten
debug info: 0.0232