- Markoff chain as a noun:
- 1
Markoff chain
noun
1 Markoff chain
A Markov process for which the parameter is discrete time values.
synonym: Markov chain.
Dutch: Markovketen, Markov-keten
debug info: 0.0134
noun
A Markov process for which the parameter is discrete time values.
synonym: Markov chain.
Dutch: Markovketen, Markov-keten
debug info: 0.0134