|
We do not currently know of any antonyms for Markov process.
The noun Markov process is defined as:
- Any stochastic process for which the conditional probability distribution of future states depends only on the current state (and not on past states).
|
|
|
|
|
|