Markov Chain
- System moves between a finite set of states with fixed probabilities for each transition.
- The next state depends only on the current state (the “memoryless” property).
- You can compute multi-step transition probabilities by applying the fixed transition probabilities over successive steps.
Definition
Section titled “Definition”A Markov chain is a mathematical system that undergoes transitions from one state to another according to certain probabilistic rules. The defining characteristic of a Markov chain is that no matter how the system arrived at its current state, the possible future states are fixed; this is known as the “memoryless” property.
Explanation
Section titled “Explanation”In a Markov chain the probabilities of moving from the current state to each possible next state are fixed and known. Starting from a given state, those transition probabilities determine the likelihood of the system being in each state after one step. If the system remains in a given state for several transitions, the overall probability of being in each state after a specified number of steps can be calculated by repeatedly applying the fixed transition probabilities.
Examples
Section titled “Examples”Weather model
Section titled “Weather model”Three states: sunny, cloudy, and rainy. The probability of transitioning from one state to another is fixed and known. For instance, the probability of transitioning from sunny to cloudy may be 0.3, while the probability of transitioning from cloudy to rainy may be 0.7.
If we start in the sunny state, we can use the probabilities to calculate the likelihood of transitioning to the other states. For example, the probability of transitioning to the cloudy state is 0.3, and the probability of transitioning to the rainy state is 0. If we stay in the sunny state for several transitions, we can calculate the overall probability of transitioning to each state after a certain number of steps.
Stock market model
Section titled “Stock market model”Three states: bullish, neutral, and bearish. The probability of transitioning from one state to another is fixed and known. For instance, the probability of transitioning from bullish to neutral may be 0.5, while the probability of transitioning from neutral to bearish may be 0.2.
If we start in the bullish state, we can use the probabilities to calculate the likelihood of transitioning to the other states. For example, the probability of transitioning to the neutral state is 0.5, and the probability of transitioning to the bearish state is 0.1. If we stay in the bullish state for several transitions, we can calculate the overall probability of transitioning to each state after a certain number of steps.
Use cases
Section titled “Use cases”Markov chains allow prediction of a system’s future state based on its current state and the fixed transition probabilities, which can be useful for making decisions or planning strategies in various real-world situations.
Related terms
Section titled “Related terms”- “memoryless” property