## Absorbing Markov Chains :

An absorbing Markov chain is a type of Markov chain in which there are one or more “absorbing” states, also known as “sink states”, that a system cannot leave once it enters them. This means that the system will remain in these states indefinitely, and the system will not return to any of the non-absorbing states.

For example, consider a simple weather model where the states are “sunny”, “cloudy”, and “rainy”. If the system starts in the “sunny” state, it can either stay sunny or become cloudy with a certain probability. If it becomes cloudy, it can either stay cloudy or become rainy with a certain probability. If it becomes rainy, it will remain rainy and will not return to the sunny or cloudy states. In this case, the “rainy” state is the absorbing state.

Another example is a model of customer behavior in a store. The states are “shopping”, “deciding”, and “purchased”. If a customer starts in the “shopping” state, they can either continue shopping or move to the “deciding” state with a certain probability. If they move to the “deciding” state, they can either continue deciding or move to the “purchased” state with a certain probability. Once they enter the “purchased” state, they will not return to the “shopping” or “deciding” states, and will remain in the “purchased” state indefinitely. In this case, the “purchased” state is the absorbing state.

Absorbing Markov chains are useful in modeling systems where there is a certain outcome that is irreversible or impossible to escape from once it is reached. These models can be used to determine the likelihood of reaching the absorbing state and the expected time it will take to reach it.