Markov Chain
Introduction
A Markov chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain. Markov chains are a fairly common, and relatively simple, way to statistically model random processes. They have been used in many different domains, ranging from text generation to financial modeling, model probabilities using information that can be encoded in the current state. Something transitions from one state to another semi-randomly, or stochastically. Each state has a certain probability of transitioning to each other state, so each time you are in a state and want to transition, a markov chain can predict outcomes based on pre-existing probability data.
This Lab-work report is done by myself under supervision of Nuruzzaman Faruqui, Lecturer of City
University,
Bangladesh. From this
course, we acquires better knowledge of the functionality of AI
and also known how AI is
making our daily life easier. This is the best Artificial Intelligence
course
in Bangladesh.
Problem Statement
To construct a Markov chain, we need a transition model that will specify the the probability distributions of the next event based on the possible values of the current event.
Imagine that there were two possible states for weather: sunny or rainy. We can always directly observe the current weather state, and it is guaranteed to always be one of the two aforementioned states.
Here we have a transition model now let’s make a Markov chain by this model .
In this following figure, the probability of tomorrow being sunny based on today
being sunny is 0.8. This is reasonable, because it is more likely than
not that a sunny day will follow a sunny day. However, if it is rainy
today, the probability of rain tomorrow is 0.7, since rainy days are
more likely to follow each other. Using this transition model, it is
possible to sample a Markov chain and we implement the model by a python code.
Code Commentary:
Result
When executing the code, we got the following output/result.
Conclution:From the upper discasions we know the widely used of Markov chains, we should now be able to
easily implement them in any language of our choice. There are also many more advanced properties of Markov
chains and Markov processes to dive into. We can apply this model such as Predicting traffic flows, communication
networks, genetic issues, and many more.
From the upper discasions we know the widely used of Markov chains, we should now be able to easily implement them in any language of our choice. There are also many more advanced properties of Markov chains and Markov processes to dive into. We can apply this model such as Predicting traffic flows, communication networks, genetic issues, and many more.
No comments:
Post a Comment