The Markov chain is an important mathematical principle that not only will be useful in the study of statistics in college, but in real-life problems or situations. A systems engineer will find this mathematical principle quite useful in his career along with the study and analysis of discrete dynamical systems which model real-life situations. This web page will give you a detailed and accurate description and understanding of the Markov Chain.

     Dyamical System

     A dynamical system is a system that changes in time. A discrete dynamical system is a system that changes at discrete points in time. A dynamical system model gives us a good, but not completely accurate, discription of how real world quantities changes over time. Dynamical systems model quantities such as the population size of a species in an ecology, interest on loans or savings accounts, the price of an economic good, the number of people contracting a disease, how much pollutants are in a lake or river, or how much drug is in a person's bloodstream, etc. The state of a dynamical system is the value of it at a given time period. A first-order discrete dynamical system is a system in which its current state depends on its previous state. Likewise, a second-order dynamical system is a system in which its current state depends on the previous two states. A first-order discrete dynamical system has the form a(n+1)=f(a(n)). First-order discrete dynamical systems may be linear, affine, or nonlinear. Linear first-order discrete dynamical systems have the form, a(n+1)=ra(n), n=0,1,2,... Dynamical systems are linear if the graph of its function, y=f(x), is a straight line through the origin. Affine first-order discrete dynamical systems have the form, a(n+1)=ra(n)+b, n=0,1,2,... Dynamical systems are affine if the graph of its function, y=f(x), is a straight line with y-intercept not equal to 0. If the graph of the function of the dynamical system, y=f(x), isn't a straight line, then the dynamical system is nonlinear. Dynamical systems sometimes have equilibrium states, which are states at which the dynamical system doesn't change. The equilibrium state of a linear first-order discrete dynamical system is equal to 0. The equilibrium state of an affine first-order discrete dynamical system is a=b/(1-r) if r isn't equal to 1. If r=1 and b=0, then every number is an equilibrium state. If r=1 and b doesn't equal 0, then there is no equilibrium state.

     Markov Chain

A Markov process is a stochatstic porcess satisfying the so-called 'Markov porperty'. Many techniques take into account ideas based on theoretical consideration while others are more arbitrary in their construction.In general, Markov chain lengths based on theoretical considerations are no beter or worse than those based on more arbitrary, simple-to-implement, ideas. When implementing the algorithm a good policy is to start simply and only try something of a more complicated mature if satisfactory results do not arise. It is also prudent to view the simulated annealing algorithm as a method to be optimsed rather than getting tangled up in what it all means. Nevertheless, since the problem under consideration especially its magnitude will probably influence the appropriate chain length it is perhaps desirable to link the length of the chain to the problem size in some manner. An easy method to achieve this is to link the length of your Markov chains to the size of the problem.

Example of a Markov Chain