Statistical & Financial Consulting by Stanford PhD

Home Page

Stochastic process X(t) is a Markov Chain if

1] the space of possible values of X(t) is a finite or countable (discrete) set,

2] given the present, the future is independent of the past: for any moments of time t1 < ... < tn-1 < tn and values x1, ... xn-1, xn,

P(X(tn) = xn | X(t1) = x1, ... , X(tn-1) = xn-1) = P(X(tn) = xn | X(tn-1) = xn-1).

The possible values of X(t) are also called

*irreducible*if it can get from any state to any other state in one or more steps;*aperiodic*if 1 is the maximum common denominator of all possible numbers of steps in which the chain can return to the same state;*positive recurrent*if the expected amount of time it takes to return to any state is finite;*finite*if the number of states is finite;*infinite*if the number of states is infinite;*ergodic*if it is aperiodic and positive recurrent (in practice this means that, if the chain is irreducible, it converges to the same limiting distribution no matter where it starts).

Lawler, G. F. (1995). Introduction to Stochastic Processes. New York: Chapman and Hall/CRC.

Ross, S. M. (1995). Stochastic Processes (2nd ed). New York: Wiley.

Karlin, S., & Taylor, H. M. (1975). A First Course in Stochastic Processes (2nd ed). New York: Academic Press.

Gikhman, I. I., & Skorokhod, A. V. (2004). The Theory of Stochastic Processes II. Springer.

- Detailed description of the services offered in the areas of statistical consulting and financial consulting: home page, types of service, experience, case studies, payment options and statistics tutoring
- Directory of financial topics