Stochastic Processes
Table of Contents
1. Markov Chains
- Stationary distribution as high power of markov chain
- Representation as Matrix
- Probability of being in state \(x\) given starting in state \(y\)
1.1. Definitions
- Transient vs Recurrent States
- "Closed set"
2. Stationary distribution/measure
3. Convergence
- Convergence of value vs probability
- "Almost surely converges"
- Bounded Convergence
- Dominated Convergence
4. Renewals
Die Example
5. Poisson Processes
- Exponential Variable : time for arrivals
- Poisson Variable : number of arrivals
6. Useful Identities
Total Expectatio Total Probability
\(var(Y) = E(var(Y|X)) + var(E(Y|X))\)