Book contents
- Frontmatter
- Contents
- Preface
- Introduction
- PART ONE PROBABILITY IN ACTION
- PART TWO ESSENTIALS OF PROBABILITY
- 7 Foundations of probability theory
- 8 Conditional probability and Bayes
- 9 Basic rules for discrete random variables
- 10 Continuous random variables
- 11 Jointly distributed random variables
- 12 Multivariate normal distribution
- 13 Conditioning by random variables
- 14 Generating functions
- 15 Discrete-time Markov chains
- 16 Continuous-time Markov chains
- Appendix: Counting methods and ex
- Recommended reading
- Answers to odd-numbered problems
- Bibliography
- Index
15 - Discrete-time Markov chains
Published online by Cambridge University Press: 05 August 2012
- Frontmatter
- Contents
- Preface
- Introduction
- PART ONE PROBABILITY IN ACTION
- PART TWO ESSENTIALS OF PROBABILITY
- 7 Foundations of probability theory
- 8 Conditional probability and Bayes
- 9 Basic rules for discrete random variables
- 10 Continuous random variables
- 11 Jointly distributed random variables
- 12 Multivariate normal distribution
- 13 Conditioning by random variables
- 14 Generating functions
- 15 Discrete-time Markov chains
- 16 Continuous-time Markov chains
- Appendix: Counting methods and ex
- Recommended reading
- Answers to odd-numbered problems
- Bibliography
- Index
Summary
In previous chapters we have dealt with sequences of independent random variables. However, many random systems evolving in time involve sequences of dependent random variables. Think of the outside weather temperature on successive days, or the price of IBM stock at the end of successive trading days. Many such systems have the property that the current state alone contains sufficient information to give the probability distribution of the next state. The probability model with this feature is called a Markov chain. The concepts of state and state transition are at the heart of Markov chain analysis. The line of thinking through the concepts of state and state transition is very useful to analyze many practical problems in applied probability.
Markov chains are named after the Russian mathematician Andrey Markov (1856–1922), who first developed this probability model in order to analyze the alternation of vowels and consonants in Pushkin's poem “Eugine Onegin.” His work helped to launch the modern theory of stochastic processes (a stochastic process is a collection of random variables, indexed by an ordered time variable). The characteristic property of a Markov chain is that its memory goes back only to the most recent state. Knowledge of the current state only is sufficient to describe the future development of the process. A Markov model is the simplest model for random systems evolving in time when the successive states of the system are not independent.
- Type
- Chapter
- Information
- Understanding Probability , pp. 459 - 506Publisher: Cambridge University PressPrint publication year: 2012