Hostname: page-component-78c5997874-xbtfd Total loading time: 0 Render date: 2024-11-18T23:27:14.115Z Has data issue: false hasContentIssue false

Decay rates and cutoff for convergence and hitting times of Markov chains with countably infinite state space

Published online by Cambridge University Press:  01 July 2016

Servet Martínez*
Affiliation:
Universidad de Chile
Bernard Ycart*
Affiliation:
Université René Descartes–Paris V
*
Postal address: Centro Modelamiento Matemático, Universidad de Chile, UMR 2071-CNRS, Casilla 170/3, Santiago, Chile. Email address: smartine@dim.uchile.cl
∗∗ Postal address: Math–Info, 45 rue des Saints-Pères 75270, Paris Cedex 06, France.

Abstract

For a positive recurrent continuous-time Markov chain on a countable state space, we compare the access time to equilibrium to the hitting time of a particular state. For monotone processes, the exponential rates are ranked. When the process starts far from equilibrium, a cutoff phenomenon occurs at the same instant, in the sense that both the access time to equilibrium and the hitting time of a fixed state are equivalent to the expectation of the latter. In the case of Markov chains on trees, that expectation can be computed explicitly. The results are illustrated on the M/M/∞ queue.

Type
General Applied Probability
Copyright
Copyright © Applied Probability Trust 2001 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

[1] Aldous, D and Fill, J. (1999). Reversible Markov Chains in Random Walks and Graphs. In preparation. Available at http://www.stat.berkeley.edu/simaldous/book.html.Google Scholar
[2] Anderson, W. J. (1991). Continuous-time Markov Chains. An Applications-Oriented Approach. Springer, New York.Google Scholar
[3] Bouleau, N. and Lépingle, D. (1994). Numerical Methods for Stochastic Processes. John Wiley, New York.Google Scholar
[4] Cavender, J. A. (1978). Quasi-stationary distributions of birth and death processes. Adv. Appl. Prob. 10, 570586.Google Scholar
[5] Çinlar, E., (1975). Introduction to Stochastic Processes. Prentice Hall, New York.Google Scholar
[6] Diaconis, P. (1996). The cutoff phenomenon in finite Markov chains. Proc. Nat. Acad. Sci. USA 93, 16591664.CrossRefGoogle ScholarPubMed
[7] Down, D., Meyn, S. P. and Tweedie, R. L. (1995). Exponential and uniform ergodicity of Markov processes. Ann. Prob. 23, 16711691.Google Scholar
[8] Feller, W. (1971). An Introduction to Probability Theory and its Applications, Vol. II. John Wiley, London.Google Scholar
[9] Ferrari, P. A., Kesten, H., Martínez, S. and Picco, P. (1995). Existence of quasi-stationary distributions. A renewal dynamical approach. Ann. Prob. 23, 501521.Google Scholar
[10] Ferrari, P. A., Martínez, S. and Picco, P. (1992). Existence of non-trivial quasi-stationary distributions in the birth–death chain. Adv. Appl. Prob. 24, 795813.Google Scholar
[11] Gray, L., Béguin, M. and Ycart, B. (1998). The load transfer model. Ann. Appl. Prob. 8, 337353.Google Scholar
[12] Horn, R. A. and Johnson, C. R. (1985). Matrix Analysis. Cambridge University Press.Google Scholar
[13] Jacka, S. D. and Roberts, G. O. (1995). Weak convergence of conditioned processes on a countable state space. J. Appl. Prob. 32, 902916.Google Scholar
[14] Kamae, T., Krengel, U. and O'Brien, G. L. (1977). Stochastic inequalities on partially ordered spaces. Ann. Prob. 5, 899912.Google Scholar
[15] Keilson, J. and Ramaswamy, R. (1984). Convergence of quasi-stationary distributions in birth–death processes. Stoch. Proc. Appl. 18, 301312.CrossRefGoogle Scholar
[16] Kelly, F. P. (1979). Reversibility and Stochastic Networks, John Wiley, London.Google Scholar
[17] Kijima, M and Seneta, E. (1991). Some results for quasi-stationary distributions of birth–death processes. J. Appl. Prob. 28, 502511.Google Scholar
[18] Kijima, M. (1992). Evaluation of the decay parameter for some specialized birth–death processes. J. Appl. Prob. 29, 781791.CrossRefGoogle Scholar
[19] Kingman, J. F. C. (1963). Ergodic properties of continuous-time Markov processes and their discrete skeletons. Proc. London Math. Soc. 13, 593604.CrossRefGoogle Scholar
[20] Lindvall, T. (1992). Lectures on the Coupling Methods. John Wiley, New York.Google Scholar
[21] López, J., Martínez, S. and Sanz, G. (2000). Stochastic domination and Markovian couplings. Adv. Appl. Prob. 32, 10641076.Google Scholar
[22] Massey, A. W. (1987). Stochastic orderings for Markov processes on partially ordered spaces. Math. Operat. Res. 12, 350367.Google Scholar
[23] Meyn, S. P. and Tweedie, R. L. (1993). Markov Chains and Stochastic Stability. Springer, London.Google Scholar
[24] Nair, N. and Pollett, P. (1993). On the relationship between μ-invariant measures and quasistationary distributions for continuous Markov chains. Adv. Appl. Prob. 25, 82102, 717–719.Google Scholar
[25] Popov, N. N.. (1977). Conditions for geometric ergodicity of countable Markov chains. Soviet Math. Dokl. 18, 676679.Google Scholar
[26] Saloff-Coste, L.. (1997). Lectures on finite Markov chains. In Lectures on Probability Theory and Statistics (Lecture Notes Math. 1665), eds Giné, E., Grimmett, G. R. and Saloff-Coste, L.. Springer, Berlin, pp. 301413.CrossRefGoogle Scholar
[27] Van Doorn, E. (1991). Quasi-stationary distributions and convergence to quasi-stationarity of birth–death processes. Adv. Appl. Prob. 23, 683700.CrossRefGoogle Scholar
[28] Van Doorn, E. and Schrijner, P. (1995). Geometric ergodicity and quasi-stationarity in discrete-time birth–death processes. J. Austal. Math. Soc. B 37, 121144.Google Scholar