Hostname: page-component-7479d7b7d-t6hkb Total loading time: 0 Render date: 2024-07-16T00:52:34.051Z Has data issue: false hasContentIssue false

A Zero-One Result for the Least Squares Estimator

Published online by Cambridge University Press:  18 October 2010

Donald W. K. Andrews*
Affiliation:
Yale University

Abstract

The least squares estimator for the linear regression model is shown to converge to the true parameter vector either with probability one or with probability zero. In the latter case, it either converges to a point not equal to the true parameter with probability one, or it diverges with probability one. These results are shown to hold under weak conditions on the dependent random variable and regressor variables. No additional conditions are placed on the errors. The dependent and regressor variables are assumed to be weakly dependent—in particular, to be strong mixing. The regressors may be fixed or random and must exhibit a certain degree of independent variability. No further assumptions are needed. The model considered allows the number of regressors to increase without bound as the sample size increases. The proof proceeds by extending Kolmogorov's 0-1 law for independent randomvariables to strong mixing random variables.

Type
Research Article
Copyright
Copyright © Cambridge University Press 1985 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

1. Anderson, T. W., & Taylor, J. B. (1979) Strong consistency of least quares estimates in dynamic models. Annals of Statistics, 7, 484489.Google Scholar
2. Andrews, D.W.K. (1984). Non-strong mixing autoregressive processes. Journal of Applied Probability, 21, 930934.Google Scholar
3. Andrews, D.W.K. (1985). A nearly independent, but non-strong mixing, triangular array. Journal of Applied Probability (forthcoming).Google Scholar
4. Bártfai, P., & Révész, P. (1967). On a zero-one law. Zeit shrift für Wahrscheinlichkeitstheorie und verwande Gebiete, 7, 4347.Google Scholar
5. Billingsley, P. (1979). Probability and Measure. New York: J. Wiley & Sons.Google Scholar
6. Chanda, K. C. (1974). Strong mixing properties of linear stochastic processes. Journal of Applied Probability, 11, 401408.Google Scholar
7. Chen, G. J., Lai, T. L., & Wei, C. Z. (1981). Convergence systems and strong consistency of least squares estimates in regression models. Journal of Multivariate Analysis, 11, 319333.Google Scholar
8. Christopeit, N., & Helmes, K. (1979). A convergence theorem for random linear combinations of independent normal random variables. Annals of Statistics, 7, 795800.Google Scholar
9. Drygas, H. (1976). Weak and strong consistency of the least squares estimators in regression models. Zeitshrift für Wahrscheinlichkeitstheorie und verwande Gebiete, 34, 119127.Google Scholar
10. Eicker, F. (1979). Consistent parameter estimation in mixed autpregressions and in general linear stochastic regressions. Dortmund University (manuscript).Google Scholar
11. Huber, P. J. (1973). Robust regression: Asymptotics, conjectures, and Monte Carlo. Annals of Statistics, 1, 799821.Google Scholar
12. Huber, P. J. (1981). Robust Statistics. New York: J. Wiley & Sons.Google Scholar
13. Ibragimov, I. A., & Linnik, Yu. V. (1971). Independent and Stationary Sequences of Random Variables. Groningen, The Netherlands: Wolters-Noordhoff.Google Scholar
14. losifescu, M., & Theodorescu, R. (1969). Random Processes and Learning. New York: Springer-Verlag.Google Scholar
15. Kolmogorov, A. N. (1933). Grundbegriffe de Wahrscheinlichkeitsrechnung. Berlin: Springer.10.1007/978-3-642-49888-6Google Scholar
16. Kolmogorov, A. N., & Rozonov, Y. A. (1960). On strong mixing conditions for stationary Gaussian processes. Theory of Probability and its Applications, 5, 204208.Google Scholar
17. Lai, T. L., Robbins, H., & Wei, C. Z., (1978). ‘Strong consistency of least squares estimates in multiple regression.’ Proceedings of the National Academyof Sciences: USA 75,30343036.Google Scholar
18. Lai, T. L., Robbins, H., & Wei, C. Z. (1979). Strong consistency of least squares estimates in multiple regression II. Journal of Multivariate Analysis, 9, 343361.Google Scholar
19. Nelson, P. I. (1980). A note on strong consistency of least squares estimators in regression models with Martingale difference errors. Annals of Statistics, 8, 10571064.Google Scholar
20. Phillips, P.C.B. (1984). The exact distribution of LIML: I. International Economic Review, 25, 249261.Google Scholar
21. Robinson, P. M. (1978). On consistency in time series analysis. Annals of Statistics, 6, 215223.Google Scholar
22. Yohai, V. J., & Maronna, R. A. (1979). Asymptotic behavior of M-estimators for the linear model. Annals of Statistics, 7, 258268.Google Scholar