Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- 1 Introduction
- 2 Stochastic Convergence
- 3 Delta Method
- 4 Moment Estimators
- 5 M–and Z-Estimators
- 6 Contiguity
- 7 Local Asymptotic Normality
- 8 Efficiency of Estimators
- 9 Limits of Experiments
- 10 Bayes Procedures
- 11 Projections
- 12 U -Statistics
- 13 Rank, Sign, and Permutation Statistics
- 14 Relative Efficiency of Tests
- 15 Efficiency of Tests
- 16 Likelihood Ratio Tests
- 17 Chi-Square Tests
- 18 Stochastic Convergence in Metric Spaces
- 19 Empirical Processes
- 20 Functional Delta Method
- 21 Quantiles and Order Statistics
- 22 L-Statistics
- 23 Bootstrap
- 24 Nonparametric Density Estimation
- 25 Semiparametric Models
- References
- Index
19 - Empirical Processes
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- 1 Introduction
- 2 Stochastic Convergence
- 3 Delta Method
- 4 Moment Estimators
- 5 M–and Z-Estimators
- 6 Contiguity
- 7 Local Asymptotic Normality
- 8 Efficiency of Estimators
- 9 Limits of Experiments
- 10 Bayes Procedures
- 11 Projections
- 12 U -Statistics
- 13 Rank, Sign, and Permutation Statistics
- 14 Relative Efficiency of Tests
- 15 Efficiency of Tests
- 16 Likelihood Ratio Tests
- 17 Chi-Square Tests
- 18 Stochastic Convergence in Metric Spaces
- 19 Empirical Processes
- 20 Functional Delta Method
- 21 Quantiles and Order Statistics
- 22 L-Statistics
- 23 Bootstrap
- 24 Nonparametric Density Estimation
- 25 Semiparametric Models
- References
- Index
Summary
The empirical distribution of a random sample is the uniform discrete measure on the observations. In this chapter, we study the convergence of this measure and in particular the convergence of the corresponding distribution function. This leads to laws of large numbers and central limit theorems that are uniform in classes of functions. We also discuss a number of applications of these results.
Empirical Distribution Functions
Let X1, …, Xn be a random sample from a distribution function F on the real line. The empirical distribution function is defined as
It is the natural estimator for the underlying distribution F if this is completely unknown. Because is binomially distributed with mean this estimator is unbiased. By the law of large numbers it is also consistent,
By the central limit theorem it is asymptotically normal,
In this chapter we improve on these results by considering as a random function, rather than as a real-valued estimator for each separately. This is of interest on its own account but also provides a useful starting tool for the asymptotic analysis of other statistics, such as quantiles, rank statistics, or trimmed means.
The Glivenko-Cantelli theorem extends the law of large numbers and gives uniform convergence. The uniform distance is known as the Kolmogorov-Smimov statistic.
19.1 Theorem (Glivenko-Cantelli). If are random variables with distributionfunction F, then.
Proof. By the strong law oflarge numbers, both and for every Given a fixed, there exists a partition. (Points at which F jumps more than e are points of the partition.) Now, for
The convergence of and for every fixed is certainly uniform for in the finite set. Conclude that lim sup, almost surely. This is true for every and hence the limit superior is zero.
- Type
- Chapter
- Information
- Asymptotic Statistics , pp. 265 - 290Publisher: Cambridge University PressPrint publication year: 1998
- 4
- Cited by