Book contents
- Frontmatter
- Contents
- Chapter dependencies
- Preface
- 1 Introduction to probability
- 2 Introduction to discrete random variables
- 3 More about discrete random variables
- 4 Continuous random variables
- 5 Cumulative distribution functions and their applications
- 6 Statistics
- 7 Bivariate random variables
- 8 Introduction to random vectors
- 9 Gaussian random vectors
- 10 Introduction to random processes
- 11 Advanced concepts in random processes
- 12 Introduction to Markov chains
- 13 Mean convergence and applications
- 14 Other modes of convergence
- 15 Self similarity and long-range dependence
- Bibliography
- Index
6 - Statistics
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Chapter dependencies
- Preface
- 1 Introduction to probability
- 2 Introduction to discrete random variables
- 3 More about discrete random variables
- 4 Continuous random variables
- 5 Cumulative distribution functions and their applications
- 6 Statistics
- 7 Bivariate random variables
- 8 Introduction to random vectors
- 9 Gaussian random vectors
- 10 Introduction to random processes
- 11 Advanced concepts in random processes
- 12 Introduction to Markov chains
- 13 Mean convergence and applications
- 14 Other modes of convergence
- 15 Self similarity and long-range dependence
- Bibliography
- Index
Summary
As we have seen, most problems in probability textbooks start out with random variables having a given probability mass function or density. However, in the real world, problems start out with a finite amount of data, X1, X2, …, Xn, about which very little is known based on the physical situation. We are still interested in computing probabilities, but we first have to find the pmf or density with which to do the calculations. Sometimes the physical situation determines the form of the pmf or density up to a few unknown parameters. For example, the number of alpha particles given off by a radioactive sample is Poisson(λ), but we need to estimate λ from measured data. In other situations, we may have no information about the pmf or density. In this case, we collect data and look at histograms to suggest possibilities. In this chapter, we not only look at parameter estimators and histograms, we also try to quantify how confident we are that our estimate or density choice is a good one.
Section 6.1 introduces the sample mean and sample variance as unbiased estimators of the true mean and variance. The concept of strong consistency is introduced and used to show that estimators based on the sample mean and sample variance inherit strong consistency. Section 6.2 introduces histograms and the chi-squared statistic for testing the goodness-of-fit of a hypothesized pmf or density to a histogram.
- Type
- Chapter
- Information
- Publisher: Cambridge University PressPrint publication year: 2006
- 8
- Cited by