Book contents
- Frontmatter
- Contents
- List of statistical and mathematical tables
- Preface
- PART I BASIC NUMERICAL TECHNIQUES
- PART II BASIC STATISTICAL TECHNIQUES
- 8 Probability, statistical distributions and moments
- 9 The normal and related distributions
- 10 The common discrete distributions
- 11 The Pearson system of probability-density functions
- 12 Hypothesis testing
- 13 Point and interval estimation
- 14 Some special statistical techniques
- PART III THE METHOD OF LEAST SQUARES
- Appendix
- References
- Author index
- Subject index
13 - Point and interval estimation
Published online by Cambridge University Press: 18 December 2009
- Frontmatter
- Contents
- List of statistical and mathematical tables
- Preface
- PART I BASIC NUMERICAL TECHNIQUES
- PART II BASIC STATISTICAL TECHNIQUES
- 8 Probability, statistical distributions and moments
- 9 The normal and related distributions
- 10 The common discrete distributions
- 11 The Pearson system of probability-density functions
- 12 Hypothesis testing
- 13 Point and interval estimation
- 14 Some special statistical techniques
- PART III THE METHOD OF LEAST SQUARES
- Appendix
- References
- Author index
- Subject index
Summary
Summary The opening sections of this chapter summarise the general principles of point estimation, the method of maximum likelihood and confidence intervals. These methods are then applied to parameter estimation problems for the common discrete distributions (sections 13.5–13.11), means and linear combinations of means from normal populations (sections 13.12–13.18), variances and ratios of variances of normal populations (sections 13.19–13.24), parameter estimation problems for the log-normal distribution (section 13.25), and the correlation coefficient of a bivariate normal distribution (section 13.26). A method of obtaining confidence limits for a distribution function is described in section 13.27.
Point estimation
Consider a random sample of n observations x1, …, xn. The distributional form of the population is known (for example, normal), but one or more of the parameters is unknown (for example, either μ; or o2 or both unknown). The problem is to estimate the unknown parameter(s).
The function of the observations which we choose to estimate a parameter is known as an estimator, and the numerical value obtained from it using a particular set of data is called an estimate.
Often several different functions will suggest themselves as estimators (for example, the sample mean and the sample median can both be used to estimate the mean of a normal population) and we need to decide which to use. The following criteria are used by statisticians:
The estimator should be unbiased, so that its expectation is equal to the true value of the parameter. Thus, on average, the estimate obtained is equal to the underlying parameter.
- Type
- Chapter
- Information
- A Handbook of Numerical and Statistical TechniquesWith Examples Mainly from the Life Sciences, pp. 210 - 235Publisher: Cambridge University PressPrint publication year: 1977
- 1
- Cited by