Book contents
6 - Sufficiency and completeness
Published online by Cambridge University Press: 06 July 2010
Summary
This chapter is concerned primarily with point estimation of a parameter θ. For many parametric problems, including in particular problems with exponential families, it is possible to summarise all the information about θ contained in a random variable X by a function T = T(X), which is called a sufficient statistic. The implication is that any reasonable estimator of θ will be a function of T (X). However, there are many possible sufficient statistics – we would like to use the one which summarises the information as efficiently as possible. This is called the minimal sufficient statistic, which is essentially unique. Completeness is a technical property of a sufficient statistic. A sufficient statistic, which is also complete, must be minimal sufficient (the Lehmann–Scheffé Theorem). Another feature of a complete sufficient statistic T is that, if some function of T is an unbiased estimator of θ, then it must be the unique unbiased estimator which is a function of a sufficient statistic. The final section of the chapter demonstrates that, when the loss function is convex (including, in particular, the case of squared error loss function), there is a best unbiased estimator, which is a function of the sufficient statistic, and that, if the sufficient statistic is also complete, this estimator is unique. In the case of squared error loss this is equivalent to the celebrated Rao–Blackwell Theorem on the existence of minimum variance unbiased estimators.
- Type
- Chapter
- Information
- Essentials of Statistical Inference , pp. 90 - 97Publisher: Cambridge University PressPrint publication year: 2005