6 - Model formulation
Published online by Cambridge University Press: 07 September 2011
Summary
More formal methods of statistical analysis are based on a probability model for the data. This represents in idealized form the main features of the variability encountered and possibly also summarizes the datagenerating process. Such models contain parameters some of which encapsulate the research questions of concern. The main aspects of probability models are reviewed and simple examples are given.
Preliminaries
Simple methods of graphical and tabular analysis are of great value. They are essential in the preliminary checking of data quality and in some cases may lead to clear and convincing explanations. They play a role too in presenting the conclusions even of quite complex analyses. In many contexts it is desirable that the conclusions of an analysis can be regarded, in part at least, as summary descriptions of the data as well as interpretable in terms of a probability model.
Nevertheless careful analysis often hinges on the use of an explicit probability model for the data. Such models have a number of aspects:
they may encapsulate research questions and hypotheses in compact and clear form via parameters of interest, or they may specify a simple structure, deviations from which can be isolated and studied in detail;
they provide a way of specifying the uncertainty in conclusions;
they formalize the discounting of features that are in a sense accidents of the specific dataset under analysis;
[…]
- Type
- Chapter
- Information
- Principles of Applied Statistics , pp. 90 - 117Publisher: Cambridge University PressPrint publication year: 2011