Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgments
- 1 Introduction
- Part I Fundamental concepts
- Part II Code verification
- Part III Solution verification
- Part IV Model validation and prediction
- 10 Model validation fundamentals
- 11 Design and execution of validation experiments
- 12 Model accuracy assessment
- 13 Predictive capability
- Part V Planning, management, and implementation issues
- Appendix Programming practices
- Index
- Plate Section
- References
13 - Predictive capability
from Part IV - Model validation and prediction
Published online by Cambridge University Press: 05 March 2013
- Frontmatter
- Contents
- Preface
- Acknowledgments
- 1 Introduction
- Part I Fundamental concepts
- Part II Code verification
- Part III Solution verification
- Part IV Model validation and prediction
- 10 Model validation fundamentals
- 11 Design and execution of validation experiments
- 12 Model accuracy assessment
- 13 Predictive capability
- Part V Planning, management, and implementation issues
- Appendix Programming practices
- Index
- Plate Section
- References
Summary
This chapter will synthesize the key results from the previous chapters and incorporate them into modern predictive capability in scientific computing. This chapter, in contrast to all other chapters, does not stress the theme of assessment. Here we discuss the fundamental steps in conducting a nondeterministic analysis of a system of interest. With this discussion we show how verification and validation (V&V) can directly contribute to predictive capability.
The previously covered material and the new material are organized into six procedural steps to make a prediction:
identify all relevant sources of uncertainty,
characterize each source of uncertainty,
estimate numerical solution error in the system response quantities of interest,
estimate uncertainty in the system response quantities of interest,
conduct model updating,
conduct sensitivity analysis.
All of these steps, except step 3, are widely practiced in nondeterministic simulations and risk analysis. Step 3 is not commonly addressed for three reasons. First, in many simulations the numerical solution error is assumed to be small compared to other contributors to uncertainty. Sometimes this assumption is quantitatively justified, and sometimes it is simply posited with little or no evidence. Second, in some computationally intensive simulations, it is understood that the numerical solution error is important, and possibly even dominant, but it is argued that various modeling parameters can be adjusted to compensate for the numerical error. If the application of interest is sufficiently similar to the conditions for which experimental data are available, then it is claimed that the adjustable parameters can be used to match the existing data and thereby make reasonable predictions. Third, even if the numerical error is estimated, and it is not small relative to other uncertainties, there are no generally accepted procedures for including its effect on the system response quantities (SRQs) of interest.
- Type
- Chapter
- Information
- Verification and Validation in Scientific Computing , pp. 555 - 670Publisher: Cambridge University PressPrint publication year: 2010