Skip to main content Accessibility help
×
Hostname: page-component-77c89778f8-cnmwb Total loading time: 0 Render date: 2024-07-18T16:24:25.339Z Has data issue: false hasContentIssue false

15 - Maturity assessment of modeling and simulation

from Part V - Planning, management, and implementation issues

Published online by Cambridge University Press:  05 March 2013

Christopher J. Roy
Affiliation:
Virginia Polytechnic Institute and State University
Get access

Summary

In Chapter 1, Introduction, we briefly discussed how credibility is built in modeling and simulation (M&S). The four elements mentioned in that chapter were: quality of the analysts conducting the analysis, quality of the physics modeling, verification and validation activities, and uncertainty quantification and sensitivity analysis. The latter three elements are technical elements that can be assessed for completeness or maturity. Assessment of maturity is important to the staff conducting the modeling and simulation effort, but it is critically important for project managers and decision makers who use computational results as an element in their decision making. It is also important for internal or external review committees who are asked to provide recommendations on the credibility and soundness of computational analyses. This chapter deals with reviewing methods that have been developed for assessing similar activities, and then presents a newly developed technique reported in Oberkampf et al. (2007). This chapter is taken in large part from this reference.

Survey of maturity assessment procedures

Over the last decade, a number of researchers have investigated how to measure the maturity and credibility of software and hardware development processes and products. Probably the best-known procedure for measuring the maturity of software product development and business processes is the Capability Maturity Model Integration (CMMI). The CMMI is a successor to the Capability Maturity Model (CMM). Development of the CMM was initiated in 1987 to improve software quality. For an extensive discussion of the framework and methods for the CMMI, see West (2004); Ahern et al. (2005); Garcia and Turner (2006); and Chrissis et al. (2007).

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2010

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Ahern, D. M., Clouse, A., and Turner, R. (2005). CMMI Distilled: a Practical Introduction to Integrated Process Improvement. 2nd edn., Boston, MA, Addison-Wesley.Google Scholar
Balci, O. (2004). Quality assessment, verification, and validation of modeling and simulation applications. 2004 Winter Simulation Conference, 122–129.CrossRef
Balci, O., Adams, R. J., Myers, D. S., and Nance, R. E. (2002). A collaborative evaluation environment for credibility assessment of modeling and simulation applications. 2002 Winter Simulation Conference, 214–220.CrossRef
Bertch, W. J., Zang, T. A., and Steele, M. J. (2008). Development of NASA's models and simulations standard. 2008 Spring Simulation Interoperability Workshop Paper No. 08S-SIW-037, Providence, RI, Simulation Interoperability Standards Organization.
Blattnig, S. R., Green, L. L., Luckring, J. M., Morrison, J. H., Tripathi, R. K., and Zang, T. A. (2008). Towards a credibility assessment of models and simulations. 49th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, AIAA 2008–2156, Schaumburg, IL, American Institute of Aeronautics and Astronautics.CrossRef
Chrissis, M. B., Konrad, M., and Shrum, S. (2007). CMMI: Guidelines for Process Integration and Product Improvement. 2nd edn., Boston, MA, Addison-Wesley.Google Scholar
Clay, R. L., Marburger, S. J., Shneider, M. S., and Trucano, T. G. (2007). Modeling and Simulation Technology Readiness Levels. SAND2007–0570, Albuquerque, NM, Sandia National Laboratories.
DoD (2005). Technology Readiness Assessment (TRA) Deskbook. Washington, DC, Department of Defense.
GAO (1999). Best Practices: Better Management of Technology Development Can Improve Weapon System Outcomes. GAO/NSIAD-99–162, Washington, DC, U. S. Government Accounting Office.
Garcia, S. and Turner, R. (2006). CMMI Survival Guide: Just Enough Process Improvement, Boston, MA, Addison-Wesley.Google Scholar
Green, L. L., Blattnig, S. R., Luckring, J. M., and Tripathi, R. K. (2008). An uncertainty structure matrix for models and simulations. 49th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, AIAA 2008–2154, Schaumburg, IL, American Institute of Aeronautics and Astronautics.CrossRef
Harmon, S. Y. and Youngblood, S. M. (2003). A proposed model for simulation validation process maturity. Simulation Interoperability Workshop, Paper No. 03S-SIW-127, Orlando, FL, Simulation Interoperability Standards Organization.
Harmon, S. Y. and Youngblood, S. M. (2005). A proposed model for simulation validation process maturity. The Journal of Defense Modeling and Simulation. 2(4), 179–190.CrossRefGoogle Scholar
Helton, J. C., Johnson, J. D., Sallaberry, C. J., and Storlie, C. B. (2006). Survey of sampling-based methods for uncertainty and sensitivity analysis. Reliability Engineering and System Safety. 91(10–11), 1175–1209.CrossRefGoogle Scholar
IEEE (1989). IEEE Standard Glossary of Modeling and Simulation Terminology. Std 610.3–1989, New York, IEEE.
Mankins, J. C. (1995). Technology Readiness Levels. Washington, DC, National Aeronautics and Space Administration.Google Scholar
NASA (2006). Interim Technical Standard for Models and Simulations. NASA-STD-(I)-7009, Washington, DC, National Aeronautics and Space Administration.Google Scholar
NASA (2008). Standard for Models and Simulations. NASA-STD-7009, Washington, DC, National Aeronautics and Space Administration.Google Scholar
Oberkampf, W. L., Pilch, M., and Trucano, T. G.. (2007). Predictive Capability Maturity Model for Computational Modeling and Simulation. SAND2007–5948, Albuquerque, NM, Sandia National Laboratories.CrossRefGoogle Scholar
Oberkampf, W. L. and Trucano, T. G. (2008). Verification and validation benchmarks. Nuclear Engineering and Design. 238(3), 716–743.CrossRef
Pilch, M., Trucano, T. G., Peercy, D. E., Hodges, A. L., and Froehlich, G. K. (2004). Concepts for Stockpile Computing (OUO). SAND2004–2479 (Restricted Distribution, Official Use Only), Albuquerque, NM, Sandia National Laboratories.Google Scholar
Pilch, M., Trucano, T. G., and Helton, J. C. (2006). Ideas Underlying Quantification of Margins and Uncertainties (QMU): a White Paper. SAND2006–5001, Albuquerque, NM, Sandia National Laboratories.
Saltelli, A., Ratto, M., Andres, T., Campolongo, F., Cariboni, J., Gatelli, D., Saisana, M., and Tarantola, S. (2008). Global Sensitivity Analysis: the Primer, Hoboken, NJ, Wiley.Google Scholar
SEI (2006). Software Engineering Institute: Capability Maturity Model Integration. .
Smith, J. (2004). An Alternative to Technology Readiness Levels for Non-Developmental Item (NDI) Software. CMU/SEI-2004-TR-013, ESC-TR-2004–013, Pittsburgh, PA, Carnegie Mellon, Software Engineering Institute.CrossRef
Steele, M. J. (2008). Dimensions of credibility in models and simulations. International Simulation Multi-Conference, Paper No. 08E-SIW-076, Edinburgh, Scotland, Simulation Interoperability Standards Organization/The Society for Modeling and Simulation.
Wang, R. Y. and Strong, D. M. (1996). Beyond accuracy: what data quality means to data consumers. Journal of Management Information Systems. 12(4), 5–34.CrossRefGoogle Scholar
West, M. (2004). Real Process Improvement Using the CMMI, Boca Raton, FL, CRC Press.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×