Notes on Chapter 8: Verification, Validation, and Uncertainty Quantification by George Em Karniadakis . Karniadakis provides the motivation for the topic right off:
In time-dependent systems, uncertainty increases with time, hence rendering simulation results based on deterministic models erroneous. In engineering systems, uncertainties are present at the component, subsystem, and complete system levels; therefore, they are coupled and are governed by disparate spatial and temporal scales or correlations.
Verification is the process of determining that a model implementation accurately represents the developers conceptual description of the model and the solution to the model. Hence, by verification we ensure that the algorithms have been implemented correctly and that the numerical solution approaches the exact solution of the particular mathematical model typically a partial differential equation (PDE). The exact solution is rarely known for real systems, so “fabricated” solutions for simpler systems are typically employed in the verification process. Validation, on the other hand, is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model. Hence, validation determines how accurate are the results of a mathematical model when compared to the physical phenomenon simulated, so it involves comparison of simulation results with experimental data. In other words, verification asks “Are the equations solved correctly?” whereas validation asks “Are the right equations solved?” Or as stated in Roache (1998) , “verification deals with mathematics; validation deals with physics.”
He addresses the constant problem of validation succinctly:
Validation is not always feasible (e.g., in astronomy or in certain nanotechnology applications), and it is, in general, very costly because it requires data from many carefully conducted experiments.
Getting decision makers to pay for this experimentation or testing is especially problematic when they were initially sold on using modeling and simulation as a way to avoid testing.
After this the chapter goes into an unnecessary digresion on inductive reasoning. An unfortunate common thread that I’ve noticed in many of the V&V reports I’ve read is they seem to think Karl Popper had the last word on scientific induction! I think the V&V community would profit greatly by studying Jayne’s theoretically sound pragmatism. They would quickly recognize that the ’problems’ they perceive in scientific induction are little more than misunderstandings of probability theory as logic.
The chapter gets back on track with the discussion of types of error in simulations:
Uncertainty quantification in simulating physical systems is a much more complex subject; it includes the aforementioned numerical uncertainty, but often its main component is due to physical uncertainty. Numerical uncertainty includes in addition to spatial and temporal discretization errors, errors in solvers (e.g., incomplete iterations, loss of orthogonality), geometric discretization (e.g., linear segments), artificial boundary conditions (e.g., infinite domains), and others. Physical uncertainty includes errors due to imprecise or unknown material properties (e.g., viscosity, permeability, modulus of elasticity, etc.), boundary and initial conditions, random geometric roughness, equations of state, constitutive laws, statistical potentials, and others. Numerical uncertainty is very important and many scientific journals have established standard guidelines for how to document this type of uncertainty, especially in computational engineering (AIAA 1998 ).
The examples given for effects of ’uncertainty propagation’ are interesting. The first is a direct numerical simulation (DNS) of turbulent flow over a circular cylinder. In this resolved simulation, the high-wave numbers (smallest scales) are accurately captured, but there is disagreement at the low wave numbers (largest scales). This somewhat counter-intuitive result occurs because the small scales are insensitive to experimental uncertainties about boundary and initial conditions, but the large scales of motion are not.
The section on methods for dealing with modelling uncertain inputs is sparse on details. Passing mention is made of Monte Carlo and Quasi-Monte Carlo methods, sensitivity-based methods and Bayesian methods.
The section on ’Certification / Accreditation’ is interesting. Karniadakis recomends designing experiments for validation based on the specific use or application rather than based on a particular code. This point deserves some emphasis. It is an often voiced desire from decision makers to have a repository of validated codes that they can access to support their various and sundry efforts. This is an unrealistic desire. A code can not be validated as such, only a particular use of a code can be validated. In most decisions that engineering simulation supports, the use is novel (research and new product development), therefore the validated model will be developed concurrently with (in the case of product development) or as a result of (in the case of research) the broader effort in question.
The suggested hierarchical validation framework is similar to the ’test driven development’ methodologies in software engineering and the ’knowledge driven product development’ championed in the GAO’s reports on government acquisition efforts. Small component (unit) tests followed by system integration tests and then full complex system tests. When the details of ’model validation’ are understood, it is clear that rather than replacing testing, simulation truly serves to organize test designs and optimize test efforts.
The conclusions are explicit (emphasis mine):
The NSF SBES report (Oden et al. 2006 ) stresses the need for new developments in V&V and UQ in order to increase the reliability and utility of the simulation methods at a profound level in the future. A report on European computational science (ESF 2007 ) concludes that “without validation, computational data are not credible, and hence, are useless.” The aforementioned National Research Council report (2008) on integrated computational materials engineering (ICME) states that, “Sensitivity studies, understanding of real world uncertainties and experimental validation are key to gaining acceptance for and value from ICME tools that are less than 100 percent accurate.” A clear recommendation was reached by a recent study on Applied Mathematics by the U.S. Department of Energy (Brown 2008 ) to “significantly advance the theory and tools for quantifying the effects of uncertainty and numerical simulation error on predictions using complex models and when fitting complex models to observations.”
 Roache, P.J. 1998. Verification and validation in computational science and engineering. Albuquerque,: Hermosa Publishers.
 AIAA Guide for the Verification and Validation of Computational Fluid Dynamics Simulations, Reston, VA, AIAA. AIAA-G-077-1998.
 Oden, J.T., T. Belytschko, T.J.R. Hughes, C. Johnson, D. Keyes, A. Laub, L. Petzold, D. Srolovitz, and S. Yip. 2006. Revolutionizing engineering science through simulation: A report of the National Science blue ribbon panel on simulation-based engineering science. Arlington: National Science Foundation. Available online
 European Computational Science Forum of the European Science Foundation (ESF). 2007. The Forward Look Initiative. European computational science: The Lincei Initiative: From computers to scientific excellence. Information available online.
 Brown, D.L. (chair). 2008. Applied mathematics at the U.S. Department of Energy: Past, present and a view to the future. May, 2008.
Concepts of Model Verification and Validation has a glossary that defines most of the relevant terms.