Tuesday, January 26, 2010

VV&UQ for Nuclear Energy

I found this interesting slide from a DOE pitch about simulation as it supports nuclear energy development.

Their "top priority recommendation" is about Verification, Validation and Uncertainty Quantification:
• Our top priority recommendation is that a V&V and
  UQ program for nuclear systems’ simulation take a
  two-pronged approach.
     • First, focus on research into the critical issues
       and challenges
     • Second, a concurrent study using the V&V and
       UQ process to analyze a number of critical,
       integrated physics applications would provide a
       problem focus and address the issues of
       coupled multi-scale physics and UQ.

If you care about having useful models, then you care about VV&UQ. Otherwise, a model just provides the colorful marketing material you use to garner more funding. Larzelere goes on to say,
It is Important to Note That Advanced Modeling and Simulation Does Not Replace the Need for Theory or Experiments
As an aside, I thought this graphic comparing the manpower and flops requirements of the two different simulation domains was interesting too.

1 comment:

  1. The Center for Radiative Shock Hydrodynamics 2010 project summary report has some interesting bits on assessment of predictive capability:

    Our overarching project goal is to develop a simulator – the CRASH code – that can predict radiative shock behavior in an unexplored region of the experimental input space – the elliptical tube – after being assessed in a different region of input space that has been explored by experiments. Our unique intended contribution is to be the first academic team to use statistical assessment of predictive capability to systematically guide improvements in simulations and improvement in experiments so as to produce new predictions of improved accuracy, and to demonstrate this improvement by experiment. CRASH employs both sensitivity studies, to assess which aspects of the physical system are important and which are not, and predictive model construction, to assess the probability distribution functions of both physical parameters and experimental outputs.
    [...]
    Our code system is designed to minimize tuning parameters, and in fact we relegate all such tuning to our preprocessor, used to compute initial parameters YHP. In doing so, we expect to explore the uncertainty generated by uncertainty in YHP. Further, while the system η does also depend on numerical parameters N , our strategy regarding these is to confirm that the mesh is sufficiently resolved, for example, so that the uncertainty generated in the output YS by N is smaller than the experimental uncertainty. These latter two points are worthy of note: a goal of CRASH is, to the largest extent possible, to make predictions without tuning and without treating numerical choices as either physics or tuning.


    Contrast this concern with the level of physical fidelity and grid independence of the results with the state of the practice discussed in this thread.

    Also of note, they are using a Bayesian approach to generate posterior distributions of their quantities of interest based on their model and experimental data.

    ReplyDelete