Saturday, June 23, 2012

Notre Dame V&V Workshop Notes

Last October I had the opportunity to attended a V&V workshop at Notre Dame. In a previous post I said I'd put up my notes once the slides were available. This post contains my notes from the workshop. Most of the presenter's slides are available through links in the program.

There are a couple highlights from the workshop that I'll mention before dumping the chronological notes.

James Kamm gave a really great presentation on a variety of exact solutions for 1-D Euler equations. He covered the well known shock tube solutions that you'd find in a good text on Riemann Solvers. Plus a whole lot more. Thomas Zang presented work on a NASA standard for verification and validation that grew out of the fatal Columbia mishap. The focus is not so much proscribing what a user of modeling and simulation will do to accomplish V&V, but requiring that what is done is clearly documented. If nothing is done then the documentation just requires a clear statement that nothing was done for that aspect of verification, validation, or uncertainty quantification. I like this approach because it's impossible for a standards writer to know every problem well enough to proscribe the right approach, but requiring someone to come out and put in writing "nothing was done" often means they'll go do at least something that's appropriate for their particular problem.

I think that in the area of Validation I'm philosophically closest to Bob Moser who seems to be a good Bayesian (slides here). Bill Oberkampf (who, along with Chris Roy, recently wrote a V&V book) did some pretty unconvincing hand-waving to avoid biting the bullet and taking a Bayesian approach to validation, which he (and plenty of other folks at the workshop) view as too subjective. I had a more recent chance to talk with Chris Roy about their proposed area validation metric (which is in some ASME standards), and the ad-hoc, subjective nature of the multiplier for their distribution location shifts seems a lot more treacherous to me than specifying a prior. The fact that they use frequentist distributional arguments to justify a non-distributional fudge factor (which changes based on how the analyst feels about the consequences of the decision; sometimes it's 2, but for really important decisions maybe you should use 3) doesn't help them make the case that they are successfully avoiding "too much subjectivity". Of course, subjectivity is unavoidable in decision making. There are two options. The subjective parts of decision support can be explicitly addressed in a coherent fashion, or they can be pretended away by an expanding multitude of ad-hoceries.

I appreciated the way Patrick Roache wrapped up the workshop, “decisions will continue to be made on the basis of expert opinion and circumstantial evidence, but Bill [Oberkampf] and I just don’t think that deserves the dignity of the term validation.” In product development we’ll often be faced with acting to accept risk based on un-validated predictions. In fact, that could be one operational definition of experimentation. Since subjectivity is inescapable, I resort to pragmatism. What is useful? It is not useful to say “validated models are good” or “unvalidated models are bad”. It is more useful to recognize validation activities as signals to the decision maker about how much risk they are accepting when they act on the basis of simulations and precious little else.

Tuesday, May 22, 2012

Second Falcon 9/Dragon Launch

Successful launch of the Falcon 9/Dragon to the International Space Station for COTS demo flight 2.


Lots of coverage on Parabolic Arc and Nuite Blanche.

Tuesday, April 3, 2012

Highly Replicable Research

There is a really good post by Titus Brown (by way of PlanetScipy) on replicate-able research.
So what did we do to make this paper extra super replicable?
If you go to the paper Web site, you'll find:
  • a link to the paper itself, in preprint form, stored at the arXiv site;
  • a tutorial for running the software on a Linux machine hosted in the Amazon cloud;
  • a git repository for the software itself (hosted on github);
  • a git repository for the LaTeX paper and analysis scripts (also hosted on github), including an ipython notebook for generating the figures (more about that in my next blog post);
  • instructions on how to start up an EC2 cloud instance, install the software and paper pipeline, and build most of the analyses and all of the figures from scratch;
  • the data necessary to run the pipeline;
  • some of the output data discussed in the paper.
(Whew, it makes me a little tired just to type all that...)
What this means is that you can regenerate substantial amounts (but not all) of the data and analyses underlying the paper from scratch, all on your own, on a machine that you can rent for something like 50 cents an hour. (It'll cost you about $4 -- 8 hours of CPU -- to re-run everything, plus some incidental costs for things like downloads.)

I really think it is a neat use of the Amazon elastic compute cloud.

Saturday, March 31, 2012

Mathematical Foundations of V&V Pre-pub NAS Report

A while back I mentioned an interesting looking study on the Mathematical Foundations of Validation, Verification and Uncertainty Quantification. I was just alerted to the pre-publication version available on the National Academies Press site.

Section 2.10 presents a case study for applying VV&UQ methods to climate models. The introductory paragraph of that section reads,

The previous discussion noted that uncertainty is pervasive in models of real-world phenomena, and climate models are no exception. In this case study, the committee is not judging the validity or results of any of the existing climate models, nor is it minimining the successes of climate modeling. The intent is only to discuss how VVUQ methods in these models can be used to improve the reliability of the predictions that they yield and provide a much more complete picture of this crucial scientific arena.

As noted in the front-matter, since this is a pre-print it is still subject to editorial revision.

Friday, March 30, 2012

Empirical Imperatives

From Climate Resistance:
There is a belief that you can simply read imperatives from ‘the evidence’, and to organise society accordingly, as if instructed by mother nature herself. And worse still, there is reluctance on behalf of many engaged in the debate to recognise that this very technocratic, naturalistic and bureaucratic way of looking at the world reflects very much a broader tendency in contemporary politics. To point any of these problems out is to ‘deny the science’. ‘Science’, then, is a gun to the head.
Shrinking the Sceptics

Sunday, February 26, 2012

SpaceX Dragon Panorama

SpaceX has a neat internal panorama up of their Dragon space capsule.
SpaceX Dragon capsule, internal iso/orthogrid panels and grid stiffened structure, forward/port view

Sunday, February 12, 2012

Sears--Haack Body for Mini-Estes

We have a little company here in Dayton that does print on demand (Fabbr) with Makerbots. They specialize in printing RepRap kits, but I think I'm going to see if they can print me a little rocket to use with Estes mini-motors.
The 1/4 and 1/2 A motors are 13 mm in diameter and 44 mm long.

The first thing you need to print a part with these hobby printers is an stl file. I followed a some-what torturous route to generating one.
First I made a little python script to find the minimum volume Sears-Haack body that would fit a 13x44 mm cylinder. The bold black curve is the minimum volume body; it happens to have a length of twice the motor length.
As you can see in the script, I also dumped an svg file of that curve. This is easily imported into Blender. Then the svg curve must be converted into a Mesh, and the Spin method applied to generate the body of revolution.
I played with the number of steps to get a mesh that looked like it had surface faces with near unit aspect ratio (not that it really matters, but old habits die hard).

Now I should be able to add some fins and export an stl from Blender for my rapid prototyping friends to play with. The design goal for this rocket will be to have positive static margin with the motor in the rocket, but neutral or negative static margin once the ejection charge pops it out the back (that way it does a tumble recovery).

Saturday, February 11, 2012

OpenFoam Now with Fedora RPMs

Well, I was pretty excited that OpenFoam was acquired by SGI, and they created a foundation to hold the copyrights for the project. That is good news for building a healthy open source community around the software. Looks like I jumped the gun with the install from source option I detailed back in November. If only I were a little more patient, I could have installed from rpms. The new release has lots of interesting additions. We live in exciting times for open source CFD.

Thursday, January 19, 2012

McCain's Hangar Queen on Trend

Looks like McCain's Hangar Queen is on Norm's trend.
Entire Defense budget to buy one airplane
See the video; linked from Make. Related post on Roger Pielke's site.