Monday, January 11, 2010

A Computational Physics Quality Control Checklist?

gmcrews has a nice write-up about software quality and process control with some commentary touching on climate modeling software in particular. One thing I thought after reading was that a lot of the process control stuff works really well in manufacturing physical artifacts, but software tends to be pretty different each time you approach a new problem. Statistical process control may not translate to the coding world very directly.

I just read a book review of the Checklist Manifesto (being a former flight test engineer, checklists are near and dear to me) and found an interesting passage that touches on this concern:

...three different kinds of problems in the world: the simple, the complicated, and the complex. Simple problems, they [Zimmerman and Glouberman] note, are ones like baking a cake from a mix. There is a recipe. Somtimes there are a few basic techniques to learn. But once these are mastered, following the recipe brings a high likelihood of success.

Complicated problems are ones like sending a rocket to the moon. They can sometimes be broken down into a series of simple problems. But there is no straightfoward recipe. Success frequently requires multiple people, often multiple teams, and specialized expertise. Unanticipated difficulties are frequent. Timing and coordination become serious concerns.

Complex problems are ones like raising a child. Once you learn how to send a rocket to the moon, you can repeat the process with other rockets and perfect it. One rocket is like another rocket. But not so with raising a child, the professors point out. Every child is unique. Although raising one child may provide experience, it does not guarantee success with the next child. Expertise is valuable but most certainly not sufficient. Indeed, the next child may require an entirely different approach from the previous one. And this brings up another feature of complex problems: their outcomes remain highly uncertain. Yet we all know that it is possible to raise a child well. It’s complex, that’s all.

So is software development more like producing / operating a rocket or raising a child? Taken broadly, I think it is more like child rearing, but I think it’s certainly something that could benefit from checklists in narrow domains. If you are developing computational physics codes, there are published best practices and methodologies for verification, validation and uncertainty quantification. Good software carpentry stuff like unit test suites and solid version control should also be part of the ’process’ of ensuring software quality. The end product of a quality assurance effort in computational physics should be (approaching a checklist here?) a report that documents the version control methods used, the coverage (and successful completion) of the unit test suite, the results of formal verification studies and inferences drawn from the validation testing about the range of the input parameters over which useful predictive accuracy can be expected.

Got your own 'checklist' ideas? Put them in the comments!

No comments:

Post a Comment