Saturday, July 28, 2012

Mixed Effects for Fusion

It seems like the folks at Prometheus Fusion Perfection have matured their equipment and operations to the point of being able to run on a fairly repeatable basis. Right now they are working on a Symmetry test of their polywell. I posted a link to an example design/analysis using R that would allow a basic response surface for the device to be estimated (script embedded below the fold). It uses AlgDesign for the blocking by shot, and lme4 to fit a model that includes a random effect to account for shot-to-shot variation.

Monday, July 23, 2012

Convergence for Falkner-Skan Solutions

About 6 months ago Dan Hughes sent me a link to an interesting paper on "chaotic" behavior in the trajectory of iterates in a numerical Falkner-Skan solution. It struck me that the novel results reported in that paper were an artifact of the numerical method, and had little to do with any "chaotic" physics that might be going on in boundary layers or other systems that might be well described by this equation. This is similar to the point I made in the Fun with Filip post: the choice of numerical method matters. Do not rush to judgment about problems until you have brought the most appropriate methods to bear.

There are some things about the paper that are not novel, and others that seem to be nonsense. It is well-known that there can be multiple solutions at given parameter values (non-uniqueness) for this equation, see White. There is the odd claim that "the flow starts to create shock waves in the medium [above the critical wedge angle], which is a representation of chaotic behavior in the flow field." Weak solutions (solutions with discontinuities/shocks) and chaotic dynamics are two different things. They use the fact that the method they choose does not converge when two solutions are possible as evidence of chaotic dynamics. Perhaps the iterates really do exhibit chaos, but this is purely an artifact of the method (i.e. there is no physical time in this problem, only the pseudo-time of the iterative scheme). By using a different approach you will get different "dynamics", and with proper choice of method, can get convergence (spectral even!) to any of the multiple solutions depending on what initial condition you give your iterative scheme. They introduce a parameter, \(\eta_{\infty}\), for the finite value of the independent variable at "infinity" (i.e. the domain is truncated). There is nothing wrong with this (actually it's a commonly used approach for this problem), but it is not a good idea to solve for this parameter as well as the shear at the wall in your Newton iteration. A more careful approach of mapping the boundary point "to infinity" as the grid resolution is increased (following one of Boyd's suggested mappings) removes the need to solve for this parameter, and gives spectral convergence for this problem even in the presence of non-uniqueness and the not uncommon vexation of a boundary condition defined at infinity (all of external aerodynamics has this helpful feature).

Sunday, July 22, 2012

VV&UQ for Historic Masonry Structures

What a neat application of verification, validation and uncertainty quantification (VV&UQ) methods! The paper is Uncertainty quantification in model verification and validation as applied to large scale historic masonry monuments.

Abstract: This publication focuses on the Verification and Validation (V&V) of numerical models for establishing confidence in model predictions, and demonstrates the complete process through a case study application completed on the Washington National Cathedral masonry vaults. The goal herein is to understand where modeling errors and uncertainty originate from, and obtain model predictions that are statistically consistent with their respective measurements. The approach presented in this manuscript is comprehensive, as it considers all major sources of errors and uncertainty that originate from numerical solutions of differential equations (numerical uncertainty), imprecise model input parameter values (parameter uncertainty), incomplete definitions of underlying physics due to assumptions and idealizations (bias error) and variability in measurements (experimental uncertainty). The experimental evidence necessary for reducing the uncertainty in model predictions is obtained through in situ vibration measurements conducted on the masonry vaults of Washington National Cathedral. By deploying the prescribed method, uncertainty in model predictions is reduced by approximately two thirds.

Highlights:
  • Developed a finite element model of Washington National Cathedral masonry vaults.
  • Carried out code and solution verification to address numerical uncertainties.
  • Conducted in situ vibration experiments to identify modal parameters of the vaults.
  • Calibrated and validated model to mitigate parameter uncertainty and systematic bias.
  • Demonstrated a two thirds reduction in the prediction uncertainty through V&V.
Keywords: Gothic Cathedral; Modal analysis; Finite element modeling; Model updating; Bayesian inference; Uncertainty quantification

I haven't read the full-text yet, but it looks like a coherent (Bayesian) and pragmatic approach to the problem.

Saturday, July 21, 2012

Rocket Risk

As reported on ParabolicArc, NASA awarded SpaceX a contract to launch one of their science payloads. The topic of NASA's assessment of launch service provider risk naturally came up. NASA has published payload value, and risk rating guidelines.

If we assume that each launch has the same probability of success, then these are simple risk calculations to make, e.g. see these slides. The posterior probability of success, \(\theta\), is
\[ p(\theta | r, n) = \mathrm{Beta}(\alpha + r, \beta + n - r) \]
where \(r\) is the number of successes, \(n\) is the number of trials, and \(\alpha\) and \(\beta\) are parameters of the Beta distribution prior. What values of parameters should we choose for the prior? I like \(\alpha=\beta=1\), you could probably make a case for anything consistent with \(\alpha+\beta-2=0\). Many people say that risk = probability * consequence. I don't know what the consequences are in this case, and under that approach NASA's chart doesn't make any sense (you could have a low risk with a high probability of failing to launch an inconsequential payload), so I'll stick to just the probabilities of launch success, and leave worrying about the consequences to others.

Since NASA specifies a number of successes in a row (consecutive) then there is already an indication that assuming the trials independent and identically distributed (i.i.d.) is unrealistic. If your expensive rocket blows up, you usually do your best to find out why and fix the cause of failure. That way on the next launch your rocket has a higher probability of success than it previously did.

Saturday, July 7, 2012

DARPA's Silver Birds

NextBigFuture caries the story of DARPA's continuing Integrated Hypersonics program.


Eugen Sänger, father of the hypersonic boost-glide global bomber concept, may well have been prescient when he wrote, "Nevertheless, my silver birds will fly!"

Wednesday, July 4, 2012

Experimental Designs with Orthogonal Basis

In comments on the Fun with Filip post I mentioned that the basis should be taken into consideration for experimental design. This script generates a couple different sample sets for estimating a sixth order 2-dimensional response surface.

Monday, July 2, 2012

HIFiRE 2 Videos

Lug camera video of the HIFiRE (Hypersonic International Flight Research and Experimentation) flight 2 launch:


Multiple high-speed views:


Additional coverage on Parabolic Arc.