## Saturday, February 23, 2013

### 3D Printing is a Hot Political Topic

3D printing or additive fabrication is a hot topic, and not just to the engineers, fabricators and hobbyists anymore. It seems to be becoming part of the political symbology wrapped up in economic recovery or development. As I commented on the Made in Dayton blog, the National Academies Press recently released a report about Building the Ohio Innovation Economy that includes significant emphasis on additive fabrication, and the President mentioned the National Additive Manufacturing Innovation Institute, located in Youngstown, in his State of the Union address. There is also the Midwest Pilot for connecting High Performance Computing (HPC) resources to small and medium manufacturing concerns. As we saw in this topology optimization post, there is considerable need for scale-able computational approaches to fully realize the promise of additive manufacturing. Closer to home, your Dayton hackerspace is playing with a printrbot.

## Sunday, February 17, 2013

### Dayton Masonic Temple Photogrammetry

 Bundler-PMVS2 Dense Point Cloud Visualization in Meshlab
I wanted to experiment with some free photogrammetry software, and the Masonic Temple in Dayton is a nice target of opportunity. There are some free (as in beer) options, but I wanted something that was free (as in freedom) that I could run on my own machines rather than in a software-as-a-service cloud. The basic work-flow is demonstrated in this post by Andrew Hazelden for some aerial photographs. He gets fairly impressive results using only free software.

## Wednesday, February 13, 2013

### Efficiently Directing the Work

The good Colonel knew it in 1918,
no man can efficiently direct work about which he knows nothing
--Col Thurman H. Bane
and we can rediscover it nearly a century later,

Item 19 of the checklist stresses the importance of placing experienced, domain-knowledgeable managers in key program positions. The committee has observed that many of the truly extraordinary development programs of the past, such as Apollo, the Manhattan Project, the early imaging satellite programs, the U-2, the fleet ballistic missile system, and nuclear submarines, were managed by relatively small (and often immature) agencies with few established processes and controls. In that environment, dedicated managers driven by urgent missions accomplished feats that often seem incredible today.

The committee believes that the accumulation of processes and controls over the years—well meant, of course—has stifled domain-based judgment that is necessary for timely success. Formal SE processes should be tailored to the application. But they cannot replace domain expertise. In connection with item 19, the committee recommends that the Air Force place great emphasis on putting seasoned, domain-knowledgeable personnel in key positions—particularly the program manager, the chief system engineer, and the person in charge of “requirements”—and then empower them to tailor standardized processes and procedures as they feel is necessary.

[...]

While the systems engineering process is, broadly, reusable, it depends on having domain experts who are aware of what has gone wrong (and right) in the past recognize the potential to repeat the successes under new circumstances and avoid repeating the errors.

Pre-Milestone A and Early-Phase Systems Engineering: A Retrospective Review and Benefits for Future Air Force Acquisition

When I read that last part it reminded me of something Herbert Mason said at a talk he gave recently at the NMUSAF: "History makes you smart, heritage makes you proud."

## Tuesday, February 12, 2013

### Environmental Decisions in the Face of Uncertainty

New report from the National Academies Press on decision making under uncertainty.

Description: The U.S. Environmental Protection Agency (EPA) is one of several federal agencies responsible for protecting Americans against significant risks to human health and the environment. As part of that mission, EPA estimates the nature, magnitude, and likelihood of risks to human health and the environment; identifies the potential regulatory actions that will mitigate those risks and protect public health1 and the environment; and uses that information to decide on appropriate regulatory action. Uncertainties, both qualitative and quantitative, in the data and analyses on which these decisions are based enter into the process at each step. As a result, the informed identification and use of the uncertainties inherent in the process is an essential feature of environmental decision making.

## Saturday, February 2, 2013

### No Interactions? OFAT is still a Bad Idea

Suppose you are trying to estimate the effect that 6 factors have on a response, and you know that none of the factors influence the effect of the others, so that a simple model like this
 $Y={b}_{1}{X}_{1}+{b}_{2}{X}_{2}+{b}_{3}{X}_{3}+{b}_{4}{X}_{4}+{b}_{5}{X}_{5}+{b}_{6}{X}_{6}$ (1)

is the perfect choice. How should you get the data you need to estimate the ${b}_{i}$’s? You may be tempted to design a test to estimate each of these factors by changing one factor at a time (OFAT). There are no interaction terms (e.g. ${b}_{7}{X}_{1}{X}_{4}$) in equation 1. So there’s no need to perform any runs that change several of the $X$’s at once, right? Wrong.