Saturday, May 28, 2011

Why ALPAL

Really it's "why use symbolic math programs to develop number crunching software," but these are from the doc_282 in the ALPAL docs (my emphasis).
  • First, physics code could be generated and modified to incorporate new physics and features at a much greater speed than occurs with present codes. Such a language would directly foster improvements in the physics models employed, as well as teh numerical methods used to approximate these models. Two reasons point out the need to experiment with numerical methods for a given set of integro-differential equations. First, as a direct consequence of the paucity of mathematical theorems that constructively characterize partial differential equations (PDEs) in general, there is no means of determining what approximation technique will most faithfully capture the solution. Second, in almost all cases, analysis of an approximation technique is limited to idealized linear problems, so the stability and convergence properties of applying a given approximation technique to actual set of PDEs is unknown. Thus, the optimal algorithm for solving any given PDE or set of PDEs is not known, and any tool to aid the computational physicist must be capable of dealing with many different numerical algorithms and methods.
  • Second, much of the drudgery and concomitant errors in constructing simulation codes would be eliminated by such a language. Large amounts of uninteresting algebra are associated with both the development of appropriate physical models and the discrete versions of the these models on a computer. If the algebraic work involved could be largely automated, computational physicists could spend a great deal more time doing the physics they were trained to do.
  • Third, a more natural way of describing the physics model is possible with such a language. Errors in the modeling and numerical approximation process would become more obvious, thereby reducing the number of errors in the simulation code.
  • Fourth, it becomes possible to automate the computation of Jacobian matrices both for linear and nonlinear problems. Not only would this automatic calculation greatly reduce the number of algebraic errors commited for those cases that solve linear systems of equations (with or without a nonlinear solver0, but it would make possible a large number of implicit techniques for situations where it has just not been feasible to use them in the past.
  • Fifth, such a language could fulfill the need to optimize the very expensive computation that goes on in simulation codes. While a competent scientis can do a good job of optimizing a simple simulation code, the complexity of this task for large simulation codes is beyond the capabilities of even the most skilled scientist. Moreover, optimizing a code is a mundane task that once again does not reward the scientist in his primary pursuit. Optimization becomes an even more critical issue on non-scalar computer architectures such as the CRAY-XMP or an ultracomputer. A high-level view of how to vectorize and/or parallelize a given algorithm (or meta-algorithm) on one of these supercomputers is crucial so the scientist can substantially improve the cost-effectiveness of a simulation on that computer. In principle, a language like ALPAL can provide such a view.
  • Sixth, since the cost of developing simulation codes is great, especially for new machine architectures, this language could be used to dramatically cut development costs. This is true whether a simulation code is being created for a new machine or whether it is being ported from a previously used computer. Experience with vector computers over the past decade at LLNL has taught this lesson well. More recently, some simulation codes have been ported to the CRAY-XMP, with use being made of its distributed computing capability. This porting to a distributed computer has required an even larger investment of manpower. These large manpower development costs will be repeated many times because a great variety of parallel computers are now appearing, and because parallel computing is a far greater technical challenge than even vector computing.
  • Seventh, more complete and coherent documentation can be developed with such a language. In fact, a journal-style specification of a code can provide many details that are not generally provided in a journal article about the code. The journal-style specification is expressly designed for readability, wheras the text of a traditional simulation code is not. This is so because it is impossible to express high-level mathematical concepts such as derivatives and integrals together with their numerical approximations in Fortran or any other conventional high-level language.

Saturday, May 14, 2011

International Journal for Uncertainty Quantification Latex Template

I had a bit of trouble getting the Latex document class, ij4uq.cls, for the new International Journal for Uncertainty Quantification to work on my install of Fedora 14, so I figured I'd share my recipe for making it work. The editor and journal support staff were very helpful with quick responses and useful pointers.

First you need to download IJ4UQ Latex Template and unzip that in a convenient location.

[jstults@grafton ij4uq]$ unzip IJ4UQ-Latex-Template.zip
This creates a directory with the ij4uq.cls file you need to make documents for the journal. Instead of the standard article class you'll have something like
\documentclass[review,article]{ij4uq}
at the top of the Latex document. The zip file also contains author instructions, a latex template document, a readme and instructions on installing the Palatino font which is required by the journal style.

The reason I had so much trouble getting this style to work is that Fedora still uses TexLive 2007. There is an effort underway to update things to TexLive 2010 for Fedora 15. Until that becomes standard, you have to turn on a development repository and install texlive. Unless you want to really break your distro and install the styles by hand. However, the sane way is to use the package management system as much as possible.

Second you need to turn on the development repo and update (or if you don't already have texlive installed, then install texlive).

[root@grafton ij4uq] rpm -i http://jnovy.fedorapeople.org/texlive/2010/packages.f14/texlive-release.noarch.rpm [root@grafton ij4uq] yum clean all && yum update
Warning: this is a fairly significant download. Be prepared to go do something else while this crunches.

Third, try to compile the template document. It will likely break because you don't have all of the required *.sty files. Since you are running the sweet new development version of texlive, you can specify these to yum as they apear in the Latex error messages. This saves you from needing to track down which Fedora package provides the particular style file you are missing. This one-liner should get you most of the way there.

[root@grafton ij4uq] yum install 'tex(arial.sty)' 'tex(sectsty.sty)' 'tex(appendix.sty)' 'tex(changebar.sty)' 'tex(nicefrac.sty)' 'tex(lineno.sty)' 'tex(overpic.sty)' 'tex(stfloats.sty)' 'tex(textfit.sty)'

Fourth, you need to fix the unfree floatflt.sty.

[root@grafton ij4uq] mkdir -p /usr/share/texmf-texlive/tex/latex/floatflt [root@grafton floatflt] cd /usr/share/texmf-texlive/tex/latex/floatflt [root@grafton floatflt] rm -f floatflt.* float*.tex [root@grafton floatflt] wget http://mirror.ctan.org/macros/latex/contrib/floatflt/floatflt.ins [root@grafton floatflt] wget http://mirror.ctan.org/macros/latex/contrib/floatflt/floatflt.dtx [root@grafton floatflt] latex floatflt.ins [root@grafton floatflt] texhash

Fifth, you need to update the font maps for the new fonts.

[root@grafton ij4uq] updmap-sys --enable Map /usr/share/texlive/texmf-dist/fonts/map/dvips/palatino/upl.map

That's it. Now the ij4uq.cls template document that comes with the zip file should compile on your (only slightly broken) Fedora 14.

Sunday, May 8, 2011

Storms of Our Grandfathers

Are we "rolling 13s" and getting thousand year storms every year?

NOAA April 2011 Precipitation Anomaly

The contour plots below are taken from Theory of the hydraulic jump and backwater curves. These studies of historical storm records were used to inform design decisions for the Miami Valley Conservancy District's retarding basins and channel improvements following the 1913 floods. My previous post has pictures of the hydraulic jump below Huffman Dam in operation.

My question to Dr Curry about what value high-fidelity (read: relatively expensive to run and analyze) climate simulations have for decision makers was motivated by reading up on infrastructure projects like the retarding basins and channel improvements in the Miami Valley. I think it would be interesting to take a look at a historical project like this that included rudimentary analysis of climate (weather event frequency and magnitude) in its design, and say, "here's how it would be informed differently using modern tools."

The design philosophy taken by the engineers working for the Miami Valley Conservancy District was to design for the worst possible case (historical records from Europe were also considered since they went back further and more reliably) plus roughly twenty percent margin due to the inherent uncertainty in estimating the worst possible case.

If it were necessary to depend wholly on the records of storms which have occurred in the United States, it might be thought possible for moderately great storms to occur over a period of a few hundred years, and then to find, as an exception, a storm three or four times as great. Theoretically that is very improbable, simply because water vapor in sufficient quantities cannot be transported from the ocean or gulf fast and long enough to cause such exceptional storms. As stated in chapter XI, however, records were collected of the stages of rivers in Europe for long periods of time, and these furnish fairly conclusive proof that such great exceptional storms actually do not occur. On the Danube at Vienna, for instance, we have records since about the year 1000 A.D.; fairly accurate records are available for stages of floods in the Tiber at Rome for more than 2,000 years; and records have been made of floods on the Seine at Paris for a long period of years.
Relation of Great Storms to Maximum Possible
After making the extensive investigation of storms in the eastern United States, it is believed that the March, 1913, flood is one of the great floods of centuries in the Miami Valley. In the course of three or four hundred years, however, a flood 15 or 20 per cent greater may occur. We do not believe a flood will ever occur which is more than 20 or 25 per cent in excess of that of March 1913. There is a factor of ignorance, however, against which we must provide, and the only way to do this is arbitrarily to increase the size of the maximum flood to be provided for. If longer records were available a closer estimate could be made, but in planning works on which the protection of the Miami Valley depends, it is necessary to go beyond human judgment. This has been done on all the other phases of the design, and we believe it would not be good engineering practice to stop at our judgment on this phase. We must be able to say that the engineering works are absolutely safe in every respect. For this reason provision is made for a flood nearly 40 per cent greater than that of March 1913. This is 15 or 20 per cent in excess of what is believed to be the greatest possible flood that will ever occur.
Reasons for Choosing as a Basis for Design a Flood 40% Greater than that of March 1913
Would modern tools cut the design margin due to reduced uncertainty or would they indicate that the project is now under-designed due to projected climate change? The latter seems unlikely considering that the magnitude of the purported effects has been repeatably shown to be smaller than we can reliably detect given the length of our data record. Would there be any practically significant changes to the decisions and designs? If your system already has sufficient margin for projected changes in weather-event magnitude do projected changes in frequency matter?

Friday, May 6, 2011

Technocrats and Philosopher Kings can Save our Impotent Polity

Wow, really awesome article on Climate Resistance, Trust Me, I Speak for Science. I liked these parts from the concluding paragraphs especially. I think you'll notice the parallels to my posts, The Social Ethic and Appeals for Technocracy and No Fluid Dynamicist Kings in Flight-Test.
This metaphysical confusion runs throughout Mooney’s argument. For Mooney, ‘ideology’ is some insidious, toxic force, the antithesis to ‘truth’ itself. The thrust of his argument is that we need particular scientific institutions to ameliorate this intrinsic weakness of human nature. And as such, these institutions deserve elevated status above the reach of those prone to ideology. Otherwise, we would tend towards creationism, to MMR-scares, to climate-change denial. In other words, our flawed minds would create a catastrophe, and it is this possibility of catastrophe that seemingly legitimises the elevated position of scientific institutions. Mooney reinvents Plato’s city state administrated by Philosopher Kings, the main differences being that Mooney conceives of a global polity, and the wisdom of the Guardians only produces the possibility of mere survival, not even a better way of life. To bring this back the matter of trust, Mooney doesn’t trust humans. Their minds are flawed. Their ambitions and ideas are mere fictions. The institutions they create are accordingly founded on false premises, which, instituted and acted upon, will cause disaster. Even when humans are exposed to ‘the truth’, it is, on Mooney’s view, absorbed into the poisonous, ideological programmes of partisans: liars and cheats who distort it. But without a disaster looming, this instance of a politics of fear would collapse.
He simply can’t make a popular argument for his political idea, and so turns to ‘science’ to identify the necessity of such a programme — i.e. the crisis — and to identify reasons why conventional democratic processes cannot realise it...
It's always a good day when you can throw a little Plato into the mix ; - )

Wednesday, May 4, 2011

Fooling Yourself is Easy

Common problems from an interesting set of slides:

  • Confounding in experimental design
  • Mixing up the sample labels
  • Mixing up the group labels
  • Incomplete documentation
"Unfortunately, we suspect, The most simple mistakes are common."

AAAS, Feb 19: pursuing reproducibility audio / slides

Things we look for:

  • Data
  • Provenance
  • Code
  • Descriptions of nonscriptable steps
  • Descriptions of Planned Design, if Used

Dayton Flood Control Infrastructure at Work

Photos of the flood control contrivances in and around Dayton. All of these were taken on 3 May 2011. But first, a little history and engineering detail so you'll have a better appreciation of the pictures.

Arthur P. Morgan came to Dayton after the 1913 flood to design a flood control system to protect the entire Miami Valley. One element of this system was a dry dam—a dam that held water only during a flood and released the water at a rate that the downstream riverbed could carry. The problem was that the speed of the water through the dam made it powerful and destructive. To solve that problem, Morgan went with Col. Edward Deeds to his farm in Moraine where they built models in his swimming pool. They developed the hydraulic jump, which sends water through a series of baffles and steps, and then finally into a low wall that forces the water back onto itself, dissipating its own energy. This process of turning water onto itself is the hydraulic jump. From there, the water flows downstream calmly. This technology is still used in hydrological engineering throughout the world.

From Dayton Inventors River Walk

Here's a graphical depiction of a hydraulic jump.

A basic 1-D analysis follows the figure (clicking the image should take you to the free Google e-book).

Something that's kind of neat is that this system of flood control was designed in the days when "computers" were people (often women) not machines:

These engineers were certainly sure of themselves:
The bottom line is the important part. The hydraulic jump works by increasing the rate of turbulent kinetic energy production, this leads rather quickly (immediately if you assume equilibrium turbulence) to an increased rate of turbulent kinetic energy dissipation at the bottom of the energy cascade. The destructive capability (momentum) of the water is greatly reduced in exchange for raising its temperature ever so slightly.

The following figure shows the cuts that had to be made for the outlet channels and hydraulic jump pools (note Huffman Dam in the center).

And this one shows an aerial shot of the Huffman Dam just after completion.

These views from the top of the Huffman Dam show the turbulence at the end of the outlet channels due to the hydraulic jump.

Huffman Dam Outlet Channels
Huffman Dam Hydraulic Jump Pool Turbulence
View From the Top of Huffman Dam

These types of momentum dissipation mechanisms are also used throughout the city. The submerged dams take momentum out of the four streams that come together in the Dayton city limits: Miami, Mad and Stillwater Rivers and Wolf Creek.

Low Dam North-West of Downtown Dayton
Low Dam downstream of Dayton Canoe Club
There's talk of replacing these with something more water-sport (canoe / kayak) friendly.

Tuesday, May 3, 2011

Spring 2011 UAV News

Interesting stuff from today's AIAA news brief.

Light Manned Aircraft May Be Cheaper Option Than UAVs For Some.

Aerospace Daily and Defense Report (5/2, Fulgham) reported that because some countries cannot afford to maintain a force of UAVs for long periods, "a cheaper option is light, Predator-sized, manned aircraft equipped with sensors and weapons designed for the UAV market." Some of these include trainers that are "re-invented as light attack aircraft." Examples cited in the article include the Hawker Beechcraft/Lockheed-Martin AT-6B. According to the article, "given that the aircraft was designed for student-pilot abuse that's similar to the rigors of carrier landings, there appear to be a lot of operational options" such as "irregular warfare, homeland defense and civil support."

"Beast of Kandahar" Employed In Bin Laden Hunt.

Justin Hyde at Jalopnik (5/3) writes how the Lockheed Martin RQ-170 Sentinel UAV known as the "Beast of Kandahar" was used in the operation that ended in bin Laden's death. The drone "was likely the eyes and ears of the operation, streaming live feeds back to command centers."

Satellite Images Show Location.

Space (5/3) reports Digital Globe released archival satellite images taken in January of the area where Osama bin Laden was found and killed. The company "located the probable compound using coordinates and physical descriptions through open sources."