Tuesday, December 28, 2010
Tuesday, November 16, 2010
Tuesday, November 2, 2010
My favorite was Casper the Friendly Ghost. It was great to see all the folks out and about in Grafton Hill enjoying the brisk evening air and beautiful display.
Wednesday, October 20, 2010
We seem not to be able to use the machine, which we all believe is a very powerful tool for manipulating and transforming information, to do our own tasks in this very field. We have compilers, assemblers, monitors, etc. for others, and yet when I examine what the typical software person does, I am often appalled at how little he uses the machine in his own work.
Build productivity enhancing tools of broad applicability for the expert user so that efficient, special purpose PDE codes can be built reliably and quickly, rather than attempt to second guess the expert and build general purpose PDE codes (black box systems) of doubtful efficiency and reliability.
- Take as input a PDE description, along with boundary and initial condition definitions
- Discretize the PDE
- Analyze the result (e.g. for stability)
- Calculate the Jacobian (needed for Newton methods in implicit time-integration or non-linear boundary value problem (BVP)s)
- Generate code
- Manipulate the set of partial differential equations to cast them into a form that is amenable to numerical solution. For vector PDEs, this might include vector differential calculus operations and reexpression in scalar (component) form, and the application of a linearization approximation for non-linear PDEs.
- Discretize the time and space domain, and transform the partial differential operators in the PDEs into finite difference operators. This transforms the partial differential equations into a set of algebraic equations. A multitude of possible transformations for the differential operators are possible and the boundary conditions for the PDEs also must be appropriately handled. The resulting difference equations must be analyzed to see if they form an accurate and numerically stable approximation of the original equation set. For real world problems, this analysis is usually difficult and often intractable.
- After choosing a solution algorithm from numerical linear algebra, the finite difference equations and boundary conditions are coded in a programing language such as FORTRAN.
- The numerical algorithm is then integrated with code for file manipulations, operating system interactions, graphics output, etc. forming a complete computer program.
- The production program is then executed, and its output is analyzed, either in the form of numerical listings or computer-generated graphics.
With continuing advances in computer technology, the last step in this process has become easier. For a given class of problems, answers can be calculated more quickly and economically. More importantly, harder problems which require more computational resources can be solved. But the first four steps have not yet benefited from advances in computer performance; in fact, they are aggravated by it.
Taken together with the software described in other chapters, these tools allow the user to quickly generate a FORTRAN code, run numerical experiments, and discard the code without remorse if the numerical results are unsatisfactory.
curvi : [xi, eta, zeta]; /* the curvilinear coordinates */
indep : [x, y, z]; /* the independent variables */
nn : length(indep);
eqn : sum(diff(sigma * diff(f, indep[i]), indep[i]), i, 1, nn);
for j : 1 thru 3 do (
for i : 1 thru 3 do (
J[i,j] : ’diff(indep[j],curvi[i])
K : zeromatrix(3,3);
for j : 1 thru 3 do (
for i : 1 thru 3 do (
K[i,j] : diff(curvi[j], indep[i])
grid_trans_subs : matrixmap(”=”, K, invert(J));
/* making substitutions from a list is easier than from a matrix */
grid_trans_sublis : flatten(makelist(grid_trans_subs[i],i,1,3));
/* Evaluation took 0.0510 seconds (0.0553 elapsed) using 265.164 KB. */
trans_eqn_factor : factor(trans_eqn) $
/* Evaluation took 2.4486 seconds (2.5040 elapsed) using 48.777 MB. */
- Define dependencies between independent and dependent (and possibly computational coordinates)
- Associate a list if indices with the coordinates
- Define rules that transform differential terms into difference terms with the appropriate index shifts and constant multipliers corresponding to the coordinate which the derivative is with respect to and the selected finite difference expression
- Apply the rules to the PDE to give a finite difference equation (FDE)
- Use Maxima’s simplification and factoring capabilities to simplify the FDE
- Output the FDE in FORTRAN format and wrap with subroutine boilerplate using text processing macros
defrule(deriv_subst_1, ’diff(fmatch,xmatch,1), diff_1(fmatch,xmatch)),
defrule(deriv_subst_2, ’diff(fmatch,xmatch,2), diff_2(fmatch,xmatch)),
mat_expr : apply1(mat_expr, deriv_subst_1),
mat_expr : apply1(mat_expr, deriv_subst_2),
- change variables, convert arbitrary second order differential equation in nn variables to an arbitrary coordinate frame in the variables xi[i]
- atomic notation for derivatives
- primitive atomic notation
- introduce differences of unknowns
- primitive differences, scheme and difference collect the coefficients of the differences and calculate the stencil of the solver and coordinate transformations
- write the FORTRAN code
Thursday, October 14, 2010
|Henri-Julien Félix Rousseau - Les Pêcheurs à la ligne|
[1908 - 1909]
This is another case where the hardware preceded full understanding of the physics. The Wrights found in their bicycle, and subsequent wind tunnel, tests that the published empirical constants in the theoretical lift equations were pretty far off.
Just because the tower is cool (and GIMP is cool as well):
Monday, September 27, 2010
The constraints imposed by the planetary ecosystem require continuous adjustment and permanent adaptation. Predictive skills are of secondary importance.and thought of this
There is no security on this earth; there is only opportunity.Tennekes concludes with
General Douglas MacArthur
From my background in turbulence I look forward with grim anticipation to the day that climate models will run with a horizontal resolution of less than a kilometer. The horrible predictability problems of turbulent flows then will descend on climate science with a vengeance.I have his book; it is rather good.
Friday, September 24, 2010
Tuesday, August 3, 2010
This was a guest post over on Pielke's site.
Dr Pielke's Honest Broker concepts resonate with me because of practical decision support experiences I've had, and this post is an attempt to share some of those from a realm pretty far removed from the geosciences. All the views and opinions expressed are my own and in no way represent the position or policy of the US Air Force, Department of Defense or US Government. I am writing as a simple student of good decision making. My background is not climate science. I am an Aeronautical Engineer with a background in computational fluid dynamics, flight test and weapons development. I got interested in the discussions of climate policy because the intersection of computational physics and decision making under uncertainty is an interesting one no matter what the subject area. The discussion in this area is much more public than the ones I'm accustomed to, so it makes a great target of opportunity. The decision support concepts Dr Pielke discusses make so much sense to me now, but I can see how hard they are for technical folks to grasp because I used to be a very linear thinker when I was a young engineer right out of school.
My journeyman's education in decision support came when I got the chance to lead a small team doing Live Fire Test and Evaluation for the Air Force (you may not be familiar with LFT&E, it is a requirement that grew out of the Army gaming testing of the Bradley fighting vehicle in the 1980s, a situation that was fairly accurately lampooned in the movie "Pentagon Wars"). The competing values of the different stakeholders (folks appointed by congress to ensure sufficient realistic testing compared to folks at the service level doing product development) was really an eye-opening education for a technical nerd like me. I initially thought, "if only everyone can agree on the facts, the proper course of action will be clear". How naive I was! Thankfully, the very experienced fellows working for me didn't mind training up a rash, newly-minted, young Captain.
It's tough for some technical specialists (engineers/scientists) to recognize worthy objectives their field of study doesn't encompass. The reaction I see from the more technically oriented folks like Tobis (see how he struggles) reminds me a lot of the reaction that engineers in product development offices would have to the role of my little Live Fire office. A difficulty we often encountered was the LFT&E oversight folks wanted to accomplish testing that didn't have direct payoff to narrower product development goals that concerned the engineers. "What those people want to do is wasteful and stupid!" This parallels the recent sand berm example. The preferred explanation from the technician's perspective is that the other guy is bat-shit crazy, and his views should be ridiculed and de-legitimized. The truth is usually closer to the other guy having different objectives that aren't contained within the realm of the technician's expertise. In fact, the other person is probably being quite rational, given their priors, utility function and state of knowledge.
In my little Live Fire Office we had lots of discussion about what to call the role we did, and how to best explain it to the program managers. I wish I had heard of Dr Pielke's book back then, because "Honest Broker" would have been an apt description for much of the role. We acted as a broker between the folks in the Pentagon with the mandate from congress for sufficient, realistic testing, and the Air Force level program office with the mandate for product development. The value we brought (as we saw it), was that we were separate from the direct program office chain of command (so we weren't advocates for their position), but we understood the technical details of the particular system, and we also understood the differing values of the folks in the Pentagon (which the folks in the program office loved to refuse to acknowledge as legitimate, sound familiar?). That position turns out to be a tough sell (program managers get offended if you seem to imply they are dishonest), so I can empathize with the virulent reaction Dr Pielke gets on applying the Honest Broker concepts to climate policy decision support. People love to take offense over their honor. That's a difficult snare to avoid while you try to make clear that, while there's nothing dishonest about advocacy, there remains significant value in honest brokering. Maybe Honest Broker wouldn't be the best title to assume though. The first reaction out of a tight-fisted program manager would likely be "I'm honest, why do I need you?"
One of the reason my little office existed was because of some "lessons learned" from the Tri-Service Standoff Missile debacle (all good things in defense acquisition must grow out of historical buffoonery). The broader Air Force leadership realized that it was counterproductive to have product development engineers and program managers constantly trying to de-legitimize the different values that the oversight stake-holders brought (the differences springing largely from different appetites for risk and priors for deception) by wrangling over largely inconsequential, technical nits (like tree rings in the Climate Wars). The wiser approach was to maintain an expertise whose sole job was to recognize and understand the legitimate concerns of the oversight folks and incorporate those into a decision that meets the service's constraints as quickly and efficiently as possible. Rather than wasting time arguing, product development folks could focus on product development.
The other area where I've seen this dynamic play out is in making flight test decisions. In that case though, the values of all the stake-holders tend to align more closely, so the separation between technical expertise and decision making is less contentious (Dr Pielke's Tornado analogy). In contrast to the climate realm where it's argued that science compels because we're in the Tornado mode, the flight-test engineers understand that the boss is taking personal responsibility for putting lives at risk based on their analysis. They tend to be respectful of their crucial, but limited, role in the broader risk management process. Computational fluid dynamics can't tell us if it's worth risking the life of an air crew to collect that flight test data. In that case there is no confusion about who is king, and over what questions the technical expert must "pass over in silence."
Monday, June 14, 2010
View Ohio Aerospace Hub Roadwork Development Grant in a larger map
I'm sure this will be welcome by the retailers on Brown St (now that they've successfully shuffled off their vagrant problem).
Friday, June 4, 2010
From my perspective (as someone who's sat in the hot seat conducting flight tests), the really impressive thing with the SpaceX operation was their ability to light the engines, auto-abort, and turn a new countdown/launch at the end of their range time. There was clearly a whole lot of work in the design phase leading up to the impressive execution today that made saving the mission possible. Nice.
Monday, May 10, 2010
Sunday, April 18, 2010
There were seven resulting top ideas in the entrepreneur interest category. Six of these consisted primarily of websites. Five of those websites were about mentoring for young entrepreneurs by established ones, entrepreneur support groups, information clearinghouses or some combination thereof. Consensus building is certainly brutal in seeking out the lowest common denominator. How to argue against something as innocuous and pervasive as a website in our networked age? And what were the big success stories out of last year's summit? Weeding and painting and, you guessed it, a web resource. Remember, these groups met independently at the beginning of the summit. No significant prior communication between sub-group members other than mingling at vendor booths while surfing swag.
How does what I observed in the summit breakout sessions look in light of established thoughts on creativity (or lack thereof) in groups? There are two superficially competing views of the group creative process. There is the whole-is-more-than-the-sum-of-parts school popularized by Stephen Covey. In opposition, is the creative-acts-are-individual-acts school. William Whyte's The Organization Man provides an extensive denial of useful creative genius in groups, and a call for renewed focus on the dignity and efficacy of the individual contribution. The former I'll call the Synergy School, the latter the Solitary School.
Here's part of Whyte's criticism of the consensus building group,
I have been citing the decision-making group, and it can be argued that these defects of order do not apply to information-exchanging groups. It is true that meeting with those of common interests can be tremendously stimulating and suggest to the individuals fresh ways of going about their own work. But stimulus is not discovery; it is not the act of creation. Those who recognize this limitation do not confuse the functions and, not expecting too much, profit from the meeting of minds.
Others, however, are not so wise, and fast becoming a fixture of organization life is the meeting self-consciously dedicated to creating ideas. It is a fraud. Much of such high-pressure creation -- cooking with gas, creating out loud, spitballing, and so forth -- is all very provocative, but if it is stimulating, it is stimulating much like alcohol. After the glow of such a session has worn off, the residue of ideas usually turns out to be a refreshed common denominator that everybody is relieved to agree upon -- and if there is a new idea, you usually find that it came from a capital of ideas already thought out -- by individuals -- and perhaps held in escrow until someone sensed an opportune moment for its introduction.
The scientific conference exemplifies Whyte's informational exchange meeting. No one attends to make decisions (or vote with dots), the attendees are looking to share their work, and learn about their colleagues' work. These meetings are an important part of modern scientific progress. This years' updayton meeting did have information exchange components, which I'll get to later.
The Synergy School might argue that the updayton breakout sessions can provide an opportunity for synergistic collaboration, where alternative solutions emerge that are better than any of the individual solutions brought by group members. The Synergy School's three levels of communication are
- The lowest level of communication coming out of low trust situations is characterized by defensiveness, protectiveness, and legalistic language which covers all the bases and spells out qualifiers and escape clauses in the event things go sour.
- The middle level of communication is respectful communication -- where fairly mature people communicate.
- The highest level of communication is synergistic (win/win) communication.
In support of the Solitary School's idea about the capital of individual ideas, the winning project from the entrepreneur interest category was the one option that wasn't a website. The young man whose idea formed the core of this project said, "this is something I've been writing about for years". Something he was clearly passionate about, something that he expended his individual creative effort to flesh out beforehand on his own, and subsequently pitch to the group. The other options presented by the members of the group were relentlessly mashed into web-sameness by the gentle actions of the facilitators and the listless shrugs of individual acquiescence from well-meaning group members searching for common ground. When a thoughtful member of the breakout session asked the only really important question, "how do you create an innovator?" His question was met with more shrugs around the room followed quickly by redirection from the facilitators. Clearly that question cannot be packaged into a public relations project.
What about the skills sessions? Surely these have redeeming aspects, the Solitary School would appreciate these as information exchange, and the Synergy School might appreciate them as 'sharpening the saw'. The most interesting aspect of the panel discussions was the incipient frustration I observed in some of David Gasper's comments. Roughly, "there are so many great resources for entrepreneurs in the Dayton region. Why don't we have more entrepreneurs!? Dayton needs more entrepreneurs." Some of the resources mentioned by the panelists were Dayton SCORE, EntrepenuerOhio and Dayton Business Resource Center. As Theresa Gasper observed, "People seem to want the information PUSHED to them, but then feel overwhelmed with all the information coming at them. No one seems to want to PULL the information – meaning, many don't want to search for the info." This is consistent with the majority of "needs" identified in the entrepreneur breakout sessions. These folks are looking for checklists, guarantees of stability and someone to tell them what to do. In fact, one participant in my session thought that the biggest barrier to entry for entrepreneurs was the lack of the safety net offered by nationalized health-care! If you were to ask me what is the opposite of the entrepreneurial spirit, I could not have come up with a better answer. Probably the opposite of the definitions the panel members gave of entrepreneur too:
- some one who has put something of value to them at risk
- some one with significant "skin in the game"
In her 2006 book, Generation Me, Twenge notes that self-esteem in children began rising sharply around 1980, and hasn’t stopped since. By 1999, according to one survey, 91 percent of teens described themselves as responsible, 74 percent as physically attractive, and 79 percent as very intelligent. (More than 40 percent of teens also expected that they would be earning $75,000 a year or more by age 30; the median salary made by a 30-year-old was $27,000 that year.)Twenge attributes the shift to broad changes in parenting styles and teaching methods, in response to the growing belief that children should always feel good about themselves, no matter what. As the years have passed, efforts to boost self-esteem—and to decouple it from performance—have become widespread.
These efforts have succeeded in making today’s youth more confident and individualistic. But that may not benefit them in adulthood, particularly in this economic environment.Twenge writes that “self-esteem without basis encourages laziness rather than hard work,” and that “the ability to persevere and keep going” is “a much better predictor of life outcomes than self-esteem.” She worries that many young people might be inclined to simply give up in this job market. “You’d think if people are more individualistic, they’d be more independent,” she told me. “But it’s not really true. There’s an element of entitlement—they expect people to figure things out for them.”
Please don't misunderstand my criticisms of this updayton process (or cooperation in general). I am in agreement with both Covey and Whyte that our biggest challenges require innovative cooperation to solve.
The winner of the 'best swag contest' was MetroParks with their D-ring key fob:
Wednesday, April 14, 2010
Tuesday, April 13, 2010
Thursday, March 25, 2010
Interesting how DC is such an outlier in these graphs; it's good to be king.
Here's one that's just interesting, not necessarily Dayton, Ohio-centric (you can drag the labels around if it starts out too cluttered):
Rumors of the death of US manufacturing seem greatly exaggerated.
Wednesday, March 17, 2010
Zen Uncertainty: Attempts to understand uncertainty are mere illusions; there is only suffering.Should we give up? No, there's plenty we can do to make the suffering more bearable. Lo and Mueller give an uncertainty taxonomy of five levels in their 'Physics Envy' paper:
-- WARNING: Physics Envy May Be Hazardous To Your Wealth!
- Complete Certainty: the idealized deterministic world
- Risk without Uncertainty: an honest casino
- Fully Reducible Uncertainty: the odds in the honest casino are not posted, we have to learn them from limited experience
- Partially Reducible Uncertainty: we're not quite sure which game at the casino we're playing so we have to learn that as well as the odds based on limited experience
- Irreducible Uncertainty: we're not even sure if we're in the casino, we might be outside splashing around in the fountain...
Section 2 of the paper provides a nice historical overview of the early work of Paul A. Samuelson, who single-handedly brought statistical mechanics to the economists, and they have never been the same since. Samuelson acknowledged the deep connection between his work and physics:
Perhaps most relevant of all for the genesis of Foundations, Edwin Bidwell Wil- son (1879–1964) was at Harvard. Wilson was the great Willard Gibbs’s last (and, essentially only) protege at Yale. He was a mathematician, a mathematical physicist, a mathematical statistician, a mathematical economist, a polymath who had done first-class work in many fields of the natural and social sciences. I was perhaps his only disciple . . . I was vaccinated early to understand that economics and physics could share the same formal mathematical theorems (Euler’s theorem on homogeneous functions, Weierstrass’s theorems on constrained maxima, Jacobi determinant identities underlying Le Chatelier reactions, etc.), while still not resting on the same empirical foundations and certainties.Related to this theme, there's an interesting recent article over on Mobjectivist site about using ideas from physics to model income distributions.
Lo and Mueller propose to operationalize their uncertainty taxonomy with a 2-D checklist (table). The levels provide the columns across the top, and there is a row for each business component of the activity being evaluated, here's their description:
The idea of an uncertainty checklist is straightforward: it is organized as a table whose columns correspond to the five levels of uncertainty of Section 3, and whose rows correspond to all the business components of the activity under consideration. Each entry consists of all aspects of that business component falling into the particular level of uncertainty, and ideally, the individuals and policies responsible for addressing their proper execution and potential failings.This seems like an idea that could be adapted and combined with best practices for model validation (and checklist sorts of approaches) in helping to define what sorts of uncertainties we are operating under when we make decisions using science-based decision support products.
Their final paragraph echos Lindzen's sentiments about climate science:
While physicists have historically been inspired by mathematical elegance and driven by pure logic, they also rely on the ongoing dialogue between theoretical ideals and experimental evidence. This rational, incremental, and sometimes painstaking debate between idealized quantitative models and harsh empirical realities has led to many breakthroughs in physics, and provides a clear guide for the role and limitations of quantitative methods in financial markets, and the future of finance.
-- WARNING: Physics Envy May Be Hazardous To Your Wealth!