Tuesday, December 28, 2010

Dayton Diode

Looks like Dayton has a hackerspace now: Dayton Diode Space. Tell your favorite south-western Ohio hacker!

Tuesday, November 16, 2010

TSA Screening, Terminal 2, SAN, Nov. 13, 2010

"If you touch my junk, I'll have you arrested."

TSA's security theater is an emerging threat to our Constitutional Republic.

Tuesday, November 2, 2010

Grafton Hill Pumpkin Display

Thanks to Judy Chaffin for making a great pumpkin display happen.

My favorite was Casper the Friendly Ghost. It was great to see all the folks out and about in Grafton Hill enjoying the brisk evening air and beautiful display.

Wednesday, October 20, 2010

NALPAL: Not A Livermore Physics Applications Language

Background and Motivation

The problem I am interested in: leverage high-level symbol manipulation capabilities in a computer algebra system (CAS) to generate compile-able code for number crunching. More specifically, I am interested in generating routines for numerical partial differential equation (PDE) solvers. This is not a new idea, significant work was accomplished starting in the early 1980s [123], continued into the late ’80s [456], and early ’90s [7]. However, those capabilities have not been consolidated and formally incorporated into my open-source CAS of choice, Maxima (though there is fairly recent work implementing similar ideas for Mathematica [89]). The early work [1] quotes Hamming’s Turing Award speech [10] for motivation:
We seem not to be able to use the machine, which we all believe is a very powerful tool for manipulating and transforming information, to do our own tasks in this very field. We have compilers, assemblers, monitors, etc. for others, and yet when I examine what the typical software person does, I am often appalled at how little he uses the machine in his own work.
I had been thinking about this problem for a while because of my interest in method of manufactured solutions (MMS) for code verification. I thought things were really going to take off when I came across Cook’s 1990 technical report describing code generation for PDE solvers using A Livermore Physics Applications Language (ALPAL), which was built on top of DOE-MACSYSMA. ALPAL aimed to be a general purpose domain specific language for turning symbolic descriptions of PDEs into code. Here’s a solution to the problem already developed! Of course, it couldn’t be that easy. I asked the Maxima mailing list if anyone knew what became of this effort: ALPAL question to Maxima list, and ended up answering my own question by getting a hold of some of the old ALPAL devs: ALPAL answer to the list. Unfortunately, there is a long history behind the divergence between different versions of MACSYMA (and different groups of developers) that mitigates against this old code ever working with the fork that became Maxima (should an archive of the source ever actually turn up, update: it did, see comment) [111213].
As you may have guessed from this post’s title, and the difficulties described in the previous paragraph, I’ve decided to pursue a more toolbox approach rather than (re)implementing a domain specific language and associated decision logic. The working title for my toolbox of utilities will be NALPAL, both as a nod to the valuable historical work in this area, and an indication of the path not taken.
Wirth categorizes three main approaches to systems for numerical PDE solution: black-box systems, subroutine packages, and code generation aids [1]. Black-box systems are intended for the novice user, and tend to be constrained to very specific problem types or incorporate a great deal of decision logic. Subroutine packages provide reusable building blocks, but require more knowledge and coding on the part of the user, and tend to remain problem specific. Code generation aids require the user to be an expert in numerical analysis, and tend not to automate any of the analytical work that precedes the code generation step.
Wirth’s suggested approach is a Unix-like toolbox approach where standard subroutine libraries are used when it makes sense to do so, and utilities for doing custom code generation are written in the CAS environment. The alternative is to create a full language, and incorporate the semantics and decision logic to automate the entire process for the novice user. The work of Wirth, Steinberg and Roache is a good example of the former approach, and Cook’s work on ALPAL is an example of the latter (though his early work [3] is more along the lines of the toolbox approach, going so far as to say “the viewpoint taken is that one should be able to start with integro-differential equations and end up with optimal FORTRAN code segments. The key in achieving this goal was to tackle only the specific algebraic problem-i at hand, and not to attempt to provide tools to cover every conceivable numerical scheme.”). The toolbox approach keeps more of the work for the problem solver in set-up/tear-down/customization, while the language approach loads more of the burden on to the domain specific language developer. Wirth’s second objective sums up well the approach that I think makes the most sense (emphasis original):
Build productivity enhancing tools of broad applicability for the expert user so that efficient, special purpose PDE codes can be built reliably and quickly, rather than attempt to second guess the expert and build general purpose PDE codes (black box systems) of doubtful efficiency and reliability.
There are still useful things to learn from ALPAL even if we have chosen the less ambitious road, since the problem being solved is the same. A basic description of the ALPAL use-case [7]:
  1. Take as input a PDE description, along with boundary and initial condition definitions
  2. Discretize the PDE
  3. Analyze the result (e.g. for stability)
  4. Calculate the Jacobian (needed for Newton methods in implicit time-integration or non-linear boundary value problem (BVP)s)
  5. Generate code
Even if we don’t have a full-featured language, the user of our set of utilities will still be interested in accomplishing largely these same steps. In fact, Wirth’s early work gives these steps (reproduced here verbatim) [1]:
  1. Manipulate the set of partial differential equations to cast them into a form that is amenable to numerical solution. For vector PDEs, this might include vector differential calculus operations and reexpression in scalar (component) form, and the application of a linearization approximation for non-linear PDEs.
  2. Discretize the time and space domain, and transform the partial differential operators in the PDEs into finite difference operators. This transforms the partial differential equations into a set of algebraic equations. A multitude of possible transformations for the differential operators are possible and the boundary conditions for the PDEs also must be appropriately handled. The resulting difference equations must be analyzed to see if they form an accurate and numerically stable approximation of the original equation set. For real world problems, this analysis is usually difficult and often intractable.
  3. After choosing a solution algorithm from numerical linear algebra, the finite difference equations and boundary conditions are coded in a programing language such as FORTRAN.
  4. The numerical algorithm is then integrated with code for file manipulations, operating system interactions, graphics output, etc. forming a complete computer program.
  5. The production program is then executed, and its output is analyzed, either in the form of numerical listings or computer-generated graphics.
Wirth goes on to say (emphasis added),
With continuing advances in computer technology, the last step in this process has become easier. For a given class of problems, answers can be calculated more quickly and economically. More importantly, harder problems which require more computational resources can be solved. But the first four steps have not yet benefited from advances in computer performance; in fact, they are aggravated by it.
Also, this additional bit of motivation from Chapter 5,
Taken together with the software described in other chapters, these tools allow the user to quickly generate a FORTRAN code, run numerical experiments, and discard the code without remorse if the numerical results are unsatisfactory.
This is similar in sentiment to the idea of numerical throw away code.

PDE Code Gen Recipes

With the problem background and motivation set, the rest of this post will focus on pulling out useful Maxima recipes demonstrated by folks who have generated FORTRAN from MACSYMA with intent to inflict grievous arithmurgical damage on hapless PDEs.
Much work is done in most of these articles from the 1980s to avoid array references and break up expressions by hand to reduce the computational cost in MACSYMA. Also, MACSYMA’s knowledge of the chain rule is not used for calculating the transformations because of an explosion in the number of terms [5], rather an identity for the derivative of a matrix inverse is used. A lot of this effort seems unnecessary today because speed and memory have improved so much. However, the basic approach and variables used to define the problem is still relevant (compare with this example of Burgers’ equation on a curvilinear grid):
  dep : [f, sigma]; /* the dependent variables */
  curvi : [xi, eta, zeta]; /* the curvilinear coordinates */
  indep : [x, y, z]; /* the independent variables */
  depends(curvi, indep);
  depends(dep, curvi);
  nn : length(indep);
  eqn : sum(diff(sigma * diff(f, indep[i]), indep[i]), i, 1, nn);
The result is rather large, but it illustrates Maxima’s knowledge of the chain rule. Of course, generally it is easiest to compute ∂x ∂ξ rather than ∂ξ ∂x for an arbitrary grid, so you need to make substitutions based on the inverse of the Jacobian of transformation. In Maxima we might do something like
  J : zeromatrix(3,3);
  for j : 1 thru 3 do (
    for i : 1 thru 3 do (
      J[i,j] : ’diff(indep[j],curvi[i])
    )
  );
  K : zeromatrix(3,3);
  for j : 1 thru 3 do (
    for i : 1 thru 3 do (
      K[i,j] : diff(curvi[j], indep[i])
    )
  );
  grid_trans_subs : matrixmap(”=”, K, invert(J));
  /* making substitutions from a list is easier than from a matrix */
  grid_trans_sublis : flatten(makelist(grid_trans_subs[i],i,1,3));
which gives us a list of nine equations we can use to make substitutions so that all our derivatives are with respect to the computational coordinates.
  trans_eqn : subst(grid_trans_sublis, eqn) $
  /* Evaluation took 0.0510 seconds (0.0553 elapsed) using 265.164 KB. */
  trans_eqn_factor : factor(trans_eqn) $
  /* Evaluation took 2.4486 seconds (2.5040 elapsed) using 48.777 MB. */
Factoring the result starts getting on up there in compute time and memory, but still not untractable, or even that uncomfortable for an interactive session. Of course we still haven’t made any difference expression substitutions, and that will expand the number of terms even further. The assumption that the grid is arbitrary is a decision point that would have to be dealt with in a black-box style implementation. Best to leave it to the (hopefully) knowledgeable user.
As an aside, linearization is another example of an assumption that is probably best left to the user. Wirth assumes that a linearization is required to solve nonlinear PDEs, whereas Cook provides for calculating the system Jacobians needed for Newton methods (of course Wirth does note that this is just one approach that could be used, and the tools are flexible enough to be extended to other methods). Using the Jacobian is a more modern approach made practical by the successful development of Newton-Krylov methods. It is impossible to predict the future development of numerical methods that will change the choices that need to be made in discretizing PDEs (though that doesn’t prevent speculation [14]), so a flexible toolbox approach is again indicated.
Once a PDE is defined substitutions of finite differences for partial derivatives must be made to create the discrete approximation. Wirth uses a function called DISCRETIZE which uses the dependencies list (created by a call to depends) to substitute indexed variables for the independent variables. Then the partial derivatives of the indexed variables are replaced by finite difference operators. The substitutions of difference expressions is controlled by using Maxima’s pattern matching rules. The basic recipe is
  1. Define dependencies between independent and dependent (and possibly computational coordinates)
  2. Associate a list if indices with the coordinates
  3. Define rules that transform differential terms into difference terms with the appropriate index shifts and constant multipliers corresponding to the coordinate which the derivative is with respect to and the selected finite difference expression
  4. Apply the rules to the PDE to give a finite difference equation (FDE)
  5. Use Maxima’s simplification and factoring capabilities to simplify the FDE
  6. Output the FDE in FORTRAN format and wrap with subroutine boilerplate using text processing macros
The boilerplate is a set of standard templates for a particular solution procedure. The example Wirth uses is an alternating direction implicit (ADI) scheme. A more modern example might be a Newton-Krylov scheme. Wirth describes an environment, Fast FORTRAN Programing (FFP), whose goal is to move computational physics out of a “cottage industry” state. He describes it thusly, “The system consists of two major components: a subroutine library and a command language for building, compiling and running codes.” Based on my experience, that sounds a whole lot like Scipy, which is built on top of Python, and packages together many of the classic scientific computing libraries.
Pattern matching in Maxima is relatively straight-forward. For instance, say I have a function that I’d like to use to calculate my derivatives (such as one based on the discrete cosine transform (DCT)), I could declare patterns that replaced derivatives with function calls.
  matchdeclare([fmatch,xmatch,nmatch],all),
  defrule(deriv_subst_1, ’diff(fmatch,xmatch,1), diff_1(fmatch,xmatch)),
  defrule(deriv_subst_2, ’diff(fmatch,xmatch,2), diff_2(fmatch,xmatch)),
  mat_expr : apply1(mat_expr, deriv_subst_1),
  mat_expr : apply1(mat_expr, deriv_subst_2),
This would result in all of the first and second derivatives in mat_expr being replaced by function calls diff_1 and diff_2 respectively.
The initial set-up steps Cook uses in [3] are roughly the same as those used by Wirth, with the addition of a constant list. The focus is more on what to do with the discrete expressions once they are generated. The basic recipe is
  1. reduce_cnst()
  2. gcfac()
  3. optimize()
  4. fortran()
The constant list allows reduce_cnst to pull constants out of loops, and optimize pulls out common sub-expressions to speed things up. optimize returns a Maxima block, which is similar to a Fortran subroutine. In fact, turning blocks into subroutines in other languages is a common problem which has been addressed on the Maxima mailing list (e.g. block2c, see also cform), and as Dan Stanger points out, this is the purpose of the GENTRAN package. GENTRAN is not yet successfully ported to work with Maxima [13] (though there is a gentran directory that ships in the Maxima source tarball if you want to take a look).
The only reason Cook went to the Lisp level in [3] was to reduce the cost of the factor routines (which is less of a concern now), and to make optimize work with lists of expressions (which it does now). One of the examples cites a need for an outrageous 500 million words of computer memory for one calculation, but that’s roughly half what’s in my old desktop computer I got used about a year ago (for about $100). The computing power available to the average amateur has come a long way since 1982.
Both Wirth and Cook assume that an arbitrary orthogonal coordinate system (like spherical, polar, cylindrical, Cartesian, etc.) will be defined. A slightly more modern approach is presented by Roache et al. [456]. They assume an arbitrary curvilinear coordinate system, which may not have analytical scale factors and metrics (i.e. they include the numerical grid generation task as part of the problem statement / solution procedure).
The approach demonstrated in [5] focuses on transforming a set of PDEs to arbitrary curvilinear coordinates described by Laplace’s equation. This approach couples the PDE solution and grid generation methods together. Modern approaches in the engineering world generally assume that the grid will be generated by a separate tool (the fully coupled approach can still be useful for adaptive or moving grids), though solving problems on a single, canonical grid seems to still be common in the sciences. The MACSYMA routines presented there are
change()
change variables, convert arbitrary second order differential equation in nn variables to an arbitrary coordinate frame in the variables xi[i]
notate
atomic notation for derivatives
notation(exp,vari)
primitive atomic notation
scheme()
introduce differences of unknowns
difference(u,f,exp)
primitive differences, scheme and difference collect the coefficients of the differences and calculate the stencil of the solver and coordinate transformations
myFORTRAN()
write the FORTRAN code
A lot of this extra effort is avoidable now, because it is tractable to use Maxima’s knowledge of the chain rule, and the built-in indexed variables and pattern matching facilities.

Conclusion

After digging in to the old literature on generating code from MACSYMA, and trying out a few things with current (v5.20.1) Maxima, it seems like all the pieces you need to generate PDE solving code already ship with Maxima (not touched on in this post were the vector calculus and tensor capabilities that Maxima has as well). I kind of already knew this since I'd been generating solver code in an ad hoc sort of way already. Perhaps there is a place for building up a pattern matching rules database, and maybe boilerplate templates. That seems to be the area that the modern efforts are focused on (e.g. Scinapse claims that it’s rules encoding PDE solution knowledge make up half it’s 120kloc and is the fastest growing part of the code-base [9]). The recipes presented here seem more like a documentation, or tutorial item rather than a possible new Maxima package.

References

[1]   Wirth, M. C., On the Automation of Computational Physics, Ph.D. thesis, University of California, Davis, 1980.
[2]   Wirth, M. C., “Automatic generation of finite difference equations and fourier stability analyses,” SYMSAC ’81: Proceedings of the fourth ACM symposium on Symbolic and algebraic computation, ACM, New York, NY, USA, 1981, pp. 73–78.
[3]   Cook, G. O., Development of a Magnetohydrodynamic Code for Axisymmetric, High-β Plasmas with Complex Magnetic Fields, Ph.D. thesis, Brigham Young University, December 1982.
[4]   Florence, M., Steinberg, S., and Roache, P., “Generating subroutine codes with MACSYMA,” Mathematical and Computer Modelling, Vol. 11, 1988, pp. 1107 – 1111.
[5]   Steinberg, S. and Roache, P. J., “Symbolic manipulation and computational fluid dynamics,” Journal of Computational Physics, Vol. 57, No. 2, 1985, pp. 251 – 284.
[6]   Steinber, S. and Roache, P., “Using VAXIMA to Write FORTRAN Code,” Applications of Computer Algebra, edited by R. Pavelle, Kulwer Academic Publishers, August 1984, pp. 74–94.
[7]   Cook, G. O., “Construction of large-scale simulation codes using ALPAL (A Livermore Physics Applications Language),” Technical Report UCRL-102469, Lawrence Livermore National Labs, October 1990.
[8]   Husa, S., Hinder, I., and Lechner, C., “Kranc: a Mathematica package to generate numerical codes for tensorial evolution equations,” Computer Physics Communications, Vol. 174, No. 12, 2006, pp. 983 – 1004.
[9]   Akers, R. L., Kant, E., Randall, C. J., Steinberg, S., and Young, R. L., Enabling Technologies for Computational Science, Vol. 548 of The Springer International Series in Engineering and Computer Science, chap. SCINAPSE: A problem solving environment for partial differential equations, Springer, 2000, pp. 109–122.
[10]   Hamming, R. W., “One Man’s View of Computer Science,” Journal of the Association for Computing Machinery, Vol. 16, No. 1, January 1969, pp. 3–12.
[11]   Cook, G. O., electronic mail communication, August 2010.
[12]   Fateman, R. J., electronic mail communication, August 2010.
[13]   Stanger, D., electronic mail communication, August 2010.
[14]   Houstis, E. N. and Rice, J. R., “Future problem solving environments for computational science,” Mathematics and Computers in Simulation, Vol. 54, No. 4-5, 2000, pp. 243 – 257.

Thursday, October 14, 2010

A little bit of Dayton in Paris

Found this in the Musée de l'Orangerie (whose main attraction is the Water Lilies).
Henri-Julien Félix Rousseau - Les Pêcheurs à la ligne
[1908 - 1909]
Looks like an early version of the Wright's glider with the fixed double rudder. The French Aero club at the time was trying to replicate their early gliding results. The brothers made a trip to France in 1908 to show off the Flyer (powered, single steerable rudder), which really cemented their fame (previously the Europeans had been broadly and aggressively dismissive of their claims).
This is another case where the hardware preceded full understanding of the physics. The Wrights found in their bicycle, and subsequent wind tunnel, tests that the published empirical constants in the theoretical lift equations were pretty far off.


Just because the tower is cool (and GIMP is cool as well):

Monday, September 27, 2010

Only Opportunity

Happened upon this
The constraints imposed by the planetary ecosystem require continuous adjustment and permanent adaptation. Predictive skills are of secondary importance.
Hendrik Tennekes
 and thought of this
There is no security on this earth; there is only opportunity.
General Douglas MacArthur
Tennekes concludes with
From my background in turbulence I look forward with grim anticipation to the day that climate models will run with a horizontal resolution of less than a kilometer. The horrible predictability problems of turbulent flows then will descend on climate science with a vengeance.
I have his book; it is rather good.

Friday, September 24, 2010

Goodbye Blue Sky

Look mummy, there's an airplane up in the sky
over Rotary Park in Beavercreek.



























Click this one to watch the prop go 'round:

Tuesday, August 3, 2010

No Fluid Dynamicist Kings in Flight-Test

This was a guest post over on Pielke's site.

Dr Pielke's Honest Broker concepts resonate with me because of practical decision support experiences I've had, and this post is an attempt to share some of those from a realm pretty far removed from the geosciences. All the views and opinions expressed are my own and in no way represent the position or policy of the US Air Force, Department of Defense or US Government. I am writing as a simple student of good decision making. My background is not climate science. I am an Aeronautical Engineer with a background in computational fluid dynamics, flight test and weapons development. I got interested in the discussions of climate policy because the intersection of computational physics and decision making under uncertainty is an interesting one no matter what the subject area. The discussion in this area is much more public than the ones I'm accustomed to, so it makes a great target of opportunity. The decision support concepts Dr Pielke discusses make so much sense to me now, but I can see how hard they are for technical folks to grasp because I used to be a very linear thinker when I was a young engineer right out of school.

My journeyman's education in decision support came when I got the chance to lead a small team doing Live Fire Test and Evaluation for the Air Force (you may not be familiar with LFT&E, it is a requirement that grew out of the Army gaming testing of the Bradley fighting vehicle in the 1980s, a situation that was fairly accurately lampooned in the movie "Pentagon Wars"). The competing values of the different stakeholders (folks appointed by congress to ensure sufficient realistic testing compared to folks at the service level doing product development) was really an eye-opening education for a technical nerd like me. I initially thought, "if only everyone can agree on the facts, the proper course of action will be clear". How naive I was! Thankfully, the very experienced fellows working for me didn't mind training up a rash, newly-minted, young Captain.

It's tough for some technical specialists (engineers/scientists) to recognize worthy objectives their field of study doesn't encompass. The reaction I see from the more technically oriented folks like Tobis (see how he struggles) reminds me a lot of the reaction that engineers in product development offices would have to the role of my little Live Fire office. A difficulty we often encountered was the LFT&E oversight folks wanted to accomplish testing that didn't have direct payoff to narrower product development goals that concerned the engineers. "What those people want to do is wasteful and stupid!" This parallels the recent sand berm example. The preferred explanation from the technician's perspective is that the other guy is bat-shit crazy, and his views should be ridiculed and de-legitimized. The truth is usually closer to the other guy having different objectives that aren't contained within the realm of the technician's expertise. In fact, the other person is probably being quite rational, given their priors, utility function and state of knowledge.

In my little Live Fire Office we had lots of discussion about what to call the role we did, and how to best explain it to the program managers. I wish I had heard of Dr Pielke's book back then, because "Honest Broker" would have been an apt description for much of the role. We acted as a broker between the folks in the Pentagon with the mandate from congress for sufficient, realistic testing, and the Air Force level program office with the mandate for product development. The value we brought (as we saw it), was that we were separate from the direct program office chain of command (so we weren't advocates for their position), but we understood the technical details of the particular system, and we also understood the differing values of the folks in the Pentagon (which the folks in the program office loved to refuse to acknowledge as legitimate, sound familiar?). That position turns out to be a tough sell (program managers get offended if you seem to imply they are dishonest), so I can empathize with the virulent reaction Dr Pielke gets on applying the Honest Broker concepts to climate policy decision support. People love to take offense over their honor. That's a difficult snare to avoid while you try to make clear that, while there's nothing dishonest about advocacy, there remains significant value in honest brokering. Maybe Honest Broker wouldn't be the best title to assume though. The first reaction out of a tight-fisted program manager would likely be "I'm honest, why do I need you?"

One of the reason my little office existed was because of some "lessons learned" from the Tri-Service Standoff Missile debacle (all good things in defense acquisition must grow out of historical buffoonery). The broader Air Force leadership realized that it was counterproductive to have product development engineers and program managers constantly trying to de-legitimize the different values that the oversight stake-holders brought (the differences springing largely from different appetites for risk and priors for deception) by wrangling over largely inconsequential, technical nits (like tree rings in the Climate Wars). The wiser approach was to maintain an expertise whose sole job was to recognize and understand the legitimate concerns of the oversight folks and incorporate those into a decision that meets the service's constraints as quickly and efficiently as possible. Rather than wasting time arguing, product development folks could focus on product development.

The other area where I've seen this dynamic play out is in making flight test decisions. In that case though, the values of all the stake-holders tend to align more closely, so the separation between technical expertise and decision making is less contentious (Dr Pielke's Tornado analogy). In contrast to the climate realm where it's argued that science compels because we're in the Tornado mode, the flight-test engineers understand that the boss is taking personal responsibility for putting lives at risk based on their analysis. They tend to be respectful of their crucial, but limited, role in the broader risk management process. Computational fluid dynamics can't tell us if it's worth risking the life of an air crew to collect that flight test data. In that case there is no confusion about who is king, and over what questions the technical expert must "pass over in silence."

Monday, June 14, 2010

Ohio Aerospace Hub Road Project

Here's a map of the recently announced pork road development project to support our innovation hub:
View Ohio Aerospace Hub Roadwork Development Grant in a larger map

I'm sure this will be welcome by the retailers on Brown St (now that they've successfully shuffled off their vagrant problem).

Friday, June 4, 2010

Falcon 9: Lift Off!

Falcon 9 launch successful on first test flight.  Wow!
First Stage In Flight

Successful Stage Separation
Second Stage Burn
Yeah, yeah, it's just a test.  But the cost of the entire development program (~$335M) for Falcon 9 is the same as a single test flight for Ares I-X (~$445M)...

From my perspective (as someone who's sat in the hot seat conducting flight tests), the really impressive thing with the SpaceX operation was their ability  to light the engines, auto-abort, and turn a new countdown/launch at the end of their range time.  There was clearly a whole lot of work in the design phase leading up to the impressive execution today that made saving the mission possible.  Nice.

Sunday, April 18, 2010

Updatyon Summit: Struggles of an Organization Town

I recently attended the updayton summit.  I'm a newcomer to Dayton and was curious about this unique-sounding gathering.  My first impression on hearing the words Young Creatives (YCs) Summit was "what a pretentious sounding group".  I went with a list of preconceived questions (culled from this set of notes) that I wanted to try and answer through observation of how the summit was conducted and how the participants evolved their involvement.  My initial idea was to look for evidence supporting either a synergistic group process or emergent individual creativity at the summit.  I note my impressions and preconceptions upfront so that you'll understand my observations are not disinterested, though I tried to be 'minimally involved' and objective (with only limited success, the participants and facilitators were really nice and their optimism was infectious).  If you think I missed something significant please point it out in the comments. 
The updayton summit's main goal is to come up with annual projects that serve to excite YCs and will in-turn help Dayton retain recent college graduates.  The means used to achieve this goal were facilitated consensus generation and voting.  Attendees were split up by self-selected interest categories, and then further into several sub-groups.  My interest category was entrepreneur.  These sub-groups met independently at the beginning of the summit in breakout sessions to generate ideas and vote on their 'top two' options for projects.  After that, the results of all the sub-groups were collected by summit staff.  The attendees then went to panel discussions in their interest category, and then all met together in a Town Hall to vote on the final projects for the coming year. 

There were seven resulting top ideas in the entrepreneur interest category.  Six of these consisted primarily of websites.  Five of those websites were about mentoring for young entrepreneurs by established ones, entrepreneur support groups, information clearinghouses or some combination thereof.  Consensus building is certainly brutal in seeking out the lowest common denominator.  How to argue against something as innocuous and pervasive as a website in our networked age?  And what were the big success stories out of last year's summit?  Weeding and painting and, you guessed it, a web resource.  Remember, these groups met independently at the beginning of the summit.  No significant prior communication between sub-group members other than mingling at vendor booths while surfing swag. 

How does what I observed in the summit breakout sessions look in light of established thoughts on creativity (or lack thereof) in groups?  There are two superficially competing views of the group creative process.  There is the whole-is-more-than-the-sum-of-parts school popularized by Stephen Covey.  In opposition, is the creative-acts-are-individual-acts school.  William Whyte's  The Organization Man provides an extensive denial of useful creative genius in groups, and a call for renewed focus on the dignity and efficacy of the individual contribution.  The former I'll call the Synergy School, the latter the Solitary School.

Here's part of Whyte's criticism of the consensus building group,
Think for a moment of the way you behave in a committee meeting.  In your capacity as group member you feel a strong impulse to seek common ground with the others.  Not just out of timidity but out of respect for the sense of the meeting you tend to soft-pedal that which would go against the grain.  And that, unfortunately, can include unorthodox ideas.  A really new idea affronts current agreement -- it wouldn't be a new idea if it didn't -- and the group, impelled as it is to agreement, is instinctively hostile to that which is divisive.  With wise leadership it can offset this bias, but the essential urge will still be to unity, to consensus.  After an idea matures -- after people learn to live with it -- the group may approve it, but that is after the fact and it is an act of acquiescence rather than creation.

I have been citing the decision-making group, and it can be argued that these defects of order do not apply to information-exchanging groups.  It is true that meeting with those of common interests can be tremendously stimulating and suggest to the individuals fresh ways of going about their own work.  But stimulus is not discovery; it is not the act of creation.  Those who recognize this limitation do not confuse the functions and, not expecting too much, profit from the meeting of minds.


Others, however, are not so wise, and fast becoming a fixture of organization life is the meeting self-consciously dedicated to creating ideas.  It is a fraud.  Much of such high-pressure creation -- cooking with gas, creating out loud, spitballing, and so forth -- is all very provocative, but if it is stimulating, it is stimulating much like alcohol.  After the glow of such a session has worn off, the residue of ideas usually turns out to be a refreshed common denominator that everybody is relieved to agree upon -- and if there is a new idea, you usually find that it came from a capital of ideas already thought out -- by individuals -- and perhaps held in escrow until someone sensed an opportune moment for its introduction.
Togetherness

The scientific conference exemplifies Whyte's informational exchange meeting.  No one attends to make decisions (or vote with dots), the attendees are looking to share their work, and learn about their colleagues' work.  These meetings are an important part of modern scientific progress.  This years' updayton meeting did have information exchange components, which I'll get to later.

The Synergy School might argue that the updayton breakout sessions can provide an opportunity for synergistic collaboration, where alternative solutions emerge that are better than any of the individual solutions brought by group members.  The Synergy School's three levels of communication are
  1. The lowest level of communication coming out of low trust situations is characterized by defensiveness, protectiveness, and legalistic language which covers all the bases and spells out qualifiers and escape clauses in the event things go sour.
  2. The middle level of communication is respectful communication -- where fairly mature people communicate.
  3. The highest level of communication is synergistic (win/win) communication.

However, the acknowledged goal of the updayton summit is 'stimulating like alcohol' and 'engagement' rather than synergy.  So, while I went looking for evidence of high-level cooperative action I should have paid closer attention to the marketing materials and lowered my expectations accordingly.  There was no effort at establishing group trust (we didn't even introduce ourselves at the start of the breakout).  We jumped right in to the scripted consensus process.  Low-trust communication among mature professionals leading to compromise (consensus) is the best we can hope for from events like these, and, unsurprisingly, that's exactly what we got.  This naturally raises the question: why bother?  If all we can reasonably hope for is second tier communication then why invest the effort?  The gist I get from a closer look at the promotional material is 'to get buy-in for the projects which will excite YCs to stay'.  Which, in a moment of cynicism, might strike one as rather manipulative.  Instead of manufacturing radiators, now we're manufacturing community-spiritedness.  We might not be able to offer you gainful employment, but you can volunteer to weed our sidewalks!

In support of the Solitary School's idea about the capital of individual ideas, the winning project from the entrepreneur interest category was the one option that wasn't a website.  The young man whose idea formed the core of this project said, "this is something I've been writing about for years".  Something he was clearly passionate about, something that he expended his individual creative effort to flesh out beforehand on his own, and subsequently pitch to the group.  The other options presented by the members of the group were relentlessly mashed into web-sameness by the gentle actions of the facilitators and the listless shrugs of individual acquiescence from well-meaning group members searching for common ground.  When a thoughtful member of the breakout session asked the only really important question, "how do you create an innovator?"  His question was met with more shrugs around the room followed quickly by redirection from the facilitators.  Clearly that question cannot be packaged into a public relations project.

What about the skills sessions?  Surely these have redeeming aspects, the Solitary School would appreciate these as information exchange, and the Synergy School might appreciate them as 'sharpening the saw'.  The  most interesting aspect of the panel discussions was the incipient frustration I observed in some of David Gasper's comments.  Roughly, "there are so many great resources for entrepreneurs in the Dayton region.  Why don't we have more entrepreneurs!?  Dayton needs more entrepreneurs."  Some of the resources mentioned by the panelists were Dayton SCORE, EntrepenuerOhio and Dayton Business Resource Center.  As Theresa Gasper observed, "People seem to want the information PUSHED to them, but then feel overwhelmed with all the information coming at them. No one seems to want to PULL the information – meaning, many don't want to search for the info."  This is consistent with the majority of "needs" identified in the entrepreneur breakout sessions.  These folks are looking for checklists, guarantees of stability and someone to tell them what to do.  In fact, one participant in my session thought that the biggest barrier to entry for entrepreneurs was the lack of the safety net offered by nationalized health-care!  If you were to ask me what is the opposite of the entrepreneurial spirit, I could not have come up with a better answer.  Probably the opposite of the definitions the panel members gave of entrepreneur too:

  • some one who has put something of value to them at risk
  • some one with significant "skin in the game"
Dayton already has a tough time with entrepreneurial thinking because of its recent history as a factory town (far removed from the celebrated, early-industrial "great men").  In his article on a New Era of Joblessness, Don Peck identifies psychological work that points to an additional generational component contributing to this dearth of entrepreneurs,
Many of today’s young adults seem temperamentally unprepared for the circumstances in which they now find themselves. Jean Twenge, an associate professor of psychology at San Diego State University, has carefully compared the attitudes of today’s young adults to those of previous generations when they were the same age. Using national survey data, she’s found that to an unprecedented degree, people who graduated from high school in the 2000s dislike the idea of work for work’s sake, and expect jobs and career to be tailored to their interests and lifestyle. Yet they also have much higher material expectations than previous generations, and believe financial success is extremely important. “There’s this idea that, ‘Yeah, I don’t want to work, but I’m still going to get all the stuff I want,’”Twenge told me. “It’s a generation in which every kid has been told, ‘You can be anything you want. You’re special.’”

In her 2006 book, Generation Me, Twenge notes that self-esteem in children began rising sharply around 1980, and hasn’t stopped since. By 1999, according to one survey, 91 percent of teens described themselves as responsible, 74 percent as physically attractive, and 79 percent as very intelligent. (More than 40 percent of teens also expected that they would be earning $75,000 a year or more by age 30; the median salary made by a 30-year-old was $27,000 that year.)Twenge attributes the shift to broad changes in parenting styles and teaching methods, in response to the growing belief that children should always feel good about themselves, no matter what. As the years have passed, efforts to boost self-esteem—and to decouple it from performance—have become widespread.

These efforts have succeeded in making today’s youth more confident and individualistic. But that may not benefit them in adulthood, particularly in this economic environment.Twenge writes that “self-esteem without basis encourages laziness rather than hard work,” and that “the ability to persevere and keep going” is “a much better predictor of life outcomes than self-esteem.” She worries that many young people might be inclined to simply give up in this job market. “You’d think if people are more individualistic, they’d be more independent,” she told me. “But it’s not really true. There’s an element of entitlement—they expect people to figure things out for them.”

Seeking 'solutions' which enable this emerging neurosis, rather than healing it, is probably not the answer to a more dynamic Dayton. 


Please don't misunderstand my criticisms of this updayton process (or cooperation in general).  I am in agreement with both Covey and Whyte that our biggest challenges require innovative cooperation to solve. 
Our most important work, the problems we hope to solve or the opportunities we hope to realize require working and collaborating with other people in a high-trust, synergistic way...
Interdependence

Let me admit that I have been talking principally about the adverse aspects of the group.  I would not wish to argue for a destructive recalcitrance, nor do I wish to undervalue the real progress we have made in co-operative effort.  But to emphasize, in these times, the virtues of the group is to be supererogatory.  Universal organization training, as I will take up in the following chapters, is now available for everybody, and it so effectively emphasizes the group spirit that there is little danger that inductees will be subverted into rebelliousness.
Over and above the overt praise for the pressures of the group, the very ease, the democratic atmosphere in which organization life is now conducted makes it all the harder for the individual to justify to himself a departure from its norm.  It would be a mistake to confuse individualism with antagonism, but the burdens of free thought are steep enough that we should not saddle ourselves with a guilty conscience as well.
However, what Dayton lacks towards its success is not more resources from government, more focus on community, more committee meetings or trendy bohemian culture to attract jobless hipsters.  In fact, if the attendance of the updayton summit is any indication, Dayton has no lack of optimistic joiners.  However, coddling these agreeable, cooperative, and risk-averse Organization Volk is not the answer if what you are seeking is a flowering of 1000 new entrepreneurs in Dayton.  As Whyte argues, we lack a recognition that  "[t]he central ideal -- that the individual, rather than society, must be the paramount end [...] is as vital and as applicable today as ever".  Lower the barriers to entry (taxes / zoning / regulation / government subsidized competitors), and the passionate individuals uninterested in paternalism will exploit the opportunities that emerge to deliver for Dayton's future.
[
The winner of the 'best swag contest' was MetroParks with their D-ring key fob:
Yes. My keys are now a' swingan'...
]

Wednesday, April 14, 2010

Explosively Formed Projectiles: An Impact of Climate Change

This new ad campaign is quite terrible. Fear-mongering with future catastrophes is not enough. If we don't pass climate legislation, then it's like we're Killing American Troops.
The part about more powerful improvised explosive devices (IEDs) being used in Iraq, and explosively formed projectiles (EFPs) likely being imported from Iran is accurate. The problem is that you don't have to have precision manufacturing to make pretty darn good devices. In fact, simply formed copper plates and modest amounts of explosive does just fine. So will passing that climate legislation to 'cure our addiction to foreign oil' save anyone from a terrorist's road-side device? Nope. Soldiers will still be in harms way. They'll continue to drive down those same roads, but now they have the additional distinction of appearing as props in climate-politics theater.

Tuesday, April 13, 2010

2010 Young Creatives Summit: Get the Shirt

The 2010 Young Creatives Summit is imminent here in the shining Gem City!
You may forget all the folks you meet during the networking and breakout sessions since they'll be plenty of beverages at the after-party, but you'll always have the shirt!

Get yours while there's still time.

Thursday, March 25, 2010

Ohio Personal Income

The Dayton Business Journal has a recent article about incomes in Ohio falling less than the national averages. Here's some interactive graphs from Google public data on the topic:

Interesting how DC is such an outlier in these graphs; it's good to be king.

Here's one that's just interesting, not necessarily Dayton, Ohio-centric (you can drag the labels around if it starts out too cluttered):
Rumors of the death of US manufacturing seem greatly exaggerated.