Friday, July 17, 2009

Steele's Hill-Grafton Hill Historic District

The "Statement of Significance (in one paragraph)" for the Grafton Hill historic district taken from the National Register of Historic Places nomination form:
The GRAFTON HILL HISTORIC DISTRICT is significant for both historical and architectural reasons. Architecturally, this relatively small area constitutes an excellent grouping of high style residences which date roughly from the 1880's to the early twentieth century and which display a remarkable degree of integrity. A number of them are among the finest of their style in Dayton. Of those neighborhoods within the city limits, the district's Queen Annes, Jacobethans, and Craftsman houses are unrivaled in scale and detail. Historically, the district is a reminder of Dayton's earliest suburban development, of the movement of population from the center of the city outward, north of the Miami River. The proposed district is part of the much larger area generally known as Dayton View and represents part of the first successful development of Dayton across the river. A Dayton View Historic District nomination was submitted and accepted to the National Register earlier this year but included only that neighborhood west of Salem Avenue, a major thoroughfare which bisects North Dayton. Grafton Hill is located east of Salem, and while similar to its western counterpart in development pattern and architectural style, has always been considered a separate entity even though its development and was concurrent with that of Dayton View west of Salem. Before this time, no successful expansion of Dayton, north across the river, had taken place. This expansion was secured after Dayton's 1913 Flood since Dayton View and Grafton Hill is on some of the highest ground in the city. Both before and after the Flood, this residential area was created by and for a generation of rising professionals and businessmen who were the leaders of their rapidly growing industrial city. Its houses reflect in their architectural variety the tastes of the latter nineteenth and early twentieth centuries. Thus, the district remains today as a physical link to an important segment of Dayton's development history.

Here's a map with the approximate boundaries of the district. The houses listed in the historic nomination form are marked. The early, distinguished owners are noted in the place description in the map (click the marker). The Dayton Art Institute and Masonic Temple are marked in green.

View Grafton Hill: NPS Inventory in a larger map
The nomination form goes on about the two large structures at the south of the district.
The early residents of this district continued to be among the entrepreneurs of the Dayton business and professional class. The district's dominance was further enhanced by the construction of the Dayton Art Institute (DAI) and the Masonic Temple on the section of the hill from which downtown Dayton can be viewed. These two structures were built in the district because of the social and economic prominance of the district and its inhabitant as well as the prime "viewability" of the location. Also, of equal importance, is the fact that Mrs. Carnell, the Dayton benefactress responsible for the building of the DAI, and Mr. Charles Underwood, president of the Masonic Temple, were residents of the district at the time of the buildings' construction.

The short write-up on wikipedia for the district.

Monday, July 13, 2009

Derivative of Unequally Spaced Points

I posted a while back on solving Burgers' equation on a moving grid. The same coordinate transformation techniques applied there can be used to calculate the derivative of unequally spaced data points. Just as in the moving grid application, the grid transformation takes the points in the physical space to the computational space. Once in computational space, the sky's the limit!

One way to do this would be just a simple, low-order finite difference for the first derivative, maybe using an average of two one-sided differences (that would reduce to a central difference if the points were equally spaced). We can easily do better than that though since there are so many great, optimized transforms available in the FFTW library. Spectral methods give results that are accurate down to the precision of the machine for more than about 20 points (for functions without discontinuities).

The cost of using a grid-transformation is that we have to calculate two derivatives and then multiply them together, and the cost of a spectral method is O(n*log(n)) instead of O(n) as in a finite difference approximation. In one-dimension things are pretty easy, the chain rule shows us the two derivatives we need to calculate:

It is not straight-forward to directly calculate the additional derivative, so we calculate it by inverting the derivative of the physical coordinate with respect to the computational one:

Using the discrete cosine transform from FFTW (Chebyshev psuedospectral method) is straightforward to do by using F2Py and Python:

n = 17 # number of data points
a = 0.0 # left side of the interval
b = 3 * np.pi # right side of the interval
alpha = 0.25 # parameter to scale the perturbation
dx = (b - a) / float(n) # this is the equally spaced delta x
half_interval = (b - a) / 2.0 # so the dct diff is properly normalized
n_perturb = 3 # number of perturbed sets to try

dx_perturb = alpha * dx * (sp.random.random_sample((n,n_perturb)) - 0.5)

x = np.linspace(a, b, n)
dxdxi_perturb = np.zeros((n,n_perturb),dtype=float)
dydxi_perturb = np.zeros((n,n_perturb),dtype=float)
y_perturb = np.zeros((n,n_perturb),dtype=float)
x_perturb = np.zeros((n,n_perturb),dtype=float)

for i in xrange(n_perturb):
x_perturb[:,i] = x + dx_perturb[:,i] # add in the random fluctuations

y = np.cos(x)
y_perturb = np.cos(x_perturb)
dxdxi = cs.dct_diff(x, half_interval)
dy_analytical = - np.sin(x)
dydxi = cs.dct_diff(y, half_interval)
# derivatives with respect to the computational coordinate:
for i in xrange(n_perturb):
dxdxi_perturb[:,i] = cs.dct_diff(x_perturb[:,i], half_interval)
dydxi_perturb[:,i] = cs.dct_diff(y_perturb[:,i], half_interval)

# now actually calculate the derivatives we're after:
dydx = dydxi / dxdxi
dydx_perturb = dydxi_perturb / dxdxi_perturb

The derivative of the physical coordinate with respect to the computational coordinate for the equally spaced case and a couple of cases with some random perturbations added:

The actual derivative that we're after:

This example shows the flexibility of the approach, but it also indicates the importance of smooth changes in grid spacing for accurate derivative approximations. Abrupt changes in grid spacing translates into noise in the derivative.

Sunday, July 12, 2009

Integrity of Technical Analysis

Along the lines of my earlier CFD Integrity post Eric Weinstein's Talk at a recent conference about Economic Crises addresses some of the problems that someone committed to scientific integrity has in dealing with decision makers. His main topic is a more rigorous connection between mathematicians / physicist and economists / decision makers. Of course the connection is the similar math-tools that both groups use, and the improved tools that economists could benefit from by becoming more closely related to the physicists.

One of the problems that he talks about early on is that those highly committed to scientific integrity never even get invited to the table. The goals of decision makers, in an economic crisis for example, may not be focused on discovering the truth, but on enhancing "confidence" in the market. In these instances the normal devotion to transparency and brutal honesty are undesireable by the decision makers. Brutal honesty is rarely conducive to mollifying constituencies (at least in the short-term).

This is our challenge: to tell the truth to power with love. We have to maintain the commitment to scientific integrity described by Feynman when talking to the decision makers (laymen). But to do what we have to to be at the table. Fighting the stereotype of the arrogant, unapproachable scientist / engineer is part of the solution as much as improved technical tools.

In commenting on the benefits of closer connection of biological ideas with economics Weinstein states that lying actors don't necessarily get removed from the market auto-magically by some invisible hand: "It is the case that maybe all truth that we generate is initially constructed only as camouflage for the few lies we really want to tell." The illustration he uses comes from wasp pseudocopulation.

As he wraps up: "The barbarity of the market is unquestionable," echos of the Law for Wolves.

Saturday, July 11, 2009

New Pizza for Historic Dayton

UpDayton wants to "discover what young people need in a region and how Dayton may be lacking." Well, it may be missing a big GM factory and a Fortune 500 company, but since New York Pizzeria Restaurant opened on E. Fifth, it certainly isn't missing some very good Stromboli delivered right to your door. The location is right on the corner of Fifth and LaBelle, and is a nice addition to the St. Anne's Hill Historic District. Now if you live in St. Anne's you can walk to Bomberger Park or a nice little pizzeria.

View New York Pizzaria Delivery Radius in a larger map

Here's the St. Anne's Hill Blog write-up. Their hours and phone number as reported in the Dayton Daily News:
Hours are 11 a.m. to 10 p.m. Monday through Thursday, 11 a.m. to 11 p.m. Friday and Saturday, and noon to 9 p.m. Sunday. For more information, call (937) 222-0321.

The pizza is good, but I highly recommend the Stromboli. I think most of the economic problems that Dayton is experiencing could be ameliorated if the local government would implement a progressive Stromboli stimulus package and first time pizza-buyers tax incentives.

Tuesday, July 7, 2009

JASSM bootstrap reliability

The Joint Air-to-Surface Stand-off Missile has been in the news lately. It's the same old story that's been going on ever since it was "fielded". A few missiles fail to function properly in testing, the Air Force says Lockheed-Martin needs to bring up the reliability or the program will be terminated. Lockheed-Martin agrees that more work needs to be done ("keep the money flowing") to improve the reliability, and they'll work diligently with their Air Force team members to serve the warfighter.

That's military-industrial complex business as usual, not that interesting. The neat part is that there's enough information in most of those press releases and news articles to do some resampling statistics on the reliability of the missile. So, in the interest of supporting "an alert and knowledgeable citizenry" (see the Eisenhower video at the bottom of this post) here are some bootstraps on JASSM reliability.

It is really easy to do in Octave using the empirical_rnd() function.

n_09 = 19; % number of tests, based on stated 79% success rate
f_09 = 4; % number of failures
nbins_09 = 14;
t_09 = ones(n_09, 1);
t_09(1:f_09) = 0;
boot_09 = empirical_rnd(t_09, n_09, nboot); % really easy to do
% bootstraps
reliability_09 = sum(boot_09, 1) / n_09;

The sample size n_09 and number of failures f_09 are based on the recent Reuters story:
Four JASSM missiles tested in November, January and February did not detonate on impact or had other problems, raising fresh questions about the program. But the missile still had a reliability rate of 79 percent, and was on track to reach the 90 percent goal, the Air Force said. --Reuters, 6 Jul 2009

So, according to the recent reports JASSM should work about four times out of five (the AF wants it to work nine times out of ten). Here's the "4 out of 5" reliability distribution (based on the bootstrap shown above) with 19 samples:

Back in 2004 JASSM had a claimed reliability of 76%:
We have had 29 launches of JASSM and we have a 76 percent success rate. --Judy Stokely, deputy of acquisition at the Air Armament Center

Which means it should work three times out of four. Here's a "3 out of 4" missile's reliability distribution with 29 samples:

The sampling distributions are too large to measure a change as small as the difference between 0.76 reliability and 0.79 reliability, so it's statistically the same missile now that it was back in 2004. In both distributions the desired 0.9 reliability is out on the tail of the distributions, i.e. you can't claim it's a "9 out of 10" missile with much credence.

A more interesting question is what sort of sampling distribution would you get from nine successes and one failure (the desired nine out of ten missile)?

With a sample size of only ten it would be pretty hard to tell the difference between a "9 out of 10" missile and a "4 out of 5" missile. Based on the press releases apparently a "4 out of 5" missile is unacceptable, but what level of confidence will the Air Force place on knowing that they have 0.9 reliability?

Eisenhower on the military industrial complex:

Here's the Octave file with the code for doing the bootstraps.

Also, thanks to Michael J.T. O'Kelly's bootstrap.py for showing how easy it is to resample with replacement from an array in Python using SciPy.