Friday, December 31, 2004

Leaving Las Vegas

When we were kids, an evening on the couch looking at the family slides was good for a hoot. You got to see yourself or your siblings or other relatives caught in grimacing facial gestures or bent over picnic tables. The occasional upside-down or sideways slide in the tray only added to the festive atmosphere.

OK, we were dull.

But one of the memorable chunks of my parents' slide collection was the set of pictures they took in Las Vegas when the lived there in the mid-50s. Those were the years when Sin City was still more than half a sleepy desert town, just beginning to flush full of mob money and sex and glitz. My dad was stationed in the Army near there. He was one of those GIs you see in the old "Dawn of the Atomic Age" newsreels. Their officers handed them sunglasses and said, "here, watch this nuclear bomb go off." Somehow, he still had enough sperm left after the irradiation to produce three children.

The slides I remembered would make a tremendous historical resource. You'd see the mobsters' cars parked by the side of the main street, and a local high school band parading down the middle of it. When I went through the slide boxes recently, I kept an eye out for those. But I found very few shots from my parents' days in the West. Perhaps they'll still turn up, but I asked my mom and my dad thinks they got moldy and he threw them away.

Here's a few that I did find. Click on the picture for a bigger version.



My mom in the car, somewhere on the California coast, probably near Monterey.



The house they lived in in Las Vegas.



This is one of the few Vegas pictures I found in the box of slides. The Sands, back in the Day.



Zoomed in on the entertainment billing at the Sands that day. Any body have a clue who "Pages Bray" is? My parents used to just read the name and laugh.

Acquired Taste

Seeing family over Christmas piqued me with guilt over my long-neglected role as Clan Archivist. I've been given boxes of old photos and slides to arrange, catalogue, and store for posterity. Why posterity would give a damn about us I've yet to discover.

Most of the photographs I've seen many times over. But the slides are a surprise. Partly because my father stopped developing pictures as slides in the early 1970s, and partly because it's been 20 years since the last working slide projector in the family went where the woodbine twineth.

Recently I learned how to work my Epson scanner to make jpegs from slides, and that's what I spent the last week doing.

Among the revelations that sprang from the slides boxes is the design sensibility of my grandparents on my mother's side. I remember them well, but I had forgotten what they used to do to their houses. And now, as it pops up on my screen in lurid colors, I miss them. I look on their works and tremble with a combination of chagrin and delight. These people who made me had the utterly American combination of bombastic bad taste and the heart, and the cash to indulge it.

I loved them dearly: they had big hearts and taught me many lessons. Her parents were imigrants, his grandparents were. They came from the streets, literally -- boarding houses and reform schools of Philadelphia. My Nana was orphaned as a child, my Grandpop worked himself up through the Navy Yard. They eloped to Elkton, Md., when she was 16.

By the 1930s, they had made it. And here's what they made of it. Click on these for larger versions, if you can take it.



This actually is a detail of a picture of my Nana and my Aunt Edna, her sister, circa 1950. I zoomed in on this table lamp and decoration.



Here's my mom and I in a bedroom in my grandparents' home in 1961. Note the padded headboard. And nobody should be allowed to do that to wallpaper.



That's little me, celebrating Christmas, 1962, seemingly oblivious to the murderous psychedelic red of the chair and the wallpaper. Acid trips couldn't hold a candle to my grandparents' house.



More wallpaper desecration. Here's a peek into their dining room, cropped off from a picture of my Nana gazing up at the "money tree" (just visible on the right) that their neighbors gave them when they were set to move to Florida. It says something about your decorating when your neighbors get up a collection and give you money to leave town.



With the help of the "money tree" my grandparents retired to Boca Raton, Florida, in 1963 and got rid of all their hideous old furniture, replacing it with hideous new furniture. Somehow the big, loud old stuff had a kitschy warmth. But this .... They decorated the whole house in Louis-the-someteenth. Louis the Prima, I think. Here's the front of their living room. Note the textured paint on the ceiling: a huge spiral of scallops that personify, more than anything else I've ever seen, the delightful experience of "bedspins."

Those two big-ass chairs? I have them now. They effortlessly combine tackiness and indestructability. They're Bassett, I think; a solid make. Since I recently got married, Amy and I have an excess of stuff. I recently e-mailed my sister to see if there was anything she lacked in the way of home furnishings. Her response was, "I'm not taking those chairs!"



Here's the back of the living room, looking out toward the screened-in porch where we ate breakfast. Note the green shag carpet. Note Cardinal Richeleu's color TV set.



The couch. The wall. The mural. The horror. I don't know where they found this guy, but when my grandparents bought this house, they had some painter come in and cover selected wall space with some sort of vaguely Mediterranean scenes, all in one color of blue. It's the kind of decor you might see in a sea food restaurant that was routinely closed down for botulism.

Stingy? Think Again

Simmins tots up the private donations -- everyone from Bill Gates to Linkin Park -- and finds something like $127 million from U.S. citizens and government.

Don't forget, the people of the United States are the most generous people in the world. We don't expect our Government to spend our money for us. We take care of that ourselves.

Of course, as he adds in an update,

There are some things that only the United States Government can do. No amount of money sent to the Red Cross or the United Nations can dispatch a carrier battle group or seven water purification ships to the region.

The aid is going to get there. Sad to say, it will come too late for hundreds of thousands. Food, water, clothes already are within tantalizing reach of the worst-hit areas, but there's no next place to take them, no system to distribute them.

Thursday, December 30, 2004

Deep Impact

Between 1980 and 1999, scientists zeroed in on the Chicxulub crater in Yucatan as the 200-mile-wide entry wound in the case of the dead dinosaurs. (There are, however, doubts). An asteroid or comet slammed into the earth 65 million years ago and so disrupted the climate that it wiped out 70 percent of the species then living.

Since then, we've looked to the skies for The Threat. "Dinosaur" is a metaphor for the laggard, stupid qualities in the human race. The Police sing a song and warn we're walking in their footsteps. Hollywood movies like "Deep Impact" show the human species going the way of the triceratops, dispatched by the same bullet. Often a tsunami figures in this. Given the proportion of ocean to land surface on the planet, after all, a meteorite has a better-than-70-percent chance of hitting water.

And now, while we're looking up for death from the skies, Mother Nature pulls the rug out from under us. Calamity comes from beneath our feet.

The East Coast of the U.S., where I live, doesn't seem to be in danger of that kind of earthquake-born tsunami. We're on the trailing edge of the continental plate, and the Atlantic is geologically a quiet place. But if you want a glimpse of what a tsunami might mean in this part of the world, there is a place to find it.



This map shows one of the largest craters ever identified on the Earth. The dark pink ring is the crater itself, the blue blob around it is the debris field of what was ejected out of it by the blast. It's not visible now; like Chicxulub it's buried under later layers of rock and sediment. But its story is told in a book called "Chesapeake Invader," by C. Wylie Poag, the geologists who pieced together the clues and found the crater.

One of the important first clues was the identification of the smaller "Tom's Canyon crater," which he found off the coast of New Jersey. Like the Chesapeake impact, it happened about 35 million years ago, and the two meteorites probably are related, perhaps parts of a larger body that fell apart before it struck the Earth. In imagining the effect of such an asteroid hit, Poag writes about the "super tsunami" it would spawn.

In theory, any large meteorite impact into the ocean would initiate a series of gigantic tsunami waves. The maximum estimated height of the waves at the impact site would equal the depth of the water there. The microfossil evidence indicated that the late Eocene water depth at impact was around 1,500-2,000 feet. This meant that the super tsunami would have started out being a quarter to half a mile high, but wave heights would gradually degrade as the tsunami moved away from the impact site. When the waves moved into shallow water near the shoreline, however, they would begin to feel bottom; this would slow down the wave fronts and cause their crests to rise again to forty times their open-ocean height. At the shoreline, then, the waves would have been hundreds, possibly even thousands, of feet high.

The waves would have hit the mid-Atlantic coast within minutes. "There the enormous hydraulic power of the churning waters would easily have stripped unconsolidated sedimentary beds from the floor of the continental shelf and from the coastal plains of Virginia, all the way to the Appalachian foothills. The powerful backwash of the waves would transport the scoured-out fragments back into the sea. Eventually, within a few hours or days, the seaward backrush would have piled most of the rock debris into a thick deposit" on the shallow seafloor.



This map, from Poag's book, shows the amount of debris piled up beneath the Atlantic coast from that tsunami backwash.

And that's just from the Tom's Canyon crater, not the much larger Chesapeake one. That bigger blast would have fried every living thing within 600 miles, though, so the tsunamis from it, however awful, would be something of an afterthought.

The climate was warmer then, the seas were higher and the East Coast shoreline was further inland. But the idea of a wave crashing and foaming inland as far as Washington, D.C., or Philadelphia is still an awesome vision.

Here's a good site to read more about the Chesapeake Bay meterorite.

Poag's book is not as good as it could have been. The writing is awkward and repetitive. He rightly focuses the narrative on the detective work and the unfolding of evidence that led him and others to find the Chesapeake crater. And he alludes to the scientific pecking order and turf jealousy that made him triple-check his results before he announced them. This part of the tale could have been given more depth, without much more text.

There's a fascinating, but unexplored, subtext to meteorite hunting, since it converges astronomy and geology, which are as different as the Elves and the Dwarves in Tolkien. The boys who sit on the hills gazing out at the heavens take one path into science, and it's different from the path taken by the boys who clamber into caves and come home muddy at dark.

Poag and his partners are the men and women in work boots digging for oil and fresh water to pump up out of the rocks. But here they find the fallen star that the astronomers dream about. Ad aspera per astra.

Wednesday, December 29, 2004

Queensberry for Eggheads

Stephen Howe, an academic at Oxford, advertises his credentials to write about the Western conflict over the Israeli-Palestinian conflict by boasting that his own writings on it irritate people on both sides.

That's a star-worthy resumé item. In his latest piece in Open Democracy, Howe takes a walk around the vicious academic disputes about Israel and the Palestinians, at Columbia U. in America and in various British institutions.

There is a good rule of thumb for social arguments, now applicable to almost any subject and circumstance. It goes simply: whoever first mentions the Nazis loses the argument.

Aye. And he offers this list of ground rules for civil discourse on heated topics:

  • Comparisons between countries or political processes are supposed to be precision tools, not bludgeons wielded against the nearest enemy. They should actually tell us something useful about one or both the things being compared.

  • It is in principle illegitimate to bring your opponent’s ethnic, national or religious background into the argument – unless of course he/she has explicitly and deliberately done so first.

  • Try to work on the assumption that your opponent is acting in good faith, unless he/she absolutely forces you to believe otherwise.

  • Don’t pretend to be offended or intimidated, when all that’s really happened is that your views have been challenged.

  • There is nothing morally praiseworthy about being simple–minded.

  • And above all – don’t mention the Nazis!

Tuesday, December 28, 2004

Monkey's Uncle

Modern science is religion-blind; it doesn't denigrate faith, it simply takes no cognizance of faith. It sets out neither to prove nor to disprove theology. Scientists debate and argue but they don't appeal to God or claim he loves limestone more than clay. There is no one chemistry book for Catholics, another for Hindus, another for Jews.

[In modern times the German and Russian scientific communities temporarily seceded, and for a time there was "German science" and "Soviet science," but that was forced on them by a warping pressure of authoritarian politics and cults of national or ideological identity.]

The religion-blindness is real, if the science is real. It's not like scientists simply put God in the next room and do their work aware that He is only a few steps away, so that they always somehow try to sculpt their facts and results into something not inconsistent with His Scripture.

When I see atheists and agnostics embrace creationism or Intelligent Design, in any sort of numbers, and defend it as passionately and persistently as born-agains do, then I'll begin to take it seriously as science. When some biologist or paleontologist who has never even heard of the Bible reads an Intelligent Design script and says, "That fits the facts better than evolution, and it explains them more coherently," then I'll pay attention.

59,000

Probably by the time I hit "publish post" the casualty toll will have risen again.

The tsunami pictures on the wire today are, if possible, even more heartbreaking than those of yesterday. The shock of the survivors begins to melt into raw grief, and the bulldozers begin to punch through to the worst-hit areas, with photographers behind them. Meanwhile the sea casually tosses back on beaches what it raped from them Sunday.

This is a worldwide calamity. The destruction spread across a ring of nations in Asia, but the victims include a Czech supermodel (injured, boyfriend missing), Lord Richard Attenborough's granddaughter (dead), and perhaps thousands of tourists from Europe, Australia, and North America. There are pictures of little tow-headed children from Sweden or Germany walking around Thai hospitals looking for their lost parents.

The Bombay author Suketu Mehta took note of the same thing in his op-ed article in this morning's Wall Street Journal, which, alas, I can only find behind a subscription wall. He admitted to taking the frankly cynical view that the "white" faces among the victims and survivors would involve the West in a level of compassion and urgency that it otherwise would lack.

I wish I could tell him he was wrong.

Intelligent Design

The blogosphere is debating the hot topic of Intelligent Design, which is the old Creationism dressed up in a 2004 wardrobe.

One of the important new books on this topic is "Why Intelligent Design Fails: A Scientific Critique of the New Creationism," by a pair of academic scientists. If you can't plunk down $40 for this just yet, there's a good overview of it here.

Needless to say, in this one I'm firmly on the side of Darwin's modern heirs. Prevailing science ought to be questioned, and probed for flaws and contradictions. But some people seem to hope that, if they just attack science in subtle enough terms, it will all just fall down and then we'll all go "back to the Bible" for a science textbook.

There's a fair degree of straw man in the usual creationist argument. Nobody reputable in the world of biology thinks complex systems in living things "just happened by accident." And the creationists know well that the simplistic argument has an advantage in a public forum. Flat-earthers used to be judged winners in public debates in the 19th century, even by people who knew better. All they have to do is say, "look around you; you can see it's flat." An appeal to common sense that can be refuted, but only by a long explanation involving mathematics.

As someone quoted in one of the Dover Township School Board stories said, "it only takes 5 seconds for a baby to throw up on your sweater; it can take hours to get it clean again."

Monday, December 27, 2004

Guess Whose Fault It Is?

You had to know it was coming. People who want to find a way to blame America for everything are going to show how that earthquake that knocked Sumatra off its foundation is the fault of Shrubbie McChimpler. Perhaps because they're a day ahead in the news cycle, the Australian tin foil hat brigade got there first.

It is a sad and grim reminder of how vulnerable we are to the force of nature. A pity our army is busy fighting America's immoral war when they should be providing assistance to the affected areas.

Thankfully, a blogger who goes by A.E. Brain struck back with a post that should stand as a pre-emptive fisking of all such stupidity.

We - and by that I mean those baby-eating bloodthirsty barbarians in the Australian military - have plans for dealing with natural disasters. We - and by that I mean us Evil Warmongering Boffins that support the military - even develop simulations and models to help the guys in uniform plan what to do. Unlike some SHM readers, we don't have a direct line to God, so we don't know when and where such catastrophes will occur. The same resources that could support an armoured infantry company operating round Mosul would also be useful for relieving natural disasters, and more importantly, there are plans so to use them. We can walk and chew gum at the same time, provided we don't over-commit ourselves. That's why we have so few troops in Iraq, and resisted the strong pressure from the USA pre-war to commit more in the post-war phase. The US understood this, and didn't make a fuss about us keeping a Strategic Reserve.

More importantly, we don't just write Idiotarian letters to the SMH decrying terrorism, we do something about it. We also don't just write factually-challenged letters to the SMH about the "force of nature", we do something about that too.

We do what we can, reflexive and limited immediate aid first, but we also figure out what's needed, think and research before acting. You save more lives that way, even if the wilfully ignorant of the chattering classes get into a lather because of it.

Like the War on Terror, we're all in this together. In cases like this, we don't worry about what stupid and insulting things various Malaysian government bigwigs have said about us recently, nor even whether today's victims in Aceh were slaughtering Christians and burning down Churches last week. When Mother Nature throws a tantrum, we save 'em all, and let God sort them out.

Tsunami Help

Another Link

Milblogger Mudville Gazette has posted up a wrap, with links, to the various sites of soldiers, army medics, military chaplains, and others who were at the secene of the Mosul suicide attack and who blogged about it.

This is the key link in the process that will turn the blog world from a bubbling sea of opinion and observation to a pipeline of information that will supplement, or bypass, the main media outlets. What MG does here is what reporters and editors do at newspapers and TV stations: get the witnesses together, arrange them and their stories, and present the result to the public. The public doesn't have to go out in search of witnesses, individually, to learn what they saw and experienced and thought about it.

In each case, someone sitting in a room thousands of miles from the event is tying together the threads of the story of what happened in that time and place.

The difference is, in the case of what MG did here, About 10 percent of it is him linking them up, and about 90 percent of it is them talking. The framework is his, but the voice is theirs.

In a newspaper article, those proportions likely would be reversed. The framework, the selection of quotes (and pruning within them) and the overarching voice of the piece would be from the anonymous editor or rewrite man in a room thousands of miles from the event.

Which do you think the public cares more about? Which do you think informs it better?

Front Page News

Check out today's front pages from newspapers around the nation. The Indian Ocean tragedy is the major story, of course. Of 160 U.S. front pages presented there, 95 illustrate the tragedy with this picture: the woman holding her head in grief with the debris in the background. They made a good choice; the picture conveys something of both the material scale of the devastation and the human measure of suffering.

Dozens of pictures were available last night, when these pages were being put together. Some showed scenes of damage and debris, but frankly they were difficult to distinguish from hurricane photos. Some showed water in the streets, but that didn't convey the power of what evidently happened.

But check out the Anniston, Alabama, "Star." It's the very first paper on the top row (they are presented alphabetically by state, and alphabetically within each state). The "Star" chose one of the many pictures on the AP wire last night that showed people grieving over their dead children and relatives. As I scrolled through the leaf desk, those pictures told of the tragedy in a way none of the others did. They were raw and real, and the number of bodies piled up told the story like nothing else. But we don't put pictures of dead bodies in newspapers. Double-taboo breaking for the "Star" because one of the dead bodies in the background is a woman with a breast hanging out. (You probably can't see that on the blow-up on this site, but I saw this photo on the AP wire last night.)

The only other paper on the Newseum's site that ran the full photo was the Santa Rosa, Calif., "Press-Democrat."

Anniston is a very good paper, considering the size of the community it serves. I was told by people who know such things that it's one of the handful of smaller papers that the New York Times likes to hire from (this was in the 1980s; might not be true now). I once was offered a job there as a business writer, so I can tell you it's a thoughtful operation.

They went with a very graphic dead-bodies photo, in color, outside.

Did they make the right call? Does it matter that probably none of the dead people in this photo, or their families, will ever read the Anniston "Star"? Would they run this picture if the victims were from Mobile, not India?

The Great Falls "Tribune" in Montana (bottom row, second page) ran the same photo that Anniston and Santa Rosa used, but cropped out three-fourths of it -- including all the dead -- and just printed the wailing, grieving survivor woman. Better choice?

A few other papers also used dead body pictures, sometimes as a smaller, secondary element in their page one story: The LA "Daily News" ran a powerful picture of a man holding a dead child. The Philadelphia "Inquirer" ran a photo of people pulling a dead woman's body from the sea. The Austin "American-Statesman" ran the same photo.

The Akron "Beacon Journal" ran a small and relatively sanitary picture of bodies lined up in a morgue. The Detroit "Free Press" showed a corpse in a big picture, but only the feet were visible.

Of the dozens of heartbreaking images I saw last night on the wire, a few stuck with me and probably will for a long time. They rarely made it into the newspapers I see on this site. One shot, of a mother grieving over two beautiful little girls, is in today's Knoxville "News-Sentinel." That's about it.

But perhaps the most poignant photo in the whole presentation of page ones on the Newseum site is a one-column picture in both the San Jose "Mercury News" and the Sacramento "Bee" (both on the first page of the site). The entire photo shows a couple grieving over their dead child on a beach in Cuddalore, India. It seems to be a version of this Reuters photo. The editors cropped it down to just the face of the father, wrung with anguish, and the hand of the dead child, which he holds close to his face.

War and Pacifism

Part of the reason Americans have a hard time talking sensibly to one another about the war in Iraq is that we lack the necessary vocabulary. What do I call someone who is a liberal Democrat, a lifelong opponent of imperialism, deeply mistrustful of the Bush administration and its corporate connections, yet who supported the invasion of Iraq to topple Saddam because it would end the suffering of Iraqis? To call such a person "pro-war," as I sometimes have done, feels false. Some readers have called me on that, and they're right. But what's the correct term?

One of the gifts my wonderful wife got me for the holidays this year (and, yes, in our case, they really are best described as "the holidays") was a book that's been on my Amazon wish list for a while: "Semi-Detached Idealists: The British Peace Movement and International Relations, 1854-1945," by Martin Ceadel.

The book was more than 30 years in the making. In it, I find a frame of reference for attitudes toward the Iraq war. That is, in describing the peace movement in a different time and place, Ceadel uses a vocabulary that can be applied to the current situation -- excluding outright traitors, profiteers, or people whose primary motivation in this is something other than the question of justice and war.

He describes two types of peace advocates, and three brands of people opposed to them. In the extreme wing are the militarists, who "glorify conquest on the grounds that it advances civilization."

What Ceadel calls (and, being British, spells) the defencists "hold that the incidence of war is minimized when all countries reject aggression and maintain national defences strong enough to deter others from attempting it." This is the si vis pacem, para bellum view, which probably has been the popular view among politicians and citizens in Britain and the U.S. over the long run for the past 200 years.

Between these two are the crusaders, who "believe that the first use of military force is sometimes necessary to achieve justice and thereby create the conditions for lasting peace." It's pretty clear which of these three classifications accurately describes the "neo-cons," and the other architects and supporters of the current U.S. effort in Iraq.

Crusading is an awkward word for this idea, though, because it conjures up specific images of the Christian Crusades of the Middle Ages. Those mass movements can be understood on several levels, many of which have nothing to do with the modern concept being described. Worse, "crusader" has particular unpleasant associations for non-Christians.

Yet it is felt as the right word by Ceadel and other specialists on this topic, and it probably was felt as such by President Bush when he used it as America began to change its foreign policies after 9/11. He was mocked for that, but in fact it is just the word many academic political scientists use among themselves. Bush and Bin Laden both at different times have called America's work in Iraq a "crusade." However darkly and differently, both are right.

Crusading is nothing new; it was expounded in the 1790s both by the liberal Tom Paine (who wanted Revolutionary France to conquer Britain and reform it) and the conservative Edmund Burke (who wanted Britain to conquer Revolutionary France and eradicate its bloody and dangerous regime). Throughout the 19th century, it took the lead from time to time in the British popular mind, usually expressed as a yearning to aid revolutionaries and freedom fighters against repressive empires. Lord Byron was crusading when he went to help the Greeks in their fight for freedom. The crusading spirit ran high again when the Hungarians (1840s) and the Italians (1850s) rose against the Austrians, and the Bulgarians (1870s) rebelled against the Turks.

Ceadel's topic is Britain, but he looks beyond it for comparisons. In his opening chapter, he outlines a rough prediction model to discover what is likely to be the dominant view of war and peace in a nation, based on its relative security and on the liberality of its institutions. Germany, with high insecurity, feudal structure, and historically illiberal religions, tended toward militarism. America is at the opposite pole, Ceadel writes, and crusading is one of its natural states.

The United States, however, was too secure and too liberal for the goal of the peace movement: its geo-strategic security was so great as to allow it to ignore the balance of power; and its model liberal constitution and puritan tradition, unchecked by a feudal elite, also contributed to a self-righteous approach to the international system which oscillated between the desire to take charge of it and the desire to wash its hands of it. American peace sentiment thus leaked away into crusading in moments of confidence and into isolationism in moments of disappointment. [p.22]

What unites the militarists, crusaders, and defencists, and opposes them to the peace movement, is that they do not believe the ultimate goal of abolishing war is a practical one. In the case of the militarists, neither do they think it desirable.

The peace movement adherents also come in more than one flavor, and their agendas sometimes clash:

There is a group, at times a majority, that "argues that the abolition of war will be achieved only by improving the structure either of the international system or of its constituent states and that until this has been achieved defensive military force may be needed to protect these reforms." To label them, Ceadel turns to A.J.P. Taylor's awkward term pacificist. It's only one short syllable removed from "pacifist," and the eye is at risk of reading the longer word as the shorter one, a problem Ceadel acknowledges by always printing pacificist in italics.

Whatever you call it, this group, while more practical than absolute pacifists, easily can get tangled in contradictions. Reading about it, I thought of the Quaker pacifists in the North who urged Lincoln to wade through gore to crush the Southern rebellion. Ceadel points to the way out of this ethical briar patch:

Only when pacificists have managed to link their support for war to the promotion of an evidently eirenic reform, such as the creation of a league of nations, have they been able to make their predicament as pro-war members of the peace movement seem less paradoxical.

Finally, the outright pacifist "believes that war can immediately and unconditionally be repudiated." This view has been, at all times, that of "a small but dedicated minority."

***

For centuries, in the Western Christian mind, war was an evil but unavoidable fact of life in a Fallen world. Peace movements in Britain began to take shape only gradually, after the 1730s. They were children of the British Enlightenment, with its remarkable marriage of evangelical Christian values and rational humanist ones.

The idea that war could -- and should -- be banished from human experience began to express itself politically in the 1790s, Ceadel writes, when Britain fought France. Interestingly, the first real pacifist movement arose when Britain went to war with Robespierre's tyranny, one of the worst in modern European history, which slaughtered its citizens and threatened its neighbors and gave the world's languages the fresh-minted word "terrorism."

Labels: , ,

The Next Four Years


But what should be the administration's Middle East project for the next four years? Post-Saddam Iraq is not a failure--as long as roughly 80 percent of Iraq's population is moving towards democratic governance, we're not failing. But it is certainly an awful mess. Clerical Iran, the bête noire of every administration since 1979, is advancing its nuclear-weapons programs and playing a favorite Middle Eastern parlor game--divide-and-frustrate the Westerners (the Europeans have enthusiastically abetted Ali Akbar Hashemi Rafsanjani, the clerical regime's major-domo and its most accomplished realpolitician). And even though Osama bin Laden's al Qaeda has so far failed to strike the United States again--a more severe visa policy towards Middle Eastern Muslim males has all by itself made tactical planning and operations inside the United States enormously difficult--Islamic holy-warriorism remains a ferocious menace. Muslim Americans have shown themselves highly resistant to violent Islamic extremism--if they had been as susceptible to bin Ladenism as European Muslims have been, we would likely have seen numerous attacks since 9/11 inside the United States. Young Muslim men could still, however, get infected by the ever-vibrant militancy coming from abroad. As long as bin Ladenism brews in the Middle East, the successful penetration of America's defenses remains an ever-terrifying possibility.

The Struggle for the Middle East, by Reuel Marc Gerecht.

Sunday, December 26, 2004

What would Kleisthenes do?

This is not a serious proposal, just a brain-tickler.

The Founders looked to classical models when they built the American political system. Sometimes I wonder what would happen if we continued that policy.

More and more Americans live in so-called "landslide counties", where one party or the other holds a thumping majority. Meanwhile Congressional districts are increasingly gerrymandered by the two parties to insure that they will be "safe" for the incumbents.

Partly as a result of this, the political values systems of the core "blue" areas of the American map seem to be disconnected from the "red" ones. The people gather into the modern political equivalent of tribes, insulated in self-constructed media cocoons, and national politics become increasingly acerbic, emotional, and vulnerable to demagogues.

Ancient Athens had a similar problem, once upon a time. The Peisistratidai tyrant dynasty entrenched itself by exploiting regional conflict between the tribes, with their individual religious customs, and between the mutual hostility and different values of people in the coast, the interior, and the mountains.

The late 6th century B.C.E. statesman Kleisthenes forced a solution to this problem, and for this he is called the founder of Athenian democracy. His main achievement in government was redrawing the tribal map, which was something like the ancient equivalent of a congressional redistricting. But he did it in a way that amounted to a complete social reorganization of the Athenian state.

He set the number of tribes at 10, and he gave each a name and identity based on a god-hero. Then he divided the entire Athenian territory into electoral districts, and assigned all the citizens of each district to one or another of the tribes. But he set this up so that the chunks of turf of any tribe did not adjoin, and so that each of the tribes had a section in each of the three regions -- coast, inland, mountains.

This defused any chance of tribal barons building up a power base, and it forced the people to work together across geographical lines and local identifications.

A man's tribe and deme were hereditary, and he kept them even if he moved. The tribes were artificial ones at first, but they gelled because of the custom of having holidays and observances in common, and because men of one tribe now fought together in the same regiment (as Americans generally did, up through the Civil War), and because the Boule, the Athenian council or parliament, was to consist of 50 from each tribe -- elected by all the citizens of that tribe.

The system succeeded so well that it outlasted Athenian democracy itself, persisting into Roman times. It succeeded so well that modern historians have a difficult time reconstructing what came before it.

The American Founders had an abhorence of Athens, as an unbalanced state and a case of democracy run amok. But we are more Athenian now than they would have liked, so Athenian models perhaps are proper ones for us.

So, imagine this: dissolve the current Congressional districts, and apportion them by population, but dispersed in at least three divisions, spread over varying regions of the country. Make one district out of, say, North Philadelphia, Key West, and the eastern third of Wyoming. The people in those places have to nominate and elect one Representative. To do that, they have to get to know and understand one another, to compromise and communicate.

And before you dismiss it as mere trifling, pause to savor the image of Ted Kennedy campaigning in Kansas, or Tom Delay trying to win voters in Maine.

Labels: ,

Friday, December 24, 2004

Let Me Get This Straight ...

The story is horrific. But do you think maybe the terrorists in this case chose the wrong tactic to call attention to their cause?

Assailants claiming to be members of a revolutionary group opposed to the death penalty ambushed a bus filled with people bringing home Christmas gifts and killed at least 28 people ....

Jumbo Shrimp

Milblogger Rich at Beef Always Wins got his dinner dished up yesterday by the Secretary of Defense. Cool! If you think these visits are just photo ops, read on. It seems you'll get a better sense of the interactions from the soldiers than you would by relying on the AP reporting.

Come to think of it, I think I used to get my lunch dished out by Rummy, too. Either that, or the cafeteria ladies at Ardmore Junior High School bore a striking resemblance.

And Merry Mithras, Too

To friend and foe alike:

zhu dajia shengdan ji xinnian kuaile!

Which is Mandarin for "(I) wish everyone a merry Christmas and a happy new year."

Actually, what it says literally is "wish big-house holy-birth and [< 'come up to'] new-year pleased-joy."

[Or so says Marc Miyake in his fascinating linguistics-oriented blog]

Putting the "You" Back in "Yule"

Just when you're wondering why we need to keep reading Europeans, you remember, when they come close enough to see us as more than their evil caricatures, they put it all in proper perspective:

But, above all, the annual fuss about taking Christ out of Christmas misses the central point about the holiday season in America. This time of year captures, perhaps better than any other, the defining characteristic of Americans in the modern world — their lack of cynicism and scepticism, their enduring hope and faith in themselves, their country and even the world around them.

...

At other times, I can’t quite take all this American idealism and sentimentality. It is just a bit too much at odds with a complex world. As the country’s critics never tire of observing, it can lead to a little too much certainty and self-belief and a deficit of doubt and acknowledgement of error.

But, at this time of year, a bit of simple faith, a bit of uncynical joy and a bit of human hope induced by that unfathomable miracle that happened a couple of thousand years ago, is right on the mark.

Joyous Solstice to you all. I had a good one. Tomorrow -- actually later this morning -- we're getting married. It's a big secret, so don't tell anyone.

Despite what you read, by the way, Yule is not the name of the Anglo-Saxon winter holiday.

Thursday, December 23, 2004

The Poison, The Banned.

As blogs "grow up," they are evolving an ethic. It will be a knotty process, but in some cases we can learn from the experience of MSM journalists.

One question that came across the bow here at "Done With Mirrors" this week is of corrections. I had written a convoluted sentence in one post. It was just a bad sentence that I should have pulled apart and re-packed. In it, I used a phrase that could have been interpreted as, "I support everything up to, and including, X," when what I meant was, "I support everything up to, but not including, X."

That was called to my attention by a commenter. I went back in and changed the text. Did I do wrong?

I work in newsrooms. Corrections in the newspapers where I have worked are serious business. You want to put out the fire you've accidentally set, and squelch the false information before it spreads.

Errors are caught at every step of the process -- copy editors, proofreaders, even pressmen catch them. And you fix them. But some get through and make it onto the printing press.

Every night, some editor sticks around till 2 a.m. or so and gives the first run another look-through as it rolls off the press. If he catches an error, he'll open up the page on his computer, fix it, and substitute the corrected plate when the press stops for a roll change. If it's egregious enough, he can "stop the presses" and make the fix at once, even though this likely will make the paper late.

Sometimes, though, even this last line of defense fails, and the error doesn't come to our attention until the next day. In that case, you run a print correction in the next available edition. But in that case, some damage has been done.

The fear of errors is well-founded. I remember working for a newspaper that, in one article, confused the names of a local police chief and his brother, a trash hauler who was charged with illegal dumping. Though a correction was printed as soon as possible, the story with the wrong information was archived, as are all stories, and later reporters digging for background on the police chief sometimes would put into their stories the false information that he had been charged with a crime. Even attaching the correction to the clip of the original story in the archives didn't entirely prevent this. It was a nightmare, an error that just won't die. And frankly if that chief had had a good lawyer he could have been proud owner of a newspaper.

So intense is the desire to avoid making such mistakes, among honest journalists, that the newspapers I have worked for have a policy of "not repeating the error in the correction." In other words, if I write that someone lived at 1313 Mockingbird Lane, and it turns out he lived at 3131 Mockingbird Lane, when I come to write the correction for that, I'd write, "So-and-so lives at 3131 Mockingbird Lane. The address was given incorrectly in yesterday's edition." The wrong information has become taboo.

The Internet seems to present the same situation. People turn to it for research, and they find what is there. If someone googles "1313 Mockingbird Lane," they would find the incorrect name I listed for it, if my article was online. Even if I repeat the error in the correction (as probably should be done in this case, because of how Internet searches work), the searcher might see that later correction, or he might not.

Yet there's another side to this: When it comes to people's opinions, in many cases I do want to know what they've said, before they discovered they were wrong and patched it up.

When I used to run an editorial page, I was sorry I couldn't somehow print the letters exactly as they came to me, without the spelling and grammar standardized. They all look the same in newsprint, but one might have been done on company letterhead and the next in purple crayon.

Most blogs are a mix of opinion and fact. Does it matter to you that I wrote an ambivalent sentence? Does it matter that my fingers slipped and I left the "l" out of "public?" Does it matter that I might have dropped a negative I meant to use, and thus reversed the meaning of what I said? Simply tired, or Freudian slip, you decide? Is hitting the "publish" button on a post a moment of no return, or can you go back in and make a fix on a post 5 minutes later?

Any thought-out policy on corrections will have to balance the blogger's need to correct mistakes with the reader's right to detect tergiversation.

***

Anyway, the person who happened to call my attention to the error in this case is a blogger who is banned from this site, and that brings up another ethical issue. He earned that ban long ago, on a previous blog. He's a Chomskyite of the Inner Circle -- that is, one close enough to ask questions of and receive direct answers from The Master -- for so he tells me and I have no reason to believe otherwise. He appears to have imbued the nastiness of his Master along with his world-view.

[If you want a sample of what I'm sparing you from, you can find it here, in the site of this blogger who was kind enough to link to one of my posts, and paid for it by having Mr. Chomskyite track back the link and dump his stored-up flames in this fellow's "comments" section, along with the ubiquitous multiple links back to his own site.]

None of that is why he's banned. He's actually the kind of opponent who's useful if you're on the other side from him, because his tone is so odious that he can't help but alienate reasonable people. Voltaire said it best: "I have never made but one prayer to God, and a very short one: 'O Lord, make my enemies ridiculous.' And God granted it." [Letter to M. Damilaville, May 16, 1767.] My friend José, a gentleman from Spain whose views on many topics are closer to my Chomskyite commenter's than to me, pegged him exactly: "There are two kinds of fascists: fascists and anti-fascists."

No, he's banned for:

1. Spoofing -- posting under names of real people who use the blog, as though the words he wrote were theirs.

2. Making threats against other commenters.

Those seem to me to be valid enough reasons for banning, and a good beginning of a list of ban-worthy offenses. But there could have been others. How about repeated posting of irrelevant comments that amount to little more than links back to his own site? Every blogger who begins to draw traffic probably recognizes this type: every successful system attracts parasites. But is it right to ban based on that?

How about a continuous stream of insults aimed at other commentators, which intimidates people who aren't spoiling for a fight from making comments? Is habitual uncivil discourse cause for banning? Yeah, I know, it's a rough-and-tumble world, and free speech means you get your ass handed to you sometimes, but does the most caustic person in the room have to automatically set the tone for everyone?

It seems to me that opening a blog is like opening a tavern or a museum or some other semi-public space. You have an obligation to make sure that space stays as decent as you want it to, and not let it deteriorate into a monkey house, unless you want a monkey house.

A blog without comments is basically just a Web site, but a blog without some decency standard after a month will invariably look like a clogged toilet. Enforcing standards for comments is not restricting someone's speech. Comments on a blog aren't like letters to the editor. Letters to the editor are asymmetrical warfare. In this corner, Joe Six-Pack: in that corner, people who buy ink by the barrel.

But anyone can start a blog. (And eventually probably will.)

[UPDATE: 12/23, fixed typo in graph 19]

Wednesday, December 22, 2004

Netherlands 911


The Dutch minister for immigration and integration is Rita Verdonk, a woman, as it happens. In late November she went to the town of Soesterberg to speak about "Dutch values." There she was introduced to an imam named Ahmad Salam. He refused to shake her hand.

In the hours after [filmmaker Theo] van Gogh's death, Verdonk had given a speech that had drawn fire from a representative of the radical, Antwerp-based Arab-European League, who likened her to Hitler. ("All she was missing," he said, "was the little moustache.") But that wasn't what bothered Salam.

"I cannot shake hands with a woman," the imam explained.

"Well, then," Verdonk replied, "we have plenty to talk about."

From Holland Daze: The Dutch rethink multiculturalism, by Christopher Caldwell, in "The Weekly Standard."

Shadows of the West

"Fundamentalism Begins at Home" by Josie Appleton is part book review, part interview-with-author. I haven't read the book in question, just this walk-around piece. That doesn't meet the minimum requirement for pontification, but the topic is an urgent one and the author's argument seems worth considering.

After 9/11 the Koran became a bestseller in the West, as readers scoured the text for phrases that might explain the hijackers' actions. Some argued that violence is inherent in Islam; others said that Islam means peace. The 'understanding Islam' industry boomed, with debates, books and pamphlets professing to unearth the mysterious depths of Islamic culture, politics and history.

In Globalised Islam: The Search for a New Ummah, the French sociologist Olivier Roy criticises this 'confused' and 'sterile' debate. 'It is based on an essentialist view', he tells me, 'the idea that Islam is this or that. But you can find anything in Islam. The problem is not what is in the Koran, but what people think is in the Koran.' His concern is to look at the lived reality of Islam, rather than its canonical or historical background. For example, in the book he argues that the idea that Islamic suicide attacks are an attempt to win virgins in paradise is 'not very helpful. Why should Muslims have discovered only in 1983 that suicide attacks are a good way to enter paradise?'

Good points. The Quran, read selectively, can be used to justify all sorts of behaviors, some of them violently contradicting one another. In that, it's not unlike the Christian Gospels or the Old Testament. The question is, why do some people choose to read it in the most violent and destructive way possible?

Roy's attempt to answer this took him not deep into the mosques of the Middle East and the Wahhabist madrasas, but into the immigrant enclaves of Europe and America. "New-style Islam," Appleton writes, "... is strongest among Muslim immigrants living in Western cities. In fact, far from fundamentalist Islam being a Middle Eastern import into the West, it is increasingly the other way around."

She explains elsewhere in the article:

Most of the 9/11 ringleaders were 'born again' Muslims, who went to secular schools, had spent time in the West, and had cut themselves off from their families and communities. Judging by the documents they left behind, they had invented a bizarre set of religious prescriptions for themselves - instructions for the attacks included to 'wear tight socks' and 'blow your breath on yourself and on your belongings.' Such nihilistic violence cannot be understood in conventional religious or political terms - instead, it seems to be an individual's demonstration of the strength of their faith.

So far, this makes sense. The reflexive apologists for terrorism say it is a tragic but inevitable outgrowth of rage and despair. If the Americans and the Israelis didn't persist in picking on Muslims/Arabs, so this thinking goes, the Muslims/Arabs wouldn't lash back so violently.

But that runs into two problems: other peoples have been, and continue to be, degraded and dominated yet they do not form suicide bomber cults. And the 9/11 attackers almost to a man (and in many cases the Palestinian suicide bombers) came from relatively successful middle class families and were themselves educated and had the promise of successful career paths.

Instead, the suggestion being made in Roy's book seems to be that there is a serious "dislocation" that comes about when men and women plunge deeply into a religion while disconnected from its social context. V.S. Naipaul, too, has noted this characteristic of Islam, though mostly in examining non-Arab Muslim peoples. The source of the disconnection is different, but the result seems to be the same:

The disturbance for societies is immense, and even after a thousand years can remain unresolved; the turning away has to be done again and again. People develop fantasies about who and what they are; and in the Islam of converted countries there is an element of neurosis and nihilism. These countries can be easily set on the boil.

Karen Armstrong, too, in her own way has linked the rise of modernism and the triumph of secularized Western culture with the rise of fundamentalism in the three great monotheistic faiths.

God-stuff is potent. Religion is the insanity that keeps us sane; it is humanity's way of confronting darkness, death, and chaos -- in the world and in human hearts -- without falling into them. That part of religion is only safely practiced inside the firm walls of a community. And the great religions have grown up within communities -- whether Greek polytheism, Judaism, or Arab Islam. To step outside the humanizing web of community, but to practice intensely the religion, is to risk a Dionysian possession that makes what is most horrible into a holy virtue. E.R. Dodds described this 50 years ago in "The Greeks and the Irrational." In our lifetimes, David Koresh lived it out and died in it. So, it seems, did the 9/11 killers.

Near her conclusion, Appleton writes:

The new breeds of Islam are really just the shadows cast by the changing shapes of the West. Today, with the old political frameworks gone, the West is unable to furnish the ideologies to go along with the process of Westernisation. Islam is reached for as an age-old gel, to hold things together in a dislocated world. Iran is modernising in reality - the age of marriage is on the rise, as are female literacy rates - but in ideology it is going backwards, with the lowering of the legal marriage age to nine. Educated, well-off young men, with degrees and laptops, imagine that their box-cutters are the equivalent of seventh-century swords.

And this is where I am wary. It is possible that the violent reaction among some modern Muslims is in part a reaction to conditions in the West, and to the world-dominance of Western culture. That seems to me worth exploring.

But it is easy to take one step too many in that direction and say, "this is all the fault of the West." And now you're into Chomsky-land, and you've lost all decency and perspective. I'll have to go find Roy's book to see if he can walk that tightrope.

Labels:

Tuesday, December 21, 2004

Speak of the Devil

Talking about God, most of you probably know Voltaire's quip about the necessity of inventing Him, if he didn't exist. The director's cut has some interesting context:

"I want my lawyer, my tailor, my servants, even my wife to believe in God, because it means that I shall be cheated and robbed and cuckolded less often. ... If God did not exist, it would be necessary to invent him."

The Great Big Coffee-House

Hugh Hewitt and Jeff Jarvis are talking about religion. Are secular elites bullying people of faith in America, or is the supposed secular coup a fiction in a nation that remains essentially Christian?

I'm following the conversation with interest. I tend to agree with Jarvis. There's a deep hostility to evangelical Christianity in many places in America (I work in one of them), but most such people have no design to eradicate Christianity. They just want to keep it at arm's length. Fair enough.

On the other hand, I do agree with Hewitt that there's a deep unfairness, and a destructive power, in the double standard that says it's OK for private persons and commercial media to spout despicable depictions of evangelical Christianity that rightly would be execrated if they were aimed at other faiths (or unbelief).

If you close one eye and look at America, you can see basic ideas of right and wrong -- a bulwark of a robust, free people -- eroding under attack from reckless and embittered relativists. Close the other eye and you see just as clearly blind fundamentalist morality on the march, threatening to enshrine an un-democratic Old Testament creed as the higher law.

Open both eyes, then, and see what Madison saw. The tension, the negotiation, the struggle for consensus and a common view, is the unscripted balancing act that keeps America safe.

But the best thing about this Hewitt-Jarvis discussion is what it isn't. So far, the focus isn't on government and religion. That's in there, but the posts are largely about We the People and our faiths and our joint ownership of the nation. I like that. Too often talk of religion in America presumes Americans are passive little leaves buffeted by the whims of a few fundamentalists in the corridors of Congress, or a few secularists on the 9th Circuit Court.

We're citizens, not subjects. Use common sense and talk out the problem, the way Hewitt and Jarvis are doing. If you wouldn't like someone to talk that way about your religion (or your philosophy), don't talk that way about theirs. That doesn't preclude honest debate, a good-natured joke, or hard questioning of other people's moral foundations. But it does take the wind out of bitter invective and a lot of pointless bickering.

If you don't want to bring Hawaiian volcano worshippers into your kid's classroom, don't try to bring the preacher into it, either. The rest is details. My suggestions? Put the manger scene on private property, not the lawn of the town hall. Let the middle school students and their parents organize the Christmas caroling. The more often the people, and not the courts or the government, work things out, the better the solutions will be.

George Washington, the practical plantation-manager among the learned Founders, often spoke about the political importance of religion. He did so in his "Farewell Address" (based on a draft by Hamilton), where he named it along with education and public credit as things productive of "public felicity." He was not talking about government-sponsored religion. He was talking about the people and their faiths. Plural. When it came to the government, Washington was no less a separationist than Madison and Jefferson. He had had first-hand experience with the problem -- or rather the twin problems -- of the people failing to accomodate one another's beliefs and the government's heavy-handed impositions.

As commander in chief during the Revolution, Washington outlawed New England regiments' "Pope's Day" buffoonery because it offended his Catholic soldiers. Politically correct? He had a war to win and he needed everyone. In 1777 he opposed a congressional plan to appoint brigade chaplains in the Continental Army. "Among many other weighty objections to the Measure," he wrote to John Hancock (then president of Congress), "it has been suggested, that it has a tendency to introduce religious disputes into the Army, which above all things should be avoided, and in many instances would compel men to a mode of Worship which they do not profess."

***

American courts overturn official public school prayer and city-sponsored manger scenes. They outrun popular opinion in many cases, but it's not the judges' fault. Separating the strands of civic religion from state religion has grown difficult because government has seeped into fields of American life that were, in Washington's or de Tocqueville's days, purely private or communal.

That, not judicial activism, caused the seeming retreat of religion from the civic realm in the United States in the late 20th century. The courts are just following where the government goes (usually with the invitation of the people), and making sure the government keeps to its constitutional rules.

Here's a story you may have seen in the news over the past few years. It unfolded in my hometown of West Chester, Pennsylvania. Because of a lawsuit, a 50 inch-by-29 inch bronze Ten Commandments plaque on the courthouse now has been "disappeared" behind a 50 inch-by-29 inch slab of blank plastic.

I see that courthouse, and the fight about the place of religion on its wall, as a miniature model of the national debate: a tension between two ways of seeing America. On the one hand, America is a group of people ruled by a government that each of us wants to see as sharing our values but essentially impartial. (That itself is a tension.)

At the same time, Americans are a set of very opinionated people sharing the same public space. The small-town Pennsylvania courthouse has two historical identities. It is the seat of the legal system. But it also was the people's public parlor. Courts met there four times a year, for sessions lasting a few weeks. The rest of the time, the courthouse was everyone's property. West Chester was a town big enough to have a thriving social and political life, but too small to have a large meeting hall of its own. The courthouse served that purpose.

West Chester's main Baptist and Presbyterian churches grew from preaching done in the courthouse in the 1820s. Chester County Horticultural Society showed off its prize vegetables in the grand jury room. Abolitionists met there, and anti-abolitionist mobs attacked them. Political parties nominated their candidates there. Private recruiters banged the drum there to rally men for the Civil War regiments.

Over the years, the citizens paid to beautify and improve the building -- or persuaded the commissioners to tax the rest of the county to do it. They rebuilt the courthouse in 1846, then expanded it, then expanded it again. They added a sundial and a clock; and in 1869 they erected a fountain in front with one spigot for people, one for horses, and a little trough at the bottom to dispense drinks for dogs.

None of this was accomplished without a tempest of public indignation. One person's philanthropic gesture is the ugliest thing another person has ever seen. And even when two agree on the ends, they rarely think alike on how it should be done. West Chester town meetings -- held in the courthouse, of course -- to resolve public issues were tumultuous: "Everything looked harmonious but the very first resolve brought down the whole house in a storm of opposition," one account from the 1850s reads. "Nobody was in favor of anything, and everybody was opposed to everything."

During 134 years of building, and rebuilding, and expanding the courthouse, nobody proposed posting a Ten Commandments plaque. The plaque arrived in 1920. It belongs to the civic face of the courthouse's identity, not its legal side. A committee of citizens, headed by a Bible class teacher, bought it. It was put up with the consent of the county commissioners, but the idea wasn't theirs.

It was erected at a time of great fear among the Protestant majority in middle America, which felt threatened by the tide of immigration from southern and Eastern Europe and the nebulous menace of atheistic Bolshevism. Darwin's teachings seemed to derail traditional Bible-based morality. None of this was explicit in the erection of the plaque. But the Ku Klux Klan played on these fears and claimed hundreds of members in West Chester in those years. There were intense fundamentalist revivals in the borough churches, under huge banners printed with: "Christ For West Chester: West Chester for Christ."

U.S. District Judge Stewart Dalzell, in his decision ordering the plaque taken down, wrote about separation of church and state. He wrote about the mistake government makes when it appears to endorse one religion over others.

What he might have talked about as well was the ever-shifting, never-ending American public discussion of our religions. The idea that atheists (or Hindus, or Buddhists, or wiccans) could come to that building seeking justice, but meet effrontery at the door, simply was not part of the public discourse in West Chester in 1920. It is so now.

New Favorite Thing (update)

Save Iran

Liberals Against Terrorism cites a Committee for the Present Danger policy report on Iran that has some recomendations for a U.S. approach to Tehran. LAT thinks it's a good list, and so do I:

  • Offer to reopen our embassy in Tehran
  • Step up cultural, academic and professional exchanges
  • Authorize American non-governmental organizations (NGOs) to operate within Iran
  • Arrange for young Iranian activists to attend civic campaign seminars in the U.S. and elsewhere
  • Engage in interaction between such agencies as the CIA, FBI and the Drug Enforcement Agency with Iranian counterparts on issues such as drugs and terrorism
  • Build a legal case against Khamenei and his associates for their financing of terrorists and human rights violations in order to build pressure for them to "return to the mosque" or face a possible international tribunal
  • Use "smart" sanctions to target assets of Khamenei and his associates
    Provide up to $10 million a year to fund independent satellite television stations now broadcasting from the U.S. to Iran.

It embraces much of what eventually worked for us in Eastern Europe during the Cold War; part Reagan, part Willy Brandt. As Praktike at LAT writes, " It goes beyond the simplistic engage/isolate debate, and tries to do both simultaneously -- by isolating Khamenei and his circle while engaging the Iranian people and working to separate the security forces from the regime."

"Pave Iran" isn't viable (it never was). The Iraq project puts our army in their front yard, but it also weakens our military hand against Iran; the mullahs know they now can cause real problems for America for the cost of a suicide bomber's bus fare over the border. This will require the hard power-soft power rope-a-dope.

Labels:

My New Favorite Thing

in the Whole World is this parody:

If Hemingway had written "The Night Before Christmas" (envisioned by James Thurber).

There was a long silence. I could hear the children moving again.

“Is Saint Nicholas asleep?” asked the children.

“No,” mamma said. “Be quiet.”

“What the hell would he be asleep tonight for?” I asked.

“He might be,” the children said.

“He isn’t,” I said.

“Let’s try to sleep,” said mamma.

This has been around forever, but somehow I missed it. And I count myself a devotee of the old "New Yorker." To lacuna is human. [Hat tip, Sullivan]
The Diplomad serves up top 10 list of "allies we admire and appreciate," and I couldn't agree more. I won't steal all their thunder, but among the picks are a few places I like to beat the drum for:

Australia: The Aussies are always there when it counts. This is the only country in the world that has fought alongside the USA in every war over the past century. We don't even mind too much when they beat us out for some Olympic medals. We consider them Americans with accents; or is it the other way round?

Denmark: Always on the right side of every intra-European argument. Plus it's the only entry as a country in the Holocaust Museum's wall of honor -- the Danish royal family during WWII was one of the great class acts of all time.

Singapore: Dynamic business center and sensible, pragmatic foreign policy, with leaders openly rejoiced over the US victory in the Cold War.

Italy: Berlusconi is one of the most colorful and gutsy leaders in the world. Diplomads loved when he insulted that leftist German European Parliament guy, comparing him to Sgt. Schultz in "Hogan's Heroes."

And their endorsement of Israel contains what could be a national motto: Pisses off the rest of the world even more than the United States.

Monday, December 20, 2004

Left Behind

Most of my adult life I've been on what is called, in our degraded political language, "the left." Now, I don't know what I am.

I believe in protecting the environment from rapacious exploitation, and I believe in an economy that encourages people who work hard and play fair, but with safety nets and protection for the little guys against the unscrupulous corporate predators among the honest business people.

I own a home in an urban neighborhood of mixed ethnicity. Every time I see another farm or wooded copse chewed up for a housing development, I grit my teeth. I led editorial battles to save farmland and woods from suburban sprawl. A $50 million bond issue to save open space in one county passed, probably, in part, because of editorials I wrote about it.

I've spent hours and dollars working to keep religious fundamentalists from taking over local school boards (a much more important job than simply bashing on Jerry Falwell). I've advocated for minorities and sick Vietnam veterans. I sought to vote for statesmen who would offer a generous foreign policy that shared America's good fortune with the world. I was in favor of Enlightenment virtues and freedoms in opposition to fundamentalist strictures and darkness, peaceful solutions over violent ones.

A "liberal" is someone who believes that change can be good, especially when it is guided by a spirit of free inquiry and a firm sense of what is right and it aims to increase human freedom and give people more opportunities to guide their own lives. A liberal believes people are basically good, and they can, and want to, make their lives better. It's a faith enshrined in Bobby Kennedy's quote (nicked from G.B. Shaw) about "seeing things that never were" and saying, "why not?"

A "liberal" is someone who believes the Enlightenment values enshrined in the constitutions of the Western democracies are true human values, not merely cultural artifacts. The West has no gift from god, and our citizens are not better than those of other lands, but I love my country because it has set up these principles as our collective guide and have committed ourselves to live by them, when right, and be corrected by them, when wrong.

I grew up thinking that, and I identified myself as a liberal.

What I saw as the opposition was ... well, everything opposite to this. It was many things: Hidebound religious orthodoxy, knee-jerk refusal to think and apply one's mind to political and social problems, insistence that any change only would make things worse. These attitudes often huddled under the label "conservative."

Like a lot of people raised in my generation, I was mistrustful of U.S. military power, and selfish nationalism. Like a lot of people, I recited the litany of "stupid American" stories and jokes. In those days, I regarded America as almost God-like in its invulnerability. Thus I naturally had a root-for-the-underdog identification with any people or group I felt as a victim of U.S. power. Like you'd slap a bad kid for kicking a dog. The slap won't hurt the child, but the kick could kill the dog.

Then I saw the reeking ruins in New York city. 3,000 dead -- people just like me, who probably told the same jokes and held the same views. Why dead? Because they were Americans. The edifice of the country shook, and it made me realize, this place is mortal, like any nation. Like the moment you realize that, someday, your parents are going to die, it changes you.

When I look at America, for all its flaws, against its enemies, and all their purposes, I know which I prefer, which side I give my whole support. And when I look at the way the rest of the world reacted to us -- telling us we deserved it, still more frightened of us than of anything else, a world where a hot-selling book in France right now is called "50 Good Reasons to Hate Americans" -- I saw the fruits of unrestrained America-bashing as clearly as I saw them in the ruins in New York when my son and I went up there a few weeks after the attacks.

Killing the Americans didn't start on 9/11. It is at least as old as the Palestinian hijacking of the '80s, when the Americans were routinely singled out on international flights and beaten to death. It's a result of resentment of American power, you say? Very well, the Germans in the 1930s started killing the Jews not because they felt the Jews were weak, but because they were terrified of the supposed power the Jews had in the world.

I'm one of those who believes America is at war, and ought to behave like it, since Sept. 11. And after much studying and soul-searching, I came to the conclusion that the world probably, and Iraqis definitely, would be better off if the U.S. used its military might for once to remove a corrupt fascist who had been occasionally useful to us. He was our mess, largely, so it was our job to clean him out.

It strikes me as a decision a principled man could possibly make. But it doesn't strike my liberal friends that way. I understand their vexation, but it seems they can see only venality or psychopathia in people like me. And having once stood on the other side from them, and seen them in that perspective, I can't imagine going back to their camp (not that they are inviting me back).

I spent much of the '80s and '90s in active, public disputation with "the right." When I thought of "them" I pictured zealous, pious, ignorant, self-assured demagogues of crusading ideologies, inflexible mean men clad in expensive suits and cheap ethics.

Yet, as a small-town newspaper editor, the people I dealt with on the "right," with three or four odious exceptions, were fine and decent. The head of the local anti-abortion group was a soft-spoken young widowed mother of two. A school-prayer advocate was a cheerfully avuncular man who always asked about my son and would as gladly sit in my office and chat about the things we agreed on -- such as the genius of George Washington -- as the ones we didn't. The ex-mayor, a hardcore law-and-order cop, used to regale me with stories of law enforcement in the old days. I welcomed visits and phone calls from them.

I still hate SUVs and corporate malfeasance, executives who jilt retirees out of their hard-earned savings and foul the waters. I still think police should be held to a high standard in exchange for the power we grant them. I'm still a friend of freedoms and Enlightenment values, and an ally of whoever embraces them, in whatever place or culture. I reject the notion of school prayer as a panacaea for society's ills. I think abortion is tragic, but a necessary evil. I applaud the idea of gay marriage, and would gladly leave it to the states to decide whether it should be so. I also think states should decide whether marijuana should be leagal to buy, sell, own, and smoke. I think the government has no business censoring what we see on TV or do in our bedrooms.

In other words, I still disagree with my old enemies. But on more and more issues, I've come down on the side opposite my former friends. And I find myself in political opposition to many people and organizations I once supported.

On the whole, my old adversaries never forgot that their opponents were human beings. And thus they never stopped being human themselves. I wish I could say the same of the humanists around me today.

***

Possibly, all this is no deep matter. The evolution of a radical young man to a conservative middle-aged one is among the oldest of stories. Yet I feel neither "conservative" nor evolved. I still believe I'm upholding the values of my liberal youth, albeit in a different form. And like the aftermath of a divorce, I can't help re-examining my history on the left to look for incipient signs of a break-up.

"Gun control" is one such issue. I've never owned or fired a gun. I once held in my hand my great-uncle's .22 revolver from his days as a Pennsylvania Railroad conductor, but I'm not sure that gun would even fire. I've been to a shooting range once, to cover a police contest for a newspaper.

My grandfather on my father's side was an avid hunter, as were other men on that side in the late 1800s. I have photos and illustrations of them with rifles in hand. But that never got passed down to my dad, perhaps because his own father died before he had the chance. Probably it wouldn't have mattered. In our suburban existence, nobody talked about guns. It wasn't a gun culture.

So I came of age associating firearms with Christian enthusiasm, flag-waving patriotism, fondness for the military, and other irrational fixations of the right-wing loonies in this country.

I was of the "why would you need an AK-47 to hunt a deer" school of gun control. But back in the '80s I read the Village Voice, and back there among the naughty personal ads they ran Nat Hentoff's column. I read him regularly. And here was this Jewish intellectual from the city, with no more of gun culture in him than I had, teaching me to think of the Constitution, and especially the Bill of Rights, as a whole.

My commitment to freedom of speech was solid; anything this side of "shouting fire in a crowded theater," I endorsed. So, I set myself the task of devising an argument against the Second Amendment that wouldn't also involve, and constrict, the First.

I couldn't do it, of course. They are of a piece. Would you say that the framers of the Bill of Rights never imagined the destructive power of modern weaponry? Then neither did they imagine the reach and scope of the modern media -- visual as well as printed, and all the more powerful for its pretense of unbias. Was their commitment to an armed citizenry based on an antiquated military model of a minuteman national army? Then so was their commitment to a free press based on a political system where newspapers served as the principle organs of party communications, something that hasn't been true in America since 1880 or so.

You don't need an AK-47 to shoot a white-tail deer, but neither do you need to dunk a crucifix in a piss-pot to make art. Guns kill people -- when people use them for that purpose. So do words. Or were we never serious about that bit about the pen being mightier than the sword?

So I gave up, and learned to accept the idea that some people grow up with guns and they're not survivalist freaks and they're no real danger to me. The gun problem in America -- and it is real -- is largely associated with urban crime. But until you can invent one set of rules for the black inner city, and another for the deer-hunting backwoods counties, you'll not solve it. The ever-clever Ed Rendell discovered the difficulty of that as mayor of Philadelphia. No state illustrates the dilemma better than Pennsylvania.

Later I got to know people in the South, who had grown up in Atlanta suburbs that looked much like mine on the Main Line, but they had been taught to use and handle firearms, and they used them for pleasure. And I actually envied them their Sunday afternoons blasting plastic milk bottles in the back yard. It sounded like fun. As for whether it would ever be a useful skill, as opposed to a passtime, that question got answered when my Marietta, Ga., friend ended up working in post-war Iraq.

I've still never owned or fired a gun. Perhaps I never will. By now, for me, it would be an affectation or a dilletante experience. But I've made my peace with that strain of the American right.

***

In my youth, during the Cold War, "left" and "right" generally stood for "communist" and "anti-communist." But this was a false dichotomy and I got an early education in that.

Twice, in the late 1970s, when I was a teen-ager, I lived in West Berlin and spent some time across the wall in East Germany. It was the most "conservative" place I have ever been. Nothing changed. Ever. No one experimented. It lacked color, even on a sunny day; no discos, no pool halls. The neon decadence of the Ku-damm in West Berlin might have been on another planet, not just across the wall. In the company of other students, I took a tour of historic sites in the East -- Potsdam, Frederick the Great's palaces. Our tour guide was an employee of the state. No doubt she was chosen particularly to lead this cluster of young Americans. Perhaps the bureaucrats thought they had picked someone to convince us of the virtues of the People's Republic.

A few of us, including our American teacher guide, spent a lot of time up at the front of the bus between stops, chatting with her. She was a matronly woman, to all appearances good-natured and honest. We probed her about life in the DDR. She said she would never want to live anywhere else. It suited her just fine. In upholding the virtues of her system, she said something I'll always remember: "when my children go out of the house, I don't have to worry about where they are."

At one of the palaces on this tour, we happened to pass a line of Hungarian students of about our own age (guided by their own government-supplied minder). They practically broke through the velvet ropes to get to us and pepper us with questions about life in America. They scrawled down addresses and pressed them on us. By the time our respective guides had herded us all on, we on the U.S. side got a clear impression of their restlessness and their hunger for a way of life we took for granted.

This was odd because, back in the U.S., all the anti-com-ya-nists I knew were grumps and blue-hairs who saw the Beatles and blue jeans as evidences of socialist corruption, and all the self-professed communists were layabout bohemians with "Che" buttons on their ratty army surplus jackets. It was easy to see which of them would have found life better in the Worker's Paradise of East Germany.

I didn't see at the time how much of the "liberal" view was simply an anti-American one. Many of the people advocating it didn't really care about Marxism-Leninism, except insofar as the idea of their advocating it pissed off their parents. Many of them also didn't really care about North Vietnamese or South Africans, except insofar as those people were shaking their fists at the company daddy runs.

Communism never attracted me, I'm glad to say. I skipped Marx and read Rousseau, Kropotkin, Godwin, Paine, Gandhi, Paul Goodman, that sort of thing. I decided I was an anarchist, or at least that description came closest to what I felt. I embraced the romanticism and somehow overlooked the silliness of it. You can do that when you're 18 and there's not a shooting war on.

In Europe, I also met Kurds. I met them in taverns and hostels in Nuremburg, because, for some reason, the small town of Fürth, near there, was a center for black market passports. They were refugees who had escaped ahead of Saddam's death squads after the U.S. had pulled its support from them. This was the moment Iraq shifted from Soviet satellite to U.S. client in containing the Ayatollah. These Kurds weren't bitter against Americans. They understood war and politics and betrayal. They wanted to come to the U.S., too, to bide their time and live the life.

When I read about Kurdistan today, I wonder if any of the young men I met in Nuremburg in 1979 survived and are now among the leaders of that reborn land. I was on their side instinctively in 1979; I'm on their side now. An indigenous non-Christian tribal people, victims of decades of official repression, fascist attempts to eradicate their culture and literally wipe them off the face of the earth. Brutally murdered with the complicity -- at least -- of the U.S. government. This ought to be a no-brainer for a true "liberal."

But instead the liberals I know have no interest at all in the Kurds, because the Kurds made the unforgivable mistake of liberating themselves with the help of American military power. That makes them the bad guys, because the only indigenous people a modern liberal approves are those that burn American flags.

Sunday, Christopher Hitchens (in NYT Book Review) pointed out that the true, best heir of the 1960s youth Revolution is Vaclav Havel. Unlike the Western hippies, his revolution -- wrapped in blue jeans and non-violence and rock music -- really did overthrow a repressive, dour authoritarian state. Yet the heirs of the '60s in the West have little use for him. They cling to Castro.

In bidding farewell to the left, I find myself in interesting company. Among them is author and columnist Ron Rosenbaum, who wrote in his farewell letter:

Goodbye to a culture of blindness that tolerates, as part of "peace marches," women wearing suicide-bomber belts as bikinis. (See the accompanying photo of the "peace" march in Madrid. "Peace" somehow doesn’t exclude blowing up Jewish children.)

Goodbye to the brilliant thinkers of the Left who believe it’s the very height of wit to make fun of George W. Bush’s intelligence—thereby establishing, of course, how very, very smart they are. Mr. Bush may not be the sharpest knife in the drawer (I think he’s more ill-informed and lazy than dumb). But they are guilty of a historical stupidity on a far greater scale, in their blind spot about Marxist genocides. It’s a failure of self-knowledge and intellectual responsibility that far outweighs Bush’s, because they’re supposed to be so very smart.

Goodbye to paralysis by moral equivalence: Remind me again, was it John Ashcroft or Fidel Castro who put H.I.V. sufferers in concentration camps?

Goodbye to the deluded and pathetic sophistry of postmodernists of the Left, who believe their unreadable, jargon-clotted theory-sophistry somehow helps liberate the wretched of the earth. If they really believe in serving the cause of liberation, why don’t they quit their evil-capitalist-subsidized jobs and go teach literacy in a Third World starved for the insights of Foucault?

Goodbye to people who have demonstrated that what terror means to them is the terror of ever having to admit they were wrong, the terror of allowing the hideous facts of history to impinge upon their insulated ideology.

Goodbye to all those who have evidently adopted as their own, a version of the simpering motto of the movie Love Story. Remember "Love means never having to say you’re sorry"?

Goodbye to all that.

Labels: ,

Purple Prose

When I was a Little Kid, growing up in West Chester, Pa., in the late 1960s, my parents used to get two daily newspapers. That's a story in itself (nobody does that nowadays). They got the big city Philadelphia Inquirer in the morning and the hometown Daily Local News (affectionately known as "Daily Lacka News") in the afternoon.

One day, I picked up the "Local" from their coffee table and saw a sight that made my 10-year-old eyes get big. Contractors had been tearing down the old landmark Mansion House hotel in town to make way for a bank and a parking garage. The job was about half finished, and the picture in the paper showed the jagged profile of the half-demolished building. And someone at the newspaper had pasted into the background of the picture Godzilla and King Kong, duking it out.

I've never written to him to confirm this, but I'm convinced this was the work of a young assistant city editor at the paper named Dave Barry.

He later moved on to more prominent, though no less sophomoric, occupations. And at one point in my journalism career I, too, was assistant city editor at the Daily Local News. I heard my share of Dave Barry stories from older co-workers who remembered him. He once irritated a very irritable reporter to the point that the reporter threw a typewriter at Dave's head, and was fired -- for damaging company property.

Did Dave Barry survive the '80s? I'm still not sure. He was a riot back then, and his column was a must-read. But then you caught on to his bag of tricks, which leaned heavily on the "misdirection play" sentence. You know, give all the cliche clues of being about to say one thing and then say something perverse. "I would never say anything critical about so august an institution as the New York Times until I was sure I'd never get a job there." That sort of thing.

But he got some laughs out of me again today with this piece, his formula for national reconciliation in the wake of the ugly election:

For example, a delegation from Texas could go to California and show the Californians how to do some traditional Texas thing such as castrate a bull using only your teeth, and then the Californians could show the Texans how to rearrange their football stadiums in accordance with the principles of "feng shui" (for openers, both goalposts should be at the west end of the field).

Or maybe New York and Kentucky could have a college-style "mixer," featuring special "crossover" hors d'oeuvres such as bagels topped with squirrel parts.

His conculsion (and the Samoa reference derives from what comes above): "Remember that no matter where we live - be it in a red state, or a blue state, or a Samoan state - we are all Americans inside. If we cut ourselves, we will all bleed the same color; and then, as Americans, we will sue somebody."

Can we at least agree that we all need a laugh at ourselves collectively, and this is pretty funny?

Sunday, December 19, 2004

Washington's Crossing

I bought David Hackett Fischer's "Washington's Crossing" to read about a battle that figures in local lore in the part of the world where I grew up, but I ended up reading one of the most penetrating books I've yet found on military history, history in general, and what Thomas Jefferson called "human events."

When historians delve into military details, the result sometimes ends up being what James McPherson called "more and more about less and less." Not here. Fischer tells the military story in rich and human detail. British soldiers and especially their Hessian coalitionists emerge from the shadows and numbers and stand forth as full-fleshed people. But the book is more than regiments marching. Along the way, Fischer helps explain such puzzles as the fit of slave-owning into Washington's ideas of liberty, or the tension between Yankee town meeting democracy and Southern aristocracy in the colonial army.

And he does his best to revive something we've largely forgotten: What made the American Revolution so different.

Several times in reading Fischer's prose I recalled Michael Moore taunt that the Iraqi "insurgents" are not terrorists, but "minutemen." In other words, they are the moral equivalent to the American revolutionaries, and Moore predicted they will win and that they deserved to win, just as the American Revolutionaries did. Typically, Moore left it at that and never bothered to back up his assertion.

But it's not hard to see what he meant -- to the degree that he meant anything but to be a rankling nuisance. Like all Moore's deceptions, there's a dusting of truth on it. The indigenous revolt against the superpower army from abroad faces the same range of challenges, the same tactical choices. The insurgents inevitably will make some of the same choices, in any generation.

But Moore's comparison is superficial. The Iraqi insurgents are like the American Revolutionaries in the same way the death pilots of 9/11 were like the airline pilots they stabbed to death to commandeer the planes. Fischer's concluding chapter explains why:

In 1776, American leaders believed that it was not enough to win the war. They also had to win in a way that was consistent with the values of their society and the principles of their cause. One of their greatest achievements in the winter campaign of 1776-77 was to manage the war in a manner that was true to the expanding humanitarian ideals of the American Revolution. ... In Congress and the army, American leaders resolved that the War of Independence would be conducted with a respect for human rights, even of the enemy. This idea grew stronger during the campaign of 1776-77, not weaker as is commonly the case in war.

It had been a year of disasters. The British routed the Continental army from Long Island, then captured New York City along with many prisoners. The redcoats next pushed George Washington back through New Jersey, waging an increasingly savage campaign not just against the Continental army but against the whole "Levelling, underbred, Artfull, Race of people" they found in America.

Yet early in 1777, John Adams wrote to his wife, "I know of no policy, God is my witness, but this -- Piety, Humanity and Honesty are the best Policy. Blasphemy, Cruelty and Villainy have prevailed and may again. But they won't prevail against America, in this Contest, because I find the more of them are employed, the less they succeed."

What they fought for colored how they fought. And here, too, the comparison with modern Iraq is instructive. The American revolutionaries had woven into their flag not just stars and stripes, but ideals of liberty, whether it was the learned political theorizing of Madison, the commercial common sense of Franklin, the town meeting democracy of New England soldiers, or the stoic self-discipline of Washington. Educated or ignorant, they built their cause around this quality, learned from their experiences as British citizens, and it informed their decisions on the battlefield.

Not all American leaders agreed. Others in Adams's generation believed, as do many in our own time, that America should serve its own national self-interest, defined in terms of wealth and power, and seek it by any means. But most men of the American Enlightenment shared John Adams's way of thinking. In the critical period of 1776 and 1777, leaders of both the Continental army and the Congress adopted the policy of humanity. That choice was reinforced when they learned that some British leaders decided to act differently. Every report of wounded soldiers refused quarter, of starving captives mistreated in the prison hulks at New York, and of the plunder and rapine in New Jersey persuaded leaders in Congress and the army to go a different way, as an act of principle and enlightened self-interest.

There were no Geneva Conventions in the mid-18th century, but every soldier and officer understood the customs of war, which were binding on their sense of honor as warriors. A wounded or cornered enemy could ask "quarter" from the other side, and there were standards for accepting it, or rejecting it. Plundering was universal, but if a house was occupied, and the owners did not resist, the proper plunderer always left the family enough to live on, and he did not take personal items.

There was no international bureaucracy to threaten a violator with a lengthy trial in the Hague, of course, but his own officers could order him summarily shot, which does count as a sort of deterrent. Or the bad behavior could invite like reprisals from the other side. Officers of the two armies in the Revolution traded hot charges across the lines when the system broke down.

Americans, unlike the British, generally extended the right of quarter to their enemies, even as the Americans reacted with indignation as British slaughter of wounded and helpless Continental soldiers. After the Battle of Princeton, Washington put a trusted officer in charge of the 211 captured privates with these instructions: "Treat them with humanity, and Let them have no reason to Complain of our Copying the brutal example of the British army in their Treatment of our unfortunate brethren. ... Provide everything necessary for them on the road." Hessian prisoners were so well treated that, once they had got over the shock of it, they could be sent from one holding place to the next without an armed escort. After the war, almost a quarter of the Hessians remained in America. Their names still dot the phone book in Chester County, Pa., when I grew up there.

Any large army is going to have in its ranks men whose better natures will unhinge in the stress of war. Horror and brutality will happen every time an army marches to battle, as sure as innocent civilians will be killed. If you can't accept that, better to be a thoroughgoing pacifist. At least it's an honest position. Better than pretending you didn't know. The job of a nation and its leaders, military and civilian, is to ensure the horrors are as few as possible, and the war crimes are exceptions.

The fact that there were many exceptions to the American ideal of 1776 -- especially in the case of loyalist legions and runaway slaves -- does not change the essential fact that the American leaders attempted not just to win, but to fight a war they could look back on with pride, and that would be a fitting birth to the nation they sought to make. And they largely succeeded. "The moral choices in the War of Independence," Fischer writes, "enlarged the meaning of the American Revolution."

The Iraqi insurgents, too, have their ideals: a terrorized and repressed people, rule by the gun and the knife, Ba'athist fascism and Islamist fanaticism. They, too, make their moral choices based on their ideals. Does anyone, even Michael Moore, imagine that their "victory," should that nightmare come, would be followed by a replay of Philadelphia, 1787?

As Fischer writes in his concluding paragraph:

[American soldiers and civilians in 1776] set a high example, and we have much to learn from them. Much recent historical writing has served us ill in that respect. In the late twentieth century, too many scholars tried to make the American past into a record of crime and folly. Too many writers have told us that we are captives of our darker selves and helpless victims of our history. It isn't so, and never was. The story of Washington's Crossing tells us that Americans in an earlier generation were capable of acting in a higher spirit -- and so are we.


***

Some of Fischer's best work is crammed into the 8-point type back in the appendices. There he gives a brief, broad-brushed, but insightful tour of "Washington Crossing the Delaware" -- both Emanuel Leutze's famous 1851 painting and the crossing incident itself -- as seen through the evolving eyes of Americans in one generation after another. As you might expect, there's a pendulum effect, with each generation to some extent reacting against the view of the one before, but at one point comes a perfect storm of negative convergence. Guess when that happened?

After discussing the "debunking" mood of popular history writers around the time of the U.S. bicentennial, Fischer turns his attention to that generation of academics. Their view of U.S. history still matters, because these men and women are dominant forces in academe and because their bile has informed many Americans now politically active.

A similar mood spread among a troubled generation of academic historians who were born in the baby boom (ca. 1941-57). They came of age in the late sixties and early seventies, when a youth revolution was bright with the promise of a new age. It was a revolution that failed in the era of Vietnam, Watergate, burning cities, and blighted hopes. A conservative revival followed. Republicans moved to the right, liberal Democrats shifted toward the center, and many on the left sought sanctuary in American universities as internal exiles from a society that turned away from them.

In the 1980s some of these internal exiles rejected all politics. Others increasingly called themselves American Marxists and predicted the coming collapse of capitalism. Then came the unexpected collapse of the Soviet Union instead, and the failure of Marxism throughout the world, It was a double disaster for the American left. The result was an angry generation of academic iconoclasts, disillusioned by the failure of radical movements, alienated from American institutions, and filled with cultural despair. When the light of their revolution failed, some of them could see nothing but darkness.

More than a few became historians. Some ex-Marxists became historical relativists who beat their dialectal swords into epistemological ploughshares, and rejected ideals of objective and empirical inquiry. They judged other works mainly by ideological standards of political incorrectness such as racism, sexism, and elitism. Any work with a positive tone about the United States was condemned as "triumphalism." Their writings expressed intense hostility to American institutions and alienation from the main lines of American history.

As his artistic exemplar of this period, Fischer chooses "George Washington Crossing the Delaware" by the artist Peter Saul, of the faculty of University of Texas in Austin. "In vivid, clashing, Day-Glo acrylic colors, it shows a river crossing that has been reduced to chaos. Washington, his horse, and his men (all in tie wigs) tumble out of the boat into the river while American and British soldiers fire at each other in a battle on the ice. The values of Emanuel Leutze's painting are inverted as completely as the capsized boat."

***

Same medium, inverted values. Fallujah, during Abu Musab al-Zarqawi's reign, was no Philadelphia. On the whole, I'd rather be in Philadelphia.

Labels: , ,