Tuesday, November 29, 2005

Speaking of Keepers

Mudville Gazette does yeoman work in telling the forgotten story of the war on terrorism that was being waged long before most of us woke up to the fact. This pairs up nicely with Marc's recent post (below).

There's a ton of stuff in here I had forgotten about, or maybe never knew. Among the jaw-droppers is this:

"In America, we have a figure from history from 1897 named Teddy Roosevelt. He was a wealthy man, who grew up in a privileged situation and who fought on the front lines. He put together his own men - hand chose them - and went to battle. You are like the Middle East version of Teddy Roosevelt."

... which was addressed to Osama bin Laden by ABC News' John Miller.

Tides and Times, Part II

American Future's Marc Schulman is back above ground after his second archaeological expedition into the morgue of the New York Times, to track the Gray Lady's editorial coverage of Iraq over a decade and a half. The second installment covers the period from G.W. Bush's election to the start of the war that overthrew Saddam.

It's well worth a read. Even Marc is surprised by what he found:

If the New York Times deserves to be vilified, it’s for its editorial stance and reporting after — not before — the U.S. invasion of Iraq. Memory can play tricks, and I’m forced to admit to this surprising conclusion, which follows from my review of every Iraq-related editorial published between Bush’s inauguration and the invasion’s start.

That review reveals no personal or institutional animus towards President Bush; in fact, on a number of occasions, the editors praised his efforts and policies. Only after war was staring them in the face did the editors hurl invectives in Bush’s direction. The Times was convinced that Iraq was a serious, though not imminent, threat and was willing to countenance the use of force. It was biased, but the bias was in favor of multilateralism, not against a Republican president. Furthermore, the Times didn’t reserve its criticisms for Bush. France was a frequent target for the editors’ verbal assaults; there’s not a single instance of praise for the French.

This appraisal does not mean that I’ve changed my mind and come around to the Times’ way of thinking. As before, I continue to believe that Bush’s decision to attack Saddam’s Iraq without Security Council approval was correct. My major criticism of the Times’ posture is that, despite condemning France’s intransigence, the editors implicitly concluded that having the U.S. bow to the French was preferable to acting without the approval of the Security Council. The Times was wrong to place multilateralism above the removal of a threat that it clearly and consistently recognized.


Fear of serious injury cannot alone justify suppression of free speech and assembly. Men feared witches and burnt women. It is the function of speech to free men from the bondage of irrational fears. Louis D. Brandeis "Whitney v. California"

If there's one American virtue that has been unabashedly embraced by the permanently dissident domestic left, it's freedom of speech. It is the one thread of the national fabric they cling to as indispensible to their purpose, whether that is to live as eternal critics, or genuinely bring down the sociaety in the name of some utopian other.

Especially when that right to speak out is the right to slam and insult religious bigots.

Blasphemy is the word that the majority hisses into the ears of the few. Each church has accused nearly every other of being a blasphemer. The Catholics called Martin Luther a blasphemer and Martin Luther called Copernicus a blasphemer. Pious ignorance always regards intelligence as a kind of blasphemy. Some of the greatest men of the world, some of the best, have been put to death for blasphemy. After every argument of the church has been answered, has been refuted, then the church cries, "Blasphemy!" Blasphemy is what an old mistake says of a newly discovered truth. Blasphemy is the bulwark of religious prejudice. Blasphemy is the breastplate of the heartless. The Infinite cannot be blasphemed. Robert G. Ingersoll, from his summation in the 1887 case of a young man brought up for trial under an old New Jersey statute against blasphemy. Ingersoll argued the case for free. The jury found the defendant guilty and sentenced him to $25 and court costs of $75, which Ingersoll paid.

The "Free Speech Movement" was a centerpiece of the anti-Vietnam War counterculture of the 1960s.

All free people, whatever political stripes we wear, ought to be proud of this Western ideal. Anyone who cares to stand up for what he thinks will, one day, find himself with only the parchment of the First Amendment between himself and a mob of trouble. The more minoritarian you are, the more often those days come.

Except I just did a search, including archives, at DailyKos for various combinations of "Theo van Gogh" and got nothing. Zero, zip, nada.

Van Gogh was murdered in the early morning of Tuesday November 2, 2004, in Amsterdam in front of the Amsterdam East borough office (stadsdeelkantoor) on the corner of the Linnaeusstraat and Tweede Oosterparkstraat streets. He was shot with eight bullets from a HS2000 (a handgun produced in 2000 in Croatia) and died on the spot. His throat was slit, and he was then stabbed in the chest. Two knives were left implanted in his torso, one pinning a five-page note to his body. The note threatened Western governments, Jews and Hirsi Ali (who went into hiding). The note also contains references to the ideologies of the Egyptian organization Takfir wal-Hijra. [Wikipedia entry]

So when did statements like this:

Standing up for free speech in the face of religious fanaticism should be automatic for anybody who understands the classical liberal principles upon which Western society was built.

Become the exclusive province of the American right?



Oliver Kamm has a nice across-the-pond tribute to an American politician whose world-view was out of step with his times, but whose ideas seem more timely today than ever.

Senator Henry ‘Scoop’ Jackson was the best President America never had. He twice sought the Democratic presidential nomination, in 1972 and 1976, but proved unacceptable to his party owing to his views on national security.

After visiting Buchenwald shortly after its liberation, Jackson became an advocate for an interventionist foreign policy to challenge totalitarianism. But far from being a stereotypical proponent of state power, he was a disinterested advocate of human rights, believing that the spread of liberty was the key to Western security. Against the realpolitik of Henry Kissinger, he carried a Congressional amendment tying trade concessions to the Soviet Union to freedom of emigration. He opposed aiding apartheid South African-backed rebels in Angola. On economics, he was a New Dealer, urging what now appears a remarkable degree of state intervention. He was an early conservationist, and a supporter of civil rights. Personally, he was a man of unostentatious philanthropy. Politically, he exemplified the belief that strong defence was a bipartisan cause as well as a mainstay of liberal principle.

We Had to Do Something

Some of my anti-war friends argue that the American people were solidly behind the troops, and behind the president, for the invasion of Afghanistan that toppled the Taliban in 2001. But, they say, Iraq squandered all that unity to no purpose.

But I think we'd be at roughly the same point now in domestic politics, with or without the Iraq invasion. The anti-war movement starts slowly, but grinds away relentlessly. If Iraq had not given it the fodder it has fed on, media coverage of Afghanistan would have sufficed. And the erosion of support would have been accomplished, even if Saddam still sat in Baghdad cursing America, and his son still squatted in his rape-palaces on the Euphrates.

But I'll leave the "to no purpose" part aside for now.

It is true that popular American opposition to an attack in Afghanistan was much lower (10 or 12 percent), at the outset, than popular opposition to the attack on Saddam's Iraq, which stood at about 27 percent right before the war.

However, people worldwide were as against a U.S. invasion of Afghanistan as they were an attack on Saddam. Gallup International surveyed 37 countries in late September 2001 and found support for a U.S. attack on the country harboring the 9/11 terrorists only in the U.S., Israel, and India. Thumping majorities registered against the Afghanistan campaign in the United Kingdom (75%), France (67%), Panama (80%), Mexico (94%) and so on. One of the crucial contributors to the erosion of support for the Iraq war -- international opinion/squandering the good will of the world -- was in place before the first boots hit the desert from Kuwait.

Other key contributors would have fallen in place nicely. The outcry over Guantanamo mainly involves fighters captured in Afghanistan. The dismay over prisoner abuses in Iraq actually overshadows worse reports of abuse that have filtered out from Afghanistan. Substitute "Bagram" for "Abu Ghraib" and you can run the same headlines. The miscalled "insurgents" who flocked to Iraq from across the Muslim world after the U.S. invasion would have found their way as easily to Afghanistan.

With just a little time, the initial 12 percent-to-27 percent gap easily would have been overcome.

Here's an early dose of the poison, which was ready to circulate before anyone dreamed of a U.S. overthrow of Saddam. I clipped this editorial column from the San Francisco "Chronicle" of Oct. 17, 2001 -- little more than a month after Sept. 11, and when the Afghanistan campaign had just begun.

AS LONG AS WE STILL HAVE IT, I'm going to make the most of the First Amendment: What we are doing in, above, and to Afghanistan is short-sighted, counterproductive and immoral. That I am among a mere 6 or 10 percent of Americans (depending on the poll) who feel this way hurts my heart.

The amount of nonthink, or flat-out denial, that is required to support Operation Enduring Freedom is painful to contemplate. Sending thousands of kids -- "our brave men and women in uniform" -- to risk their lives for it is unbearable.

We Americans have never been known for critical thought and analysis. Context and historical perspective rank low on our national priorities list, somewhere below foreign language skills but above gas conservation. Add to that our deliberate myopia and chronic impatience, and you have the U.S. military trashing big chunks of Kabul, Kandahar and Mazar-I-Sharif in pursuit of a cave-dwelling, mass murderer and his worldwide band of suicidal disciples.

Damn the advice from seasoned experts on terrorism and the Middle East; full speed ahead with the cruise missiles.

After all, we had to do something.

That phrase. It has been uttered so many times since Sept. 11, I expect to see it printed on our currency any day now. People who call themselves pacifists, people who admit that they are uneasy with the destruction we are raining down on Afghanistan -- people who can't see how this frenzy of B-1's is actually going to get Osama bin Laden -- offer up the phrase as if it were a bona fide moral escape clause: We had to do something.

Lord, yes. We'd waited more than three weeks before we started dropping bombs. Such restraint. Why don't we at least cut the b.s., and own up to exactly what it is we are doing?

First, does the phrase "collateral damage" sound familiar? When Persian Gulf War veteran Timothy McVeigh used it to describe the 168 children and adults he murdered in the Oklahoma City bombing, we took it as proof of his evilness, as the justification that we needed to execute him.

What is it proof of when U.S. generals use it to describe the Afghan civilians that our bombs already have killed? How about the untold numbers who will die from hunger or disease on their way to refugee camps that can't take them?

Likely, because McVeigh shocked us with the term, "collateral damage" seems to have given way to a new euphemism. As Defense Secretary Donald Rumsfeld put it last week:

"There is no question but that when one is engaged militarily that there are going to be unintended loss of life." Lest anyone think him cold, Rumsfeld added, "And there's no question but that I and anyone involved regrets the unintended loss of life."

When U.S. civilians are killed, it's a travesty. When the dead are from someplace else -- especially a backward, poverty-stricken country such as Afghanistan -- it's regrettable.

Second, let's be honest about the blowback, the truly lethal, political time bombs that we plant with every payload among millions of mainstream Muslims in the Middle East and Asia. George W. Bush can insist that "the United States is a friend to Islam." How many regrettable losses of life do reasonable Muslims tolerate before they begin to doubt our friendship?

Without a doubt, after Sept. 11, we did have to do something, something that takes time, deep and true coalition-building and patient cunning. Instead, we've chosen to play into a mass murderer's hands and prove that our reverence for human life starts diminishing at America's borders.

Start from there. Leaven that into the American media every day. Repeat the application for four years. See if you don't end up with the current ugliness PLUS Saddam still laughing. It doesn't mean the abuses and the mistakes of the Afghan war are justified or excusable. But stop telling me everything would have been cool with these anti-war patriots if we'd just stopped short of Iraq.

Labels: ,

People in Your Neighborhood

Mr. Jihadi is a person in your neighborhood, if you live in these Israeli villages. A fascinating, and chilling, photo essay from the Lebanese border with Israel, by Michael J. Totten. Good fences make good neighbors, but these fences don't look nearly good enough.

Monday, November 28, 2005

The Fisk Factor

Robert Fisk is a generation older than me, but he's nothing like the older generation of American journalists I trained under in this business -- with an unfinished novel manuscript in one drawer, a bottle of hooch in the other, a green eyeshade on their brows and somewhere on their anatomy a tattoo acquired in some Navy port under circumstances now, regrettably, hazy.

Fisk is the Middle East correspondent for the "Independent," beloved by the anti-war and anti-American lefts. He has a Ph.D. in political science from Trinity College. I doubt he owns an eyeshade or a tattoo.

He does, however, have a book. The Great War for Civilisation is a "survey of the Middle East in our time," according to the glowing review in the "Guardian," the "Independent's" spiritual sister on the British left.

Fisk is a hero to many contemporary journalists, especially to those for whom a distaste for Israel and America ranks in the top three chosen topics of conversation. He's like George Galloway with a byline. The difference is, Galloway is a politician. Fisk is a reporter. His articles do not appear in the opinion section. They appear on the front page. Yet he openly scorns the idea of journalistic objectivity.

Fisk doesn't believe in the concept, calling it a specious idea that, as practiced by American reporters, produces dull and predictable writing weighed down by obfuscating comments from official government sources.

"It's our job (as journalists) to challenge the centers of power, and to describe with our own vividness the tragedies and injustice and viciousness of the world, and to try and name the bad guys," he says in one interview. The "Guardian" review puts it like this:

His philosophy is "to challenge authority - all authority - especially so when governments and politicians take us to war". He quotes with approval the Israeli journalist Amira Hass: "There is a misconception that journalists can be objective ... What journalism is really about is to monitor power and the centres of power."

If it is proverbially a newspaper's duty to "print the news and raise hell," Fisk abuses the second half of the equation without earning it through the first. People who read him to learn what is going on in a particular place and time soon will throw the "Independent" aside in disgust, realizing that for their investment of time they've learned a good deal about what is going on inside Mr. Fisk's head and very little about what is going on outside it.

His factual errors are so common, and his conclusions from them so distorted, that he has joined the exclusive club of men like Quisling and Benedict Arnold, who have seen their names become common words -- a fisking is a point-by-point destruction of an argument that is built on a sandpile of flawed details.

Even the "Guardian" in its glowing reception of Fisk's book, feels compelled to close the piece with a long laundry list of its errors -- and these were merely the ones that happened to cross the consciousness of the reader who was assigned to write the review:

The book contains a deplorable number of mistakes. Some are amusing: my favourite is when King Hussein's stallion unexpectedly "reared up on her hind legs". Christ was born in Bethlehem, not Jerusalem. Napoleon's army did not burn Moscow, the Russians did. French: meurt means dies, not blooms. Russian: goodbye is do svidanya, not dos vidanya. Farsi: laleh means tulip, not rose. Arabic: catastrophe is nakba not nakhba (which means elite), and many more.

Other mistakes undermine the reader's confidence. Muhammad's nephew Ali was murdered in the 7th century, not the 8th century. Baghdad was never an Ummayad city. The Hashemites are not a Gulf tribe but a Hijaz tribe, as far as you can get from the Gulf and still be in Arabia. The US forward base for the Kuwait war, Dhahran, is not "scarcely 400 miles" from Medina and the Muslim holy places, it is about 700 miles. Britain during the Palestine mandate did not support a Jewish state. The 1939 white paper on Palestine did not "abandon Balfour's promise" (and he was not "Lord Balfour" when he made it). The Iraq revolution of 1958 was not Baathist. Britain did not pour military hardware into Saddam's Iraq for 15 years, or call for an uprising against Saddam in 1991. These last two "mistakes" occasion lengthy Philippics against British policy; others may deserve them, we do not.

So which do you prefer? Journalists who concentrate on getting the facts right, or those who concentrate on "challenging authority?" Do you want your newshawks to be telling you the way the world is, or the way they think it ought to be? Do you want to rely on the newspaper written by the reporters who know how you should vote, but don't know a stallion from a mare?

You might as well stop to ask all this, because the evolution is underway, whether you asked for it or not. And this time you don't have to rely on an anecdote from my newsroom as the most direct evidence. Just consider the awful coverage of Hurricane Katrina.

From a journalistic point of view, the root causes of the bogus reports were largely the same: The communication breakdown without and especially within New Orleans created an information vacuum in which wild oral rumor thrived. Reporters failed to exercise enough skepticism in passing along secondhand testimony from victims (who often just parroted what they picked up from the rumor mill), and they were far too eager to broadcast as fact apocalyptic statements from government officials ... without factoring in discounts for incompetence and ulterior motives.

...[T]ruth became a casualty, news organizations that were patting their own backs in early September were publishing protracted mea culpas by the end of the month, and reputation of a great American city has been, at least to some degree, unfairly tarnished.

Yet Dan Rather called the Katrina coverage "one of television news' finest moments," because, in a phrase worthy of Robert Fisk, the networks "were willing to speak truth to power." But it seems that, like Fisk, they were willing to be gulled by the authority figures who suited their ideologies for the sake of hurling verbal grenades at the ones they despised.

One consequence of all this activist journalism is that the media, more than ever, becomes part of the story. When it claims to merely observe, the media ignores Werner Heisenberg's observation about physics: By observing you change what you observe -- which is as true of current events as it is of particle accellerators. And the more activist the media, the more true it is. Just look at how al-Jazeera's reporting changed U.S. policy in Fallujah, or how the media coverage changed the situation in New Orleans:

... The information vacuum in the Superdome was especially dangerous. Cell phones didn’t work, the arena’s public address system wouldn’t run on generator power, and the law enforcement on hand was reduced to talking to the 20,000 evacuees using bullhorns and a lot of legwork. “A lot of them had AM radios, and they would listen to news reports that talked about the dead bodies at the Superdome, and the murders in the bathrooms of the Superdome, and the babies being raped at the Superdome,” [Maj. Ed] Bush [public affairs officer for the Louisiana Air National Guard] says, “and it would create terrible panic. I would have to try and convince them that no, it wasn’t happening.”

The reports of rampant lawlessness, especially the persistent urban legend of shooting at helicopters, definitely delayed some emergency and law enforcement responses. Reports abounded, from places like Andover, Massachusetts, of localities refusing to send their firefighters because of “people shooting at helicopters.” The National Guard refused to approach the Convention Center until September 2, 100 hours after the hurricane, because “we waited until we had enough force in place to do an overwhelming force,” Lt. Gen. H. Steven Blum told reporters on September 3.

“One of my good friends, Col. Jacques Thibodeaux, led that security effort,” Bush says. “They said, ‘Jacques, you gotta get down here and sweep this thing.’ He said he was braced for anything. And he encountered nothing — other than a whole lot of people clapping and cheering and so glad that they were here.”

There never was a golden age of American media, from an ethical point of view. Total objectivity is impossible -- Fisk is right that far. Only a god can see like that. But, like any moral virtue, it is meant as a goal, a steadfast purpose, a lodestar. Something you reach for, not a place where you claim to stand. You set your feet toward it, and it keeps you on the right path.

There never was a golden age of American media. The newsies always embodied, more or less, the blind spots and passions of the wider society. But they did great work, and they wrote stories that did more than just eat away at truth like an acid. They could tell you a battle story, for instance, that included the grueling grunt work, the heroic moments, the tragedy of the carnage -- and the big picture.

The purely negative reporter, a Fisk, obsessed with tearing down the powers that be and setting the world right to his vision, will never give you that final quality -- what was bought at the price of the sacrifice. For him, that's all a lie unless it serves his utopian ideal.

Some of the most effective writing done online in recent years has been parodies of modern media styles that work by applying them to historical news events. There's even a video -- Fahrenheit 1861.

If you want to read the sour wine of new journalism in the old bottles of history, consider V.D. Hanson's imagined editorial by the modern "New York Times" in the wake of D-Day:

The unfortunate slaughter of the last month and the present quagmire in the hedgerows are the unfortunate wages of a certain American arrogance — that we can always simply go where we wish, count on locals to admire us, and see the world in terms of black and white, of “good” Americans and “bad” Germans. As we saw last month, simplistic logic leads to careless planning that in turn results in thousands of dead and wounded Americans on a stormy beach and the survivors huddled a few miles away in a hostile countryside that shows no desire to be “liberated.”

Which could be Fisk, again, except with him it would be a front-page leader, not an editorial.

Or this piece, printed by Marc at "American Future," imagining the modern media covering the Battle of Midway:

Midway Island, perhaps the most vital U.S. outpost, was pummeled by Japanese Naval aviators. The defending U.S. forces, consisting primarily of antique Buffalo fighters, were completely wiped out while the Japanese attackers suffered few, if any, losses. In a nearby naval confrontation, the Japanese successfully attacked the Yorktown, which was later sunk by a Japanese submarine. A destroyer lashed to the Yorktown was also sunk. American forces claim to have sunk four Japanese carriers and the cruiser Mogami but those claims were vehemently denied by the Emperor’s spokesman. The American carriers lost an entire squadron of torpedo planes when they failed to link up with fighter escorts. The dive bombers had fighter escort even though they weren’t engaged by enemy fighters. The War Department refused to answer when asked why the fighters were assigned to the wrong attack groups.

All of it is exactly right. These same facts built up, in another hand, into Walter Lord's "Incredible Victory." But here they sit stacked up with the perspective that puts emphasis only on the failures and shortcomings of Western hegemonists, on "challenging the centers of power," if you will.

Does America need a class of secular priests, a la Fisk? Perhaps Europe, which has abandoned its traditional faiths, needed to convert its journalists into pious inquisitors. But America still has flourishing seminaries, and if our journalism programs start turning out Jeremiahs, who will give us the straight dope on the news?

Labels: ,

Sunday, November 27, 2005

New Rules of War

[P]retending that the very specific charges that our troops used illegal chemical weapons against unarmed and innocent civilians is really all about the larger appropriateness of the battle for Fallujah is tantamount to accusing an innocent man of rape simply to raise the larger issue of sexual assault ....

Thank goodness for John Cole. Tough on stupid, no matter where it grows.

Here are three passages from the same page of Bing West's book on the battles for Fallujah. All refer to the spring 2004 offensive, the one that was called off (from the top, for political reasons) before it could finish the job:

Mortar attacks were common, day and night. Sometimes the shells dropped in with disturbing accuracy; other times they missed by a city block. Whenever a Cobra gunship flew over the city, it attracted a fusillade of machine gun fire and RPG rockets, a few detonating in the air, most exploding on roofs and streets.


The same message was broadcast from most minarets:
America is bringing in Jews from Israel and stealing Iraq's oil. Women, take your children into the streets to aid the holy warriors. Bring them food, water and weapons. Do not fear death. It is your duty to protect Islam. After a few nights, when asked what the imams were yelling, the translators, bored by the repetition, simply said, "Stealing oil, bringing in Jews, protect Islam. The usual stuff. Same old, same old."


LtCol Olson and Capt Zembiec watched through binoculars as boys about ten years old lugged mortar shells across a road. On the roof with them were a Delta Force sniper with a .50 caliber rifle and a Marine corporal with the standard .308 sniper rifle. They sat in separate sandbagged shelters, peering out through mouse holes. Zembiec called them "cooperative carnivores." They waited all day, hoping that a grown-up insurgent would grow impatient and walk out to take one of the mortar shells from a boy. None did.

Ladies and gentlemen, there is a description of your enemy.

West is not writing a book of polemics. He doesn't feel compelled to justify the U.S. to a world that dislikes it. He's writing about a battle in progress. He's writing about "our" troops, and in his case, he means it. While al-Jazeera, embedded with the Fallujah insurgents, was unabashedly the "Arab network," the American networks were busy being citizens of the world. For Americans, where was the home-team media coverage of the Iraq war, unless you dug deep online for the free-lancers? As a result, most of us never got a clear view of our side, or who we were fighting.

In Fallujah we were fighting Al-Qaida in Iraq, alongside a gaggle of small-time Islamist groups, swelled by thousands of part-time jihadis, and potentially tens of thousands of angry citizens, each with an AK-47 in the house, who would join the fight if they thought the insurgents were winning, and melt back into being sullen civilians if the Marines surged.

This is the new face of war in a world where no nation yet -- with the possible exception of China -- dares face the United States in an open battlefield. As recently as the 1930s, the world contained 8 or 10 serious military powers that were capable of waging war against one another. The days when Canada and Italy were world-class military powers are over.

Yet the old rules of war, which still are invoked, evolved in that world. Those rules were devised by decent and fair men, and they were meant to protect innocent civilians and soldiers who had been rendered hors de combat. They were meant to apply to armies that fought in the open, in uniform, and to civilians who cowered in the cities and prayed the battles would pass them by.

As late as 1945, they still were in play. As the 12th Army Group pushed through Franconia and approached picturesque little German cities, the civilian leaders often sent out secret negotiators to arrange to surrender the town before it was stormed. Usually the German military had pulled back already, but it was a risky business. If they returned and found "collaborators," they tried them on the spot and hanged them.

If the American generals agreed, the town would open its gates to them. Many a gem of a medieval city was spared this way. But sometimes a few SS fanatics or Hitler Youth would sneak back into the town, hide in the attics, and open fire as the Americans approached then scurry out. The U.S. troops invariably pulled back then, and they called up the artillery and the bombers to flatten the town before moving on. The deal was off. How would Fallujah have fared under those rules?

The rules were not meant to bind the hands of American power, so that extremist ideologies, if they took refuge in populous cities, could always defeat the United States. The rules were not meant to level the playing field among combatants. They did not take into account the mass electronic media, and the difference between democracies, where a handful of media images can shake the national foundation, and tyrannies, where pictures of abused prisoners can easily be made to disappear before they are seen, along with anyone who owns them.

The U.S. Marines don't deliberately kill women and boys, even those actively aiding the enemy, because their warrior code forbids it. Yet any time a battle rages in a city, they will die. Who chose that battlefield? Who brought the non-combatants into the fight? Which side took infinitely greater care to avoid killing them?

It is absurd that the world now believes Americans used "chemical weapons" on Fallujah, thanks to the braying of people for whom all the evil in the world is summed up in the seven-letter word "America." Yet only the handful who will read "No True Glory" will know about the two snipers, the trained killers at the peak of their art, the modern-day Ajax and Teucer, watching the deadly weapons go up and wound their comrades all day long and never pulling the trigger.

After Vietnam, anti-war zealots spit on returning soldiers. Thanks to the Internet, today they don't have to wait for the troops to return to start spitting.

Labels: , ,

Saturday, November 26, 2005

But Did You Listen to Me? No!

So New York City is shocked -- shocked! -- to discover its early past was rich in slavery.

Hmph. I could have told them that. I've been trying to tell them that for years, in fact. But sometimes you have to whack people with a 2-by-4.

As a result, New York soon had had the largest colonial slave population north of Maryland. From about 2,000 in 1698, the number of the colony's black slaves swelled to more than 9,000 adults by 1746 and 13,000 by 1756. Between 1732 and 1754, black slaves accounted for more than 35 percent of the total immigration through the port of New York. And that doesn't count the many illegal cargoes of Africans unloaded all along the convoluted coast of Long Island to avoid the tariff duties on slaves.

In 1756, slaves made up about 25 percent of the populations of Kings, Queens, Richmond, New York, and Westchester counties. Slaveholding concentrated in New York City, where by 1691 competition from slave labor had driven white porters out of the market houses and where by 1737 free coopers were complaining of "great numbers of Negroes" working in their trade.

The slave trade became a cornerstone of the New York economy. As with Boston and Newport, profits of the great slave traders, or of smaller merchants who specialized in small lots of skilled or seasoned slaves, radiated through a network of port agents, lawyers, clerks, scriveners, dockworkers, sailmakers, and carpenters.

The Dutch legacy left its mark on New York slavery, even after the British occupation. The British at first handled slaves in New York on the same relatively humane terms the Dutch had set. The population already was racially mixed, and slavery in New York at first was passed down not exactly by race, but by matrilineal inheritance: the child of a male slave and a free woman was free, the child of a female slave and a free man was a slave.

By the 18th century, through this policy, New York had numerous visibly white persons held as slaves. But after 1682, as the number of slaves rose (in many places more rapidly than the white population) fears of insurrection mounted, restrictions were applied, and public controls began to be enacted. By that year, it had become illegal for more than four slaves to meet together on their own time; in 1702 the number was reduced to three, and to ensure enforcement each town was required to appoint a "Negro Whipper" to flog violators. In a place where slaves were dispersed in ones and twos among city households, this law, if enforced, would have effectively prohibited slaves from social or family life.

Local ordinances restricted times or distance of travel. Slave runaways were tracked down rigorously, and ones bound for French Canada were especially feared, as they might carry information about the condition and defenses of the colony. The penalty for this was death. Slaves did run off, especially young men, but they tended to gravitate to New York city, rather than Canada. There many of them sought to escape the colony by taking passage on ships, whose captains often were not overly scrupulous about the backgrounds of their sailors.

So much easier, in so many cases, for people outside the American South to believe that slavery, and racism, and moral responsibility for a collective past, only exist somewhere else, the better to wash their hands of the whole matter. It's called scapegoating, and it never is anything but self-deception.

Friday, November 25, 2005

'Twas Ever Thus

Every generation has its Homer. That is each has a translator of the great writings of antiquity that touches the reader like the work of a living genius. The older generation now alive will go down to the grave clutching Richmond Lattimore's "Iliad" -- "The finest translation of Homer ever made into the English language." For me, Homer's English voice will be that of Robert Fagles.

You cannot compare them to pick a winner, except in terms of fidelity to the Greek, which is not the point. A generation passed between Lattimore and Fagles. You need a contemporary translator to tease the story up out of the depths of time, like a clever old pike in a pond, to bring it dazzling and alive into the light of the present.

No work is as difficult and as thankless as translation. When a character speaks in a certain accent that ancient listeners would recognize at once as "Spartan" (by its flattened vowels, something as a Scottish accent is to English), you instantly click into certain expectations about him or her. The ancient author can communicate a whole cultural identity in a single word. How do you convey that in modern terms, in a world where nobody knows Spartans from Trojenz?

It's also true that old translations can be living classics in their own right. Keats reached back past Pope's "Odyssey" to George Chapman's robust Elizabethan translation. And in the 20th century Ezra Pound was delighted by the stunning, almost medieval Scots-English version of the Aeneid by Gavin Douglas, a work as dark, cold, and brine-soaked as a Scottish seacoast.

Now there's a new history of the Peloponnesian War, ripe for our times, by Victor Davis Hanson, fair-haired historian of the neo-cons and bête noire of the Bush-bashers.

His work is not at all a translation of Thucydides. Yet no history of the war can be written without walking in the Greek exile's footsteps. Hanson's book is called A War Like No Other : How the Athenians and Spartans Fought the Peloponnesian War.

But as this review makes clear, the book is at the same time a statement that the clash of the Greeks was, at the same time, a war like a great many others, even if we've forgotten that:

Hanson’s Peloponnesian War lacks heroes. Pitched battle takes second place to skirmishes, raids, night attacks, and terror campaigns. We hear of cruelty, butchery, and cannibalism: This is not the glory that was Greece, but a Mediterranean chamber of horrors. The reader thinks of such disillusioned World War II novels as Catch-22 or Slaughterhouse Five or Malaparte’s neglected masterpiece, The Skin.

This may seem like strange territory for Hanson, who is not only a classical scholar but a journalist. He is a neoconservative advocate of an aggressive American foreign policy that is not afraid of using force in the interests of spreading freedom and democracy. But the paradox is more apparent than real. Hanson is too shrewd a student of warfare to imagine that “the romance of a good nineteenth-century fight,” as he puts it, is likely to be on offer today. In Iraq and Afghanistan, in every airport and in the London Underground, and in hundreds of unseen alleyways and mountain passes, we are locked in the dubious battles of a dirty war. Hanson’s account of fear, fire, and terror (to cite some of his chapter titles) is a tale for the times. Yet if there is a mournful quality to Hanson’s meditation, the book is no lament: like Stan Getz playing Burt Bacharach, Hanson makes art out of loss. How sad that war has lost its mythic appeal, now, let’s get on with it; that is how we might read the book’s message.

Stan Getz? Yeesh, what an image.

In interviews, Hanson is more explicit about the parallels he sees:

Everything we have seen in the present global war — slaughtering schoolchildren in Beslan; murdering diplomats; taking hostages; lopping limbs; targeted assassinations; roadside killing; spreading democracy through arms — had identical counterparts in the Peloponnesian War. That is not surprising when Thucydides reminds us that the nature of man does not change, and thus war is eternal, its face merely evolving with new technology that masks, but does not alter its essence.

More importantly, Athens' tragedy reminds of us of our dilemma that often wealth, leisure, sophistication, and, yes, cynicism, are the wages of successful democracy and vibrant economies, breeding both a sort of smugness and an arrogance. And for all Thucydides' chronicle of Athenian lapses, in the last analysis, rightly or wrongly, he attributes much of Athens' defeat to infighting back at home, and a hypercritical populace, egged on by demagogues that time and again turned on their own.

So the war is also a timely reminder about the strengths — and lethal propensities — of democracies at war. And we should remember that when we hear some of the internecine hysteria voiced here at home — whether over a flushed Koran or George Bush's flight suit — when 160,000 Americans are risking their lives to ensure that 50 million can continue to vote.

This is How We Do It

If they had Pulitzers for blogs, this would get one. Marc at "American Future" dives into the archives and lays it out: What the New York Times said editorially about Saddam Hussein, WMD, and the use of American force in Iraq, dating back to 1993. It's not an assassination of the Gray Lady, but a perspective on editorial policy that will allow those of us who care about such things to discover if there's method to the journalism, or mere political trimming.

Except for a brief period during 1994, The Times’ editorial position was distinctly hawkish during the Clinton presidency. At no time did the Times express any doubts regarding the credibility of intelligence information pertaining to WMD. Throughout this period, the paper’s editors insisted on an aggressive UN-directed inspection regime, which was their preferred means to disarm Saddam’s Iraq. They frequently made note of Saddam’s efforts to thwart the inspectors, and insisted that Iraq must fully cooperate before the sanctions implemented at the end of the Gulf War should be lifted. The Times’ objective was the elimination of Iraq’s WMD, not regime change. Bringing democracy to Iraq was not a topic in its editorials.

Notwithstanding their preference for inspections, the editors did not shy away from advocating the use of air strikes – including unilateral American air strikes – if the obstacles constructed by Saddam made it impossible for the U.N.’s inspectors to fulfill their missions. The Times endorsed every U.S. military operation ordered by Clinton. None of the editorials insisted that the U.S. must obtain Security Council approval before undertaking a military action, nor did they require that military operations – unilateral or multilateral – be authorized by new Security Council resolutions.

When the editors criticized the Clinton administration, it was for being too dovish, not too hawkish. They leveled similar criticisms at the U.N. Security Council. China, Russia and especially France were taken to task for giving priority to their commercial interests over coming to grips with the threat posed by Iraq’s WMD.

The single exception to the Times’ hawkish stance stemmed from Iraq’s November 1993 decision to cooperate with the UN arms inspectors. In an editorial dated August 1, 1994, it was stated that Iraq was “now close to meeting the Security Council’s requirement that it destroy its stocks of biological, chemical, and nuclear weapons and accept long-term international monitoring.” For France, Russia, and China, this was sufficient for lifting the oil sanctions. The U.S. and Britain "opposed any acknowledgement of progress," and the Clinton administration, "which insists on retaining sanctions as long as Mr. Hussein remains in power, has been reduced to strained reinterpretations of the cease-fire’s resolution’s clear language ..." The editors sided with France, Russia, and China. It wouldn’t be long, however, before the Times would be disabused of the notion that Saddam had changed his colors.

She's Baaaaack

Did you know Cindy Sheehan is back in Crawford, Texas?

Don't worry; you will soon. The media machine is ginning up again. There are 31 pictures on the wire photo desk tonight with her name in the caption and 40 stories have moved since Thanksgiving.

What's amusing -- well, one of many things -- is that the AP slugs on her story in the first round of the media circus were "PEACEMOM." AP always keeps the same slugs on a story in a cycle; Supreme Court stories always are SCOTUS. Even if the story changes in nature, the slug generally stays the same for the sake of editors finding it again. If two children don't come home from school it will be MISSINGKIDS or something, even as the case evolves into a murder, a trial, a conviction.

So the AP dubbed this woman "Peace Mom." This time around, "Peace Mom" isn't in the slug -- it's in the headlines. 'Peace Mom' Cindy Sheehan returns to Texas for war protest. It was the Associated Press that gave her that name, now they're quoting it, unattributed, in their headines. They're working overtime to nail down the public perception on this.

I've said it before: It's a very inaccurate description of who she is and what she wants.

Some people who devote their waking ours to whipping up their own indignation over the pecadillos -- some not so minor -- of the White House wonder why some of the rest of us would rather tone that down while there are wars to be won, but will take aim at the Sheehans and Joe Wilsons and Michael Moores.

Think of it like this. Every protest march, every labor strike, draws a lot of people who come for the right reason -- to express an honest and sincere dissent. And every such action also draws people who will be there to stir up chaos, avenge a grudge, wave the demagogue banner.

If you care about the process, if you care about the dissent, the right to express such feelings even when they do potentially terrible damage, it's just as important to call out the phonies and the anarchists. And since the dissenters in the current situation often seem too distracted to undertake the work, the rest of us will gladly cover for them. Take it, Willie:

Parnell came down the road, he said to a cheering man:
Ireland shall get her freedom and you still break stone


Kit Marlowe, Meet Mr. Bowdler

Remember when it was only Victorian prudes who screwed with the classics?

IT WAS the surprise hit of the autumn season, selling out for its entire run and inspiring rave reviews. But now the producers of Tamburlaine the Great have come under fire for censoring Christopher Marlowe’s 1580s masterpiece to avoid upsetting Muslims.

Audiences at the Barbican in London did not see the Koran being burnt, as Marlowe intended, because David Farr, who directed and adapted the classic play, feared that it would inflame passions in the light of the London bombings.

Simon Reade, artistic director of the Bristol Old Vic, said that if they had not altered the original it “would have unnecessarily raised the hackles of a significant proportion of one of the world’s great religions”.

The burning of the Koran was “smoothed over”, he said, so that it became just the destruction of “a load of books” relating to any culture or religion. That made it more powerful, they claimed.

Members of the audience also reported that key references to Muhammad had been dropped, particularly in the passage where Tamburlaine says that he is “not worthy to be worshipped”. In the original Marlowe writes that Muhammad “remains in hell”.

Thursday, November 24, 2005

College Essay

Keep reading. It just gets funnier.

Follow the Leader

Neo-Neocon has thoughts on leaders.

I'm interested in why we (and I include myself here) are somewhat averse to the very word "leader." One of the commenters on SC&A touched on what I consider the heart of the answer, and that is that leaders require followers. Or at least we think they do, in the common understanding of the word "leader."

Now, who in American wants to be a follower? Practically no one. Individualism was built into this country from the start, and the distaste for a leader in that sense is not limited to the left--it's very strong on the right, too. The idea of "leader" is too close to royalty on the one hand and to dictatorship on the other.

She runs through the litany of modern leaders, from Hitler and Stalin to Big Brother, who have left the word in bad odor. But she notes that the aversion seems particularly strong among Baby Boomers (she's one, I'm another). And maybe the adversion involves more than a few bad men.

Raised by parents who had renounced some of the authority of their own parents; encouraged by our numbers, prosperity, and the press to take our adolescent rebellion to extremes; many of us have taken the charge "Question Authority" to heart. Some never stop questioning it and rebelling against it, often just for the sake of rebelling.

The institutions they most distrust or flee from -- the military, churches -- are the most hierarchical.

She looks at what makes leaders inspiring, but circles back again to the difficulty of being a follower.

If one has a real reason to admire what a person in a position of leadership has done, it isn't as hard to be a follower when necessary, or to trust that things are in generally capable hands.

Like me, she has made the intellectual break with the reflexive leftish liberalism of her past. The emotional break is a much harder matter.

She's right, of course, that the tension between individualism and authority runs through the American story. It's fascinating to watch the dissenters and non-conformists of old England arrive on this shore and instantly become authoritarians obsessed with rooting out dissent in New England.

But the remarkable thing about the '60s generation -- and I think NNC would agree with me on this -- is the thoroughness of its rejection of leadership, of the very idea of leaders, as outmoded, an impediment to the Aquarian Age. It goes beyond the usual American suspicion. (Even in their most anti-establishment phases, the Puritans and the Quakers had leaders and knew how to follow them.)

This is the generation that produced both Bush and Clinton. It produced most of the people now in leadership positions in the media and academe. Even in many American places where the flower-power '60s gained no foothold, the faith in leadership shuddered.

"Following" is now felt as sheep-like, submissive, stupid. Yet it's one of the oldest and deepest-rooted human behaviors. And it can be a fully alert and active state.

I tend to see things in an evolutionary perspective. Human behaviors are not quite hard-wired, but certainly they've been reinforced by three million years or so of independent existence as a species. Most of that experience was lived under dramatically different circumstances than the way we live now. And you have to remember the forest primeval to make sense of a lot of what we do now.

Why do we keep stuffing our faces with sugar and fat even though it kills us? Because for 98 percent of human history, the intense craving for sugars and fat was a boost to survival. Why do modern men act out rituals of war and hunting in their sports? Might as well ask why we have mouthsful of teeth suited to tearing raw meat.

The interrelation of leaders, followers, and survival is at the heart of one of the great stories of the ancient world: the Anabasis. An army of 10,000 Greeks reluctantly had agreed to go fight for a Persian prince who wanted to claim the throne of his empire. They followed him into the heart of what is now Iraq, and there fought a great battle against his rival half-brother. The Greeks swept the field and won a victory, but their Persian prince, Cyros, was killed in the fight.

It was an awkward situation all around. The Greeks were surrounded by enemies, far from home, with limited supplies of food. The Persian king desperately wanted to get rid of this powerful army in the middle of his realm. Yet as a unit they were too powerful to beat in battle. The Persians lured the chief Greek generals to a negotiation, and treacherously slaughtered them. The Persians reckoned that, without their leaders, the Greeks were doomed.

It's a testimony to the genius of ancient Greece, and the habits of democracy, that the lost brigade replaced its head. But they did so knowing the urgency of the situation, and that unless they had strong leaders, with committed followers, they all were as good as dead. The junior officers got together. A man named Xenophon, "an Athenian, who was neither general nor captain nor private," rallied them. Leadership, he told the others, was essential.

“All these soldiers have their eyes on you. If they see you are discouraged, they all will be cowards; but if you show that you are making preparations against the enemy, and if you call on them, you may be sure they will follow you and try to imitate you. ... Now, first of all, I think you will do the whole army a great service if you take care at once to appoint captains and officers in the place of those who have been lost. For it is true one may say universally that without commanders nothing good or useful could ever be done; good discipline always saves, but disorder has destroyed many."

Their leader in battle had been Clearchos. Now he was dead, killed by their enemies.

"... You see that the enemy dared not make war upon us until they had seized our leaders. They believed that while the commanders were there, and while we obeyed, we were able to defeat them; when they took our commanders, they thought we should be destroyed by anarchy and disorder. Very well: the commanders must be much more careful than before, and the commanded must be more obedient than before: and if anyone disobeys, we must vote that each and all of you must help the commander to punish. So the enemy will find themselves mightily mistaken: for this day they will see ten thousand Clearchoses instead of one, who will never permit anyone to be a coward."

I watched Clinton's supporters fail to stand up for him when the waters rose. Rather than fight for a leader, they retreated into the comfortable woods inhabited by those who refuse to follow. "I'm indifferent to the fate of the leaders of my own faction," they seemed to say; "My only passion for leaders is to loathe the leaders of the opposition."

There's a pure negativity in that that is frankly revolting. People who only know what they're against, and who never are "for" anything once the going gets tough, can't advance the human race or do very much good at all. People who live in permanent states of authority-questioning, as NNC notes, are useful once in a while, but generally they are a drag on the human race.

There's an old Anglo-Saxon poem -- it survives only in a fragment -- that illuminates the reciprocal relation between leaders and followers. At the Battle of Maldon, when the leader of the Anglo-Saxon warriors is cut down in the fight against the vikings, the poet names and execrates the few who then ran away. One is Godric:

He leapt upon the mount of the steed which had once been his lord's,
on those trappings of which he was not fit,
he and with his brothers both galloped away,
Godwine and Godwig not caring for battle,
but turned away from this battlefield and to the forest fled,
seeking a place of safety and to protect their lives ...

But many more men stay and fight on. In fact, they now are fighting to either victory or death. Which is the proper warrior code response to a leader's fall. A young warrior named Ælfwine exhorts them:

"Remember the speeches which we had often at mead spoken,
that we on the bench had loudly uttered vows,
warriors in the hall, concerning bitter strife:
Now may we prove who is truly valiant!
I am willing that my royal descent be made known to all men,
that I was of Mercian blood greatly kindred;
my grandfather was named Ealhelm,
a wise alderman and very prosperous.
Not shall me these people's liegeman reproach
that I of this army am willing to depart from,
a homeland seek, now that my lord lies slain
and hewn down in battle. Mine is that sorrow greatest:
he was both my kinsman and my lord."

[From a superb translation by Douglas B. Killings]

That's cowardice and bravery. Godric and those who ran failed in their roles. But what NNC is talking about is a generation that seems to have opted out of the relationship entirely. The fact that there were no obvious viking raiding parties or Persian emperors in '60s America made this easier, I suppose.

The leader-follower dynamic is deep-rooted in us. Like anything that has roots down to the level of instinct, it is dangerous to ignore. To think you can simply walk away from it in a quest for a higher consciousness is a risky business. Don't be in such a hurry to cut off something until you know what's going to replace it. The energy, the need, are going to push through, whether you ignore them or not.

And if human experience has evolved some system to guide and contain the instincts, chances are our ancestors devised it after trying most other things and finding they worked less well. This is the heart of philosophical conservatism. If it looks evil, be sure it's not really a lesser-of-evils before you reject it and tear it up.

In American history, it seems to me, the generation that shares most in common with the noncomformists of the 1960s is the one that came of age in the 1840s. Like the youth movement of the Vietnam era, this was not a nationwide phenomenon, but rather it concentrated in a pocket -- in this case, New England.

At its core were radical individualists -- the Transcendentalist, "who believed that the human mind could through its own capacities transcend its inheritance in human history" [Lewis P. Simpson]. They chased truth to ground in the "self's ultimate power of intuitive perception." They combined an outward utopianism and the need to feel free of any authority but one's own inner voices.

Having rejected leaders of the conventional culture, the Transcendentalists, like their 1960s counterparts, were susceptible to cults, both religious and secular. They sought out gurus and they made pilgrimages -- to lecture halls, if not to rock shows.

Having freed themselves from conventional loyalties, the Transcendentalists easily swerved into authoritarian ways. They refused the taint of compromise. Thoreau turned his able pen to writing a hagiography for the original American terrorist, John Brown.

Thoreau was, like many modern utopian individualists, a totalitarian at heart. "We talk about a representative government; but what a monster of a government is that where the noblest faculties of the mind, and the whole heart, are not represented!"

Their anthem was written to the march of soldiers' boots on the cobblestones of Boston, heading off to wreak destruction and spread the scripture of New England to the Southern heathens.

I have read a fiery gospel writ in burnish'd rows of steel,
"As ye deal with my contemners, So with you my grace shall deal;"
Let the Hero, born of woman, crush the serpent with his heel
Since God is marching on.

Neo-Neocon, too, comes around to this generation in her post.

When Lincoln was assassinated, the grieving poet Walt Whitman wrote the famous poem "O Captain! My Captain!" It's a lament for a leader fallen when the prize is so close at hand, a crie de cour on behalf of a nation bereft.

I'm not sure it would be possible anymore for this sort of metaphor and emotion about a leader to be expressed--or perhaps even felt--about a President. The feeling is composed of many things, but one of them is love.

I needed a long time to appreciate Whitman. Too often his poems read like an auto parts catalogue; the long lines lie there inert, like a python that ate too much on a cold day. When I recall some thought from his poems and want to quote it, I turn to the book and find it straggling along over a page and a half, without beginning or end.

Whitman's love for Lincoln, almost embarrassingly passionate, perhaps is yet another expression of Whitman's omnisexuality. But either way, it discovers and draws up a very old thing: The dear love of a leader can be a quality that doesn't dead-end at a Stalin or a Hitler.

Of the few Anglo-Saxon poems that have come down to us, two are laments by outcast men -- that is, men who for one reason or another have lost their "captains," their lords, and the protection of their chieftains and the band of men who surrounded them in loyalty. Evidently this leaderless condition filled the Anglo-Saxon man with dread.

In one of the poems, the narrator dreams that he again is in the presence of his leader, only to awaken to the double bitterness of feeling the loss anew:

ðonne sorg ond slæp somod ætgædre
earmne anhogan oft gebindað.
þinceð him on mode þæt he his mondryhten
clyppe ond cysse, ond on cneo lecge
honda ond heafod, swa he hwilum ær
in geardagum giefstolas breac.
ðonne onwæcneð eft wineleas guma,
sare æfter swæsne. Sorg bið geniwad,

Which, in a rather flaccid translation, comes out:

Even in slumber his sorrow assaileth,
And, dreaming he claspeth his dear lord again,
Head on knee, hand on knee, loyally laying,
Pledging his liege as in days long past.
Then from his slumber he starts lonely hearted
. . .
The longing for loved one: his grief is renewed.

That's pure Whitman. Whitman, like Emerson, had his fervent disciples, who wanted to make a religion out of him. But he wouldn't let them forget the humility of being human. Which includes being a follower sometimes.

Emerson born 1803, the guru. Margaret Fuller, 1810; Thoreau 1817; Julia Ward Howe 1819. Whitman 1819. Whitman was perhaps the truest individualist in his ur-hippie generation, yet he was the only one, I think, who understood what a leader was and ought to be.

The antithesis of New England Transcendentalism was the American South. There, for many reasons, the leader-follower dynamic was firmly in place among the white population. In part because, like Xenophon's Greeks, they sensed the danger surrounding them. In part because they felt a heritage connection to the lowland Scots who were brothers to the men who fought at Maldon. They tended to rally to chieftains, even when they did not entirely like them.

Emerson found the South purely repulsive. He was among the many in New England who hated slavery not so much because they liked Africans and wanted them to be equal partakers of America but because they resented the Southern leadership class.

Whitman had none of that.

"I am very warmly disposed toward the South; I must admit that my instinct of friendship towards the South is almost more than I like to confess. I have very dear friends there -- sacred, precious memories; the people there should be considered, even deferred to, instead of browbeaten. I feel sore, I feel some pain, almost indignation, when I think that yesterday keeps the old brutal idea of subjugation on top.

"I would be the last to confuse moral values -- to imagine the South impeccable. I don't condone the South, where it has gone wrong -- its Negro slavery, I don't condone that -- far from it -- I hate it. I have always said so, South and North; but there is another spirit dormant there which it must be the purpose of our civilization to bring forth; it cannot, it must not, be killed."

Whitman, I have come to think, was the greater American.

Carnival of the Etymologies

[Special Thanksgiving edition]


If Thanksgiving had been a holiday we inherited from the Church, we might be preparing this week for *Gratimas.

The first Thanksgiving in America supposedly was held October 1621 by Plymouth Colony Pilgrims in appreciation of assistance from members of the Massasoit tribe and celebration of the first harvest. But Thanksgiving Day as the name of a New England fall harvest holiday that grew from this event is not attested until 1674.

The generic use of thanksgiving for "the giving of thanks" goes back to about 1533; in the specific sense of "public celebration acknowledging divine favors" thanksgiving dates from 1632.

The noun thanks is attested from 1340, from the verb thank, which goes back to Old English þancian "to give thanks." This is a general Germanic word (cf. Old Saxon thancon, Old Norse þakka, Danish takke, Old Frisian thankia, Middle Dutch and German danken).

The reconstructed Proto-Germanic root is *thankojan, from the noun *thankoz meaning "gratitude," but also, and probably originally, "thought." The sense evolution likely is from "thought" to "good thought" to "gratitude." The related Old English noun þonc underwent just this evolution in historical times; it originally meant "thought," but by c.1000 it had come to mean "good thoughts, gratitude."

The whole group thus is from the same root as think, from the Proto-Indo-European base *tong- "to think, feel."

Ever wonder what they call a turkey in Turkey? It's a hindi. This is complicated, so hold on to your hats.

The first use of the word turkey to refer to a bird in English is from 1541, but the reference was not to the North American bird. Rather, it was to the "guinea fowl" (Numida meleagris), an African fowl imported from Madagascar via Turkey, by Near East traders known as turkey merchants, because they traded with the Turks. This was an important trade route for England in those years, and the only conduit for exotic goods to enter the island until a trade route was established through Russia. England came late to New World exploration, and the French and Spanish -- with whom England was as often as not at war -- monopolized it. This explains why the English devoted so much time to searching for a Northwest Passage (to get to Asia without crossing Spanish sealanes) and how the big North American bird (Meleagris gallopavo) came to be called a turkey.

It was domesticated by the Aztecs, and introduced to Spain by returning Conquistadors (1523) and thence to wider Europe. But because of wars that interupted trading between the majpr European Christian powers, it arrived in most places beyond Spain by way of North Africa (then under Ottoman rule) and Turkey.

The New World bird itself reputedly reached England by 1524 (when Henry VIII is said to have dined on it at court). The word turkey was first applied to it in English circa 1555, either because it was identified with or treated as a species of the guinea fowl that already had that name, or because it was imported from turkey, or a bit of both.

The Turkish name for it, hindi, literally means "Indian," and probably was picked up by the Turks from French dinde, contracted from poulet d'inde, literally "chicken from India." The French, at least, recognized that this big bird came from the New World. But they gave it an incorrect name based on the common misconception that the New World was eastern Asia.

The meat soon became popular fare, and by 1575, turkey was becoming the usual main course at an English Christmas. The turkeys raised by the Pilgrims in Plymouth Colony probably were stock brought from England.

Pilgrim is Middle English pilegrim (c.1200), a word borrowed from Old French pelegrin. Its original sense in English was the general religious one of "one who travels to holy shrines as a religious duty," and the meaning of the Latin source of the French word, peregrinus, is "foreigner." It comes from the adverb peregre "from abroad," a compound of per- "beyond" and agri, the locative case of ager "country."

And this ager, in turn comes from Proto-Indo-European *agros "field," source of English acre (Old English æcer "tilled field, open land"), Greek agros (common in English compound words referring to farming), and Sanskrit ajras "plain, open country."

The change of the first -r- to an -l- in the Romance languages happened by a process linguists call dissimilation. It seems people don't like to have to pronounce the same consonant twice in a row in a word, especially if its an -r- or an -l-, so they tend to either drop one or turn it to another letter. Since -r- and -l- sound something alike (think of the common confusion of Chinese and Japanese speakers of English), they tend to turn into one another through dissimilation.

Among the other modern English words that show evidence of this sort of dissimilation are purple (Latin purpura), marble (Old French marbre), turtle (Latin turtur), flair (Late Latin fragrare), laurel (Middle English lorrer, from Old French laurier), and marmalade (Latin melimelum, literally "sweet apple").

The pronunciation of colonel retains the dissimilated spelling that the word had when it first was borrowed into English from French in the 16th century, but the English spelling was modified in Shakespeare's day to conform to the original Italian word.

The English religious sect who founded Plymouth Colony had called themselves Pilgrims from c.1630, in allusion to Heb. xi.13:

These all died in faith, not having received the promises, but having seen them afar off, and were persuaded of them, and embraced them, and confessed that they were strangers and pilgrims on the earth.

The most obvious difference between the Pilgrims and the Puritans (with whom they often are confused) is that the Puritans arrived in New England about a decade after the Pilgrims, came in larger numbers, and formed the backbone of the Massachusetts Bay Colony, which absorbed the Pilgrim settlements.

Theologically, the Puritans wanted to reform the Anglican church, while the Pilgrims had gone one step further and left it altogether to form their own sect. They sharted an insistence on the authority of the revealed word. But the Pilgrims were more spiritual, the Puritans more intellectual, in their living faith.

It's the Puritans who got the bad reputation, on both sides of the Atlantic. Puritanism famously was defined by H.L. Mencken as "the haunting fear that someone, somewhere may be happy" [1920] and Thomas Babington Macaulay wrote in his "History of England" [1849] that, "The Puritan hated bear-baiting, not because it gave pain to the bear, but because it gave pleasure to the spectators."

The name Indian was applied to the native inhabitants of the Americas since at least 1553, on the mistaken notion that America was the eastern end of Asia. More than 500 modern English phrases include Indian, most of them originating in the U.S. and most impugning honesty or intelligence.

Indian giver, for instance, first was attested in, 1765, but it originally meant one who gives "a present for which an equivalent return is expected." [Thomas Hutchinson, "History of Massachusetts Bay"]. The meaning "one who gives a gift and then asks for it back" is first attested only in 1892.

The exact signification of the word in Indian summer "spell of warm weather after the first frost" (1778) is unclear. Perhaps such a spell was so called because it was first noted in regions inhabited by Indians, or because the Indians first described it to the Europeans. No evidence connects it with the color of fall leaves or a season of Indian attacks on settlements (two popular myths).

It is the American English version of British All-Hallows summer and French été de la Saint-Martin (feast day Nov. 11). Also colloquial in England was St. Luke's summer (or little summer), a period of warm weather occurring about St. Luke's day (Oct. 18).

Thanksgiving is the harvest celebration in our secular calendar. Harvest -- Old English hærfest -- is the original English word for "autumn." The borrowing of autumn and the evolution of fall gradually focused its meaning after the 14th century from "the time of gathering crops" to the action itself and the product of the action.

German retains the world in its older sense, as Herbst "autumn." The Proto-Indo-European root of all this is *kerp-, which, not surprisingly, means "to gather, pluck, harvest" (cf. Sanskrit krpana- "sword," krpani "shears;" Greek karpos "fruit," karpizomai "make harvest of;" Latin carpere "to cut, divide, pluck;" Lithuanian kerpu "cut;" and Middle Irish cerbaim "cut").

Among the traditional Thanksgiving foods is squash, which is a shortened borrowing from Algonquian (Narraganset) askutasquash, which literally means "the green things that may be eaten raw." The elements in the compound are askut "green, raw" and asquash "eaten," in which the -ash is a plural affix (as it also is in that scandalous crime against corn, succotash).

The corn that is a staple of Thanksgiving tables originally was Indian corn, but the adjective was dropped. It also was known at first as turkey corn or turkey wheat in England for the same reason the turkey first was so called (see above).

Corn by itself originally meant merely "grain." The word itself virtually is unchanged from Old English, and is recognizable in Proto-Germanic *kurnam, from the Proto-Indo-European base *ger- "to wear away" (source also of Old Church Slavonic zruno "grain" and Latin granum).

The sense of the Old English word was "grain with the seed still in" rather than a particular plant. It was locally understood to denote the leading crop of a district. Though the word is restricted to "corn on the cob" in America, it now usually means "wheat" in England and "oats" in Scotland and Ireland, while korn means "rye" in parts of Germany.

I'm blessed now to have a wife who is a witch in the kitchen and loves to cook. It wasn't like that when I was growing up. At home, in West Chester, in the '60s, the idea of "Thanksgiving delicacy" for my brother and me was the end piece of the cranberry sauce, which bore the imprint stamp of the canning company.

Cranberry (1647) is the American English adaptation of Low German kraanbere, which was the name of a type of berry (Vaccinium oxycoccos) that grew in Europe. In England, they were known as marshwhort or fenberries. In Germany, they were kraanberen, a compound that means "crane-berries," perhaps from a fancied resemblance between the plants' stamens and the beaks of cranes.

It was the name brought over by German and Dutch settlers that took hold in the New World, where a larger but recognizably related form of the berries (V. macrocarpum) was found growing. The North American berries, and the name, were introduced in England in the late 17th century.


Monday, November 21, 2005

A Ray of Hope

If you're optimistic. If you're not, a glance down the road not taken that should have been taken.

James A. Gavrilis writes on the Mayor of Ar Rutbah. Foreign Policy's summary lede is this:

Amid the chaos in Iraq, one company of U.S. Special Forces achieved what others have not: a functioning democracy. How? By relying on common sense, the trust of Iraqis, and recollections from Political Science 101. Now, their commander reveals the gritty reality about nation-building in Iraq, from the ground up.

The setting of the story is not a quirk, not a friendly, secular Kurdish village that already knows thew ground rules of American-style democracy and is ready to take charge of itself. Ar Rutbah was a mini-microcosm of what's so difficult about Iraq. The U.S. experience there reads like a metaphor for the whole country and the whole war:

As our long column of tan trucks rode down Iraq’s Business Highway 10 at 6 o’clock in the morning on April 9, 2003, I focused on my instincts and battle training, keeping an open mind and preparing for whatever lay ahead. After three weeks of intense firefights, the Fedayeen Saddam fighters had finally slithered away. The last thing I expected to do once we entered Ar Rutbah, a Sunni city of about 25,000 in the Anbar province near Jordan and Syria, was to begin postwar reconstruction. I had not planned or prepared for governing, nor had I received any guidance or assistance in how to do so. But then, nothing in war is expected.

More Like It

Once again, leave it to Jeff Goldstein to spell out, in bulleted, plain English, what the Administration seems to be trying to say:

Clearly, the important administration arguments are beginning to coalesce: 1) Criticism of the war is not by itself unpatriotic 2) Similarly, answering anti-war critics is not challenging their patriotism 3) But opportunistic and cynical anti-war critics who are trying to walk back their own votes and level spurious charges at the Administration (they lied to take us into war) are themselves lying 4) These lies are hurting the country and the troops. 5) The burden of proof, in a post 911 world, was on Saddam Hussein to prove he’d disarmed; we could not wait for the threat to become imminent before acting 6) The cause the troops are fighting for is just and right 7) Iraq is moving toward freedom; and things on the ground are improving daily, regardless of what the MSM and prominent Dems would have us believe.

These points, taken together, form an easy, concise, and—most importantly—a factually correct counter-narrative to the Dem / MSM narrative that has preached confusion, failure, quagmire, American criminality (torture, WP), and the relentlessness of an insurgency whose battleground savvy and knowledge of the Arab world are thwarting the plans of our confused military leaders and civilian war commanders.

Agree or argue, but it's nice to see a case made clearly and directly, without being garbled by the messenger or the media. Ah, the power of a blog. I just hope he's right and that's what the administration, in fact, believes.

More Thoughts

About this.

I'm really interested in this, for the time being, from a very narrow perspective of how the journalism works. Imagine you could find an intelligent person who'd been in a clamshell for the last five years and feed him these two sets of quotes:

“People should feel comfortable about expressing their opinions about Iraq. ... I heard somebody say, `Well, maybe so-and-so is not patriotic because they disagree with my position.' I totally reject that thought."

side by side with:

"Listen, patriotic is apt to disagree with the president, it doesn’t bother me. What bothers me is when people are irresponsibly using their positions and playing politics."

And you told him to pick which one was "fiercely attacking" and which one was "abruptly conciliatory," or whatever. I can imagine he'd be puzzled by the choice.

I'm not cherry-picking quotes. Those are the ones the news wire stories are built on. One from last week, one from Sunday/Monday. I'm not going to the press conference transcripts on the White House Web site to see the totality of what was said. The quotes aren't incidental to the story -- they are the story. The AP is reporting on the "tone" of what the president is saying. And those are the quotes they chose to prove it.

Now, I don't watch TV, so maybe Bush was rolling his eyes or had his tongue in his cheek when he spoke last week. But AP didn't tell me that.

And maybe in the middle of the press conference last week a Rovian Rumplestiltskin popped up in the Rose Garden flower bed and shrieked to the press corps, "we're pushing back." And this weekend he whispered, "we're toning down."

It's very possible that's behind the spin. White House press coverage is a bizarre business, at least it was in the '80s when I got a taste of it. If anything, from what I hear, it's gotten more twisted since then.

It would be like a big wedding ceremony, and a big reception afterward, in which two families exchange vows and toasts and dances. And all through it this group of reporters is pestering the bride with probing and leading questions about her groom's family. And finally, midway through the ceremony, they home in on something and the bride at one point says, "Well, his mom wanted me to wear the dress she got married in, but we tried it on and I'm just too small for it." And at that point the press crew clicks off its tape recorders and cameras because they all recognize they've got the story for the next day. And the headline reads, "Bride slams fat mother-in-law."

Sometimes in journalism you use your intimacy with your sources, and the off-the-record chats you have with them, to shape your coverage. For instance, I once was a cop reporter when a woman was found murdered in her home for no apparent reason. The police chief told me off the record that they were pretty sure her husband had done it and covered it up to make it look like a break-in. They were building the case against him, but it took a week or more before they formally charged him. Until then, neither the chief nor I could say in public that he was facing charges.

But based on what I knew but couldn't print, I wrote a different sort of story than I would have otherwise. It was "police continue to investigate the death of ______" rather than "killer on loose in city." If you do that often enough, you then can read other people's coverage and get a clue as to what they've been told off the record based on where they're steering the story.

But it seems to me, more and more, that the White House coverage we're getting is driven by something other than realities. Which isn't helping the national dialogue on Iraq.

Tone Deaf

Bush Tones Down Attack on Iraq War Critics

Oh, really?

BEIJING - After fiercely defending his Iraq policy across Asia, President Bush abruptly toned down his attack on war critics Sunday and said there was nothing unpatriotic about opposing his strategy.

"People should feel comfortable about expressing their opinions about Iraq," Bush said, three days after agreeing with Vice President Dick Cheney that the critics were "reprehensible."

I’m not really following every word out of the Prez's mouth, but this strikes me as yet another case of the media covering its ass by inventing a “shift.”

Hasn’t Bush been saying all along criticism of policies is patriotic if sincere, and the American people have a right and a duty to question and dissent, but politicians who had turned against the war for political reasons were being disingenuous and sending the wrong message?

The quotes I saw from him late last week are like this:

Some Democrats who voted to authorize the use of force are now rewriting the past. They are playing politics with this issue. And they are sending mixed signals to our troops and the enemy. And that is irresponsible.

And this:

Listen, patriotic is apt to disagree with the president, it doesn’t bother me. What bothers me is when people are irresponsibly using their positions and playing politics.

That, as I understand it, was his position. You can play footsie over what they meant when they voted to "authorize," but unless you were a Democrat who voted to authorize the war, I don't see how you could read him as calling you "unpatriotic" for questioning the war.

The reporting of that seemed to ride right over the distinction he was drawing between politicians and people, however, and it came out in the headlines as “Bush slams war critics.”

Now that the editors have calmed down enough to notice the other half of what he’s been saying all along, the headlines say, “Bush changes tune.”

Cheney’s another story. But Cheney’s not Bush. I don’t mean to imply Bush’s distinction is perfect or right, or that dissent even by politicians is sending wrong messages, but I really am seeing a lot of media stuck on stupid here.

I mean, look at the lede:

After fiercely defending his Iraq policy across Asia, President Bush abruptly toned down his attack on war critics Sunday and said there was nothing unpatriotic about opposing his strategy.

Those aren’t opposing qualities. You can defend something and welcome opposition as patriotic. Just because you’re being opposed patriotically doesn’t mean you have to stop defending.

Besides, he said last week there was nothing unpatriotic about opposing his strategy. “Listen, patriotic is apt to disagree with the president, it doesn’t bother me.” So where’s the new in this?

Then when the AP goes for a quote to back it up, they have to get a Cheney word — “reprehensible.” Couldn’t they even find a Bush word as strong as that? That’s kind of reaching for it.

Be careful how you read these things. It’s not unusual for the media to cover itself for missing half a story by presenting that half when it realizes the mistake and spinning it as a new wrinkle.

Sunday, November 20, 2005

Apolitical Blues

It's becoming painfully obvious to me that I'm not political enough. Some of you are probably doing the "no shit, sherlock" face right now. I realize it because when I write a post like this one, and post it at some of the places that have been kind enough to open their doors to me as a guest contributor, I invariably get responses that say, "This is a waste of time. What's your point?"

Because "point," to an awful lot of people, means "being solid red or blue, and making your best smash-and-trash attack on the other side." When they do respond, the commenters react as though I'm attempting to be polemical, and I just suck at it. So they fish out the leftist or rightist "point" they think they smell in my prose, and then go attack the straw-man version of it.

When I write something that says, "this person is a decent and honorable man, but he's wrong," or "this is a big, complex issue, here's an angle on it you might want to take into consideration," or "I'm still trying to figure out which of these two evils is the less" -- I get, "What's your point?"

They've been thinking so long and so hard in black and white they forget what colors look like. At some of those places I hardly even bother to post any more. When I do, I rarely bother to read the comments.

Here, it's been good. People who read this blog seem to get it and I'm grateful for it. Of course there's only a few dozen of you on a given day, if my stat counter works.

Thing is, I knew all along I wasn't political. Literature and history are my first loves. Where's politics? Way down the list, buried deep under stargazing and bellydancers and Anglo-Saxon mythology and British cheeses and Charlie Parker and the Philadelphia Flyers and paleoclimatology and about a dozen dozen other things. When I was young I called myself an anarchist (a philosophical position, not a fondness for throwing rocks) and I didn't vote till I was in my 30s and I still can't seem to vote for the same party two cycles in a row.

As a history writer, I had to understand politics and political thinking. I understand it, but I confess I understand it like a biologist might understand squids mating. It still baffles me how people like Madison and Calhoun and Wilson (Woody, not Joe) can be so high-minded and statesmanlike and at the same time roll around in the partisan mud. If there is a bump of politics, in the phrenologists' model, my skull lacks it.

I'm not going to bore you again with the "jolted into awareness by 9-11" story, but there's a plot twist in it that I sometimes overlook. In my blissfully a-political days I spent a lot of time on the message board of the scarlet leftist British newspaper The Guardian. I was only vaguely aware that it was leftist, and I didn't care; I was there for the literary conversation. There were a lot of smart folks there, talking about historical Christianity and Jane Austen vs. Emily Brontë (Emily by a pinfall in the 23rd minute) and the latest by W.G. Sebald.

When Sept. 11 hit, naturally, that was where I turned to grope for perspective. And the vulgar, gleeful high-fiving and hostility with which the British left greeted the attacks was a key part of my political awakening. The stench of that was part of what propelled me away from the pseudointellectual left as a place to ground a foundation for understanding what happened and what ought to be done.

Labels: ,

Saturday, November 19, 2005

Realism and Idealism

When Michael Young writes about the Mideast, it's usually worth a read. This piece in Reason online is no exception. He says the U.S. drive to democratize the Arab world has lost its momentum in the last year. No news in that. But he says much has changed in that world since the process began, and that the U.S. policy split between realism and idealism is a false dichotomy. And America can and should pursue both.

The administration has always mixed interests and ideals, conciliation and confrontation, since no foreign policy can ever be conducted on the basis of one without the other. But [Gideon Rose, managing editor of Foreign Affairs magazine] was indirectly right in another sense: When it comes to facing the dilemma of advancing democracy against interests in the Arab world, the administration has been willing to hold up odiously manipulated elections as proof of progress—for example the recent presidential election in Egypt or municipal elections in Saudi Arabia. In other words, the administration, perhaps reluctantly, has intermittently fallen back on an old realist trick of insisting things are better, providing counterfeit evidence of this, and turning to more important items of business.

Yet now is as good a time as ever for the U.S. to make democratization a basis of its foreign policy doctrine in the Middle East. Many Arabs have no patience for Bush. But the center of gravity in the region has decisively shifted in the direction of advancing liberty, as recent events have eroded the legitemacy of Arab leaders: three relatively free elections this year in Iraq (one of them forthcoming in December), another relatively free election in Lebanon after the country saw an end to Syrian occupation, and growing discontent with the fossilocracies in other parts of the region, particularly Egypt or Tunisia, and with second-generation despotisms, such as those in Syria and Jordan. For an administration to ignore such changes and banish democracy to a secondary tier of priorities would display a striking lack of ambition and foresight.

A more obvious parallel question is whether the U.S. can even return to the cold realism that guided policy under the first Bush administration. As 9/11 showed, that approach posed a genuine national security threat, as disgruntled Arabs, associating Washington with their own domestic persecutors, retaliated against the U.S. Conversely, absolute, inflexible devotion to democracy at the expense of more practical consideration of interests is simply not sensible.

That leaves a third option: that the U.S. declare the spread of democracy a strategic interest (not an open-ended desire), one that must be advanced where and when possible, even if it is temporarily delayed by intervening objectives. Arab regimes should be pushed to take specific measures within specific timeframes to open up their societies, and the U.S. can tie this to other forms of bilateral cooperation. Finally, no administration should ever hail as progress what is patently an effort by dictatorships to sell it a defective bill of democratic goods.

Friday, November 18, 2005

Worth 1,000 Words


And here's the caption:

Iranian university students burn an Israeli flag, during a demonstration commemorating the 5th anniversary of the death of Edoardo Agnelli, late heir of the Agnelli automotive dynasty, who was found dead near the Turin-Savona highway, northern Italy, Nov. 15, 2000, in front of the Italian Embassy in Tehran, Iran, Tuesday, Nov. 15, 2005. The demonstrators claim that Edoardo Agnelli did not commit suicide as investigators established, but because he allegedly converted to Islam he was killed to make way for his brother Lapo Elkann, son of a Jewish father.

Actually, it looks to me like they're burning an American flag, but in Iran that's a meaningless distinction.

Anyone here ever heard of poor Edoardo Agnelli? Me either. Ever felt moved to troop out to the Italian embassy nearest your home, with flags and lighter fluid, and get yourself furiously, hatefully angry at the Joos because poor Edoardo is dead?

Me either. But they did. Look at the face on this guy:

this guy

Wild-eyed over something that happened in Italy five years ago to nobody he ever knew. Now tell me you still think it's OK if Iran gets the Bomb.