Tuesday, May 31, 2011

Photo of the Day: Glove Seat

I took the accompanying picture of this rather unusual bit of statuary this past Friday, as I was passing by midtown Manhattan's Duffy Square—an area now dubbed by the New York Daily News’ Mike Lupica “Bloomberg Beach,” in honor of all the lounge chairs and other types of furniture now placed in this pedestrian area once reserved for vehicles ready to run everyone over.

I want you all to know I had the devil of a time getting this shot off. Just as I was ready to snap, didn’t another tourist (or pair of them) come by and wait to have their picture taken in this most commodious bit of outdoor furniture. I thought I’d never get out of there...

Quote of the Day (David Mamet, on the Agony of Flopping)

“There’s nothing in the world more silent than the telephone the morning after everybody pans your play. It won’t ring from room service; your mother won’t be calling you. If the phone has not rung by 8 in the morning, you’re dead.”—Playwright David Mamet quoted in Andrew Goldman, “Always Be Changing: David Mamet Explains His Intellectual Shift to the Right. The Far Right,” The New York Times Magazine, May 29, 2011

Monday, May 30, 2011

This Day in Presidential History (Jackson Duels Over Horses, Wife)

May 30, 1806—Amid early morning fog, on the banks of the Red River in Logan County, Kentucky, Andrew Jackson—lawyer, planter, temporarily out of public life—engaged in a duel with Charles Dickinson, a Nashville attorney with a well-earned reputation as one of the deadliest shots in the state. Though Dickinson fired first and his aim was as true as ever, it wasn’t good enough—his opponent was only badly wounded.


The consequences of that day—the uproar over Jackson’s decision to continue the duel when he didn’t have to, along with the bullet that remained lodged inside him—would last for the remainder of Old Hickory’s life, and play no small part in creating his legend as a man not to be trifled with.

This is what happens when two men get excited over liquor, horses—and especially a woman.

Perceptions of Andrew Jackson have evolved dramatically over the last 70 years or so. As America rose to counter a foreign enemy (multiple enemies, in fact) in WWII, Hollywood contributed The Remarkable Andrew (1942), in which the spirit of the victor of New Orleans and seventh President (played by Brian Donlevy) comes to the aid of a wrongly accused small-town accountant (played by William Holden) who idolizes him.

Flash forward to 2010, nearly 40 years after America washed its hands of a conflict in Southeast Asia—and now amid two wars involving the Islamic world—when the Public Theater presented the Michael Friedman musical Bloody, Bloody Andrew Jackson, described by The New York Times as “likely to remain a true reflection of these United States for many years to come.”

There’s a world of difference in these two worldviews, also shaped, to no mean extent, by two generations’ diametrically opposed perceptions of Jackson’s treatment of slaves and Native Americans. But they are united on one point, something that the unfortunate Mr. Dickinson (did I mention that AJ’s shot killed him?) would readily agree with: Jackson would fight.

There are people in this world who are too cocky for their own good, and I’m afraid Dickinson was one of them. Put together youth (25 years old), good looks, a decent legal practice (a letter of recommendation by his teacher, John Marshall, didn’t hurt), an agreeable wife and the aforesaid deadly aim (one that had gotten him through a number of duels already), and Dickinson thought he could survive anything.

He should have known better than to mess with Jackson. Not yet 40 years old, Old Hickory was already inspiring stories about his ferocious temper, his ability to take a devastating blow, and his capacity to strike back.

A few incidents (recounted briefly but with brio) in Jon Meacham’s Pulitzer Prize-winning biography of the President, American Lion, will suffice:

* Recalling wrestling matches in their youth on the frontier, an acquaintance remarked: “I could throw him three times out of four, but he would never stay throwed.”

* Captured along with his brother Robert during the American Revolution, the 14-year-old Jackson was ordered by a British officer to clean his boots. Jackson refused, claiming that he was a prisoner of war and should be treated as such. The redcoat swung his sword, leaving scars on Jackson’s skull and fingers—and hatred for a foreign force that so mistreated Robert in captivity that he died shortly after release.

* While riding circuit in 1798 as a justice of the Tennessee Superior Court, Jackson came face to face with Russell Bean, indicted for “cutting off the ears of his infant child in a drunken frolic.” Bean scared out of his wits the local sheriff, then, armed with a knife and pistols, proceeded to taunt a local posse sent to bring him to court. Jackson, however, made the astonished Bean drop his guns and surrender. It was not simply Jackson‘s blunt order (“Now surrender, you infernal villain, this very instant, or I’ll blow you through”), but his look, Bean confessed later, that did the trick. When Bean looked into the eyes of the sheriff and townspeople, he said, he saw, “No shoot”; when he looked into Jackson's, he read, “Shoot.” That meant, Bean concluded (in a phrase I can't help but love), it was time for him to “sing small.”

Surely Dickinson knew all this. He had to—there weren’t many lawyers out on the frontier, and he and Jackson were often rivals at the bar. The wonder is not really that he got into trouble with Old Hickory, but why he didn’t do so sooner than he did.

Dickinson, you see, touched on the most sensitive point in Jackson’s life: the origin of his marriage to his dear wife Rachel. The two had wed as soon as her first husband, Lewis Robards, obtained a divorce, in 1791--or so they thought. But, it turned out, Robards had only filed for divorce, and his petition wasn’t granted for another two years. As soon as the pair learned this, they went through the ceremony again to formalize things, though they already thought of themselves as man and wife.

Nevertheless, this meant that, technically, Andrew and Rachel had been living together unmarried, in an adulterous, bigamous relationship, for two years.


Meacham writes that Jackson’s marital situation caused such a scandal during the 1828 Presidential campaign because the nation’s mores had grown more conservative than those of Jackson’s frontier environment of the early 1790s. But even by the turn of the century, on the frontier, tongues were wagging.

One of these was Dickinson’s. The first time Jackson heard about this, he requested via Dickinson’s father-in-law, Col. Joseph Erwin, an apology. Jackson accepted Dickinson’s explanation—that he only started talking after too much to drink, and didn’t mean anything by it.

Over time, something of a myth has grown up around Jackson, to the effect that once you were on his bad side, not even God could get you off it. Not so: Seven years after the Dickinson imbroglio, Jackson became involved in another duel, with Thomas Hart Benton; received a bullet from that encounter; but later, in Washington, became an ally and diehard friend of Benton, by now a U.S. Senator from Missouri.

Unlike Benton, Dickinson wouldn’t live either to old age or into Jackson’s good graces. With the matter over Rachel seemingly taken care of, a dispute arose between the two over their shared passion for horseracing. Dickinson, exasperated by a forfeit and outright loss by a horse owned by Col. Erwin to one owned by Jackson, had one drink too many again, and again he made insinuations about Jackson and Rachel.

By this time, Jackson had also gotten wind of a rumor that some of Dickinson’s statements were about to make their way into print. Among Dickinson’s assertions: that Jackson was cowardly. Representatives from Jackson reached Dickinson before he could leave the state for Maryland (his birthplace and home of his ancestors) and demanded immediate satisfaction from the rash attorney. The meeting was set for May 30.

Jackson and his "second" for the duel were certain he couldn’t outdraw Dickinson, and determined that his best chance was somehow to withstand Dickinson’s shot before getting off one of his own. At this point, they hit upon a simple but (within the terms of the elaborate Code Duello formulated in Ireland in 1777) acceptable means of doing so: wear a loose coat that would provide a more elusive target for Dickinson.

All of this was unbeknownst to Dickinson, who was feeling even more confident (if that could be possible) on the way to the encounter, when, just for practice, he aimed at a string supporting an apple and cut the cord in two. Then, once he got to the site of the encounter and got the word to pace off, he turned almost immediately and fired.

The bullet scraped Jackson’s breastbone and broke some ribs—but more important, it didn’t touch his heart. Dickinson didn’t even know this much—all he could see was that Jackson stood as erect as ever. “Great God! Have I missed him?” Dickinson exclaimed, immediately sensing his own jeopardy—because now, by the Code Duello, Jackson could get off a clear shot at him.

Jackson pointed his pistol, aimed—and nothing happened: His pistol was at half-lock. At this point, by settled practice, Jackson, with his honor clearly established by his willingness not just to meet his opponent but to endure a shot from him, could have simply aimed at the trees, fired a worthless shot, called it a day, and let cooler heads smooth things over once again with his opponent.

But Jackson was reading the Code Duello very literally, and knew that an empty click technically didn’t count as a shot at all. This time when he aimed, he got his shot off, and it struck Dickinson in the abdomen. The latter was taken to a nearby house, where he bled to death from the wound.

Jackson and his second were 20 miles away when they noticed that Jackson was bleeding into his shoes. Some observers thought that Jackson didn’t seek immediate attention because he didn’t want Dickinson to have the satisfaction that he had struck him at all.

The Dickinson duel would return to haunt Jackson two decades later, when supporters of John Quincy Adams not only cited it as an example of the general’s intemperate nature, but also pointed to his decision to fire at Dickinson as a cold-blooded murder. It didn’t matter: unlike the disputed election of 1824, which ended up with Jackson losing the election in the House of Representatives despite winning the popular vote, this time he won walking away.

Dickinson was hardly the last person to underestimate Jackson’s fierce will. Nearly 30 years later, Jackson became the first President to suffer an assassination attempt, while he was leaving the Capitol as part of the funeral procession for a congressman. This time, by the standards of the day, the 67-year-old Jackson really was Old Hickory. But he reacted just the way he would have 30 years before.

After the assailant, named Richard Lawrence, attempted to fire at him, Jackson and the crowd rushed him. Lawrence, now with a second gun (and, by some accounts, close enough to touch Jackson’s coat), pulled the trigger again. Even with the different weapon, the gun misfired.

Now Jackson turned upon him, and Lawrence—like Bean and Dickinson before him—knew genuine fear, as Jackson, employing a trick he’d learned long ago on the frontier, swung his cane at the assailant’s stomach and brought him to heel.

Andrew Jackson was a hugely controversial figure, even for a position that seems to attract (or create) such types. His encounter with Dickinson illustrates one of the great faults of his personality: a propensity to enlarge a quarrel unto it became an unnecessary matter of life or death.


It also demonstrates, as certainly as his later near-death experience with Lawrence three decades later, one of his virtues: a physical fearlessness so absolute as to make men confident in themselves simply because they were associated with Old Hickory.

Quote of the Day (Abraham Lincoln, on Those Who “Have Borne the Battle”)

“With malice toward none, with charity for all, with firmness in the right as God gives us to see the right, let us strive on to finish the work we are in, to bind up the nation's wounds, to care for him who shall have borne the battle and for his widow and his orphan, to do all which may achieve and cherish a just and lasting peace among ourselves and with all nations.”—Abraham Lincoln, Second Inaugural Address, March 4, 1865


Abraham Lincoln was dead three years before the first Memorial Day was proclaimed by General John Logan, national commander of the Grand Army of the Republic, in May 1868, but had he lived he would have found something memorable to say upon the occasion.

After all, we have not only this brief, poignant reference to the fallen soldiers and their survivors in the Civil War, but also the magnificent Gettysburg Address, and this ringing closing sentence from his First Inaugural Address, on the binding ties between the living and the dead who made it possible for us here in the United States to celebrate our freedoms this weekend:

“The mystic chords of memory, stretching from every battlefield and patriot grave to every living heart and hearthstone all over this broad land, will yet swell the chorus of the Union, when again touched, as surely they will be, by the better angels of our nature.”

Sunday, May 29, 2011

This Day in Literary History (Party Girl Mystery Leads to O’Hara’s “Butterfield 8”)

May 29, 1931—Starr Faithfull had a name practically made for the tabloids, and what happened to her in the hours following this date—the last one on which she was seen alive—assured that she would remain there for weeks.

The 25-year-old beauty, who lived with her family in New York’s Greenwich Village, now went missing from an ocean liner off Long Beach, Long Island, only to turn up just after dawn on June 8—fingernails painted bright red, wearing an expensive fitted black-and-white dress, but face down, seaweed tangled in her hair, dead.

The condition of the body—badly bruised—would, by itself, have been enough to guarantee the interest of law enforcement. But it also turned out that Starr’s liver was found to contain the barbiturate veronal, and that she had kept a diary in which her sexual history—including flings with numerous men—was exhaustively detailed.

With her bobbed hair, taste for liquor and disregard for sexual mores, she seemed to embody an F. Scott Fitzgerald flapper. But just a bit of probing showed that, in one crucial respect, she resembled less one of those "flaming youth" than Fitzgerald’s Nicole Diver in Tender Is the Night: both were victims of child molestation.

These facts, coupled with a slow news cycle, guaranteed that the Starr Faithfull case would hit not only the tabloids during the first half of the summer of 1931, but also even the pages of the good, gray New York Times. Four years later, John O’Hara would transform her life and sad end into a bitter dissection of sex and class in Prohibition-era Gotham, Butterfield 8. If the latter title rings a vague bell, it might be from the obituaries several weeks ago for Elizabeth Taylor, recounting that her first Oscar came for her performance in the 1960 adaptation of the O’Hara novel.

Gloria Wandrous, the protagonist of O’Hara’s only roman a clef, or tale based closely on a real-life incident (he normally used the general details of a person’s life while transforming the externals), is seduced as a child first by a friend of an uncle, then by a heavyset school principal.

The facts in the Starr Faithfull case were even more sensational. When reporters burst into the apartment Starr shared with her family on St. Luke’s Place, a few doors down from New York Mayor Jimmy Walker, they discovered in her diaries a reference to a cousin of her mother’s--identified as AJP—who had molested her with the aid of ether while she was a child.

“AJP” turned out to be Andrew J. Peters, who had quit his post as Assistant Secretary of the Treasury to run for Mayor of Boston. Little could anyone have imagined that this successful “reform” candidate, who won City Hall back from the clutches of Irish-American boss James Michael Curley, was, in his personal life, far more vile than his opponent--a cold man who paid off Gloria’s parents--downwardly mobile relations of his--to hush up the explosive scandal.

Part of the money was used to send the young girl on transatlantic voyages, where she developed a taste for high living. None of it, however, managed to still her mounting desperation. "I am playing a dangerous game," Starr wrote a friend shortly before she disappeared. "There is no telling where I'll land."

Just how she landed on Long Beach became the concern of Nassau County's District Attorney Elvin Newton Edwards. Over the next few weeks, the D.A.'s explanations for the event shifted like the sands on which the tragic young woman was found.

First he announced that she had been killed on the liner by two men (one a prominent politician), then taken out in a boat and tossed overboard; then that she was knocked unconscious aboard a boat, then thrown into the water. Then, Edwards announced, the politician was “practically eliminated” from consideration as a suspect. Later, his office turned their attention to "Chicago gangsters" and even questioned publisher Bennett Cerf. Finally, Edwards focused on suicide as the probable cause of death.

Many people, Edwards announced at one point in the investigation, were glad that Starr was dead. In any event, the police ended up destroying her potentially explosive diaries, and nobody was ever charged in the case. The case of Starr Faithfull remains unresolved to this day.

Surprisingly, outside of O'Hara, few writers have treated what was once one of New York's most sensational cases. Gloria Vanderbilt (who, as a child, would have known many of the adult high-society types who would have crossed paths with Starr) imagined what her missing diaries were like in the fictional The Memory Book of Starr Faithfull (1994). The one extensive nonfiction treatment of the case appears to be Jonathan Goldman's The Passing of Starr Faithfull.

Flashback, May 1911: Hubert Humphrey, Herald of “Sunshine of Human Rights,” Born)

Hubert Horatio Humphrey, a legislative giant who achieved more in the U.S. Senate than a number of Presidents have in the Oval Office, especially by calling on Americans to “get out of the shadow of states’ rights and walk forthrightly into the brighter sunshine of human rights," was born on May 27, 1911 in South Dakota to a druggist and his wife. Upon the death of his father, young Humphrey—now in Minnesota—took over the small family business, and customers came to like the young man.

So far, the story sounds like the early trajectory of George Bailey in It’s a Wonderful Life, except that Humphrey was able to achieve many of his dreams—and, through his life’s work, helped more than a few others achieve theirs.

Friday’s New York Times included an excellent op-ed piece by Rick Perlstein on “America’s Forgotten Liberal.” "January was the 100th anniversary of Ronald Reagan’s birth, and the planet nearly stopped turning on its axis to recognize the occasion,” Perlstein writes (albeit slightly inaccurately: The Gipper’s centennial actually occurred on February 6—an error that the eagle-eyed copyditors behind “All the News That’s Fit to Print” still haven’t gotten around to correcting.) “Today is the 100th anniversary of Hubert H. Humphrey’s birth, and no one besides me seems to have noticed.”

Actually, I had intended to write about Humphrey’s centennial for awhile now. Humphrey is the type of leader who, in ways both ideological and personal, has now largely faded, if not entirely disappeared, from the national political scene, for reasons beyond even what Perlstein astutely notes.

Unfortunately, the lack of attention to Humphrey says as much about many progressives’ historical amnesia as it does about Americans’ as a whole, perhaps even more so. At the height of his career, when the Oval Office was within the grasp of Lyndon Johnson’s Vice-President in the closing days of the tumultuous 1968 campaign, the American left, judging him a Johnny-come-lately to the effort to end the Vietnam War, boycotted the election in large enough numbers that they were able to assure the political resurrection of Richard Nixon—and with it, four more years of the war they abominated, not to mention Watergate.

Much campaign journalism, from the likes of Theodore H. White and Hunter Thompson, examined the Minnesotan on the hustings in these and the three other Presidential races he mounted, but he has achieved little of the scholarly interest that Reagan has attained. You might be able to count on the fingers of one hand all the biographies that focus only on him.

But, if you want a vivid idea of what Humphrey achieved—and of the isolation he had to endure, first from the right, then from the left, to gain it—there are two other vivid accounts you can read where, though he is not the sole focus of the writers, he plays important roles. One is the “Orator of the Dawn” chapter in the third volume of Robert A. Caro’s Years of Lyndon Johnson biography, Master of the Senate. The other work is Robert Mann’s excellent history of the 16-year effort to enact meaningful civil-rights legislation, The Walls of Jericho: Lyndon Johnson, Hubert Humphrey, Richard Russell and the Struggle for Civil Rights.

The Caro biography lays out the painful price that Humphrey paid, after his 1948 election to the Senate, from Southern colleagues for his memorable advocacy of a civil-rights platform at the Democratic Convention that year. That segregationist bloc, led by Richard Russell of Georgia, consistently isolated this most gregarious of politicians, helping to assure that most of his legislative initiatives went nowhere.

Over time, Johnson, as Majority Leader of the Senate, managed to bring Humphrey in from the cold by relentlessly preaching the necessity of compromise, enabling the Minnesotan to forge relationships with the Southern bloc, enough so that he could begin to pass most of his non-civil rights bills. By 1964, LBJ—now esconsed in the White House—used Humphrey—now Senate Majority Whip—as his indispensable floor manager for the Civil Rights Act of 1964.

Humphrey’s partnership with LBJ led to his greatest political triumphs, as well as the deepest disappointments of his long public career. The Vice-Presidency, a thankless office to begin with, became for Humphrey the graveyard for his own longstanding Presidential ambitions, as LBJ insisted on repeated public support for the Vietnam War that was at variance with his longtime colleague’s private views. “Dump the Hump” began to be hurled at him by anti-war protesters.

When Humphrey finally broke with the President on the issue with a month left before polling, the far left judged it too little, too late. (Like Achilles sulking in his tent, Eugene McCarthy didn’t get around to endorsing his state colleague in the Senate as his party’s Presidential standard-bearer until a week before the election.)

The war sparked much of the left’s harshest criticism of Humphrey, but it wasn’t the only point on which they faulted him. As time went on, they regarded him with growing condescension. His call for a “politics of joy” was derided by those thrusting their fists into the air. Even the humble beginnings (watching his father “grinding his life away between unpaid bills and unpaid accounts” before finally having to sell the family home) that spurred his liberal fighting faith led to a mocking nickname: “Drugstore Liberal.”

Perlstein bemoans the passing of Humphrey’s brand of economic liberalism in favor of the free-trade, looser-regulation brand favored by Bill Clinton, but progressives are culpable in another manner for the passing of his type. From the 1970s through this decade, Democrats consistently marginalized pro-life Democrats, even those who, on every other conceivable measure of economic security were well within the party’s liberal wing. The apotheosis of that effort came when Kathy Taylor, a member of Pro-Choice Republicans for Clinton, was able to stand before the 1992 Democratic National Convention, but not pro-life Governor Robert Casey of Pennsylvania.

One last point: Humphrey’s deep generosity of spirit, an attitude increasingly gone missing on Capitol Hill. According to Caro, LBJ regarded Humphrey’s willingness to let bygones be bygones, to forgive his adversaries, as a weakness. But it also led to two of the finest moments of his life.

Mann’s biography recounts Humphrey’s dogged effort to make it to the 1971 funeral of Russell. Humphrey could have simply used the wicked storm that day not to attend final services for the colleague who had called him “a damn fool” when he first came to the Senate. But Humphrey’s plane braved the storm, making him the only Senate colleague to witness Russell’s burial.

Seven years later, Humphrey’s sensitivity to others led him even to look out for his opponent in the 1968 campaign, Richard Nixon. Humphrey blamed the GOP for a back-channel effort that led South Vietnam’s Nguyen Van Thieu to reject LBJ’s peace plan in the closing days of the election and assure a Democratic defeat.

But in the months before he died of cancer, Humphrey—now lionized by his colleagues—reached out to Nixon, by this point still in disgrace more than three years after being forced from the Presidency, to invite him to his funeral. Nixon showed up—the first time he came back to Washington after his ignominious departure.

There are two quotes, I think, that describe the essence of Humphrey. The first explains what he fought for all his life: "The moral test of a government is how it treats those who are at the dawn of life, the children; those who are in the twilight of life, the aged; and those who are in the shadow of life, the sick, the needy, and the handicapped."

The second describes the art of compromise that made him anathema to many in his own time and that, in the age of screaming blogs and cable channels of all ideological stripes, would make him even more suspect in today’s environment: "If I believe in something, I will fight for it, with all I have. But I do not demand all or nothing. I would rather get something than nothing."

Quote of the Day (St. Philip Neri, on Going to Extremes)

“If you wish to go to extremes, let it be in sweetness, patience, humility and charity.” – Attributed to St. Philip Neri (1515-1585)

Saturday, May 28, 2011

Quote of the Day (Walker Percy, on His Hopes for Man and Himself)

“For what do I hope? Short-term goal: that man can survive himself long enough to explore the infinite potential of himself and the world around him. If he can last another fifty years, he might make it. Personal goal: to survive my own bad habits.”—Walker Percy, quoted by Robyn Leary, “Surviving His Own Bad Habits: A Previously Unpublished Interview with Walker Percy,” Doubletake Magazine, Winter 2000


The novelist Walker Percy (The Moviegoer, Lancelot) was born on this date in 1916.

Friday, May 27, 2011

Photo of the Day: Fleet Week

This Memorial Day weekend, as usual, midtown Manhattan will be crawling with men in uniform. Here are just a few of those I noticed on my way home from work today.

Quote of the Day (Edward Hirsch, on “My Heart and My Head”)

“I lived between my heart and my head,
like a married couple who can’t get along.”—Edward Hirsch, “Self-Portrait,” in The New Yorker, October 18, 2005

Thursday, May 26, 2011

Flashback, May 1911: Mahler, Venice Spark Mann’s “Death in Venice”

The place of the artist in society was even more on the mind of German novelist Thomas Mann than usual when he departed for Venice with wife Katia and brother Heinrich. Just before leaving, he’d read in the newspapers about the death of Europe’s most famous conductor, Gustav Mahler, whom he had met several times before (including a dress rehearsal for Mahler’s Eighth Symphony only the year before).


The memory of the composer, along with what Mann was about to see in Italy’s famed gondola city—misplaced luggage, a cholera epidemic, dismal weather, a heavily rouged, dissolute old man, and most of all, a youth whose good looks seemed like something out of Greek mythology—coalesced into Der Tod in Venedig (Death in Venice), a masterly portrait of an artist—an entire civilization—yielding, after decades of strenuous effort, to disintegration and death.

The plot of Mann’s work is wispy: A Munich novelist, a fiftyish widower left with all the time he wants to commit to his craft, is suddenly seized by a desire to travel. He journeys to the Lido, a resort across the lagoon from Venice, where he falls in love with a beautiful Polish boy whom he eyes from afar but never speaks to. That obsession is so strong as to lead the novelist to disregard well-founded rumors of a cholera outbreak in the city. Just as the boy and his family are due to leave the hotel, the novelist dies while watching him on the beach.

The novella hardly matched Mann’s Buddenbrooks or The Magic Mountain in size, but what it lacked in heft it more than made up in density. Nearly every sentence in Death in Venice feels heavy—with symbolism, with philosophy, with the author’s barely suppressed desires, with the weight of Western history.

Consider these three sentences, in the middle of the very first paragraph:

“It was a spring afternoon in that year of grace 19--, when Europe sat upon the anxious seat beneath a menace that hung over its head for months. [Author Gustav von Aschenbach] had sought the open soon after tea. He was overwrought by a morning of hard, nerve-taxing work, work which had not ceased to exact his uttermost in the way of sustained concentration, conscientiousness, and tact; and after the meal found himself powerless to check the onward sweep of the productive mechanism within him, that motus animi continuus in which, according to Cicero, eloquence resides.”

Let’s take this apart:

* The “menace that hung over its head for months” was the Second Moroccan Crisis (also known as the Agadir Crisis), an incident involving gunboat diplomacy which put Germany at odds with both France and Great Britain. Suddenly, the political order that Prince Metternich had fashioned nearly a century before at the Congress of Vienna had become hopelessly frayed, so that now the very thing he feared—a continental bloodbath involving a militaristic regime with Napoleonic instincts—had become all too thinkable. Europe was about to yield to its deepest, most troubling, atavistic instincts.

* The walk in “the open soon after tea” was Mann’s daily ramble, which he felt strengthened him for answering his correspondence at night.


*The “hard, nerve-taxing work” resembled Mann’s own, which sometimes left him so exhausted that he had to travel abroad for rest.


* “Powerless to check” is used ironically here—suggesting in this context the author’s creative powers, but coming to mean, by the conclusion of the story, irrationality and disease.

Mann particularly admired Leo Tolstoy for his early, realistic work, and you can see this influence in that no detail seems to have escaped the him. In the late 1960s, a full decade after the death of the Nobel Prize laureate, a 68-year-old Polish count convinced Mann’s daughter that he was the inspiration of the Polish youth, Tadzio. The count, who, after a number of years, had finally come across this famous story in translation, marveled at the exactness with which Mann rendered his clothing, his family, even a fight with a companion.


Yet Tadzio and everyone else in this story fades in comparison with Aschenbach himself. Much of this three-dimensional character derives from multiple sources: philosophy, music, Mahler, and Mann himself.


Begin with his name: "Gustav" comes from the composer, as does the character's aquiline features. "Von," we learn from the first sentence of the book, came to him only at age 50--meaning the author does not come from the aristocracy but has now been ennobled, because of his literary reputation. "Aschenbach" means "stream of ashes," an allusion to the desire-driven death that the character is about to undertake.


An interesting question comes to the fore concerning Aschenbach's state of mind: Is he simply a believer in the ancient Greek ideal of love for a younger man--an ultimately chaste affection--or is something else at work here? I think Mann means us to see this as something darker.


The question, as with so much else surrounding this work, takes on additional dimensions because of what we now know about Mann. His diaries, made public a couple of decades after his death, reveal that he was a closeted homosexual--or, as he put it, in the grip of an "inversion." Marriage and the presence of four sons allowed him, to an extent, to sublimate these feelings, but they never left. The novelist knew he had to master them, or he would risk disgrace.


Aschenbach's fate can be seen, in part, as Mann's warning to himself about where his instincts might lead if given free rein. Upon his arrival in Venice, he recoils at the presence of a man with a rouged face and dyed hair. Before the end, however, in his attempt to secure the attention of the young Tadzio, he has submitted to the same cosmetic treatment. Furthermore, Tadzio's family, noticing the inordinate attention their son is receiving, take steps to put distance between themselves and the novelist.


There is one other creative work out of Germany in the years since Mann’s novella that echoed its predicament of a middle-aged man, long leading the life of the mind, who becomes undone by an unexpected obsession with an inappropriate object of desire. Yes, I mean the 1930 film that put Marlene Dietrich on the map, The Blue Angel—which was based on a 1905 novel, Professor Unrat, written by none other than Heinrich Mann.


Sigmund Freud's Civilization and Its Discontents would not be published until nearly two decades after Death in Venice, but Mann had already anticipated, in fictional form, the theories of the founder of psychoanalysis on the enormous restraints placed by civilization upon the primitive instincts for sex and death at the heart of individuals. Mann, who wrote about artists and intellectuals throughout his career in such works as Buddenbrooks, The Magic Mountain, and Doctor Faustus, saw these instincts particularly at work in creators such as himself.

Quote of the Day (Ralph Waldo Emerson, on Why “Imitation Is Suicide”)

‘There is a time in every man's education when he arrives at the conviction that envy is ignorance; that imitation is suicide; that he must take himself for better, for worse, as his portion; that though the wide universe is full of good, no kernel of nourishing corn can come to him but through his toil bestowed on that plot of ground which is given to him to till. The power which resides in him is new in nature, and none but he knows what that is which he can do, nor does he know until he has tried…. We but half express ourselves, and are ashamed of that divine idea which each of us represents. It may be safely trusted as proportionate and of good issues, so it be faithfully imparted, but God will not have his work made manifest by cowards.”—Ralph Waldo Emerson, “Self-Reliance,” in Essays, First Series (1841)

Wednesday, May 25, 2011

Quote of the Day (JFK, on the Race to the Moon)

“I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the earth. No single space project in this period will be more impressive to mankind, or more important for the long-range exploration of space; and none will be so difficult or expensive to accomplish. We propose to accelerate the development of the appropriate lunar space craft. We propose to develop alternate liquid and solid fuel boosters, much larger than any now being developed, until certain which is superior. We propose additional funds for other engine development and for unmanned explorations--explorations which are particularly important for one purpose which this nation will never overlook: the survival of the man who first makes this daring flight. But in a very real sense, it will not be one man going to the moon--if we make this judgment affirmatively, it will be an entire nation. For all of us must work to put him there.”—President John F. Kennedy, “Special Message to the Congress on Urgent National Needs,” May 25, 1961

In 1992, historian Michael Hart wrote a history in which he ranked the 100 most important people in history. John F. Kennedy made it to #81--something that probably astonished American historians of the last few decades, who have tended to consider him a good but not great chief executive. (They would have been especially surprised that Abraham Lincoln, long at #1 or #2 in all the Presidential polls, well ahead of JFK, did not make the elite 100 people worldwide.)

Hart named JFK to his list not because of his civil-rights advocacy, or for steering the U.S. past nuclear confrontation with the Soviet Union, or for becoming the first non-Protestant to become President. No, JFK earned the ranking for pushing through the manned space program.

In an age of greater fiscal austerity and of fading memories of the Cold War, Kennedy’s call for a manned landing on the moon has lost considerable luster. Historian John Logsdon, whose new book on Kennedy and the space program was featured prominently in John Noble Wilford’s New York Times article on the anniversary of this event yesterday, conceded in the article that the “impact of Apollo on the space program has on balance been negative.” Even the liberals who formed the core of Kennedy’s political base came to feel, a decade after his rallying message to Congress, that the money might have been better spent on matters close to earth--notably, anti-poverty programs.

I’m somewhere halfway between Professors Hart and Logsdon in how I consider Kennedy’s clarion call. On the one hand, the manned lunar landing undoubtedly did divert federal money from more cost-effective, unmanned space efforts, not to mention social programs. On the other hand, only a project so audacious--and one with underlying tensions with the Soviet Union very much in play--could have engaged the public’s attention so tenaciously.

In the end, I think, we celebrate the lunar landing--and Kennedy’s part in pushing for it--for a reason from anything related to something as mundane as scientific results. No, this might have been the last time, for the last several decades, that Americans would see that their country could a) conquer a challenge once it had been identified, or b) lead other nations in an area that, for more than a century, had been a strength: technological know-how.

Tuesday, May 24, 2011

This Day in Film History (“Thelma and Louise” Premieres as Cultural Flashpoint)

May 24, 1991—Fresh from a triumphant reception at the Cannes Film Festival, Thelma and Louise was released to U.S. theaters. Other films made more money that year, but few sparked more water-cooler debates or critical commentary than this movie that dramatized unilateral female punishment of male violence.

(In addition to debates and commentaries, the female “buddy” movie starring Susan Sarandon and Geena Davis also inspired parody, in the form of an episode the following year of the TV dramedy Sisters. In this particular episode, Swoosie Kurtz’s Alex, traumatized by a mugging, seeks out the services of a female self-defense instructor with definite ideas on how to render the male of the species harmless. Her name--wink, wink—is Thelma Louise.)

The central dilemma of Callie Khouri’s Oscar-winning Best Original Screenplay—the fateful consequences of waitress Louise’s decision to shoot—execute, really—Thelma’s would-be rapist, might seem the product of the late 20th century, impossible to imagine without the social and legal revolution following Susan Brownmiller’s Against Our Will.

But in many ways, the film is a throwback to films of the Thirties and Forties. I don’t mean simply the previously mentioned “buddy” film (at points, the farcical elements make the two female friends a distaff version of Crosby and Hope), nor the screwball comedy (the absurdities pile higher and higher the farther the women drive in their ’66 Thunderbird), nor even the more durable and adaptable “road” movie.

No, I’m talking about crime films early in the talkie era, years of desperation in which ordinary Americans, with no other recourse, took to a life of crime. Thelma, struggling to make sense of their experience, finally blurts it out to Louise: “You know, something’s, like, crossed over in me and I can’t go back, I mean I just couldn’t live.” If you want a good example of what I mean, catch sometime the 1937 Fritz Lang film You Only Live Once, starring Henry Fonda and Sylvia Sidney as doomed lovers forced to go on the run.

This type of movie, like the later Bonnie and Clyde, features escapes deep into the American Heartland. Danger bonds the two lovers ever more tightly together.

And so it is again, a half-century later, in this drama directed by Ridley Scott. Originally set simply to be executive producer, the creative force behind Alien took the helm of the film himself when he realized that nobody else shared his vision that this material formed, in a way, an American epic.

In the patriarchal America of this film, every man is found wanting in some way, either commitment-phobic (Louise’s eternal boyfriend, the musician Jimmy), infantile (Thelma’s husband Darrell, who likes his wife to stay quiet while he takes in football games on the TV), deceitful (Brad Pitt’s career-making role as easygoing bank robber J.D.), or exploitative and violent (Thelma’s attacker). The one sympathetic figure, Harvey Keitel’s law-enforcement officer, is, at least as far as his power goes to save the women from jail time, impotent.

And so, Thelma and Louise can only depend on themselves. Before they drive off the cliff, in their own version of Butch and Sundance’s decision to go out shooting against a Latin American posse waiting for them, the two longtime friends declare their mutual love—something they haven’t been able to enjoy in a world outside deeply inimical to their elementary desires.

This past March, Vanity Fair published a 20-year retrospective of the film, “Ride of a Lifetime,” by Sheila Weller. Among the juicy tidbits of trivia in the article:

* Holly Hunter, Frances McDormand, Jodie Foster, Michelle Pfeiffer, Meryl Streep and Goldie Hawn were all considered for the two roles that went to Sarandon and Davis.

* Billy Baldwin beat out Pitt for two roles that year. He gave up his role in Thelma and Louise to take on the second of the two movies, Ron Howard’s Backdraft. When he saw what Pitt did with J.D.--and the film’s success--he was undoubtedly sorry for his decision.

* As he has continued to do with females the world over in the two decades since, Pitt had a discombobulating effect on Davis. She had performed perfectly well with a couple of actors who preceded him, but when he auditioned he was so cute that she kept flubbing her lines. Finally, as Scott and the casting director considered the choices, she couldn’t resist chiming in: “The blond one--duh!”

Thelma and Louise, as I indicated earlier, though it did well enough at the box office, with a gross nearly three times its estimated budget, was hardly the greatest blockbuster that summer (Robin Hood: Prince of Thieves and Terminator 2 grossed five and ten times, respectively, its U.S. total).


But it was a much greater cultural landmark than those other movies. Later in 1991, the Clarence Thomas confirmation hearings would take up the issue of how far a male could go with a woman. But Thelma and Louise had already fired the first feminist shot across the bow.

Song Lyric of the Day (Bob Dylan, Sensing Time “Running Away”)

“Shadows are falling and I've been here all day
It's too hot to sleep, time is running away.”--Bob Dylan, “Not Dark Yet,” from his Time Out of Mind CD (1997)

He’s always been hard-headed about mortality (see “I’ve seen pretty people disappear like smoke,” from the Blood on the Tracks tune “Buckets of Tears”). But on Time Out of Mind, Bob Dylan displayed a bone-deep sense of his own mortality.

Contrary to popular belief, the whole album had been written before Dylan was treated at the hospital in May 1997 for histoplasmosis, a fungal infection that can lead to pneumonia in rare, untreated cases. No matter: like T.S. Eliot in “The Love Song of J. Alfred Prufrock,” Dylan had “seen the eternal Footman hold my coat, and snicker,/And in short, I was afraid.”

Today, Dylan turns 70. Given the rock ‘n’ life and the constant grind of his never-ending tour (one that's badly frayed his voice) since the Eighties, there’s an element of luck in his survival, but also, maybe, an act of defiance against those who’ve counted him out over the years.

After all, as Alex Ross noted in a New Yorker summary of his life to date in 1999, “The Wanderer”: “If you look through what has been written about Bob Dylan in the past 30-odd years, you notice a desire for him to die off, so that his younger self can assume its mythic place.”
Never mind: focus on the songs--nearly 1,000 of them, in the course of his five-decade career. I think you’ll see they richly merit the tribute Bruce Springsteen paid his elder at the 1988 induction ceremony at the Rock ‘n’ Roll Hall of Fame:

“He was a revolutionary, man, the way that Elvis freed your body, Bob freed your mind. And he showed us that just because the music was innately physical, it did not mean that it was anti-intellect. He had the vision and the talent to expand a pop song until it contained the whole world. He invented a new way a pop singer could sound. He broke through the limitations of what a recording artist could achieve, and he changed the face of rock and roll forever and ever.”

Monday, May 23, 2011

Quote of the Day (The Late Leader of Ireland, Balancing Theory and Practice As Only the Gaels Can)

“That’s fine in practice, but will it work in theory?”--Garret Fitzgerald (1926-2011), onetime economist and former Premier of the Republic of Ireland, assessing the value of an idea, quoted in Alan Cowell, “Garret FitzGerald, Ex-Irish Premier, Dies at 85,” The New York Times, May 20, 2011

Sunday, May 22, 2011

This Day in Senate History (Sumner, Slavery Opponent, Attacked by Colleague in Chamber)

May 22, 1856—One of the first signs of the violent convulsions that would seize the nation in a few short years over slavery occurred on a small but horrifying scale right in the chamber of the United States Senate, as Charles Sumner of Massachusetts was mercilessly caned while sitting at his desk by a Southern member of the House of Representatives in retaliation for a stinging attack on a colleague of the abolitionist Republican. The differing reactions of the two sections—the North horrified by the violence, the South lionizing its perpetrator—foretold the irreconcilable differences that would result in the Civil War.

The January shooting of Arizona Representative Gabrielle Giffords sparked much initial talk that an epidemic of incivility had directly provoked the attack. Subsequent revelations about assailant Jared Loughner’s state of mind led to a reconsideration of that suggestion.

In the case of the assault on Sumner, no such ambiguity existed. The assailant, Representative Preston Brooks of South Carolina, was not only clearly heard telling Sumner why he was about to beat him, but also justified his actions in a speech to colleagues afterward.


The train of events leading to this shocking incident began with the Senate’s passage of the Kansas-Nebraska Act in 1854. Settlers in the two new territories would have the opportunity to vote on whether they wanted their state slave or free.

But if the measure’s architect, Illinois Senator Stephen A. Douglas, hoped that this doctrine of popular sovereignty would settle at the ballot box the issue now roiling the nation, he was badly mistaken. Northern voters began incensed by the notion that the Missouri Compromise of 1820, which had kept slavery out of certain designated territories west of the Mississippi, had now been effectively superseded, so that it would now be impossible to halt the spread of “the peculiar institution” and its threat to free white labor anywhere in the U.S.

Before long, violence had broken out in Kansas, as both sides in the slavery dispute rushed in to

try to influence voting. That prompted Sumner to take to the Senate floor for two entire days to denounce “The Crime Against Kansas.”

That violation, declared the senator, had been aided and abetted right within the Senate chamber. One source was Douglas, whom Sumner characterized as a "noise-some, squat, and nameless animal . . . not a proper model for an American senator." The “Little Giant” was annoyed enough to whisper to a colleague, "that damn fool will get himself killed by some other damn fool," but otherwise he was content not to do or say anything.

Would that another target of Sumner’s rhetoric has stayed similarly philosophical. At this point, Douglas launched into a withering tirade against South Carolina Senator Andrew P. Butler. The Southerner, he noted, had embraced a “harlot”--slavery.

That proved too much for Brooks, a relative of Butler. He was angered not just that Sumner had maligned not only an entire area, but also a politician too old and sickly to attend the speech.

Had Sumner been a Southerner, Brooks might have challenged him to a duel, where the Northerner would have at least a change of defending himself. But, since he was a Yankee, Brooks dismissed him as being beyond the code of chivalry. He’d have to handle him another way.

That “way” was physical violence. Colleagues might have sensed as much if they noticed Brooks' cane, one he had used during a duel he had fought in 1840. As Sumner sat at his desk, using his frank privilege to send copies of his speech, Brooks entered. He then proceeded to strike him so viciously that the cane broke.


The differing reactions of the two sections epitomized overall opinions in the nation on the slavery issue. While the North denounced Brooks and members of the House tried unsuccessfully to oust him, Southerners sent him canes to replace the one he brought during the beating.


Sumner was forced to leave the Senate for three years to recuperate from his injuries, but he lived long enough to see the downfall of slavery. Ironically, Brooks died within the year, joining in death the man whose reputation he defended in the most violent way, Butler.

Theater Review: “Lombardi,” by Eric Simonson

I really would have kicked myself if I hadn’t seen Lombardi at Broadway's Circle in the Square before it closed this weekend. No, I couldn’t compare it to the classic dramas (particularly in its early history) that the theater once premiered (notably Eugene O’Neill’s Long Day’s Journey Into Night).




But, as a graduate of St. Cecilia’s High School in Englewood, N.J., where legendary Green Bay Packers coach Vince Lombardi got his start, I have been fascinated by his life story. Moreover, as an admirer of the source for Eric Simonson’s play, David Maraniss’ marvelous biography When Pride Still Mattered, I couldn’t help but wonder how the material of his life had been transformed.

By the time I was done sitting through the 90 minutes (without intermission) of this play, I could only liken myself to Packer opponents as they faced the team’s famed “power sweep” in their '60s glory days. I knew more or less what was coming, but the whole was so flawlessly executed that I was powerless to withstand it.

(Incidentally, to any other alumni from my high school who might read this: Yes, St. Cecilia’s is named several times in the course of the play. I managed to restrain myself each time it was mentioned.)

Simonson has chosen for the central situation of his play the 1965 season. The Pack have finished out of the running for the last two seasons, after having won two straight. For a ferociously driven coach who has been preaching the importance of winning since he first came to Green Bay in 1959, this situation is not only unacceptable but physically sickening.

Into the scene steps the (fictional) Look Magazine reporter Michael McCormick, appearing at the Lombardi home to write an in-depth profile of the coach. McCormack first tries to establish a relationship with a coach angry over being burned by another magazine writer the prior year, then has to fend off Lombardi’s heavy-handed attempt at news management.

McCormick is more than a stand-in for the audience in coming to understand the famous coach. He’s also a means for exploring one of the show’s themes: the problematic search for fathers. The reporter is following the same career path of his late father, but he had trouble seeing eye to eye with the latter while he was alive.

In a sense, McCormick reenacts the same struggle with Lombardi that he had with his parent: the need to win approval while still establishing his own identity and independence. While that struggle is eternally relevant, it was particularly so during the “generation gap” of the 1960s.
Lombardi himself has his own struggles as a parent. His total concentration on his team makes him a de facto father figure for the players he treats with tough love, but at home he is never there enough for his son and daughter.

It’s a difficult position to have to, in effect, be a dramatic device, but as McCormick, Keith Nobbs acquitted himself well in making his character three-dimensional. He also brought the house down with what might be the funniest moment in a play filled with laughs: his dead-on impersonation of how Lombardi might have turned out if, instead of continuing his stalled quest for an NFL coaching job, he had accepted a quieter, more family-friendly job as a bank officer (“You want a loan? You think you deserve a loan?”)

I can think of other events in the life of Lombardi more inherently dramatic than the pivotal week dramatized here (the Packers’ classic “Ice Bowl” playoff game against the Dallas Cowboys comes instantly to mind). But these might have missed two of the cornerstone figures of the Packers, their backfield of “Thunder and Lightning” (fullback Jim Taylor and halfback Paul Hornung), both gone late in the Lombardi regime.

Bill Dawes brought a welcome lightness of touch to Hornung, a player with a burning desire to score, on and off the field (in one scene, he chuckles that he has to leave, as he has a stewardess waiting for him).

But Chris Sullivan also delivered in an infinitely dicier role, as the hulking, monosyllabic Taylor. For much of the play, Simonson's script depicted the Hall of Fame running back as tough as a bull and as unthinking (he calls every male he meets “Roy” because he can’t remember their real names).


Yet three-quarters through, Sullivan made the most of his longest set of lines in the play, pouring out the back’s mixture of pride and anger in arguing unsuccessfully with Lombardi for a salary that would not only compensate him adequately for past accomplishments but also for the endless injuries and physical punishment he absorbed stoically for years.

Dan Lauria, most familiar to fans as Kevin Arnold’s father on the comedy The Wonder Years, brought out nearly every facet of this most complicated of human beings--in the words of McCormack, “the most imperfect perfect man I ever knew.” A former high-school football coach himself, he understood how to convey the combination of intelligence, charm and sheer bluster Lombardi needed to transform a team of losers into a squad with five NFL titles and two Super Bowl championships. The performance alternated, in reenacting the coach's legendary training-camp rants and exhortations, some of the most astonishing displays of lung power in this or any other Broadway season with displays of unconscious vulnerability. (Stress made Lombardi continually sick, triggering the colon cancer that struck him down at only age 57.)

Fully a match for Lauria was Judith Light, whose role as wife Marie Lombardi was a far cry from “Who’s the Boss?” Drawing herself up to her full height, filling the room with the most gin-soaked, smokiest voice this side of Elaine Stritch, she could have settled for the kind of tough, wisecracking woman in which Eve Arden specialized in the Forties. But she let you see the fragility that the ripostes can’t fully disguise—devotion to a man whose commitment to winning left his wife in the middle of nowhere, with precious little comfort except for the martinis that leave her increasingly unsteady. Light fully deserved her recent Tony Award nomination.

I’ve never made it out either to the Pro Football Hall of Fame or the Packers’ Lambeau Field, but I felt as if I were approaching the next best thing in the mini-shrine to the Lombardi-era Packers erected in the theater’s lobby. It was filled with all kinds of collectible items, including the jerseys worn by Taylor, Hornung, quarterback Bart Starr and linebacker Ray Nitschke, footballs signed by the Packers, and Super Bowl rings (including the image accompanying this post).
Lombardi was hardly groundbreaking theater, but in its rich exploration of the complicated coach that McCormick mourns as “the most imperfect perfect man I ever knew,” it was a solid and moving character study. Like the Maraniss biography, it kept in exquisite balance its view of a man consumed by a pursuit of perfection he knew he could never achieve.

Bonus Quote of the Day (J.D. Moehringer, on Vegas, Boom and Bust)

“Vegas is America. No matter what you read about Vegas, no matter where you read it, this assertion invariably pops up, as sure as a face card in the hole when the dealer’s showing an ace. Vegas is unlike any other American city, and yet Vegas is America? Paradoxical, yes, but true. And it’s never been more true than during these past few years. Vegas typified the American boom—best suite at the Palms: $40,000 a night—and Vegas now epitomizes the bust. If the boom was largely caused by the housing bubble, Vegas was bubble-icious. It should be no surprise, therefore, that the Vegas area leads the United States in foreclosures—five times the national rate—and ranks among the worst cities for unemployment. More than 14 percent of Las Vegans are without work, compared with the national rate of 9.5 percent.”—J.D. Moehringer, “Las Vegas: An American Paradox,” Smithsonian Magazine, October 2010


Many of my work colleagues, now out in Sin City for our annual convention, will relate to this piece—which, judging by a number of reader comments on Smithsonian’s Web site, has a lot of Las Vegas residents in a real snit.

As someone who has visited Las Vegas several times over the years, I found myself continually agreeing with J.D. Moehringer, a Pulitzer Prize-winning journalist who has gone on to write a fine memoir of his own (The Tender Bar) as well as to collaborate with Andre Agassiz on his. But judge for yourself. I think you'll enjoy his dry wit and sharp eye for detail.

Quote of the Day (St. Therese, on Prayer as a “Cry of Gratitude and of Love”)

“Prayer is, for me, an outburst from the heart; it is a simple glance darted upwards to Heaven; it is a cry of gratitude and of love in the midst of trial as in the midst of joy! In a word, it is something exalted, supernatural, which dilates the soul and unites it to God. Sometimes when I find myself, spiritually, in dryness so great that I cannot produce a single good thought, I recite very slowly a Pater or an Ave Maria; these prayers alone console me, they suffice, they nourish my soul.”--St. Therese of Lisieux (1873-1897), Story of a Soul: The Autobiography of St. Therese of Lisieux

Saturday, May 21, 2011

Bonus Quote of the Day (A Rebel Recalls Richmond, the New Confederate Capital, 1861)

“The city was thoroughly jammed—its ordinary population of forty thousand swelled to three times that number by the sudden pressure. Of course, all the Government, with its thousand employs, had come on; and in addition, all the loose population along the railroad over which it had passed seemed to have clung to and been rolled into Richmond with it. Not only did this mania seize the wealthier and well-to-do classes, but the queerest costumes of the inland corners of Georgia and Tennessee disported themselves with perfect composure at hotels and on the streets. Besides, from ten to fifteen thousand troops were always collected, as a general rendezvous, before assignment to one of the important points-Norfolk, the Peninsula, or the Potomac lines. Although these were in camp out of town, their officers and men thronged the streets from daylight to dark, on business or pleasure bent; and the variety of uniforms — from the butternut of the Georgia private to the three stars of the flash colonel-broke the monotony of the streets pleasingly to the eye.

“Hotel accommodations in Richmond were always small and plain, and now they were all overflowing. The Spotswood, Exchange and American held beds at a high premium in the parlors, halls and even on the billiard-tables. All the lesser houses were equally packed, and crowds of guests stood hungrily round the dining-room doors at meal-times, watching and scrambling for vacated seats. It was a clear case of ‘devil take the hindmost,’ for their cuisine decreased in quantity and quality in exact ratio to augmentation of their custom.”-- Thomas C. DeLeon, Four Years in Rebel Capitals: An Inside View of Life in the Southern Confederacy, From Birth to Death (1890)

More often than not, this blog’s discussion of history focuses on a specific person or group. But, if you’re like me, sometimes you want to see a broader canvass—i.e., not the people who made the major decisions, but how ordinary people lived in extraordinary times. Social history, if you will.

The date that gave rise to this Bonus Question of the Day—the day that Richmond, Va., was designated the capital of the Confederacy—provides just the kind of occasion needed for such a discussion.

I’ve been able to find a number of titles on the Internet by Thomas Cooper DeLeon, but far, far less about his life. What little I’ve pieced together comes from A Richmond Reader, 1733-1983, a fine anthology edited by Maurice Duke and Daniel P. Jordan, and David J. Eicher’s The Civil War in Books: An Analytical Bibliography.

DeLeon claimed to have composed the book “almost immediately” after the war, based on extensive notes. He’s been criticized by historians for his partisan tone and occasional mistakes, but—at least judged by this section on Richmond—this account lives. Any Hollywood scenarist hoping to write about the Confederacy would do well to consult this for local color.

According to Duke and Jordan, DeLeon (1839-1914) was a journalist, novelist and Confederate officer from an important South Carolina family. In the months surrounding the outbreak of the Civil War, he served as a government clerk in Washington before throwing in his lot with the Confederacy—first moving to Montgomery, site of the provisional capital, then to Richmond.

Reading DeLeon’s lively remembrance of Richmond after the new government was transferred from the Deep South reminded me of nothing so much as a more recent history/memoir written years later: Washington Goes to War, by David Brinkley. I’m not talking simply about the striking similarities between the two authors (Southern journalists in their early 20s at the time of the events they recount, in cities below the Mason-Dixon line).

No, what I have in mind is the energetic, jostling, disruptive scene—the exponential growth—experienced by a comparatively sleepy metropolis as it swells to previously unimagined importance—not just as a government center, but as the fulcrum of a military-industrial complex.

The homefront “war” Brinkley witnessed, as a young radio reporter, was World War II. Franklin Roosevelt’s desire to make the United States the “arsenal of democracy” will sound familiar after you’ve just read the paragraphs from DeLeon: soldiers everywhere you looked, mushrooming government bureaucracies, and no vacancies whatsoever to be found at the city’s hotels. (For a comic view of the latter, watch sometime George Stevens’ wonderful 1943 romantic comedy The More the Merrier, starring Jean Arthur and Joel McCrea.) Central to this effort was the building of the Pentagon, the largest office building in the U.S. at that time.

Let’s stop for a moment to consider the issue of accommodations, because they played at least some role in Richmond’s designation as the new seat of government for the Confederacy. After only a few months in Montgomery, the representatives from the seceding states could see that the city was not for them, though the state of Alabama promised to set aside an entire tract of land for the federal district, as had been done with Washington.

Montgomery had nowhere near enough room or creature comforts for the Confederacy’s elected officials, let alone the soldiers who would have to defend it. Once the provisional government experienced the first days of a particularly muggy May, marked by particularly pesty mosquitoes, it couldn’t wait to go elsewhere.

Of all the cities contending for Confederate capital, Richmond was most able to hold an influx of people (though, as we’ve seen from DeLeon’s account, even in this instance it was strained to capacity).

And then there was the institution as vital for the Confederate war effort as the Pentagon would be for America’s in World War II: Tredegar Iron Works. Under owner Joseph R. Anderson, this private firm's foundries and machine shops switched from fabricating cannon and gun carriages for the U.S. Government to becoming “the Mother Arsenal of the Confederacy.” Of the 2,200 cannon produced by domestic sources for the Confederacy, Tredegar would account for half.

For all its advantages, however, Richmond possessed a true Achilles’ heel for the Confederacy: its proximity to Washington. “In selecting Virginia as their battleground, the rebels committed a crowning blunder,” The New York Times argued persuasively. “At Montgomery, its very remoteness would have secured to it a sort of immunity from punishment … but Virginia is not two days’ sail from the centres of population at the North.”

Only lackluster leadership in the Army of the Potomac prevented Richmond from being captured sooner. Even under the likes of Generals McClellan, Pope, Burnside and Hooker, though, Virginia was forced to endure constant invasions and battles, and Richmond held its breath, praying it could forestall the inevitable.

Four years later, as the city lay in smoke and ashes, it was crowded by a far different group: in the words of African-American minister Peter Randolph, they were liberated slaves, “running, leaping, and praising God that freedom had come at last.”

(The image accompanying this post, by the way, comes from Harper’s Weekly, Oct. 19, 1861. It shows an Alabama regiment marching through Richmond’s Capitol Square on their way to join General P.G.T. Beauregard.)

Photo of the Day: Guess It's Okay to Come Out Now, Right?

The evangelicals meant one thing by "rapture," but I thought of another today as I walked around Overpeck County Park in Leonia, N.J.



For my money, there could be a lot worse days for the world to end...

Quote of the Day (Andrei Sakharov, on Human Rights and Scientific Progress)

“International confidence, mutual understanding, disarmament, and international security are inconceivable without an open society with freedom of information, freedom of conscience, the right to publish, and the right to travel and choose the country in which one wishes to live. I am likewise convinced that freedom of conscience, together with the other civic rights, provides the basis for scientific progress and constitutes a guarantee that scientific advances will not be used to despoil mankind, providing the basis for economic and social progress, which in turn is a political guarantee for the possibility of an effective defense of social rights.”—Andrei Sakharov, “Peace, Progress, Human Rights” (Nobel Peace Prize Lecture), December 11, 1975


Andrei Sakharov, Soviet scientist turned dissident, was born on this date in 1921. Though dead for the last two decades, this towering figure continues to provide powerful witness, through the example of his life and work, to the cause of human rights worldwide.

Friday, May 20, 2011

This Day in Art History (1st Rockwell Graces “Saturday Evening Post” Cover)

May 20, 1916--The Baby Carriage marked the first appearance in the Saturday Evening Post for Norman Rockwell--the beginning of a 50-year association between the magazine and artist that would see another 321 of his illustrations land on the cover of that publication.

The 22-year-old artist was enthralled at the prospect of reaching “two million subscribers and then their wives, sons, daughters, aunts, uncles, friends.” But for a long time, he believed that the disapproval of the Post’s legendary (and legendarily brusque) editor George Horace Lorimer would wreck his career before it had a chance to start. Only the constant prodding of a friend finally persuaded Rockwell to travel from Philadelphia and visit the offices of Curtis Publishing Co. to show his work to Lorimer.

To Rockwell’s surprise and delight, Lorimer accepted not only two of his finished paintings for covers (including The Baby Carriage) but also three sketches for future covers, paying him $75 for the work. Those works were immediate hits with the magazine’s subscribers, and over the years artist and magazine became permanently linked in the public mind.

(See my prior post on the last issue of the magazine.)

If you want to better understand the life and times of Rockwell, you can’t do better than visit the Norman Rockwell Museum in Stockbridge, Mass., as I have done a couple of times over the years. Combine it with a trip to the museum and studio of another Stockbridge artist, the sculptor Daniel Chester French, and you will receive a full immersion in Americana.

Rockwell, working in the heyday of mass commercial art, caught the man on the street in his everyday world—at work, at home, at play. The emotions he evoked—heroism, patriotism, sentimentality—often appear to be relics of a bygone era. Yet the sight of these works summon forth more than a glimmer of recognition; they also induce stabs of wistful nostalgia and dreams of a more innocent, more reverend time.

The Norman Rockwell Museum opened in 1993, the successor to a smaller museum on the town’s Main Street that even years ago could no longer accommodate the flood of tourists making their way here. Today, the institution seems to be succeeding in its campaign to nudge the art elite toward appreciating an illustrator who may well have been the country’s best-loved artist.

Like Walt Disney, Rockwell came to stand for a folksy artistic idiom, a nostalgia for an America fading in a postwar age made anxious by totalitarian threats from abroad and self-questioning about our own commitment to liberty from within.

Art historian Robert Rosenblum has saluted Rockwell’s “mesmerizing (and) diverse powers.” Much of that versatility derived from the fact that Rockwell came along at a time of exploding interest in particular art forms.

As Rockwell began his career, boosts in education, prosperity, and leisure time had widened the American middle class. Book and magazine publishing, advertising, and public relations firms sprang into life and became sophisticated in reaching this group. Rockwell was the master who knew how to appeal to these people in whatever form he put his hand to: Saturday Evening Post covers, book illustrations, Hallmark greeting cards, advertisements, calendars, catalogs, commemorative stamps, booklets, and murals.

The America that Rockwell depicted, part of a world he ruefully acknowledged as “falling apart,” was populated by the earnest Boy Scout, the towheaded scamp, the gentle family doctor who made house calls, the winking soda jerk, the praying grandmother, and the unassuming hero back from winning a war.

Rendered in all their wrinkles, broken noses, bent knees, missing front teeth, and protruding bellies, they appear even more striking in the museum’s easel paintings averaging four feet square than on the smaller but more familiar Saturday Evening Post covers. “I guess I am a storyteller,” he observed, “and although this may not be the highest form of art it is what I love to do.” His rural models, unlike more sophisticated urban types, would rather die before trying to be like anyone else, he noted.

Midway through Rockwell’s career, during World War II, politicians and pundits, conscious of the nation’s new worldwide mission and burden, spoke of the “American Century” and “The Century of the Common Man,” sometimes practically in the same breath. The artist’s subjects represented the very epitome of that vision. Their geniality and the realistic style on which Rockwell relied for 60 years (he had long ago abandoned his “James Joyce-Gertrude Stein period,” he chuckled) put the artist out of favor with an art elite in thrall to the rebels and misfits who wanted not to paint the world but to reshape it.

By the end of his life, Rockwell’s American archetypes had expanded considerably. Now as much a symbol of his country as any of his characters, he was increasingly called on to paint the powerful and famous as well as the ordinary person: Dwight Eisenhower, Jack and Bobby Kennedy, John Wayne, Ann-Margret, Ted Williams, and the Apollo 11 astronauts, among others.

More important, the end of his long association with the Post, which had long stipulated that blacks and other minorities could never be depicted in other than menial jobs, unshackled him. Rockwell’s work in the Sixties and Seventies for Look, McCalls, and even the leftist Ramparts became less lily-white and more multicultural, while also giving vent to his growing interest in such issues as poverty and civil rights.

Rockwell’s legendary productivity, resulting as much from a reluctance to say no, led to such stress that more than once he had to be treated for depression, a condition often hidden as a kind of shame in his lifetime.

The hundreds of works on display at the museum testify to the means of escape from his inner turmoil and the source of his creativity: the wider community. This, after all, was a man who admitted to knowing just about everybody who passed on the street below as he looked out the window of his Stockbridge home, a man who delighted in serving on the local dance committee when he wasn’t painting.

In his small New England town, he could see the entire world through fresh, untired eyes. “Don’t artists have an obligation to humanity?” he asked one of his sons. “...Does an artist live in a world by himself? Is it his only obligation to express his own insides, or does he have an obligation to keep?”

“I just painted life the way I wanted it to be,” Rockwell once said. He was being far too modest.

Song Lyric of the Day (Marvin Gaye, on Love and War)

“You see, war is not the answer
For only love can conquer hate.”—Renaldo “Obie” Benson, Al Cleveland, and Marvin Gaye, “What’s Going On,” performed by Marvin Gaye on his LP What’s Going On (1971)


The Motown masterpiece What’s Going On, released 40 years ago today, emerged from personal and national cauldrons. In much of the prior year, Marvin Gaye slipped out of the music business, so distraught over the loss of dear friend and vocal partner Tammi Terrell that he not only thought of leaving the industry entirely, but even tried out for the Detroit Lions training season in a midlife attempt to become a professional football player.

In addition, the return of Gaye’s brother from Vietnam made the singer question the value of hits such as “I Heard It Through the Grapevine” and “Too Busy Thinking About My Baby.”

When Gaye finally returned to the studio, he was brimming with ideas for the most ambitious and challenging album of his career. While the music would blend jazz and classical elements into the now-familiar brand of Motown soul, the lyrics would cover all the churning discontents of the America of his day: the war, inner city poverty and hopelessness, the generation gap, civil rights, and the environment.

A decade before, Gaye had supplied one of Motown Records with one of its first two albums, The Soulful Moods of Marvin Gaye. But now label founder Berry Gordy Jr., normally a shrewd judge of material, dug in his heels against releasing his star’s latest offering, believing it would go nowhere. Not even friend and label exec Smokey Robinson’s assurance that Gaye was serious about his threat to leave the company could sway him.

In the end, what changed Gordy’s mind was something far simpler and familiar: the certain smell of a hit.

Someone leaked to a local DJ the single “What’s Going On,” even while Gordy remained adamant against its release. The song--started by Al Cleveland and Four Tops member Renaldo Benson, after watching the violent crushing of a demonstration, then completed by Gaye--began to be requested continually at the station. With everyone soon asking where the album was, Gordy relented.

Gaye’s self-produced LP did more than hit #6 on the album charts and spawn three top 10 singles, the title cut, “Inner City Blues (Makes Me Wanna Holler)” and “Mercy Mercy Me (The Ecology)”; did more than influence Stevie Wonder, Curtis Mayfield and other artists within the Motown fold in gaining creative control of their work; or, even a personal level, did more than move Gaye, formerly known best for his romantic crooning, onto an entirely new artistic plane.

No, Gaye’s smooth tenor, his mastery of a three-octave range, was put in the service of creative reconciliation, of bringing understanding to a divided country and his own fractured heart. The self-mastery he temporarily achieved was not long lasting (see, for instance, my prior post on his violent, unnecessary death at the hands of his father).


But, standing mere inches from a microphone, he gave unforgettable voice to issues that, two generations later, have not faded in urgency.

Thursday, May 19, 2011

Song Lyric of the Day (Billy Joel, on “The Empire State Laid Low”)

“Seen the lights go out Broadway
I saw the Empire State laid low.”—Billy Joel, “Miami 2017 (I've Seen The Lights Go Out On Broadway),” from his Turnstiles LP (1976)

This is one of those albums that I nearly wore out during my high-school years. Its inexplicable lack of mass appeal (stalling at #122 on the charts) left Billy Joel one remove (The Stranger) away from the beginning of his truly significant commercial success.

But song for song, Turnstiles--released on this date 35 years ago--ranks very nearly at the top of his entire discography. While “New York State of Mind” probably has the greatest potential for becoming a standard, other songs have also been covered by other artists, including “Summer, Highland Falls” (Peter, Paul and Mary) and “Say Goodbye to Hollywood” (Ronnie Spector).

The singer-songwriter wrote this, the concluding song of the LP, after seeing the famous New York Daily News headline: “Ford to City: Drop Dead.” But, to an extent, it also reflected his own circumstances.

“Miami 2017 (I've Seen The Lights Go Out On Broadway)” is a song about desperation and defiance, two attitudes that Joel was getting to know all too well. He had been forced to fire the original producer of the album, James William Guercio (famous for his work with Chicago and Blood, Sweat and Tears) and take the producing chores into his own hands. Coming five years after twin career disasters (an onerous contract, a mastering error that left his first album, Cold Spring Harbor, a bit too fast and his voice a bit too high), this latest setback undoubtedly left him reeling.

In a way, this personal adversity made Joel bond even closer to the beleaguered New York area when he returned from the West Coast and formed a new backup band for himself. His rallying cry for the city is, in fact, expressed in entertainment terms: “They turned our power down/They drove us underground/But we went right on with the show.”

“Miami 2017” becomes an anthem of a metro standing tall, no matter what the disaster—bankruptcy, blackout, or bombing. Even at the song’s end (the year 2017, of course: a date that, astonishingly, now seems right around the corner from us), there is one voice--the singer’s--to, if nothing else, “keep the memory alive.”

Quote of the Day (Balzac, Seeing Through the Likes of The Sperminator Years Ahead of Time)

“Men are like that—they can resist sound argument and yield to a glance.”—Honore de Balzac, A Marriage Settlement (1835)

Wednesday, May 18, 2011

Quote of the Day (John Marshall Harlan, on the Supreme Court’s “Brutal” Assault on Civil Rights)

“In my opinion, the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case. It was adjudged in that case that the descendants of Africans who were imported into this country and sold as slaves were not included nor intended to be included under the word 'citizens' in the Constitution, and could not claim any of the rights and privileges which that instrument provided for and secured to citizens of the United States;… The recent [Thirteenth Through Fifteenth] amendments of the Constitution, it was supposed, had eradicated these principles from our institutions. But it seems that we have yet, in some of the States, a dominant race -- a superior class of citizens, which assumes to regulate the enjoyment of civil rights, common to all citizens, upon the basis of race. The present decision, it may well be apprehended, will not only stimulate aggressions, more or less brutal and irritating, upon the admitted rights of colored citizens, but will encourage the belief that it is possible, by means of state enactments, to defeat the beneficent purposes which the people of the United States had in view when they adopted the recent amendments of the Constitution, by one of which the blacks of this country were made citizens of the United States and of the States in which they respectively reside, and whose privileges and immunities, as citizens, the States are forbidden to abridge. Sixty millions of whites are in no danger from the presence here of eight millions of blacks. The destinies of the two races in this country are indissolubly linked together, and the interests of both require that the common government of all shall not permit the seeds of race hate to be planted under the sanction of law. What can more certainly arouse race hate, what more certainly create and perpetuate a feeling of distrust between these races, than state enactments which, in fact, proceed on the ground that colored citizens are so inferior and degraded that they cannot be allowed to sit in public coaches occupied by white citizens. That, as all will admit, is the real meaning of such legislation as was enacted in Louisiana.”—Justice John Marshall Harlan, dissenting opinion in Plessy v. Ferguson (1896)


Another section of this opinion delivered on this date 115 years ago today by John Marshall Harlan, associate justice of the Supreme Court, is far better known to posterity: "our Constitution is color-blind and neither knows nor tolerates classes among citizens.'' But this section above is far more prophetic, and deserves to be better known, both for its deep understanding of the high court’s problematic history in preserving the rights of America’s most marginalized citizens and in foreseeing the possibilities for mischief in laying down dangerous new precedents that ignore the intentions underlying constitutional amendments.

At first glance, Harlan, the son of a Kentucky slaveowner, might be the last person one would expect to deliver one of the most ringing calls for racial equality from any American court. Yet he appears to have been one of the few jurists of that era comfortable with socializing with African-Americans--or, for that matter, Hispanics or Chinese.

Harlan is known to the great mass of American high school and college students (if he is at all) for his lonely, courageous dissent from the Supreme Court’s 8-1 finding that segregation could be justified on “separate but equal” grounds. He deserves to be even better known. Apart from his decisions, he was, simply, quite a character. As what colleague Oliver Wendell Holmes Jr. called the last "tobacco chomping justice," he loved bourbon, golf, baseball, and colorful clothing.

Harlan was appointed to the court by Rutherford B. Hayes in 1877, as the nation began its long, dark retreat from Reconstruction. But the court's unwillingness to preserve civil rights cannot be blamed on him. He rendered 24 years of distinguished service to the highest court of appeals, and a half century later a more sympathetic set of justices would find his reasoning in Plessy compelling enough to begin overturning American apartheid.