Thursday, March 31, 2011
Wednesday, March 30, 2011
Tuesday, March 29, 2011
Monday, March 28, 2011
Sunday, March 27, 2011
Saturday, March 26, 2011
Friday, March 25, 2011
Babe, all I know is what I see
The couples cling and claw
And drown in love's debris.”—“That’s the Way I’ve Always Heard It Should Be,” lyrics by Jacob Brackman, music by Carly Simon, from the Carly Simon LP (1971)
His cigarette glows in the dark.
The living room is still;
I walk by, no remark.”
Thursday, March 24, 2011
First she was the child star, then the adult beauty, then the star onscreen and off. But, in the end, she made the greatest public impact as friend and survivor.
Steel Magnolias was made over 20 years ago, but by that time Elizabeth Taylor had already endured nearly three decades of medical conditions that could have killed her. Her first Oscar, for Butterfield 8 (1960), was widely regarded at the time by many (including, evidently, the actress herself) as a sympathy vote, coming a mere few months after a near-fatal bout with pneumonia. Over the years she also contended with a bad back (courtesy of a fall from a horse, at age 12, while shooting National Velvet), respiratory problems, weight issues, alcohol and pill addiction, and, in the end, congestive heart failure.
Time, W.H. Auden wrote, “is indifferent in a week/To a beautiful physique.” As indicated by a rather nastily titled Doonesbury collection by Garry Trudeau from the early ‘80s--A Tad Overweight, But Violet Eyes to Die For--Taylor’s fame lasted longer than her looks.
But the fact is, amid the wreckage of so much of her life (eight marriages!), she did endure. If Lindsay Lohan wants a model of how to emerge on the other side of the worst that life can throw at you, she can do far worse than look at Taylor for a role model.
People should be remembered not simply for the sum of their worst moments, but for what they were at their best. In that light, I chose for the image accompanying this post a movie still that Taylor herself, I suspect, might have cherished. It comes from A Place in the Sun (1951), a film released when she was only 19, co-starring Montgomery Clift. Yes, it shows the looks that made women envious and men temporarily insane (critic James Agee, more than two decades her senior, admitted, after seeing her in National Velvet, that he was "choked with the peculiar sort of adoration I might have felt if we were both in the same grade of primary school"), but details so much more.
Clift’s conflicted sexuality prevented the two from ever having a physical relationship, but Taylor’s deeply felt relationship with her co-star was, in its way, as loving as any she enjoyed with any of her husbands. He died young, but knew that she would always be there for him. You might be down on your luck, but as long as Taylor was around, you were rich in friends.
Maybe it was that ability to be there for someone else that was the instinct behind her advocacy for AIDS victims (prominently including another closeted gay friend, Giant co-star Rock Hudson), including through the Elizabeth Taylor AIDS Foundation. Maybe it was through this instinct that--years after the end of a career she increasingly regarded as superficial, and years after medical problems that would have killed anyone else--Taylor lived and is recalled instantly by those who never saw her in her celluloid heyday.
Wednesday, March 23, 2011
Tuesday, March 22, 2011
Why doesn’t the breeze delight me?
Why doesn’t the night invite me?”—"Spring Is Here,” music by Richard Rodgers, lyrics by Lorenz Hart, from the musical I Married an Angel (1938)
At least temporarily, I understand a bit what Rodgers and Hart were expressing. While standing by the side of a duck pond in Demarest, N.J., over the weekend, taking this and other photos, I felt that the season was not quite there yet in my heart.
Maybe in a few more weeks—after that snow we‘re supposed to get again tomorrow…
Monday, March 21, 2011
Jean Harrington (played by Barbara Stanwyck): “Do you really think so?”
Charles: “Yes, you have a definite nose.”
Jean: “I'm glad you like it. Do you like any of the rest of me?”—The Lady Eve (1941), written and directed by Preston Sturges
The Lady Eve, released 70 years ago today, furnished Barbara Stanwyck with one of her golden opportunities to do what she did best onscreen: demonstrate that she was forever deadlier than the male. Even the sharpest of studs become inadvertent prey for "Stanny," as demonstrated in the unforgettable moment she sidles down the stairs in Double Indemnity, making you sense immediately that Fred MacMurray’s seemingly wised-up insurance salesman is already toast.
Henry Fonda’s Charles Pike is such a poor dumb sap that, had his character been transposed to film noir, it would have been a case of cruel and unusual punishment. Instead, this shy, none-too-bright heir to an ale fortune provides Stanwyck with some of the great moments in the screwball comedy genre that Hollywood buffed to a high sheen in the Forties and Fifties.
Even before Jean’s nose becomes the topic of conversation, it’s Jean’s legs that get Charles’ attention: this card sharp trips the ale heir when they’re on an ocean liner. Later, annoyed by his rejection, she puts him through another scam: impersonating a grande dame.
Early on, she evokes chuckles when, sizing Charles up, she announces, “I need him like the axe needs the turkey.” By the film’s denouement, she’ll discover, to her surprise, that she needs him, period--the most unlikely and delightful of love stories.
One of the great crimes of Hollywood is that it never saw fit to present Stanwyck an Academy Award while she was in competition, waiting until she was in her mid-70s before giving her one of those honorary Oscars that attempt to redress wrongs to aging former box-office idols while there’s still time. (The presenter that night was John Travolta, who admitted later to being stunned that he was standing there next to a woman who had long been an idol of his family growing up in my hometown, Englewood, N.J.)
The year that The Lady Eve came out was one of the years she could have won. (Instead, the Academy awarded the Best Actress trophy to Joan Fontaine, for Suspicion.) She could easily have been nominated that year for The Lady Eve, or even for her tough-as-nails reporter in Frank Capra’s Meet John Doe. Instead, her nomination was for a turn maybe even funnier than the one she had in The Lady Eve: the deliciously named nightclub singer Sugarpuss O’Shea in Ball of Fire.
In a wonderful centenary tribute published three years ago in The New Yorker, critic Anthony Lane wrote of the female in the accompanying photo: “It was a face that launched a thousand inquisitions: the mouth too tight to be rosy, and a voice pitched for slang, all bite and huskiness. When I think of the glory days of American film, at its speediest and most velvety, I think of Barbara Stanwyck.”
Sunday, March 20, 2011
Saturday, March 19, 2011
Declaring its principles immediately, the Provisional Congress of the Confederate States of America adopted a document that, in nearly all respects, was identical to the U.S. Constitution. The most significant differences, however, involved what the seven seceding states had in common: ownership of slaves, which received special protection.
The last two sentences are so unexceptional that, not too long ago, there would have been a “dog-bites-man” quality to them. But some events and commentary over the last year or so suggest, amazingly enough, that the place of slavery as the primary cause of the Civil War is still very much a live question.
(See, for instance, Virginia Governor Bob McConnell’s statement that he didn’t include slavery in a proclamation of Confederacy History Month because he didn’t feel that it was a “significant” factor for his state, or claims by the Sons of Confederate Veterans that slavery wasn’t the “sole” cause of the conflict.)
How, these people ask, could slavery cause the conflict when a majority of Southern whites did not own slaves, when several border states that (barely) stayed in the Union permitted slaveholding, and when so much of the North was every bit a match for the South in virulent racism?
All granted—and all beside the point.
In a prior post, I discussed how Abraham Lincoln’s Second Inaugural Address—most famous for its eloquent phrase, “with malice toward none, with charity for all”—logically identified the exact way in which slavery lay behind the origins of the war. It involved Southerners’ insistence that the institution be extended into new territories, and the resistance by Northerners to this idea. “All knew,” he said, that “somehow” slavery was the war’s cause.
Some Southern naysayers (the temptation is overwhelming to call them “slavery deniers”) would undoubtedly scoff at citing on this point Lincoln, the Confederacy’s greatest rhetorical foe. He would say such a thing, wouldn’t he? they might ask.
Contemporary Southern Voices on Slavery
But you don’t have to turn to a Northerner to see how the origins of the war were perceived in its own era. You can turn to sources impossible to turn away from—Southern ones.
I’m not even going to discuss here to all the major events of the decade before Lincoln’s election that aggravated North-South relations: the admission of slave vs. free states, the Fugitive Slave Law, Harper’s Ferry, the rise of a Republican Party that opposed slavery’s expansion to the territories, a Democratic Party torn asunder in the election of 1860 by Southern belief that frontrunner Stephen Douglas would not guarantee the introduction of slavery into lands west of the Mississippi.
No, all we have to do is look at what those who mattered in the South—the men who made the laws, who governed, who ruled with the consent of the mass of whites—said or wrote about the place of slavery in their would-be new nation.
Nearly three decades ago, one of my college professors told me he felt textbooks were “a cultural menace to our society.” I think I know what he meant. Already by that point, they were being “dumbed down” so much—so stripped of unusual language or allusions that might be grasped by reading the context of the statement—that, over time, they had become joyless to read and dead on arrival in one’s hands.
How much do high-school history courses encompass primary-document reading outside of texts? And are college history courses that much better in this regard?
I ask these questions because primary documents disclose so much. In the case of slavery as primary cause of the Civil War, they settle the issue resoundingly.
Start with South Carolina, the hotbed of antebellum secessionism. Its secession declaration, released only a few weeks after Lincoln’s election, mentioned slavery no less than 18 times! (If the subject didn’t matter, why did South Carolina keep dwelling on it?)
Slavery Without the Euphemisms
Let’s turn now to the Confederate Constitution. True, some sections, dealing with the structure of the government, departed somewhat from the document molded in Philadelphia by the Founding Fathers. (For instance, the President was limited to a single six-year term, appropriation bills had to pass higher voter hurdles, and Cabinet members were given the right to speak in Congressional debates.) But you couldn’t tell many sections of the two documents apart if you placed them side by side.
Except, as I’ve written earlier, those concerning slavery.
Begin with the word itself. The creators of the founding document of the Confederacy had no qualms about using it—unlike many of their descendants in the 20th and 21st centuries, who have preferred to use the far more genteel “servants,” or the Founding Fathers, who couldn’t bring themselves to name at all the African-Americans laboring against their will.
Northern delegates to the Philadelphia convention in 1787, knowing that anti-slavery sentiment was percolating in their states after a momentous war fought in the name of liberty, feared that using the term would enrage many constituents. At the same time, they worried that the young nation, hemmed in by foreign powers, would not survive at all without the Southern states.
In contrast, Southern delegates wanted the power and influence of their states recognized as far as possible by counting everyone within their borders as part of the census. Yet there were limits to their unanimity. Many Southerners were already either hoping slavery would disappear (James Madison) or moving in that direction (George Washington).
So, in the end, the South yielded to the sensitivities of the North. Thus, only three-fifths of the slave population would count in censuses; moreover, they even went along with the North’s euphemisms for slaves, i.e., “other persons,” “such persons” and “persons owing service.”
Move forward more than 70 years later, to Montgomery. Now representatives of the seceding states had no compunction about naming this group: “slaves.” In fact, “slave” or “slavery” appears 10 times in seven different clauses.
These clauses were rewritten to build firewalls around the rights of slaveowners, who in this document were guaranteed the right to take their property wherever they wanted in the states or territories. They even adopted the hard line of Chief Justice Roger Taney, who ruled in his notorious Dred Scott decision that the peculiar institution could never be prohibited from any territory.
The "James Madison" of the Confederate Constitution
Again, there should be no surprise in any of this. One of the Montgomery delegates, Thomas R. R. Cobb of Georgia, regarded as the "James Madison" of the Confederate Constitution (the original manuscript is believed to be in his handwriting), had written the influential Inquiry into the Law of Negro Slavery in the United States of America (1858), which not only argued that slaves, dating back to Roman times, lacked any recognition as persons but gave short shrift to the extensive manumission that occurred in ancient times.The other Confederate delegates were fully prepared to follow Cobb’s lead. As William C. Davis noted in Look Away! A History of the Confederate States of America, elected officials in the South formed an oligarchy, largely based on requirements that officeholders own considerable property. The key aspect of this wasn't merely land but value. In the mountain areas of states such as Virginia, for instance, the land itself had little value for farming. Possession of slaves, then, constituted value. These officeholders were used to deference from not only slaves but even poor whites.
The most oligarchic Southern state was South Carolina, where not only did property-value qualifications rose with the office but even the election of the President was done through the legislature rather than popular vote. It was also the one gripped the most by secessionist fever. But elites were hardly confined to that state: 49 of the 50 Montgomery delegates were slaveowners. Many of them regarded fellow officials as, in effect, members of a club. If ever there was a self-interested founding national document, it was the Confederate Constitution.
Slavery the "Cornerstone" of the Confederacy: Alexander Stephens
Finally, we have the word of Cobb’s fellow Georgian, Alexander H. Stephens. Bear in mind that Stephens was as far from a fire-eater as the delegates could get. A Whig Unionist who had become friendly with Abraham Lincoln while the two served in the House of Representatives in the late 1840s, he had only yielded after his state had voted in favor of secession in the winter of 1861. His election as Vice-President of the Confederacy was meant to signal to the wider world that the South had seceded with only the greatest reluctance.
Yet even Stephens—what passed for a “moderate” in Montgomery—expressed his admiration for the Confederate Constitution in the most radical terms. In a speech in Savannah, Ga., on March 18, only a week after the Provisional Congress had adopted the document, he observed that it corrected one of the major errors of Thomas Jefferson and other statesmen of his time, that “the enslavement of the African was in violation of the laws of nature,” that they also violated “the assumption of the equality of races.”
In contrast, the Confederacy, he noted, “is founded upon exactly the opposite idea; its corner stone rests upon the great truth that the negro is not equal to the white man, that slavery subordination to the superior race is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”
The "Cornerstone" speech--which goes on to liken this "discovery" to the ideas of Galileo and Adam Smith--deserves recognition, along with James Henry Hammond's 1858 "Cotton is King" speech, as the summa of Southern self-delusion, an intellectual justification of pseudo-science by a self-interested elite that would bring untold carnage and grief in its wake to hundreds of thousands across the nation.
One Founding Document That Endured--And Another That Didn't
It took only 10 days for the Confederate Congress to debate its constitution, and only another two weeks before the required five states put the document into force through votes in favor of ratification. It was in marked contrast to the U.S. Constitution, which required four months of deliberation and nearly another year after that before it went into effect.
But then again, the document forged in bitter behind-the-scenes disputes on slavery, reflected publicly in clauses reeking in ambiguity and embarrassment, endured far longer than the South’s version, passed with overwhelming consensus by an elite anxious to preserve its ancient privileges, and without any ambivalence over its founding principle: not just racism but the all-encompassing moral and legal subjugation based on that belief system.
Friday, March 18, 2011
Bernard Malamud’s The Fixer influenced me as much as any novel I read in high school—both for its unobtrusive but effective style and its message about the indomitable spirit of the individual, even in the face of crushing injustice and circumstance. Many times, when I feel things are hopeless, I especially recall the above quote. It has, in a way, became a kind of personal credo, like the W.E. Henley poem "Invictus."
Malamud passed away writing at his desk 25 years ago on this date, only a day after telling editor and friend Robert Giroux that he expected to finish the last four chapters of his latest novel, The People, by the fall. It was an unjust twist of fate that he wasn’t able to complete it, but nothing like what befall his hero Yakov Bok in his Pulitzer Prize-winning novel.
The most unlikely of heroes, Bok leaves his village for Kiev in 1911 to make a better life for himself as a handyman. Instead, he finds that, even though he is a nonobservant Jew, he is accused of the ritual murder of a Christian child, in a Czarist Russia rocked by periodic pogroms over the age-old "blood libel" canard.
(Several weeks ago, Sarah Palin plunged into controversy by using the term “blood libel” to describe how liberals had tarred her with the extremism associated with the shooting of Rep.Gabrielle Giffords. Before she appears again on Fox News, she might want to read The Fixer before asking herself if her case even remotely compares to the one in this book—a situation based on an all-too-common reality of early 20th-century Europe, such as the real-life case of Menahem Mendel Beilis, a Ukranian Jew whose sensational trial inspired Malamud's treatment here.)
In the scene from which the above quote is taken, a window of hope is opened for Bok, only to be cruelly closed again. He’s finally found, in Bibikov, the closest thing to decency among those examining his case—true, a man who’ll urge Bok’s prosecution for the “crime” of living in an area forbidden to Jews, but at least determined to drop the far more serious murder charge. "If the law does not protect you, it will not, in the end, protect me," Bibikov notes.
Almost immediately, that remark becomes unexpectedly true, as Bibikov is murdered. His death only adds to a string of troubles that make Bok a modern-day Job.
But, against all odds, the “fixer”—a proverbial “little man” standing against a giant, monstrous legal system—endures. By at least surviving until his trial, he stands a chance of disproving the murderous falsehood that would doom not only himself but all Jews.
What helps him go on? Who is to say that it isn’t the words of the proto-existentialist Bibikov, urging, decades before Camus, the necessity of action even when all seems hopeless?
For all their differences in station and outlook, Bibikov and Bok end up sharing something: an ability to transcend former beliefs and circumstances by committing, come what may, to doing the right thing. Bibikov, a bureaucrat in the service of an absolutist ruler, realizes that twisting the law will destroy his country as surely as any prisoner. And Bok, early on a self-described nonreligious, nonpolitical Jew, is fortified on the last page by this hard-earned understanding: “One thing I've learned, ...there's no such thing as an unpolitical man, especially a Jew.”
How did I, a parochial school student, come to empathize so powerfully with a victim of anti-Semitism? It might derive from a statement Malamud made late in life: “All men are Jews, except that they don’t know it.” At one and the same time, Malamud depicts both the particular details of the lives that made Jews the scapegoats of the 20th century and the universal instincts that made them irrevocably a part of humanity.
In his first published novel, The Natural (1952), Malamud invoked the mythic overtones associated with baseball, only, in the end, to debunk them. But in The Fixer, he endowed a common man with an almost mythic heroism.
At the height of his career, Malamud ranked with his contemporary Saul Bellow and younger colleague Philip Roth in a kind of triumvirate of great Jewish-American writers. In the quarter-century after his death, while their reputation stands high or has even risen, his stock has mysteriously declined. The Fixer demonstrates why a long-overdue reassessment to restore him to his rightful honored place in American letters is in order.
Give me the comforting glow of a wood fire
But take all your atomic poison power away.”—“Power,” by John and Johanna Hall, from the No Nukes CD (1980)
Thursday, March 17, 2011
Wednesday, March 16, 2011
Tuesday, March 15, 2011
I have a funny feeling these two won’t be sending each other Christmas cards anytime again soon, don’t you?
Monday, March 14, 2011
Faithful Readers, you might have noticed that on Mondays, my “Quote of the Day” tends to be humorous. The liberal contingent among you might have assumed from the scary image accompanying today’s post that this time my circuits got crossed and I was running a Halloween-oriented quote. You’re undoubtedly angry with me, then, for not giving you sufficient warning to steer the kiddies away from this Internet horror show.
But once you get past that alarming image of Newt Gingrich, I’m certain you’ll agree that I am, in fact, adhering to my Funny Monday routine. If the above quote isn’t the funniest thing I’ve ever put out there for you—well, it’s got to be among the top five, anyway.
In her New York Times column on Saturday, Gail Collins put me onto the scent of this preposterous speculation by what should have been, by all rights, this year’s most unlikely Presidential candidate. It turned out that she could only hint at the half of it.
Throughout the 1990s, I groaned about the Clinton-Gingrich Era, an age of polarization led by two baby-boomer politicos who, in temperament if not party, had more than a little in common:
* both sought to weasel their way out of military commitments during the Vietnam War;
* both reshaped their party in their own image, leading the faithful back from the political wilderness;
* both possessed volcanic tempers;
* both possessed mighty high opinions of their potential (Clinton famously marketed himself in the 1992 Presidential election as an “agent of change,” while Gingrich, according to Bush I budget director Richard Darman, trashed negotiations with House Democrats in 1990 to further his own ambitions).
But I find this especially fascinating: Both men thought they could surmount any trouble over serial infidelities by holding to the novel notion that oral sex really wasn’t sex. (At the height of the Lewinsky imbroglio, I quizzed a couple of married friends about how their wives would react if they made a similar claim. Both agreed that they wouldn’t live to tell the tale afterward.)
A couple of years ago, I rejoiced. Slick Willie had been effectively neutered--blamed for Hillary’s loss of a Democratic nomination that was hers to lose, then effectively sidelined while she ran the State Department. Meanwhile, I vividly recalled a friend telling me how, after his resignation as Speaker of the House following his own missteps, Gingrich had been spotted in New York, visiting a publisher--and nobody seemed to pay him any mind on the street.
The Clinton-Gingrich Era belonged to the ages, I thought.
Boy, was I wrong.
I attribute Gingrich’s desire to achieve the Oval Office in the face of embarrassing disclosures about his personal life--not to mention his most unusual justification for said walk on the wild side--to Clinton. The Comeback Kid’s relationships were so multiple, so out there, that they provided other politicos across the country with practically step-by-step instructions on how to make centers of government actions also centers of personal action. The outcome of the Lewinsky scandal led more than one commentator to conclude that perhaps America was finally shedding its Puritanism and acting more like Europe in its attitude toward sins of the flesh.
In an interview with Esquire, Gingrich’s second wife, Marianne, discussed how her (now ex) husband had been called to a 1998 Oval Office meeting by Clinton, who told him, “You’re a lot like me.” What that meant exactly would soon become apparent with revelations of his own tomcatting, but even before then, Gingrich became unusually hesitant about resorting to his usual rhetorical flame-throwing approach. Clinton survived the 1998 midterm election very well indeed, but Gingrich didn’t. A dozen years later, it still must eat at him.
Now watch the wheels of Newt’s mind spin. Last year, he surely took note of Clinton’s justification to friend and historian Taylor Branch about how the Lewinsky affair transpired. (It occurred after the death of Clinton’s mother, the Democrats’ loss of Congress in the ‘94 elections, the widening Whitewater scandal, according to the ex-President. In this telling, the leader of the free world felt unexpectedly vulnerable when the intern came by during the Gingrich-engineered government shutdown of November 1995.)
That revelation--if that’s the right word for it--didn’t cause much of a stir, let alone merriment, when it was trotted out a year ago, Gingrich must have reasoned. “Why can’t I try it out with evangelical voters?” he surely thought. “And I can subliminally support it by constantly repeating the name of another divorced politician who went on to win the Presidency: Ronald Reagan.”
Gingrich--and Clinton--might have chosen another route in speaking about their errant ways. Hugh Grant pioneered this approach, and his account to Larry King (not to mention Jay Leno and other talk-show hosts) about his mad encounter with Divine Brown got him off the hook with the public: “I could accept some of the things that people have explained, 'stress,' 'pressure,' 'loneliness' -- that that was the reason. But that would be false. In the end you have to come clean and say 'I did something dishonorable, shabby and goatish.'"
Several months ago, describing the President’s 2008 campaign, Gingrich called the President “authentically dishonest.” He has also speculated that the President might be subject to impeachment for giving up the “don’t ask, don’t tell” stance concerning gays in the military.
The ex-Speaker does all of this at enormous peril to his own thin hopes for winning the high office he has craved for so long. Americans still want to feel comfortable with the character of the man they put in the nation’s highest office. If they have a choice between a still-married father of two and a man who requested divorces from two wives as they faced health crises--not to mention a man with a hilarious explanation for past misconduct--then Gingrich--not to mention a party that might just be silly enough to nominate someone who's repeatedly done something, in Grant's memorable phrase, "dishonorable, shabby and goatish"--shouldn’t be surprised at the outcome.
Sunday, March 13, 2011
Regali conspectus in auro nuper et ostro,
Migret in Obscuras humili sermone tabernas:
Aut, dum vitat humum, nubes et inania captet.
“But then they did not wrong themselves so much,
To make a god, a hero, or a king,
(Stript of his golden crown, and purple robe)
Descend to a mechanic dialect;
Nor (to avoid such meanness) soaring high,
With empty sound, and airy notions fly.”—Horace, Ars Poetica, translated by Wentworth Dillon, Earl of Roscommon
I don’t know about you, Faithful Reader, but I’ve given up tracking the daily pronouncements and news surrounding Charlie Sheen. He’s not only put out of business the crew of his own show, but also late-night comics, prime-time entertainment journalists and bloggers such as myself who hoped to say something definitive that would not be superseded by each successive news cycle involving the (now former) star of Two and a Half Men.
Heck, he’s even trying to sideline the editors of Bartlett’s Familiar Quotations: the number of catchphrases he’s minting with each Tweet and TV appearance (“tiger blood,” “bi-winning,” etc.) has grown so ridiculously immense that he now requires not just a few pages, but an entire CD unto himself.
When did Sheen transform from a bratty, limited-talent son of a Hollywood star into the highest-paid actor on TV, hellbent on taking down his long-running show? In other words, when did this example of garden-variety Tinseltown megalomania become a tale of not-so-ordinary madness?
Sheen likens himself to “a total freakin’ rock star from Mars.” Indeed, in our current culture, it takes only a nanosecond to morph from a rock star to a rock god. (And doesn’t a god deserve “goddesses,” like the twentysomething women from the adult-entertainment industry in his pad?)
You have to go back a long way to find people who thought they had so many divine powers—the Roman emperors, to be exact. My guess is that Sheen knows only one phrase from the centuries of Roman domination of the world: Horace’s Nunc est bibendum (“Now we must drink”).
But the great poet of the Augustan Age, in Ars Poetica, has some things to say apropos of the descent of gods into the common muck.
Fundamentally, Sheen has to watch out. It’s not just because the world outside his hermetically sealed, “bi-winning” environment shows signs of tuning him out (even the witches of Salem became so offended by his use of “Vatican assassin warlock” that they performed a “magical intervention”).
It’s also because that same public is emitting increasing signs that, though it is willing to forgive the worst—and repeated excesses—of stars, it expects repentance. Mental illness only goes so far to excuse someone who endangers his life and his family members, then goes on a 24/7, seven-days-a-week, ad hoc reality show, then sues the creative powers that tried to use tough love to save his life. The crowd, as Horace shrewdly observed, scorns performers prancing around “With empty sound, and airy notions fly.”
Far more talented actors than Sheen have come a cropper, especially for excesses eerily reminiscent of his. Had he opened his paper or turned on his TV this week, he might have seen someone in his corner, Oscar winner and past box-office star Mel Gibson, pleading guilty to a misdemeanor assault charge for battering the mother of his child, with a career grounded for the last five years after a drunk-driving incident that included an anti-Semitic rant as out-of-left-field as Sheen's own. (See last month’s Vanity Fair article on the roots of Gibson’s decline.)
But, if Sheen really wants a glimpse of his frightening future, he would do well to rent or catch on TCM the 1933 golden oldie, Dinner at Eight—and, in particular, concentrate on John Barrymore, in a role based on his persona and in a performance as emotionally naked and terrifying as any Sheen can ever hope to see.
“The Great Profile” was just a little more than a decade removed from his electrifying Broadway turn as Hamlet, but he was already headed straight for his sorry career finale—an inebriated has-been whose failing memory--and consequent need to improvise anything on the spot--led to the pathetic spectacle of audiences laughing at his expense.
In Dinner at Eight, the situation faced by Barrymore’s character Larry Renault should strike a chord of recognition in Sheen: a star in the grip of substance abuse, abandoned at last by a press agent exhausted from covering for his endless excesses Before his lonely end, Renault/Barrymore looks in the mirror and finds only exhaustion and emptiness. Like Sheen, he finds himself, in Horace’s words, “Stript of his golden crown, and purple robe”—and the discovery shatters him.
Friday, March 11, 2011
He might not be as consequential (for better or worse) than his father, the founder of modern conservativism, William. But for my money, Christopher Buckley is a thousand times more engaging as a writer. I’m not sure that there’s a better satirist writing today, and anytime I see one of his works in print—an article, say, or, increasingly over the last several years, one of his marvelous, laugh-out-loud novels (No Way to Treat a First Lady, Florence of Arabia)—I pounce, knowing that I’m in for something good.
The first item I bought on my Kindle more than a year ago, in fact, was a Buckley novella, a Kindle-only product for The Atlantic, Cynara. That turned out to be funny and, in the end, surprisingly moving.
Thursday, March 10, 2011
Wednesday, March 9, 2011
Tuesday, March 8, 2011
Today is the 80th birthday of prolific New Yorker contributor John McPhee (in the image accompanying this post). Far be it from me to argue with the deliberations of those who select Pulitzer Prize winners, but over the last several decades, as the Princeton, N.J. resident has concentrated increasingly—almost obsessively—on the physical world (e.g., Basin and Range), I have tended to avoid his work.(I blame The New Yorker, which, in the last years of the William Shawn era, became so musty that it allowed favored writers to go on, often at interminable length, on just about anything—see, for instance, E.J. Kahn Jr.’s Staffs of Life, a book that grew out of his multi-part series on grain.)
But in his early days, when he profiled real human beings, he gave an extraordinary vivid picture of their world. A Sense of Where You Are, for instance, remains, more than four decades after its appearance, the essential account for understanding why Bill Bradley made such a huge impression on the basketball world in college.
Likewise, The Headmaster masterfully describes, toward the end of his 66-year career at Deerfield Academy in Massachusetts, its benevolent despot, headmaster Frank Boyden. This piece benefits more than a little from intimate familiarity with its subject (McPhee was a product of the school himself during Boyden’s long reign). If you want to know not just about the rise of one of this country’s major prep schools—more than that, what makes an institution-builder tick—then this is the book for you.
It’s astonishing to realize that, in his six decades with the school, Boyden--whose demeanor, according to McPhee, suggested “a small, grumpy Labrador”--not only kept no written rules but only expelled a half-dozen students altogether. Would that record be possible to maintain in today’s world of broken homes that damage young lives, the substance abuse to which teens are exposed--and litigators ready to pounce on the lack of any written record of school policies?
(By the way, film fans: the 1982 Diane Keaton-Albert Finney movie Shoot the Moon was based on a screenplay by Bo Goldman, a former Princeton classmate of McPhee’s. More than a decade after its premiere, McPhee’s ex-wife sued the filmmakers, alleging that the events onscreen depicted her marital strife as witnessed by Goldman while he was a guest of the couple. The case was settled out of court.)
Monday, March 7, 2011
Sunday, March 6, 2011
Saturday, March 5, 2011
Look beyond that first sentence in the above quote, the one that the 20th-century’s greatest phrasemaker turned into a Cold War soundbite. With good friend President Harry Truman in the audience, watching as Winston Churchill accepted an honorary degree from Westminster, the former British Prime Minister—the man who had led his country in its hour of greatest peril—was now, after a dutiful acknowledgement of the massive contribution of Joseph Stalin in the Grand Alliance, somberly outlining the Soviet leader’s recent catalogue of electoral crime.
The 65th anniversary of the Iron Curtain speech comes the same week that the media reported on the death of Judith Coplon Socolov, a Justice Department analyst who lived nearly 60 years after convictions in two espionage trials. (An appellate court judge tossed out the convictions because of FBI agents’ perjury concerning wiretapping Socolov and their failure to obtain search warrants, and, in 1967, the government decided not to pursue the case any longer. However, the judge affirmed that Ms. Socolov was guilty, a judgment confirmed by the 1995 disclosure of the VENONA decrypts of intercepted cables concerning Alger Hiss, Klaus Fuchs, Ted Hall, Julius and Ethel Rosenberg and other spies that the government could not disclose at the trials.)
Over 30 years ago, Vivian Gornick’s The Romance of American Communism showed how for a group of Old Leftists of the 1930s through Nikita Khruschev‘s 1956 “secret speech“ outlining the terror of Stalin, Soviet-style Marxism became a golden ideology. In Judith Socolov's case, however, “romance” took on a double meaning, as becomes clear in the lede of Sam Roberts’ New York Times obit:
All too many academic historians have allowed that argument to go unchallenged. It should be demolished on several points: