Wednesday, April 30, 2008

This Day in Education History (Columbia ’68 Protests)

April 30, 1968—With student protestors still occupying five campus buildings they’d taken over the week before, Columbia University President Grayson Kirk called in helmeted police, with predictably disastrous results—more than 700 arrests, hundreds injured, nearly 400 police brutality complaints, the eventual resignations of himself and Provost David Truman, and a decade-long decline in the university’s finances.

When I arrived on campus 10 years after the fabled demonstrations, one of the events at Freshman Orientation was a discussion featuring two professors recalling the events of that spring. Even for a group more politically active and fascinated by history than many of our peers, the discussion was useful and, indeed, needed. The passage of a decade meant that many of us knew the tumultuous event had occurred but not why. I imagine that now, with another 30 years past, the demonstrations will seem as hazy to today’s students as the stories of Jack Kerouac and Allen Ginsberg at the West End in the 1940s seemed to us.

I count myself lucky to have heard the discussion at freshman orientation. The faculty members (one, I remember, was anthropology professor Robert F. Murphy, still funny and charismatic despite already being incapacitated by the paralysis that would eventually result in his death) had a real appreciation for the swirl of events and the players involved in the crisis, with few ideological axes to grind. The same, I’m afraid, cannot be said for the panel sponsored by the university this past weekend.
Columnist John Leo has griped that the absence of non-leftists (let alone conservatives) assured a panel dominated by the Students for a Democratic Society (S.D.S.) group that spearheaded the strike. Even this exclusion would not necessarily have been fatal—the ’78 discussion my class witnessed was from a liberal perspective and did not suffer from it, for the tone throughout was wry, with a sweeping view of this event in the larger context of the university’s history. I also don’t think it’s possible to have any meaningful reminiscence of the riot without the involvement of some of the event’s participants.

Still, the overall impression of the three-day panel discussion just concluded,
as reported by the New York Times two days ago, was of participants trading (anti-)war stories, with the dominant tone veering sharply toward the most self-congratulatory, smarmy, unreflective gathering I can imagine this side of Oscar night.

“You tried to be true to it,” the writer-researcher Susan Kahn remembered of the student strike. “You became a person who tried to be true to it for 40 years, who in one way or another tried to make the world better.” Edward J. Hyman, a professor of psychology, fondly recalled the late Ted Gold, who was killed nearly two years later when a bomb he was making for the Weathermen radical group exploded in a Greenwich Village town house.

Hmm…One wishes a bit more could have been said about what Gold would have done that night if fate had not intervened—like set that bomb off at a dance in Fort Dix, N.J., killing soldiers who had done him and the Weathermen no harm, leaving their survivors wounded for the rest of their lives.

I also read nothing in the post-mortems on the three-day consortium concerning an actual victim of the Columbia riot itself--
Frank Gucciardi, a 34-year-old plainclothes police officer, who, while dealing with residual unrest the day after Kirk called in the cops, was jumped from behind by a protestor leaping from a second-story window. Gucciardi sustained a painful back injury that required three spinal operations over the next three years, and which eventually forced his early retirement because of disability.

Was I just being cantankerous in middle age? I went to the school newspaper (yes, the same one I had labored on more than a generation ago), and found that “Commentariat” blogger Armin Rosen was squirming at the “
single-minded and out of tone” temper of the proceedings, too.

Here, perhaps, some context about the eight days of turmoil four decades ago is in order. As Professor Murphy and his faculty colleague reminded us at orientation, the protest was not against the Vietnam War or any university officials’ part in its planning (indeed, both Kirk and Truman had come against the war), but against the university’s decision to build a gym on public land in Morningside Park and against the school’s ties with a Pentagon research institute. (Not that the war had no role to play at all; as novelist
Paul Auster, a participant in the protest, recalled in a Times op-ed piece last week, he, like many of the protestors that week, was “crazy with the poison of Vietnam in my lungs.”).

To be sure, the protestors were correct that the university needed to change its fraught relationship with its neighbors (the proposed gymnasium’s second entrance, on the south, lower side for the most African-American community, smacked of “separate but equal”) and that its Pentagon ties should be ended. But the events of the next eight days took on an almost hallucinogenic quality all too keeping with the times.

The protest was led by Mark Rudd, who had visited Castro’s Cuba for three weeks earlier that year and reflected the growing S.D.S. Marxist-Leninist bent. He had gained control of the local chapter of SDS by spearheading the “Action Faction” of the group that clamored for direct confrontation against the Vietnam War. On April 23, he and his associate John Jacobs, after leading a protest group of 500 gathered at the sundial in the middle of the universal’s central quadrangle, attempted to take over the administration building, Low Library. Repulsed by conservative counterdemonstrators—a group consisting mostly of athletes and fraternity members that many students and faculty members doubtless were astonished to discover even existed—Rudd and his group took over nearby Hamilton Hall.

Not for long, however: another group of African-American activists from Harlem then ordered the S.D.S. group out of Hamilton Hall and get their own building. It’s hard not to agree with the sentiment voiced by one of the black activists, Ray Brown, that his group was fed up with “the 72 other tendencies of the New Left,” for the protestors were now feeding alike over the university administration’s remoteness from students (Kirk barely knew many of them) and the growing intoxication of microphones and cameras being thrust in front of them.

Events had now morphed way out of control. Kirk and Truman were now dealing with a hydra-headed force featuring student revolutionaries, an African-American contingent with ties to Harlem activists, counterdemonstrators, faculty members who tried to act as go-betweens, and New York City police, edgy and feeling undermanned to deal with this force.

Rudd’s refusal to accept anything less than total amnesty for the protestors eventually moved the administration to call in the cops. The result is well-known, and the administration’s response was heavily criticized the next year by a commission headed by Archibald Cox.

The damage to the university in the next 10 years was severe, spelled out, in eye-opening detail (the kind sadly missing from this past week), by Barnard College Professor Robert A. McCaughey nine years ago. McCaughey’s analysis shows that a year before the disturbances, the university had only a small deficit. By the 1970-71 school year, following cleanup expenses, legal costs, falling tuition payments, and plummeting alumni donations, the university was running a deficit of $16.5 million out of a total budget of $170 million.

Now, 40 years later, this past weekend, current university president Lee Bollinger joked to panel participants, “I thought about making my office available to you all night.” Obviously, he did not appreciate some of the underlying paradoxes of his presence there that day.

To start with, he has strenuously pushed for the university’s expansion north into Manhattanville—a move that recalled predecessor Kirk’s arms-length relationship with the school’s Harlem neighbors. As a prominent First Amendment expert, did Bollinger adequately consider whether celebrating a takeover in which classes were disrupted and a college dean was forcibly detained really advanced the cause of academic freedom?

The Columbia student takeover would not have happened but for the administration’s total removal from any sense of what was happening in the lives of the students. At the same time, even though the protestors forced the university to abandon the proposed gym and end the Pentagon ties, it had severely undermined the authority of a liberal university administration made to look impotent. It contributed to the images of disorder that led to the eventual national counterrevolution that autumn: Richard Nixon’s election as President.

Quote of the Day (Keillor)

“When in doubt, look intelligent.”—Garrison Keillor

Tuesday, April 29, 2008

This Day in Music History (Duke Ellington’s Birth)


April 29, 1899—His first and middle names at birth in Washington, D.C., were “Edward Kennedy,” but while still a youngster he became known by the name that millions of fans the world over now recognize him by: Duke Ellington.

“Class tells,” one of John O’Hara’s characters remarks in Ten North Frederick. As often is the case with the novelist, the noun in that sentence has a layer of irony: of course it refers to the social distinctions that divide people, arbitrarily and maybe even unfairly, but also signifies elegance, courtesy and simple human dignity. In that second sense, few embody the term as well as America’s premiere jazz bandleader-composer.

“Beyond category,” Ellington’s highest compliment, could just as easily apply to his life and career. He let nothing stand in the way of his creation—and audiences’ appreciation—of the notes snatched from his fecund brain: Not racial or national origins, not sexual preferences, not bandmembers’ weaknesses or audiences’ fickleness, not even the smallmindedness of people who should have known better (like the Pulitzer Prize board that overruled its own jury that in 1965 recommended him for a special citation).

From the shimmy back herringbone suit that first got the young pianist noticed in Washington to the cool remove he maintained between lovers and himself, he was a natural aristocrat—taking after his father James, “a Chesterfieldian gentleman who wore gloves and spats,” in the words of the musician’s sister Ruth.

In 1923, the 24-year-old Ellington paid his way into the segregated section of Washington’s Howard Theatre to hear saxophonist Sidney Bechet—his first real exposure to New Orleans jazz. Five years later, firmly established in Harlem, musicians, including white ones (like Bing Crosby), were coming to see him. By the end of his life, he had played all over the world, to audiences of all races and nationalities.

Ellington’s longtime musical alter ego was
Billy Strayhorn. Ellington managed the delicate balancing act of according “Sweetpea” as much recognition as he could without thrusting him so firmly into the limelight that Strayhorn’s homosexuality would be exposed in that less tolerant era.

Harder than keeping Strayhorn’s secret was holding the Ellington big band together. In the band’s heyday, it involved not just creating tunes with solos where musicians like Johnny Hodges or Cootie Williams could shine, but also keeping Ben Webster away from the bottle enough so he could show up for recording dates and concerts, or maintaining emotional equilibrium in the South, where the band would constantly deal with segregated accommodations, or playing small halls when their fortunes cratered with the bebop vogue after World War II.

In a warmly affectionate
tribute to the Duke on his 70th birthday, when Richard Nixon was throwing a state dinner in his honor at the White House, Ralph Ellison noted the special irony of the occasion: at the turn of the century, Ellington’s father had served as a butler there. Now the son was recognized officially as a kind of roving ambassador of perhaps the only uniquely American art form in the world: Jazz.

Something in Ellington’s easy grace must have even impressed Nixon, who, we know now, had few equals as a racist in the 20th-century White House. Five years later, on the same day he had to go before the American people and reveal the damaging Watergate transcripts, the President still found time to call the dying musician and wish him a happy birthday. It was an unexpected grace note in a President notably lacking in any, but it also testified to the example of the nonpareil musician and man who called it forth.

Ellington’s physical decline also brought out the best in a man as complex as Nixon, but who, for all his roughness, was also far more given to warmth and generosity:
Frank Sinatra. Approached by a mutual friend of the composer’s to have Ellington’s doctors checked out, Ol’ Blue Eyes had the eminent heart specialist Michael DeBakey flown up, at his own expense, to New York. DeBakey quickly sized up the situation—Duke’s doctors had blown it, and his lymphoma was terminal.

Sinatra didn’t stop there, though. On Ellington’s 75th birthday, the singer (who seven years earlier had collaborated with the jazzman on
Francis A. Sinatra and Edward K. Ellington), arranged to have the hall near Duke’s room lined with baskets of fruits and fosters, costing anywhere from $2,000-$3,000.

I probably have more CDs of Ellington than of any other jazz musician, but three stand out:
Anatomy of a Murder, Live at the 1956 Stratford Festival and Ellington at Newport: 1956. The soundtrack for the 1959 Otto Preminger courtroom classic, like the film itself, is alternately urbane, lush, ironic, and sensuous. The bandleaders’ appearance at Canada’s premiere theater venue features one tune certainly chosen for its appropriateness: “Hark! The Duke’s Trumpets.” The Newport recording preserves forever one of the landmark moments in jazz history: When the bandleader, undoubtedly inspired by a sexy blonde dancing in the audience, allowed tenor saxophonist Paul Gonsalves to cut loose on “Diminuendo in Blue” with a 27-chorus solo that brought the festival to electrifying life and earned the Duke a Time Magazine cover story and a new recording contract with Columbia.

Quote of the Day (Seneca)

“Men do not care how nobly they live, but only how long, although it is within the reach of every man to live nobly, but within no man's power to live long.”— Lucius Annaeus Seneca, ancient Roman senator
(For one of America’s natural noblemen, see today’s “This Day in Music History,” on Duke Ellington.)

Monday, April 28, 2008

This Day in Presidential History (Birth of James Monroe, "National Security President")


April 28, 1758—James Monroe—fifth President of the United States and, according to former Presidential candidate and Monroe biographer Gary Hart, our "first national-security President" —was born the second of five children of a small planter in Westmoreland County, in what was then the colony of Virginia.

Two and a half years ago, on the same trip to the piedmont section of the state in which I visited the mansions of Thomas Jefferson and James Madison,
Monticello and Montpelier, I also stopped at what their protégé, Monroe, called his "cabin castle": Ash Lawn-Highland. I have to admit that, having just come from Jefferson's iconic hilltop home-turned temple of democracy, I was surprised, even let down, by the Monroe's comparatively simple farmhouse.

Yet, as my eyes swept over Monroe's rolling 535 acres in the mountains of western Virginia, I was reminded that he was, after all, a plantation owner requiring many resources to sustain the lifestyle he desired as a man and required as a politician. Moreover, a longer look afforded some surprises.

Instead of one story high, which was how it looked from the north side, the house was, I soon noticed, tucked into a hillside that concealed a basement. Once past the modest exterior, I saw evidence of a man with a taste for sophisticated, even costly foreign furniture, usually French. Clearly, there was more to Monroe than a first glance suggested.

An Underestimated Chief Executive

For all the inevitable comparisons with Monticello and Montpelier, then, Ash Lawn-Highland should not be overlooked, anymore than Monroe's political legacy should be. The last of the "Virginia Dynasty" that dominated the Presidency at the start of the American Republic, Monroe was, like other Presidents down to the present day, used to being "misunderestimated."

Nobody—but nobody —thought he was smarter than his fellow Democratic-Republicans, Jefferson and Madison. Even his Cabinet doubted him. His Secretary of War,
John C. Calhoun, regarded him as "slow." His Secretary of the Treasury, William Crawford, thought Monroe was so indecisive that he threatened his chief with a cane.

Yet Monroe, though still not placed among the great Presidents, has achieved greater stature among historians, even making the top 10 on occasion. The doctrine named for him has dominated American policy in the Western Hemisphere for nearly two centuries. He won two terms, the second in such near-unanimous fashion that his administration was dubbed, somewhat misleadingly, the "Era of Good Feelings."

Just as Monroe rebuilt the White House after its burning by the British in the War of 1812, so he led the United States to a new era of peace and prosperity. After two wars involving its mother country in its first 40 years, the United States would never experience a British invasion again — not unless you count a quartet of moptops singing on "The Ed Sullivan Show" in 1964.

Early in his career, Monroe would have been voted one of the least likely to be called a consensus-builder. His advocacy of the French Revolution while serving as America's envoy to the country led
George Washington to recall him, and Monroe's self-justifying pamphlet upon his return further annoyed the President (who, incidentally, had been his commanding officer at the Battle of Trenton, where 18-year-old Monroe had been wounded). In December 1799, Monroe's election as Virginia's governor so enraged his old boss that Washington discussed it for an hour without taking off his snow-covered cloak – triggering the pneumonia that killed him a few days later.

Nor was Washington the only Federalist that Monroe angered during his time in Washington. Thoroughly convinced that Monroe was behind the exposure of his affair with Maria Reynolds,
Alexander Hamilton challenged the Democratic-Republican to a duel. Only cooler heads prevented a Weehawken-style duel that, like Hamilton's a few years later with Aaron Burr, might have resulted in loss of a political career or a life. (As I noted earlier this year, the most likely culprit in the leak was an associate of the Virginia Dynasty, John James Beckley, who went on to become the first librarian of Congress.)

The Tales This House Could Tell

Like its longtime master, Ash Lawn-Highland has passed in and out and back into fashion. Dire financial straits forced Monroe to put the house up for sale only a year after he left the White House, and even to sue the government he once headed for reimbursement of expenses from a lifetime of public service.

In 1974, the home was bequeathed by its then-owner to Monroe's alma mater, the
College of William and Mary. Income from admissions, shop sales, grants, and tax-deductible contributions go not only toward maintenance of the house, but also to merit scholarship recipients at William and Mary.

Guides at Ash Lawn-Highland were quick to point out to me and the other visitors on this golden autumn afternoon that Monroe had more executive experience than any prior American President – U.S. senator, four-term Governor of Virginia, Minister to Great Britain, Spain, and France, Secretary of State and Secretary of War.

How could a man with such accomplishments be overlooked? This property is a good place to start for an answer.

Like so much else in Monroe's life, it was influenced by Jefferson – from the location (only two and a half miles from Jefferson's estate in Charlottesville, personally selected by the great man himself), to the gardeners who started the orchards, even to the placement of the kitchen underneath the house.

Having taught the young Monroe the law and taken him under his wing as the new republic formed, Jefferson could be forgiven for offering a little architectural advice, too.

Some of his other counsel was not as benign. Upon inspecting Monroe's slave quarters, Jefferson told his protégé that they were too good for their occupants: even white guests, he noted, barely enjoyed better accommodations than the "servants" (the preferred euphemism for the "peculiar institution"). Perhaps he wasn't blessed with Jefferson's often stunning farsightedness, but Monroe was also not plagued by the stunning moral blindness that made the Sage of Monticello so maddening to posterity.

A Beloved But Aloof Beauty

The house interior also reflects another difference between the two: the presence of a wife. There is virtually no trace at Monticello of Martha Jefferson, who died when her husband was only halfway through his life. The sensibility of
Elizabeth Kortright Monroe, on the other hand, comes through as strongly as that of her husband in virtually every room at Ash Lawn-Highland.

In the parlor, I scrutinized the portrait of Elizabeth as a young woman. You could see why Monroe was devoted to her throughout their married life, and why her death hastened his own a year later.

She retained much of her youthful beauty all the way into her fifties in the White House, with some observers even comparing her to a goddess. But that description also hints at unapproachability – a distinct disadvantage for a plantation owner and politician expected to lavish hospitality on an epic scale. Elizabeth was utterly without the genius for small talk that distinguished Dolley Madison, and she used her mounting illnesses to evade social engagements she dreaded.

“The Last of the Cocked Hats”

The "last of the cocked hats" (i.e., the last President to have fought in the American Revolution), James Monroe favored knee-buckled breeches and three-cornered Revolutionary hats, even during his Presidency. This identification with the republic's beginnings had become so complete that people figured it was natural that he would even die on July 4, 1831, just like Adams and Jefferson had done five years before.

Both old-fashioned and cosmopolitan, Monroe knew enough about the Old World's corruptions and convulsions that he would not allow it to regain lost traction in the New World. The doctrine that Secretary of State (and future President)
John Quincy Adams formulated for him has served as the bedrock of American foreign policy ever since then. The little "cabin-castle" in Virginia reminded Monroe every day of everything valuable in his country.

Quote of the Day (Thoreau)

“A feeble writer and without genius must have what he thinks a great theme, which we are already interested in through the accounts of others, but a genius—a Shakespeare, for instance—would make the history of his parish more interesting than another’s history of the world.”—Henry David Thoreau, The Heart of Thoreau’s Journals, edited by Odell Shepard
(This is my answer to those who would wonder why anyone would be interested in reading about one person’s memoir in one small community. Put another way, it is up to the individual writer to use what John O’Hara called his “special knowledge” to illuminate ways of life not known to others.)

Sunday, April 27, 2008

This Day in Baseball History (Ryan Breaks Johnson’s Ks Record)


April 27, 1983—With his strikeout of Montreal Expo pinch-hitter Brad Mills, Houston Astros pitcher Nolan Ryan not only broke one of the most venerable baseball records—Walter Johnson’s record of 3,508 career K’s—but gave Mets fans one more reason to curse their front office for trading the young Texan a dozen years before to the California Angels for Jim Fregosi.

Typical of the reaction to this famously lopsided trade comes from the fine new book
Mets by the Numbers, by my friend and former co-worker Jon Springer and co-author Matt Silverman: “It cost them (the Mets) a pitcher who could have complemented Seaver-Koosman-Matlack in the ‘70s, and Gooden-Darling-Fernandez in the ‘80s, and maybe even Gooden-Cone-Fernandez in the early ‘90s…that Ryan kid sure pitched for a while.”

Ah…woulda, shoulda, coulda!

A Good Idea at the Time?

With all due respect to these estimable authors, I’m not sure that Ryan would have worked out quite as well for the Mets if they’d held onto him. Ryan didn’t have a problem with the ballpark—Shea is famous as a pitcher’s park. And he didn’t have a problem with his mentors—
Rube Walker developed one of the premiere starting rotations of the game, and Jerry Grote was one of the most astute backstops working in the game. With all his problems in mastering control, Ryan was the baseball counterpart to Lonnie Shelton, a young center hailed as the second coming of Willis Reed for the New York Knicks but utterly unable in his first two seasons to overcome his problem with personal fouls.

No, what I think happened was simply this: New York—and especially its media—can be tough to the point of cruelty on players, especially still-developing pitchers. (Sandy Koufax, another famous young fireballer who started out with a New York club, did not develop his control until he headed west with the rest of the Dodgers—and even then, it took a couple of years.) Even
Tom Seaver, a pitcher almost as famous for his maturity as for his mastery of pitches, requested a trade immediately after Daily News sportswriter Dick Young published a speculation-packed, fact-free column about The Franchise’s being goaded by his wife to press for a higher salary because he wasn’t making as much money as Ryan.)

A country boy at heart, Ryan requested the trade because he felt out of sorts in the Big Apple. The Mets learned that Fregosi was available. It seemed like a good idea at the time. They had had problems at third base for a long time. A star with the Angels, Fregosi seemed like a good veteran anchor at the hot corner, the way Ray Knight would prove in the Mets’ 1986 championship season. Little did they know that he was on the downside of his career.

Ryan, of course, went on to play into the early 1990s with the Angels, Astros and Texas Rangers, amassing 48 career records by the time he was through.

A Tough Competitor Flying Under the Radar

Ryan’s landmark achievement on this date illustrated two major factors about his career, in my opinion: 1) the way in which achievements that would have drawn massive attention elsewhere attracted relatively little notice for him, and 2) the determination and competitiveness that enabled him to keep going and notch one milestone after another long after other, lesser players had decided to call it a day.

Let’s start with the lack of attention. Only 19,309 people turned out at Montreal’s Olympic Stadium for the chance to see Ryan make history. Not only did baseball commissioner
Bowie Kuhn not show up, but nobody else from his office did. (But then again, Bowie was a clueless jerk who couldn’t be bothered to show up for the game in which Hank Aaron surpassed Babe Ruth’s career home-run mark. He was, as one of his successors, Fay Vincent, rightly observed, utterly undeserving of election to Cooperstown this past year.) And this really got me—even the Astros’ front office couldn’t make it.

But Ryan had the mark within his grasp, and he’d pursue it come what may. And “come what may” was precisely what was happening to him now.

Fifteen strikeouts shy of the “Big Train’s” mark as the season opened, Ryan had only recently been treated for an inflamed prostate gland. His first start after coming back, against the Expos, had gone well, but he’d struggled in his second. Now, on this Wednesday afternoon, a blister on his right pitching hand would have to be drained, because it was slowing his fastball and making his curve miss.

But as he pulled into the middle innings, Ryan began to reach back for something extra. His fourth strikeout of the game, against pinch-hitter Tim Blackwell, tied him with Johnson. Then Mills came to the plate.

Ryan jumped out ahead of the batter with two quick strikes. A third pitch, a ball, brought the entire Astros bench to its feet—something noticed by Mills, who quickly realized that that he was might be about to become a footnote to history, like Tracy Stallard and Al Downing.

The realization was fatal. Ryan uncorked his second curveball in a row, a bit high, Mills thought. But unlike early in his career, when a similar close pitch might have been judged a ball by an umpire, this one was a called strike three. The Astros’ righthander was on his way to a 4-2 victory, and in the record books.

Would-Be Rivals, Then and Now

How long or even how firmly he’d be in the record books was anyone’s guess at that point.
Steve Carlton would shortly trade places with Ryan in overtaking Johnson, and—with his relentless workout regimen—the 38-year-old southpaw seemed like a good bet to stick around for awhile. Only the year before he’d won 23 games, and in 1983 he would notch another 275 strikeouts.

But Carlton’s time was drawing to a close. After the 1983 season, he would never surpass 200 strikeouts again, and his attempt to rekindle his magic with four different teams finally ended, after a stubborn, embarrassing run, in 1988.

And the Ryan Express, remarkably, just kept going, throwing his seventh and last no-hitter at age 44 in 1991. By the time he was done after the 1993 season, Johnson and Carlton were both in his rear-view mirror.

I doubt that Ryan’s career strikeout record of 5,714 will be surpassed anytime soon.
Roger Clemens, now in second, has called it a day (and this time undoubtedly for good—he looked out of gas in his last half-season with the Yankees, and, of course, steroid accusations are dogging him now). Randy Johnson, in third place, has experienced health problems over the last few years that have limited his effectiveness. Just as important, with five-man rotations and middle relievers and closers being used far more extensively, there simply isn’t the opportunity for a pitcher to accumulate the innings necessary to gain strikeouts.

Quotes of the Day (Gospel of John, and St. John Vianney)

“I will not leave you orphans; I will come to you.”—John 14:18

“It is always springtime in the heart that loves God.”—St. John Vianney

(This year marks the centennial of the founding of the Catholic Guardian Society, which was formed to care for those leaving foster care and re-entering communities. A year and a half ago, that agency merged with the even older Catholic Home Bureau, America’s first foster home agency, to form the Catholic Guardian Society and Home Bureau (CGSHB).

It’s worthwhile to recall these non-profit groups and what they have done over the years to help at-risk young people, among the “least among you” that Christ urged his followers to help. The need is even greater in this age for such groups than ever before, as families come under assault from every direction. Please reminder, then, that aiding these groups will bring the kind of “springtime” that Vianney spoke about to a young person that desperately needs it.)

Saturday, April 26, 2008

This Day in Education History ("A Nation at Risk")

April 26, 1983—In one of its periodic fits of hysteria about why its children weren't learning, the American public embraced a Presidential commission's report, A Nation at Risk, that might (or might not) have correctly diagnosed what was wrong and might (or might not) have demonstrated why this problem represented a full-blown crisis. But the report unquestionably saved a major new federal department from the budget ax.

Too bad the report couldn't have been issued closer to Thanksgiving, when it could have enabled the Department of Education to earn a Presidential reprieve along with at least one lucky member of a prominent species that otherwise ends up on everyone's carving table in the holidays.

Somebody help me out here—can you think of another Presidential administration besides Ronald Reagan's that used Presidential commissions so often and with so much publicity? The
Greenspan Commission, for instance, gave Reagan a veneer of bipartisan cover that allowed him to make adjustments in Social Security without Democrats making stick the usual charges that he was going to leave golden-agers out in the streets. The Rogers Commission, investigating the explosion of Space Shuttle Challenger, made famous the phrase "O ring" in its postmortem on that disaster.

The most famous of them all, the
Tower Commission Report, explained how the Iran-contra affair could have taken place (because of Reagan's "management style," as laissez-faire, evidently, as his belief in economics) in a way that spared the American public from conducting yet another impeachment investigation of a President (only this time it wouldn't have involved a shifty-eyed paranoid who obsessed about everything in the White House, but instead a genial grandpa who acted a bit out to lunch whenever he talked about trees causing more pollution than automobiles or whenever he forgot that Samuel Pierce was his own Secretary of Housing and Urban Development).

But the 18 members of the commission assigned to look into education might have made an even bigger mark, because just about all Americans are concerned about schools, or should be. If you're not directly concerned because you have a kid in school, you're probably paying attention to it anyway because a healthy educational system makes your community (and, hence, your real estate) a lot more attractive, while an unhealthy one eats up your tax dollars with virtually nothing to show for it.

Unlike other commissions that waste acres of national forests with policy pronouncements in the form of MEGO ("my eyes glaze over") prose and appendices, this one endeared itself instantly to harried editors (desperate for a headline) and their reporters (desperate to find someone who agreed with them without having to put words in their mouths), with a dramatic—I would argue, hyperbolic—opening:

"If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war. As it stands, we have allowed this to happen to ourselves."

A high-school teacher at the time who helped write the report, Jay Sommer, told USA Today recently, "
In order to move a nation to make changes, you have to find some very incisive language."

Well, with language like what I just quoted, or the commission's warning that the educational system was being engulfed in "a rising tide of mediocrity," the report sure used that kind of apocalyptic lingo. It reminds me a bit of Senator Arthur Vandenberg's sage advice to the Truman Administration on how to win support for his plan to resist communism from a Congress still filled with isolationist mossbacks: "Scare the hell out of the American people."

The martial rhetoric also calls to mind the national handwringing that occurred after the Soviet Union launched Sputnik in 1957. For nearly a century and a half after Congress greeted John Quincy Adams like a skunk in a perfume factory for proposing a national astronomical observatory, our nation's capital wanted nothing, nothing, to do with funding science and or math/science education. But the thought that we might be losing the space race put the fear of God into our nation's lawmakers, and like a dam breaking, money gushed out to teach kids about integers and molecules and planets in the form of the
National Defense Education Act of 1958.

Not that there wasn't anything wrong with our nation's schools 25 years later, and—let's give A Nation at Risk some credit here—the report documented a number of them. Things like dropping SAT scores; the fact that only one-third of 17-year-olds could solve a math problem requiring several steps; the inability of four-fifths of teenagers o write a decent persuasive essay; and, most tragically, because of the quiet economic prison sentence they received, the illiteracy of millions of adults. The concentrated focus on these and similar facts undoubtedly led education historian Diane Ravitch to call A Nation at Risk "the most important education reform document of the 20th century."

All of this assumed suddenly larger proportions because of the backdrop of the times, however. Not only was the Soviet bear still growling in 1983, but now there was another menace: Japan, whose economic competitiveness had helped put out of work millions of Americans in smokestack industries during the recent recession. What did Japan have that we didn't? Smarter kids, for one, according to A Nation at Risk and other experts at the time. Okay, let's get smarter kids, came the American response.

Was there such a clear connection between smarter Japanese kids and smarter Japanese economic performance? After the Japanese economy tanked in the mid-1990s (taking with it foreign bestsellers like The Japan That Can Say No and American jeremiads like Michael Crichton's highly problematic techno-thriller Rising Sun), beleaguered American teachers weren't shy about disputing this notion. One might also argue that, relatively speaking, A Nation at Risk didn't highlight as much as it could have the decisive importance of stable families in ensuring the community vitality and individual psychological health necessary to sustain student performance.

The situation was not as clear-cut as the NEA or the '80s educational "reformers" claimed, though. In his memoir of his life in government, In History’s Shadow, John Connally bragged about something decidedly unconservative: getting the Texas legislature to fork up more money for education—or, at least, for the state's college and universities. His explanation was simple: a strong college system, particularly in the sciences, will attract more businesses, especially in high-tech areas.

Now, the lanky Texan had his share of character defects (after the longtime "Tory Democrat" joined the Nixon Administration as Treasury Secretary and even endorsed his Republican boss for a second term, wags in his home state claimed that at the Alamo he would have formed a committee called "Texans for Santa Ana"). But I believe he was onto something here. Certainly, Massachusetts' generous funding of its collegiate system has helped ameliorate the effects of a tax system that in other ways can be confiscatory.

But, no matter how flawed the education commission might have been about the link between education and competitiveness, it did at least one thing: Save the Education Department.

From day one, Reagan had been eyeing the department, one of the major innovations of his predecessor Jimmy Carter, much like the “meek little wives” of Raymond Chandler’s classic short short “Red Wind,” as they “feel the edge of the carving knife and study their husbands’ necks.”

Education Secretary Terrel Bell also seemed at that point just the way he pictured himself in the title of his memoir, The Thirteenth Man—i.e., the junior member of the Reagan cabinet and, hence, the low man on the totem pale. The report, however, gave him and his department a reason for existence, because it argued that there was indeed a federal role for education. Despite the strongest efforts of conservatives such as Ed Meese, the Education Department escaped its planned demise—and, in the hands of Bell's successor, William Bennett, even became something of a bully pulpit for those of his mindset.

I don't share conservatives' instinctive distrust of all government, but I wonder if liberals' audible sighs of relief—and even self-congratulation—after A Nation at Risk might not have been a bit premature. The increasing involvement of the federal government in education is not an unalloyed good.

Some years ago, a local dentist explained why he had become such a gadfly at board of education meetings (and, eventually, why he came to serve on one such institution). Americans, he thought, had come to feel powerless to effect change—Washington was simply too big, too far away. Local government, and especially local education, was the one area where one person could still really make a difference and see it, though.

Today, that's becoming increasingly hard to argue. Federal K-12 education spending has grown from $16 billion in 1980 to nearly $72 billion in 2007, and with increasing federal money has come increasing federal interference in how districts use the money.

One of my friends, in her days as a former teacher, railed against the
No Child Left Behind Act and its prime mover, President George W. Bush. Among the law's effects, she argued convincingly, was that it made educators teach to the test and that it unfairly penalized teachers who were stuck with the worst students.

Complaints about the legislation have been so universal that I have little doubt that, aside from the administration's disastrous performance during Hurricane Katrina, this piece of well-intended but misbegotten legislation will eventually be seen as the worst aspect of domestic policy during the Bush Administration. But it also makes one agree with Oscar Wilde: “When the gods wish to punish us, they answer our prayers.”


So liberals might ask themselves: Which do you prefer tangling with--a local board of education that you can defeat at the next election--one that at least knows you and your problems--or Dubya, who doesn't know you from Adam and might not care?

The defense rests.

Quote of the Day (Patchett)

"Receiving an education is a little bit like a garden snake swallowing a chicken egg: it's in you but it takes a while to digest."—Ann Patchett, What Now?
(For America’s sometimes-hysterical reactions toward perceived education deficits, see today’s “This Day in Education History.”)

Friday, April 25, 2008

This Day in Scientific History (DNA’s “Secret of Life” Explained)

April 25, 1953—Unlocking what they believed might be “the secret of life”—and beating several rivals to the punch with their discovery—James D. Watson and Francis Crick described the molecular structure of DNA, the hereditary material that forms individual human beings, in a landmark paper that appeared in the influential scientific journal Nature.

In the same issue of the journal were two other articles dealing with DNA –the acronym for deoxyribonucleic acid – whose co-authors included Maurice Wilkins and Rosalind Franklin. Wilkins and Franklin had been working at a lab in Cambridge, while Watson and Frick were at King’s College in London. These four, together with Linus Pauling, were locked in a struggle to figure out the nature of DNA.

In Larry Chinnock’s high school biology class at St. Cecilia’s High School in Englewood, N.J., I first learned about this particular phenomenon. After I figured out how to spell the damn thing (for one year only, figuring –until I had to write this blog post!—that the acronym DNA would do just fine on all occasions for the rest of my life), I squinted at the strange full-color diagram in our text showing this “double-helix.”

The text did a good enough job, I suppose, of explaining how DNA exerted itself on chromosomes. But I don’t think it gave us a really good idea of the brave new world opened up by this momentous discovery.

Nor—and this is something that might have appealed to a history aficionado like myself—did the book help us grasp how the struggle to understand DNA was another in a long line of scientific donnybrooks.

From Orville Wright quarreling with the Smithsonian Institution over the proper credit that should be given to him and brother Wilbur over the airplane, to Alexander Graham Bell’s fevered (and, some people think now, highly suspect) last-minute rush to gain a patent for the telephone, to Isaac Newton’s quarrels with…well, everybody, science is not really the disinterested, objective search for truth that the high priests of science would like us to believe.

Fifteen years after he and Frick published the fruits of their research, Watson wrote The Double Helix: A Personal Account of the Discovery of the Structure of DNA. Just as the year before, Norman Podhoretz’s Making It exposed the “dirty little secret” of New York intellectual life (ambition), Watson’s memoir spelled out a world that you and I never encountered in high school science classes: cutthroat competition among men (and, in this case, women) in white suits.

It was understood while these scientific investigations were occurring that Cambridge was ceding leadership in the race to identify DNA to King’s College—except that Watson and Crick weren’t engaged in these ethical niceties. Franklin, working under the direction of Wilkins, had produced a photograph of a DNA molecule that was given without her permission by Crick to Watson, who admitted later, "My jaw fell open and my pulse began to race." At a glance, it opened up to them the helixical nature of DNA, as well as its method of replication. They rushed their realization into print, beating not only Pauling (who didn’t have access to Franklin’s revelatory image) but Franklin herself.

Nine years after the epic discovery, Watson, Crick and Wilkins received Nobel Prizes for their work. Franklin received none, because six years earlier she had died of ovarian cancer at age 37.

Upon publication, The Double Helix was hailed for making scientific discovery thrilling and relatively jargon-free for the layman. But, in those days before consciousness-raising, Watson did something that would now be called politically incorrect and back then, by men of a certain stripe, would be considered ungallant: he dissed the deceased Franklin as bossy and frumpy.

In recent years, a couple of major biographies have reversed this cruel stereotype, revealing what a talented crystallographer Franklin was. Conversely, Watson’s reputation took a major hit when he was forced to resign from the Cold Spring Harbor Laboratory in Long Island, after stating that he was not optimistic about the future of Africa because the intelligence of Africans was lower than that of whites. The remark was a double nightmare for his institution not just because it exposed Watson’s individual racism but because it revived memories of the lab’s shameful past as the epicenter of the eugenics movement, America’s misbegotten embrace of scientific racism.

In a way, Watson’s fall from grace was poetic justice—not just because of isolated incautious remarks (which more than one politician has made in this electoral cycle, lest we forget), but because they were the latest example of a pattern of criticizing minorities and women, according to “Talking Points Memo” blogger Joshua Micah Marshall. Not only had Watson dumped all over Rosalind Franklin (and at a time when she wasn’t around to defend herself), but later sexist remarks led to a mass walkout by female members of a West Coast audience.

Evidently, the brashness of Watson’s youth has hardened into cantankerousness. He has had to learn the hard way what the rest of us could have told him much more easily: a Nobel Prize doesn’t immunize you from making stupid remarks.

Years earlier, upon realizing the momentous adventure upon which they were embarking, Crick remarked to Watson that they had discovered “the secret of life.” But as far as I’m concerned, baby-boomer bard James Taylor knew the real secret. Sadly, the now-octogenarian Watson does not appear to have learned it.

Quote of the Day (Swope)

“I cannot give you the formula for success, but I can give you the formula for failure—which is: Try to please everybody.”—U.S. journalist Herbert Bayard Swope, speech given on Dec. 20, 1950
(As the crisis in mainstream journalism proceeds, with readership down and layoffs up, it occurs to me that first principles are being forgotten. So here’s my two cents of free advice—can all the consultants and write from your gut. You may or may not “succeed,” as success is defined in this world, but at least you can look at yourself in the world. Besides, in time, if you act otherwise, people will be able to spots yours as the phony enterprise it is.)

Thursday, April 24, 2008

This Day in Irish History (Dublin’s Easter Rising)


April 24, 1916—A mistake-filled operation even before it started, Ireland’s Easter Rising should have ended the same way that the Fenian uprising and other rebellions had in the past—with the cause of independence no further advanced than before.

Instead, the British government, in a war against a German foe it regularly accused of ruthlessness, resorted to such a brutal crackdown that Irish opinion turned decisively from Home Rule to a separate republic.

In the years following
Patrick Pearse’s Easter Monday declaration of the republic in front of an astonished, even amused, noontime crowd at Dublin’s General Post Office, patriotic myth-making obscured how much went wrong before and during the uprising by the ragtag band of rebels.

The long litany of errors can be fairly quickly gleaned from reading Tim Pat Coogan’s
Eamon De Valera: The Man Who Was Ireland, Terry Golway’s For the Cause of Liberty: A Thousand Years of Ireland’s Heroes, and David Fitzpatrick’s essay in A Military History of Ireland, edited by Thomas Barrett and Keith Jeffrey:* Sir Roger Casement, on a mission to secure arms from Germany, was captured when he stepped ashore from a submarine, just days before the planned revolt.
* The captain of the German boat carrying the arms shipment, the Aud, ordered the boat scuttled after it was intercepted by the British Navy, depriving the rebels of crucial arms and ammunition.

* Austin Stack, who was supposed to handle the arms landing, rushed to the barracks where Casement was supposed to be—only to get arrested himself.
* Eoin MacNeill, commanding the Irish Volunteers, issued an order calling off all maneuvers after learning of Casement’s capture—assuring that, when Patrick Pearse and other rebel leaders decided to go ahead with their plans anyway, they would only have 130 men to guard the approaches to the city rather than the 500 originally envisioned and needed. His countermanding of the uprising order also meant that the rebellion would be confined to central Dublin rather than spread across the country.
* British intelligence had intercepted telegrams between the rebels’ American fundraiser,
John Devoy, the German Embassy in Washington, and Berlin, so they knew something was afoot.
* James Connolly, leading the Irish Citizen Army, was so hellbent on staging his own rebellion while the British remained at war with Germany that Pearse and his compatriots in the Irish Republican Brotherhood went along with the idea, figuring that the British would crack down on them anyway before their operations began.
* Connolly, a Marxist, argued—insanely—that Britain’s commitment to capitalism was so total that it would never fire on property in Dublin.
* The chief strategist for the rebels, poet
Joseph Mary Plunkett, was fascinated by military history, but had no practical experience whatsoever with tactics or strategy. Moreover, he was in no physical shape to lead anything at the time, having undergone an operation for glandular tuberculosis only days before the uprising. In fact, the only rebel leader with any military experience at all was Connolly.
* The rebels never had a plan for taking
Dublin Castle, Britain’s nerve center in the city.
* By establishing headquarters in the General Post Office, the rebels all but guaranteed collateral damage to surrounding civilians, shops, and housing—and a furious reaction by the populace when it became involved in the crossfire.
* Pearse led the rebellion not just with a country divided over independence, but at that point willing to wait for Home Rule.

The combined republican force of a little less than 1,800 stood no chance against the British 20,000. When the smoke cleared and Pearse surrendered five days later, 64 rebels had been killed in action versus 103 killed and 357 wounded for the British. Worse off were the civilians caught in the middle, as 300 ended up dead, hundreds more wounded, and central Dublin a wreck. (The photo accompanying this blog post shows the devastation at the GPO, the rebels’ headquarters.)

Predictably given the carnage that resulted, the civilians initially turned on the rebels who had instigated the rebellion. As the captured rebels were marched off to jail, onlookers jeered “Shoot the traitors!” and “Mad dogs!”

And then the British, ignoring the warning by Irish Parliamentary Party Leader John Redmond that mass executions “might be disastrous in the extreme,” proceeded to do just that. After martial law was declared, more than a hundred rebels were condemned to death.

Executions began on May 3 and included two or three at a time, usually in grisly or arbitrary fashion. Willy Pearse’s principal offense was that he was Patrick’s brother. The TB-ridden Plunkett was married only minutes before he died. Polio-lamed Sean MacDiarmida was given no reprieve. Connolly, his leg shattered in the fighting, had to be carried on a stretcher and propped up in a chair to face the firing squad.

By this time, nine days later, Irish public opinion had swerved so decisively against the British that Prime Minister Herbert Asquith ordered an end to the executions. Not, however, before the occupying forces made two final mistakes. They decided only to imprison a former London clerk who had served as Plunkett’s aide-de-camp.

In another instance, they pondered the choice between executing Connolly and another prisoner. Sir John Maxwell, commander of British forces, asked Judge Evelyn Wylie about the latter, “Is he someone important?”

“No,” Wylie answered. “He is a schoolmaster who was taken at Boland’s Mill.”

Reprieves were granted to the former clerk and former schoolmaster, Michael Collins and Eamon de Valera. They became the Crown’s most implacable enemies in Ireland in the years ahead—and, in the end, tragically, their own as well.

Quote of the Day (Pearse)

“Life springs from death, and from the graves of patriot men and women spring live nations. The defenders of this realm have worked well in secret and in the open. They think that they have pacified Ireland. They think that they have purchased half of us, and intimidated the other half. They think that they have foreseen everything. They think that they have provided against everything; but the fools, the fools, the fools! They have left us our Fenian dead, and while Ireland holds these graves, Ireland unfree shall never be at peace."—Patrick Pearse, “Ovation Over the Grave of O’Donovan Rossa,” August 1, 1915
(For more on the live nation that sprang from the grave of Pearse and others, see today’s “This Day in Irish History” post.)

Wednesday, April 23, 2008

Quote of the Day (Bismarck)

“The main thing is to make history, not to write it." – German statesman Otto von Bismarck

Tuesday, April 22, 2008

This Day in Environmental History (Earth Day Marks Zenith of Gaylord Nelson's Vision)

April 22, 1970—The American environmental movement may have reached its apex in this first celebration of Earth Day.

Though the calendar and a more conservative President (Nixon) seemingly indicated that a new era had begun, the nation, plagued by the Vietnam War, operated under the assumption that it was still the 1960s, so the environmental observance was marked by one of the typical occurrences of that decade—teach-ins on pollution and environmental issues at 1,500 college and 10,000 high-school campuses.

Four decades of fascination with history have only hardened my Catholic belief in the notion that even the best of us are deeply fallible. One instance of this lies in the life and legacy of the founder of Earth Day,
Gaylord Nelson (1916-2005), the late Wisconsin governor and senator and, in his later years, head of the Wilderness Society.

In a time of cynicism, when politicians spend more time appearing on CNN or Fox preaching to their partisan choirs or picking up so much PAC money that they’ve developed permanent stoops, I should want to honor an elected official like Nelson who investigated conditions affecting the lives of Americans and worked tirelessly and effectively on legislation that would change them for the better. But, for reasons that I’ll explain in a minute, I find that I can only manage two cheers for him.

First, though, what remains honorable and enduring in his record:

As governor of Wisconsin in 1961, Nelson pushed through the legislature an Outdoor Resources Action Program financed by a one-cent-per-pack cigarette tax to fund the state acquisition of parks and wetlands.


Two years later, his maiden speech in the Senate advocated a bill banning detergents from water supplies, calling on the country to awake to its environmental danger: "We cannot be blind to the growing crisis of our environment. Our soil, our water, and our air are becoming more polluted every day. Our most priceless natural resources—trees, lakes, rivers, wildlife habitats, scenic landscapes—are being destroyed."

Later in 1963, Nelson convinced President Kennedy to embark on a tour that would spotlight the environmental problem. Unfortunately, the media’s attention centered on another matter taking place the same day that JFK’s tour began: the Senate vote on his limited nuclear test-ban treaty with the U.S.S.R.

For the next half-dozen years, the senator cast about for another way to shine a spotlight on his biggest issue. In the summer of 1969, as protests against Vietnam heated up, he had a brainstorm: why not use teach-ins as a method of protesting environmental degradation?

Subsequently, Nelson wrote letters to all 50 of the nation’s governors and the mayors of its largest cities, asking them to issue Earth Day proclamations. The result was wildly successful, drawing 20 million participants that first year. Nearly a quarter century after the event, American Heritage Magazine described Earth Day as "one of the most remarkable happenings in the history of democracy."

Earth Day took off because of a grand convergence of events and attitudes: the growth of ecology as a rigorous scientific study; photos taken during the space program that underscored Earth’s fragility; the counterculture-led interest in health and returning to nature; and, I would argue, the
Santa Barbara oil spill of January 1969, in which 200,000 gallons of crude oil spilled into the channel from a disabled oil rig, killing off birds and marine life.

Environmentalism had gone from the nation’s laboratories and aquariums to the American mainstream. President Nixon was only recognizing political reality when, in the same year as Earth Day, he signed the
National Environmental Policy Act and set up the Environmental Protection Agency.

One legislative triumph after another followed for Sen. Nelson, as he introduced bills that became part of the Clean Air Act, the Surface Mining and Reclamation Act, the Federal Environmental Pesticide Control Act, the Water Quality Act, and the National Lakes Preservation Act.

By almost any measure, those measures improved the quality of American life. I saw its impact, in a microcosmic way, when I took a boat tour on the Chicago River several years ago. Our guide noted that many buildings along the route were turned, oddly enough, away from the river, while others faced toward it. The reason was simple, he explained: Those buildings facing away from the waterway had been constructed before the Clean Water Act passed in 1972, while those looking on the river had been built afterward. In fact,
the stench from the river was so bad 40 years ago that the tour would not have even been possible then.

Over the past several years, I have wondered why the environmental movement lost so much of its impetus by the 1980s. If you’re a liberal, the explanation is clear: Ronald Reagan. 


To be sure, the environmental movement was hamstrung by such egregious administration appointees as James G. Watt, Rita Lavelle and Anne Gorsuch Burford, and some of Reagan’s decisions appear petulant slaps at Jimmy Carter’s environmental efforts, such as the 1986 removal of solar panels that Carter had ordered installed in the White House during the energy crisis of 1979.

But I’m afraid that at least some of the blame for the stalling of modern environmentalism should be directed to some of its founders.

It strikes me sometimes that nearly every reform movement attempts a metaphorical “bridge too far”—a moment when it makes a serious mistake and alienates those it is trying to help. In the case of environmentalism, that critical point was reached in the case of the Chicken Little calls for
zero population growth.

In 1968, for instance, Paul Ehrlich forecast, in The Population Bomb, that the oceans would die by 1979. It didn’t come to pass, mostly because of Nobel Prize laureate
Norman Borlaug’s “Green Revolution,” which has kept starvation in check for millions in Third World countries for the last 40 years.What did come to pass, however, was predictable.

By the late ‘70s, contraceptives had managed to drive the American fertility rates down barely to replacement levels, from 3.5 children per women to 1.7, but the country was still experiencing growth. There was one major cause of this, several prominent environmentalists believed: immigration. Something, they thought, needed to be done to regulate it.

I’m afraid that Senator Nelson was one of those people. Early in his career, he had identified population control as one of the goals of the environmental movement. Years after he went down to defeat in the crushing Reagan landslide of 1980, he had become blunter in his message, dismissing charges that immigration controls represented “racism” or “nativism” in his book Beyond Earth Day:

“Never has an issue with such major consequences for this country been so ignored. Never before has there been such a significant failure by the president, Congress, and the political infrastructure to address such an important issue. We are faced with the most important challenge of our time -- the challenge of sustainability -- and we refuse to confront it. It is the biggest default in our history."

Not only could Nelson not be described as a fringe player in this portion of the environmental movement, but he was hardly alone. In 2003, a particularly cogent analysis of this segment of the movement appeared: Betsy Hartmann’s “
Conserving Hatred: The Greening of Hate at Home and Abroad.” She pointed out that three members of the Sierra Club’s board of directors were key players in the anti-immigration lobby, and that John Tanton, a major organizer and funder of the anti-immigrant movement, has close ties to a number of racist hate groups. Christopher Hayes has outlined how Tanton founded the Federation for American Immigration Reform (FAIR), the nation’s oldest and most influential immigration restriction group, in 1979.

As the son and grandson of immigrants, I can’t view such developments without extreme dismay. The environmental movement cannot expect to make life better for all Americans if people such as Nelson continue to lend credibility and talking points to zenophobes like Lou Dobbs and Rush Limbaugh. It’s an especially ironic outcome that dims the legacy of Nelson, an otherwise proud exponent of Midwestern liberalism.

Quote of the Day (Udall)

“Over the long life of life on this planet, it is the ecologists, and not the bookkeepers of business, who are the ultimate accountants.”—Stewart L. Udall, speech to Congress of Optimum Population and Environment, June 9, 1970
(For a brief history of 1970, the zenith of American environmentalism, see today’s “This Day in Environmental History” post.)

Monday, April 21, 2008

This Day in Literary History (Noah Webster’s "American Dictionary")

April 21, 1828—After more than a quarter century of obsessive labor, Noah Webster –soldier, schoolteacher, spelling reformer, textbook author, newspaper and magazine editor, county judge, and copyright advocate – published Original American Dictionary of the English Language. Like fellow Federalist John Marshall, “America’s Schoolmaster” produced a grand nationalist body of work that unified his country.

After the Revolutionary War (in which not only served but played “Yankee Doodle Dandy” at the head of fellow Yale students for the benefit of lifelong hero George Washington), Webster turned to teaching and was disheartened by the state of American education, particularly by schoolchildren’s use of English rather than American materials. In response he created A Grammatical Institute of the English Language (1783), a combined speller-grammar-reader. Benjamin Franklin was just one of many Americans to use the so-called "Blue-backed Speller" to teach his granddaughter how to read, spell, and pronounce words.

Like the somewhat younger James Fenimore Cooper, Webster possessed somewhat cranky political views that at times put him at odds with his countrymen, but he also took a British literary model and gave Americans a completely distinctive form of literature. Cooper (who, in later years, was frequently in court over libel suits against Whig defamation) started out as an unsuccessful imitator of Jane Austen and Sir Walter Scott before his Leatherstocking Saga carved out a niche as the first Western series. Webster, a Federalist who often felt out of place in the Age of Jackson, took the British form of the dictionary and gave it a very American turn.

For a simple, uncorrupt people, Webster’s 70,000-word 1828 dictionary stripped language to its essentials: “Music” instead of “Musick,” for instance. More important, it included 12,000 words and from 30,000 to 40,000 definitions that had not appeared in any earlier dictionary. He completed a revision of his bestseller only a few days before his death at 1843.

I did not realize until researching this post that Webster also produced a translation of the Bible five years after his epic dictionary. Some scholars feel as if it were a missed opportunity, since, although he knew Latin and Greek, he mostly confined himself to correcting grammar in the King James version and eliminating any passages "so offensive, especially to females, as to create a reluctance in young persons to attend Bible classes and schools, in which they are required to read passages which cannot be repeated without a blush.”

At a time when several translators were engaged in a Thomas Jefferson-like effort to expunge portions that did not conform to their less orthodox theologies, Webster had no interest in offering anything that would overturn traditional notions of Christianity. That, combined with his rather light editing of the King James version, meant that his version would not displace the two-centuries-old one.


But today, Webster’s translation is enjoying somewhat of a vogue. Because it is now in the public domain, it is available in the public domain and can thus be downloaded for free.

TV Dialogue of the Day ("Get Smart")

Maxwell Smart: “Tell me, Hurrah. What made you decide to join KAOS?”
Otto Hurrah: “I am a creative producer and director bursting with new concepts. I was wasted in the movie business. KAOS had an opening for a Mastermind so I took it. Besides, my Agent recommended it.”
Max: “But KAOS is vicious, evil, and rotten.”
Otto Hurrah: “So is my Agent!”—Hollywood mogul Hurrah explains to clueless agent Maxwell Smart how he became an evildoer, in “A Man Called Smart,” in Season 2 of Get Smart
(Let’s see how the Steve Carell-Anne Hathaway film premiering in June measures up to the 1965-70 Mel Brooks sitcom—which, incidentally, has been
recently released on DVD).

Sunday, April 20, 2008

This Day in Canadian History (Pierre Trudeau Becomes Prime Minister)

April 20, 1968—"Canada's JFK," Pierre Elliott Trudeau, became Prime Minister, beginning 16 years (except for a nine-month interval) in which he would leave an imprint far more on his country's institutions than his American predecessor had the opportunity to do in his "Thousand Days." In the process, he took a country that had been somewhat more conservative than the U.S. and transformed it into one a good deal more liberal.

If you're an American of a certain age, you're likely to remember Trudeau as a kind of character from a Moliere comedy, a middle-aged husband utterly unable to control a pretty wife only half his age. This, it turns out, is deceptive, for reasons going beyond the fact that a tragedy was being played out, unbeknownst to either the country or the principals themselves, offstage. (Margaret Trudeau’s misdiagnosed bipolar disorder was revealed a few years ago.)

That Trudeau should be known for far more than his difficult private life was borne home to me when I was visiting his country more in late September 2000, when news of his death from prostate cancer sparked an outpouring of retrospectives. I realized then what a major impact he had on his country.

Differing Views of Trudeau’s Legacy
If you're an American liberal from the Michael Moore wing of the Democratic Party, Trudeau-influenced Canada is likely to seem everything America should be but isn't, a paradise of access to inexpensive medical care and liberated attitudes toward sexuality. If you're a conservative, the nation will appear more like either a piece of Scandinavian governmental flotsam found on North American shores or, more likely, Massachusetts moved north, enlarged and taken to its natural extreme.

Whatever your ideological inclinations, 40 years later, it is impossible to imagine either Canada’s current institutions or atmosphere without gauging his record as prime minister or, earlier, minister of justice under predecessor Lester Pearson. Canadians were correct in naming Trudeau, in a Canadian Broadcasting Corp. poll, one of the “10 greatest Canadians.” (One of the few things liberals and conservatives can probably agree on is that a subject of one of my earlier posts, Wayne Gretzsky, deserves his inclusion on the same list.)

As the comparison to JFK indicates, the 1960s were a big decade for political charisma. Everybody wanted it.

In the U.S., the Democrats spent the rest of the decade pining for the second coming of Kennedy, including enduring tragedies involving both his surviving brothers. Conservative Republicans looked to former movie B-lister Ronald Reagan as a new hope emerging from the West Coast who could bring them back from the Goldwater disaster. Even at the municipal level, the search for this quicksilver quality continued. Bostonians thought they’d found it in Kevin White, New Yorkers in John V. Lindsay. It was columnist Murray Kempton’s apt description of Lindsay that captured the special appeal of these men (and, for that matter, Barack Obama today): “He is fresh, and everyone else is tired.”

Trudeaumania
Trudeaumania” was yet another manifestation of this phenomenon, even before he assumed power. Teenage girls squealed at the sight of him. Autograph hounds even chased him across Parliament Hill grounds! The general election called after he won control of the Liberal Party leadership not only secured a solid majority but cemented his image as the ascot- and cape-wearing, celebrity-dating style maven, as he conducted a legendary “kissing campaign.” (This latter tactic is the one element of the Trudeau style that American politicians of all stripes wouldn’t mind importing—which might be one reason why politics has in recent years been defined as “show business for ugly people.”)

The new Prime Minister was brilliant, and like other such people his tendency to demonstrate it all the time struck many as arrogant. In a photo from February 1968, showing him listening at a conference to an aggrieved speaker advocating independence for Quebec, Trudeau didn’t even attempt to hide his amused contempt.

At other times, this arrogance could be self-destructive and stupid. “Why should I sell your wheat?” he asked farmers from the West—a remark they would recall in delivering one drabbing after another to the Liberals in that region. His inability to hide his annoyance with American leaders needlessly strained relations. And he executed a pirouette once behind Queen Elizabeth II—something that even I, a committed anti-monarchist, regard as singularly immature, insulting and superfluous, since the British royals are essentially toothless.

Trudeau’s Two Greatest Legacies
Two achievements toward the beginning and end of his time in office--keeping his country united and providing it with a written constitution –might constitute Trudeau’s greatest legacies.
A bilingual President, he served as a symbol for French speakers who wondered if Quebec really had a place in an Anglo-dominated country. His ethnicity might have also given him credibility in that province as he reacted to the kidnappings of James Cross and Pierre Laporte (a terrorist organization dedicated to Quebec independence) by invoking the War Measures Act in October 1970.

The act, which provided the government with emergency powers in the event of “war, invasion or insurrection, real or apprehended,” strikes this American observer as reminiscent in intent to America’s Patriot Act. Its invocation was especially unusual in view of the fact that Trudeau’s ardent civil libertarianism as minister of justice.

While most of the public heartily endorsed Trudeau’s use of the measure at this time, it is possible in retrospect to ask if he had resorted to it too readily. Under its provisions, more than 450 people were arrested without charge, with most subsequently released with no charges pressed. Civil libertarians asked at the time if Trudeau had overreacted. I believe they have a legitimate point.

Those looking to Canada as a bastion of rights need to rethink how justified that really is in the light of Trudeau’s actions. Only two government officials were threatened (Cross was eventually released, Laporte murdered) in the October crisis. In contrast, the 9/11 attacks in the U.S. resulted in the loss of nearly 3,000 lives. The threat was far more apparent and wide-ranging.

Unlike George W. Bush, Trudeau escaped large-scale condemnation over time for his actions for several reasons, I think: his Justice Ministry record, his eloquence in pushing for “the Just Society,” and the ratcheting up of tensions by the Quebec separatist movement, which created a large pool of support for any actions he took to quell disturbances. But like Bush, his actions—and occasional overstepping—during a crisis left a precedent that future leaders could invoke under far less justifiable circumstances.

In 1980, Trudeau decisively faced down the threat of Quebec separatism again, winning a referendum on the issue. As part of his campaign, he vowed to provide province and country with a new constitutional order, one that would not depend on amending it in Britain. Two years later, Queen Elizabeth II assented to the Canadian Charter of Rights and Freedoms, entrenching a Bill of Rights into the Constitution of Canada, severing Canadian dependence on the British Parliament, and codifying federal-provincial relations.

Over the course of more than a decade, the record of any leader is bound to be mixed. Such was the case with Trudeau. In his stress on broad geopolitics and attention to lifestyle issues (the state, he famously said, “has no business in the bedrooms of the nation”) and sometimes merely fitful attention to bread-and-butter economic issues, he sounded at times like an American blue-state politico. And, as I’ve just noted, his handling of the October Crisis is not beyond criticism.

But he kept his nation whole and gave it a written body of laws and greater autonomy than it had known before. When he is written up in the history books—as surely he will—those are the achievements that will be recalled and honored.

Quote of the Day (Powell)

“As I look ahead, I am filled with foreboding. Like the Roman, I seem to see ‘the River Tiber foaming with much blood’.” – British politician Enoch Powell (1912-1998), speaking out on this date in 1968 in Birmingham not only against proposed anti-discrimination legislation but against unchecked immigration from British Commonwealth countries to the U.K.

(The “Roman” that Powell alluded to was Virgil, who in the Aeneid wrote of the Sibyl’s prophecy of “wars, terrible wars, and the Tiber foaming with much blood.” For all his brilliance and maverick tendencies, Powell publicly displayed his racist views in
this speech, which resulted in him being kicked out of the Tories’ Shadow Cabinet. He later left the party altogether to become an Ulster Unionist.

Conservative Republicans who are restless about John McCain’s pro-immigration views might ask themselves if they want their party to share Powell’s fate in the political wilderness for years to come.)

Saturday, April 19, 2008

This Day in World War II History (Warsaw Ghetto Uprising Starts)


April 19, 1943 – With the Allies dithering, as they had for the last decade, on how to respond to the Jewish refugee situation, and with SS head Heinrich Himmler bent on liquidating their ghetto as a grotesque birthday “gift” for Adolf Hitler, Polish Jews began the first urban revolt against the Nazis in World War II. In the end, the odds against them were insurmountable, but the Warsaw Ghetto Uprising became an enduring legend in the resistance against evil.

On this date, as it had done for the last 10 years (and as it would do again initially 50 years later, when the breakup of Yugoslavia led to mass rapes, massacres and “ethnic cleansing”), the non-German, non-occupied West did nothing but wring its hands over the fate of Europe’s Jews. British and American delegates, after nearly another two weeks of deliberation, decided: Not too much could be done.

Arguably, they remained wrong (bombing the trains taking Jewish prisoners to death camps might have been a start), but they were right in one respect: The time for really effective action would have been before Hitler re-started Germany’s war machine, even as he was systematically marginalizing Jews within weeks of taking power.

Warsaw Jews couldn’t wait for diplomats: They had literally been penned up within a ghetto within the city, encircled by 10-feet-high walls, by Nazi decree since November 1940. They couldn’t even communicate with anyone outside, since radios and telephone lines had been removed, mail censored, and incoming packages confiscated.

Imagine 400,000 people inside this enclosure, unable to buy food, permitted only half the usual rations; several families often crowded into one apartment; lack of fuel and medical supplies; sewage pipes freezing; human excrement being dumped into the streets. It was a prescription for a humanitarian disaster, which is what happened as typhus epidemics, beginning in the synagogues and buildings for the homeless, radically reduced the population.

And that was before the mid-1942 mass resettlement of residents began. Jewish leaders in the city believed that no more than 60,000 would be seized, and that resistance would only make matters worse. Unable to imagine the worst that evil could produce, they were shocked that more than five times that amount were taken and sent off, most to Treblinka. The Nazis even made Jewish doctors choose which of their patients would be deported to concentration camps. The scope of the roundup led the city's Jewish leadership to a momentous decision: the next time, they'd resist.

Before long, they began organizing in self-defense groups, under the command of a 24-year-old named
Mordecai Anielewicz. I had never heard his name before I began researching this post. He deserves better, and as part of that effort I’ll write a bit about him here, along with including the picture that accompanies this post.

Coming from a poor family in a poor neighborhood, Anielewicz might have, under ordinary circumstances, continued the studies in Hebrew, history, sociology and economics that meant so much to him. But early in the war, he became active in the underground resistance against the Nazis, even returning from Lithuania to Warsaw to be at the heart of the struggle, organizing defiance with every lecture given, paper issued, cell group organized, paper issued, and bunker constructed.

In January 1943, the Nazis surprised the resistance by taking another 6,500 people—but then the Nazis were in turn surprised when a melee began and a German official was badly injured. Four days later, they stopped the roundup effort. Himmler vowed not to be humiliated again.

At 3 am on April 19, with Passover about to occur, the Germans began what they expected would be short work of Warsaw’s last Jews. They had all the tools they could possibly hope for: 2000 troops armed with all the firepower they could ever want, along with more than 7,000 security personnel.

The 600-700 in the Jewish underground possessed only two or three light machine guns to go with a few thousand grenades, as well as a few hundred rifles, revolvers and pistols that the Polish resistance had managed, against the odds, to smuggle into them. It would have to do.


Anielewicz and his confederates began organizing hit-and-run attacks on the Germans—targeting crossroad streets in the ghetto from rooftops and attics. They had no realistic hope of winning, the last surviving leader, 89-year-old Marek Edelman, has remembered—all they wanted to do was “protect the people in the ghetto, to extend their life by a day or two or five."


One of the German tanks entering the enclosure was driven off after being met with Molotov cocktails (one of the most effective improvised weapons of the underground). Weeks of fighting ensued, with the resistance battling insane conditions (heat, smoke, burning buildings, poison gas, manholes blown up to prevent escape through sewers) as well as the enemy. The Nazis were forced to take the ghetto one street at a time.


The beginning of the end came on May 8, when, somehow or other, the Nazis located the bunker where Anielewicz and four other underground leaders were stationed, at Mila Street Number 18 (giving rise to the Leon Uris novel of the event, Mila 18). The Nazis blew up all five exits to the bunker and hurled in poison gas. Rather than being taken prisoner, Anielewicz and his partners committed suicide. Resistance was over within a week.


Several thousand Jews had died in the fires during the month-long struggle, and another 56,000 captured, with at least half being sent off to the death camps. However, Anielewicz and his fellow rebels had provided a clear example of what even badly armed (if ingenious) resistance could accomplish. In the postwar era, the Warsaw Ghetto Uprising formed, along with the ancient Jewish revolt at Masada, a rallying point for the Zionist leaders who brought the state of Israel into being.