Showing posts with label This Day in American History. Show all posts
Showing posts with label This Day in American History. Show all posts

Sunday, May 16, 2021

This Day in New York History (Seward Senate Speech Marks Him as Prime Anti-Slavery Foe)

March 11, 1850—In his maiden speech as U.S. Senator, the New York Whig William H. Seward denounced the Compromise of 1850, an omnibus legislative package that eased secessionist sentiment in the decade before the Civil War. 

The phrase he coined—“a higher law than the Constitution”—thrust him to the forefront of opposition to slavery, proved a stumbling block to his Presidential ambitions, and posed an enduring question about the relevance of faith- and morality-driven action in American politics.

I doubt if one out of a thousand people who pass the statue accompanying this entry have stopped for more than a couple of seconds to think about its subject. Now in the midst of Doris Kearns Goodwin’s collective biography of Abraham Lincoln’s wartime cabinet,
Team of Rivals, I’ve come to appreciate, maybe for the first time, the opponent-turned-friend of the President, and how much he contributed to the politics of his time, and even of what he means to ours.

As governor of New York in the early 1840s, Seward promoted economic and educational policies meant to open greater opportunities to African-, German- and Irish-Americans who were already forming part of the Democratic coalition. Like John McCain today, he excited a horde of noisome nativists to insane frenzies—in Seward’s case, through a proposal to divert a part of public school funds to parochial schools where Catholic immigrants would not have to worry about Protestant proselytizing. 

(Take note, today’s Republicans: the path to success lies away from fear-mongering about immigrants and the dispossessed. Take note, today’s Democrats: at least some of your “wall of separation” rhetoric about church and state derives from very poisonous roots.)

After a brief hiatus out of office, Seward came back to win a Senate seat, just in time to face the most divisive question of his time: the suddenly real possibility that divisions over slavery could spell the end of the Union.

The Compromise of 1850 was meant to forestall these questions by not giving either North or South entirely what each wanted. The North would get admission of California, almost certain to be a free state, and an end to the slave trade within Washington, D.C. Two other provisions favored the South: the creation of two territories in the Southwest, New Mexico and Utah, with no restrictions on slavery; and strengthening of the Fugitive Slave Law of 1793.

The squall over the Compromise of 1850 is usually remembered as the last hurrah of the Senate’s “
Great Triumvirate” of Henry Clay, John C. Calhoun, and Daniel Webster, all of whom would be out of the chamber they dominated, even dead, within three years. Webster’s three-hour March 7 oration in support of the package—an action that revolted his anti-slavery base and doomed his flickering Presidential chances but which also turned the tide toward enactment—was celebrated in John F. Kennedy’s Profiles in Courage, and was long memorized by generations of American schoolchildren.

However, the extensive legislative debate, I would argue, also brought to the fore a new generation of political leaders who dominated the antebellum and Civil War eras.
Jefferson Davis assumed the role of Calhoun (so ill that he could not read aloud his fiery speech opposing the bills) as spokesman for the South. Stephen A. Douglas, the “Little Giant” best remembered now for his debates later in the decade with Abraham Lincoln, acted, like Clay, as legislative magician by crafting the legislation and rounding up enough votes to ensure passage.

And Seward took on the part that Webster, in his zeal to preserve the Union, had relinquished: champion of “Liberty and Union, now and forever, one and inseparable.” Beginning in a low voice decidedly removed from the great tolling bell that was Webster’s, he soon transfixed the Massachusetts Senator and his colleagues.

He could not support the proposals, he said. Strengthening the Fugitive Slave Law was unworthy of “true Christians or real freemen”; not just the slave trade, but slavery itself should not be permitted in the District of Columbia; and he could not abide the introduction of slavery anywhere in the new territories.

Not only the spirit of the Constitution was incompatible with slavery, Seward claimed, but “there is a higher law than the Constitution, which regulates our authority over the domain, and devotes it to the same purposes. The territory is a part…of the common heritage of mankind, bestowed upon them by the Creator of the universe.”

Even though he lost this legislative battle, Seward became the foremost spokesman for free-soil forces in the Senate. But he had provoked so much fierce opposition in the South that his enemies began to approach his friends in number and vehemence.

Political mentor
Thurlow Weed’s confession that Seward’s speech “sent me to bed with a heavy heart” proved all too prescient in 1860. With the Whig Party dead by then, Seward sought the Republican Party nomination for President. But his long public anti-slavery record left such a long public trail that it opened the door to a relative dark-horse candidate: Abraham Lincoln.

Seward’s loyal and able service as Lincoln’s Secretary of State is how posterity fundamentally recalls him, and it’s certainly not a bad claim on our attention. But the “higher law” that this usually affable, conciliatory statesman invoked has, in one fashion or another, convulsed American politics throughout the republic.

In one sense, Seward’s appeal derives from the concept of
natural law that has found advocates from Thomas Aquinas to Thomas Jefferson. It also meshes with the theory of civil disobedience formulated by Henry David Thoreau only a year before the Seward speech and perfected as a political tool by Mahatma Gandhi and Martin Luther King Jr. in the 20th century.

But a “higher law” also has the potential of angering people who might not share one’s religious faith or even notions of morality. Applied to certain issues—Prohibition, abortion—it has polarized the American electorate for decades.

And yet, it would be a mistake to call, as some have done in this election year, for the marginalizing of moral calls to action in the political arena. Refusal to appeal to morality produces consequences in the populace that necessarily reflect the Darwinian atmosphere of politics. Does anyone think that the United States would have been a better nation without the backing that the civil rights movement received from African-American ministers or that the unionists gained from the Roman Catholic Church?

Sunday, June 1, 2008

This Day in American History (RFK-McCarthy Debate)


June 1, 1968—In the last debate before the climactic California primary, Robert F. Kennedy won the press war of expectations about the outcome with several unexpected thrusts—but a pro-forma statement in support of Israel clinched a viewer’s decision to assassinate him a few days later.

The sudden, ghastly end to RFK's life has tended to put out of focus the nature of his campaign in these final days. I think it’s useful to re-examine why he evoked so many passionate responses, both pro and con.

Bobby Kennedy, in All His Varieties

Of the three brothers who ran for President, Bobby fascinates me the most. With his quick wit, ironic distance, and automatic assumption that his private affairs would be kept out of public life, Jack seems like a Regency aristocrat. In the tributes that have poured in since the revelation of his grave medical condition, it's now more recognized than ever that Teddy is one of the barons of Capitol Hill. His knowledge of where each vote can be found would have made him a success starting as the lowliest ward heeler job, even if he'd never been born into a political dynasty.

Bobby was something different entirely. Without his older and younger brothers' personas, he could slip into any role outside of his time and place that you can imagine and still somehow fit. His churchgoing piety and deep religious commitment could have pushed him into the priesthood (whether he'd be a liberation theologian or monsignor to a cardinal is another story). He could be the classic political boss in a big-city machine—cutting whatever deal he had to, ruthless when he needed to be—or, with his passion and fire for the oppressed, join a revolution.

Some of that quicksilver quality was seen in an incident I commented on previously: his
powerful impromptu speech on the night of the murder of Dr. Martin Luther King Jr. "It feels safe to say that no one else in American public life would have quoted Aeschylus' Agamemnon to an angry black crowd on the day that King was killed," former Bill Clinton speechwriter Ted Widmer noted a week or ago, in an article for the New York Observer.

Anxiety on the Eve of Debate

Bobby’s entrance into the race was so sudden that the campaign had little if any time to organize. Though he scored some successes (notably in Indiana), a loss in Oregon to
Sen. Eugene McCarthy gave a severe pummeling to the Kennedy political operation’s reputation for being as smoothly run. The candidate found himself badly needing a victory in the winner-take-all California primary in early June. He even reversed his previous position that he would not debate McCarthy only Hubert H. Humphrey were invited, too.

In all but one way, Bobby’s political base in the primaries resembles Hillary Clinton’s today: principally in its heavy reliance on the white working-class (read: ethnic Catholic) and Hispanic voters. Barack Obama has inherited McCarthy’s base of the more college-educated, secular progressive voter. Obama’s candidacy has prevailed for two reasons: a) the black vote has gravitated to him rather than Clinton, and b) while the union workers that Kennedy could rely on has decreased over time, the proportion of college students and African-Americans has grown.

Going into the debate, RFK and his staff did not have a lot of reason to be confident. While Bobby’s speeches were eloquent, they were staff-written and rehearsed. It was a miracle that he delivered such an impassioned address on the night of the King assassination, but it might simply have been a product of emotions (especially the anguish over his own brother’s death) and deep reading finally given an outlet.

Bobby didn’t shine as well in a debate. It was surprised to discover a couple of years ago that he had not performed so well in a May 15, 1967 with the new governor of California, Ronald Reagan. (Jimmy Carter and his handlers in 1980 would have done well to check
the transcript of the RKF-Reagan debate to understand what a dangerous opponent they were facing.) When asked why Kennedy rejected repeated invitations to appear on his show “Firing Line,” William F. Buckley Jr. quipped, “Why does baloney reject the grinder?” McCarthy, on the other hand, had much more of a reputation for being articulate and witty.

The debate between the two Democrats was held on a Saturday night in San Francisco on the K.G.O. station in San Francisco. ABC news broadcaster Frank Reynolds served as moderator, with journalists Bob Clark and Bill Lawrence the chief questioners.
According to Evan Thomas’ account, the exhausted candidate was handed “about two pounds” of briefing material the night before, only to fall asleep with his beloved dog Freckles by his side. That left an all-day round for the day of the debate for preparation. RFK was determined to be as non-specific as possible about whether he had authorized FBI wiretaps of Dr. Martin Luther King Jr.

Outpointing McCarthy

Kennedy still did not shine in this static format (more like a joint press conference than a freewheeling exchange of views), but he did manage to underscore differences with McCarthy on policy matters rather than reasons of ambition, which those “Clean for Gene” so often accused him of harboring. In a point recently echoed faintly in the Clinton-Obama marathon, Kennedy—himself a Vietnam War opponent—took McCarthy to task for willingness to negotiate an end to the war with the Viet Cong. Two more exchanges were even more controversial.

McCarthy’s suggestion for transporting African Americans out of the impoverished inner cities to areas in the suburbs where there were greater employment opportunities brought a response from Kennedy that at the time even deeply dismayed many on the candidate’s own staff: “You say you are going to take 10,000 black people and move them into Orange County?”

Reading this sentence in isolation, I was ready to accept the usual historical verdict: that RFK had gotten off a cheap shot that was a blatant appeal to white-ethnic voters (regarded then and now as something that in the ‘70s was called the “Archie Bunker voter” and in the 1980s, only somewhat more kindly, as the “Reagan Democrats”). But for Kennedy's sterling civil rights record, that sentence would have been more widely decried had it come from a Nixon or Agnew.


The sentences immediately following, however, provide a more understandable—and liberal—context: “You take them out where 40 percent of them don't have any jobs at all, that's what you are talking about. But if you are talking about hitting the problem in a major way, taking those people out, putting them in the suburbs where they can't afford the housing, where their children can't keep up with the schools, and where they don't have the schools for the jobs, it's just going to be catastrophic. . . . [W]e have to face the fact that a lot of these people are going to live here [in the ghettos] for another several decades. And they can't live under the conditions that they are living under at the present time."

McCarthy found himself on the defensive on a second point: Kennedy’s charge that he wanted to overcharge Israel for aircraft. In contrast, Kennedy promised more aircraft for Israel. According to Thomas, RFK himself believed that he had “pandered” to the Jewish vote with this promise, in an attempt to defuse the absurd charge that he was anti-Semitic. (In fact, from touring Israel in 1948, Kennedy had come away much impressed with the new nation's courage and determined that America had to help it survive.)

The Opening Salvo of Arab Terrorism

Kennedy’s promise of continued, even heightened, support for Israel had a far more fateful consequence outside of votes, however. His statement could not have reassured one Palestinian-American, who, that very day, had been so angered by a prior photograph of the senator wearing a yarmulke outside a synagogue that he had purchased a box of ammunition for his .22 caliber pistol.

Several observers noted that McCarthy was not as sharp as he could have been, allowing Kennedy to get by with a mediocre if gaffe-free performance. Four days later, Kennedy beat McCarthy by 4.5% in the California primary. He still had an uphill fight for the nomination against Humphrey, who had been methodologically picking up votes in caucuses while his rivals engaged in fratricidal conflict that divided the anti-war left. But at least Kennedy now had the state he absolutely needed if he hoped to remain a viable candidate through the convention two months later in Chicago.

Nobody, least of all the candidate, could have guessed that the young Palestinian-American who had purchased the box of ammunition, Sirhan Sirhan, would be waiting for him in the Ambassador Hotel on the night of his triumph. Nor did anyone realize at the time that the resulting assassination, far from being senseless, was in fact one unaffiliated terrorist’s inauguration of more than 30 years of steadily encroaching thrusts against America and its institutions that would climax on September 11, 2001.

Saturday, May 31, 2008

This Day in American History (Seventeenth Amendment for Direct Election of Senators)

May 31, 1913— Following ratification by 36 of the 48 states then in the Union, as Secretary of State William Jennings Bryan certified one of the most important measures of the Progressive Era, the Seventeenth Amendment to the Constitution, which allowed for direct election of U.S. senators.

The Founding Fathers crafted
Article I, Section 3 of the Constitution calling for election of U.S. Senators by state legislatures. By this means, they hoped to give stakes in the newly created republic a stake in the fledgling federal government. At the same time, they hoped that such legislators—elected at one remove from the people (the Founders were not crazy for direct democracy) –would cool the passions of the House of Representatives, whose members were directly elected by the populace. With each state given two votes in the Senate, all would be equal, neutralizing the advantage of larger states in the Union at that point, such as New York and Virginia.

It all sounded fine in practice, and for awhile it even seemed to work. Alexis de Tocqueville’s Democracy in America even praised the Senate for its “monopoly of intelligence and talent.”

That might have seemed reasonable at the time, amid an age dominated by the Senate’s “
Great Triumvirate” of Henry Clay, Daniel Webster and John C. Calhoun. But even in the instance of Calhoun (who would re-enter the Senate shortly, after a brief stint as Vice-President under Andrew Jackson), the reality was darker. He was elected by a legislature that itself was elected largely through large property owners, with poor, unpropertied whites having heard any voice and with African-Americans, of course, in far dire straits. Moreover, by the end of his life, he represented a region – the South -- whose influence should have, according to its share of the population of the time, been less than it possessed.

Though the first bill calling for direct Senate elections was introduced in 1826, the movement picked up momentum in the post-Civil War period. In my home state, New Jersey, a donnybrook erupted in 1866 over the election of
John Stockton, when opponents charged that he’d been elected by a plurality rather than a majority of the legislature.

But worse was to follow. The Industrial Revolution created Gilded Age magnates who’d bribe an entire legislature without batting an eye. Between 1866 and 1906, the U.S. Senate was forced to deal with nine—count ‘em, nine—bribery cases involving state legislatures. And those were just the ones that were publicly known. The nation’s foremost deliberative body might as well have just hung out a “for sale” sign. In addition, vacancies were going unfilled for long periods—45 deadlocks occurred in 20 states from 1891 to 1905 alone.

Many Senators undoubtedly wanted to keep the means by which they had been elected in the first place. But the combined effects of an inability to conduct business without a Senate body fully staffed and undiverted by investigations, a strain on their legendary comity (working on so many bribery cases would have caused a lot of bruised feelings), and media stories such as “
The Treason of the Senate” run by publisher (and aspiring politico) William Randolph Heart all brought a gradual change of minds.

The Populist Party made direct Senate elections a part of its platform, and within a decade Progressives such as Robert M. LaFollette, George Norris, and William Borah were pushing the idea. In 1911 Senator Joseph Bristow of Kansas introduced a resolution calling for a constitutional amendment. The idea was opposed in the Senate by regional blocs in both parties: Southern Democrats (who, despite the passage of additional Jim Crow legislation, were still insanely afraid of the influence of African-American voters in direct elections) and Republicans throughout New England, New York and Pennsylvania, who not only would lose valuable money from industrials but would be suddenly exposed to the wrath of immigrant workingmen they had dissed. Eventually, however, the idea won out.

To me, the most interesting test of the new amendment came in 1916 in Massachusetts, involving Sen. Henry Cabot Lodge. Most American history students know him as the canny Senate leader whose opposition to America’s entry into the League of Nations doomed Woodrow Wilson’s dream of peace. He is also recalled, like his great friend Theodore Roosevelt; as an arch-imperialist and advocate for enhanced naval power, as well as the first (unofficial) Majority Leader of Senate Republicans in the early 1920s; and as the patriarch of a political dynasty.

But I think he deserves at least a corner in any Hall of Infamy for his zenophobia. In 1881, he termed Irish emigrants “undesirable” and “hard-drinking, idle, quarrelsome and disorderly.” Nearly two decades later, he introduced a bill for “
The Restriction of Immigration.”

In three elections, this unregenerate bigot won through the good offices of the Massachusetts state legislature. Now, following passage of the very amendment he opposed, he would, for the first time, directly confront a populace that contained a large proportion that he had dissed as being out of the American mainstream.

That year, the Democratic standard-bearer was
John F. Fitzgerald. Two years before, his re-election campaign as mayor of Boston had come to a sudden end when the challenger, James Michael Curley, announced an “educational” lecture for the voters: “Great Lovers: From Cleopatra to Tootles”—a not-so-veiled reference to a blond cigarette girl widely rumored to have had an affair with the married “Honey Fitz.” Fitzgerald would never win another election, and with his enemy Curley in the mayor’s office he could not even count on a united voting bloc.

But opposition to Lodge was fierce enough that Fitzgerald gave the previously comfortable incumbent the most difficult campaign of his career. When the votes were counted, the proud senator had withstood the challenge by only a narrow margin, signaling that the supremacy of the Boston Brahmin in Bay State politics was coming to an end—a fact that registered unmistakably in 1952 and 1962, when his grandson and great-grandnephew, Henry Cabot Lodge Jr. and George Cabot Lodge, lost Senate campaigns to grandsons of Honey Fitz, John Fitzgerald and Edward Kennedy.

Sunday, April 13, 2008

This Day in American History (Birth of Thomas Jefferson, Bibliophile)

April 13, 1743—Thomas Jefferson, a lifelong bibliophile who sparked one revolution in America and countless others the world over by declaring that "all men are created equal," began a life of fundamental paradox when he was born in Shadwell, Va., on a plantation owned by his slaveholder father, Peter Jefferson.

"The only birthday I ever commemorate is that of our Independence, the Fourth of July," the author of the Declaration of Independence said while serving in the White House. In one of the great ironies of American history, both Jefferson and his longtime friend and former political rival, John Adams, died on that second "birthday" in 1826, each unaware that the other was about to pass away.

A further irony: Jefferson was not only unable to rid his nation of an institution he had castigated in his youth, but incapable of ending his own fatal dependence on it in operating the estate he had built, Monticello.

Jefferson's personality was so multidimensional, his achievements so multitudinous, his times so tumultuous, and his legacy so ambivalent that I'll undoubtedly return, time and time again, with your indulgence, faithful reader, to his life. But as a professional librarian and literary aficionado, I'd like to focus today on one aspect of his existence: Books.

I have visited Monticello several times, most recently two and a half years ago, when I was struck in a new way by how fundamental Jefferson's library was to him. It offers a particularly vivid way of illustrating John F. Kennedy's famous quip, when greeting an assemblage of 49 Nobel Prize winners in April 1962, that it was “probably the greatest concentration of talent and genius in this house except for those times when Thomas Jefferson dined alone."

The research of Jack McLaughlin, in Jefferson and Monticello: The Biography of a Builder, highlights just how staggering an achievement his library was for his time. In fact, there were three substantial libraries created by the Virginian in his lifetime: the first, inherited from his father, which was destroyed in the fire that consumed his birthplace Shadwell in 1770; a large general collection that he compiled over the next 45 years at Monticello; and a more specialized one, eventually numbering some 1,000 volumes, that he accumulated in his last decade.

The Monticello library comprises, along with Jefferson's bedroom, the "Greenhouse," and the "Cabinet," one of his favorite architectural forms, the octagon. Freely confessing to the "malady of bibliomania," Jefferson admitted, "I cannot live without books."

Most titles here are not, but his detailed lists made it easy for preservationists to reconstruct this. The books not only are in English, but also in French, Spanish, Italian, Greek and Old English.

Jefferson classified Massachusetts law books in the "foreign law" section of the library – a graphic illustration of his view of the original states and their place in the new republic. A high-backed red chair is a remnant from his term as Vice-President under Adams.

One reason why Jefferson could not free his slaves at his death was that he was entangled in $100,000 in debt—a fortune at the time—accumulated because he was as much a slave to his appetites—for the best furniture, best paintings and sculptures, and best food and wines (680 of the latter)—as his slaves were to him. Only one aspect of his collecting mania seems forgivable (or, at least, understandable) to me: his books. Even in this instance, however, as with so much else with Jefferson, certain aspects of their creation and dissemination are problematic.

Let’s start with how the books were shelved. McLaughlin notes that, breaking with the traditional practice of the time, Jefferson had them classified not by their size but by subject, because he didn’t want to waste any time locating them. The classification scheme he used, based on Francis Bacon’s system of knowledge, was eventually the same one adopted for the Library of Congress.

But how were those shelves created? You guessed it: through slave labor—more specifically, under the supervision of John Hemings, the half-brother of Sally (need we say more about her?), a talented artisan.

During Jefferson’s first term in office, the first Librarian of Congress, John James Beckley, was appointed at his behest – though, as I pointed out in an earlier post, the President did neither the office nor his country any favors by appointing to the job a political hack who had endeared himself to his Virginia patrons Jefferson, James Madison and James Monroe by leaking the news of Alexander Hamilton’s affair with Maria Reynolds.

Even the far more lasting service to the institution that Jefferson rendered – the bargain-basement sale of 6,000 volumes from his own Monticello collection to Congress, after British troops burned its library during the War of 1812—was not without its nettlesome aspects: He did so at a point in his life when he was already financially hard pressed.

As a Virginia grandee, Jefferson was too mired in the mores of his own planter class to understand that libraries could be the province of anyone else besides white males. But eventually, the desires to inform oneself and improve one's life that lie at the heart of libraries would lead African-Americans on a long and tortuous road to freedom. In many ways, access to books still represents the most powerful weapon they—or any of us—possess as we seek the full American promise of "life, liberty and the pursuit of happiness."

Wednesday, April 9, 2008

This Day in American History (Tax-Supported Libraries)

April 9, 1833—Peterborough, N.H. agreed to support its local library through taxation – becoming the first library in the world to be funded through this means.

All kinds of other events occurred on this date in other years, both major (Turkey’s declaration that Islam would not become its state religion) and minor (First Lady Lucy Hayes began the annual egg-rolling contest on the White House lawn). But let’s focus on the occurrence in this small New England community—which, incidentally, is home of the MacDowell writers’ colony, where Thornton Wilder wrote Our Town, and, many say, was inspired by his walks here to create his immortal piece of Americana.

But the library event I celebrate is not only is dear to my heart, given my profession, but also ended up being copied by larger communities elsewhere.

I guess, if you want to be technical about it, Peterborough is not the first community to do so—the year before, Boston had taken that honor, passing the enabling legislation that would create the Boston Public Library. But Peterborough’s opened first, so it receives squatter’s rights. And, in a way, it might be more unusual that such a small community made the effort so early on.

Even the innovation pioneered by Boston and Peterborough—free, taxpayer-supported public libraries—took awhile to catch on. As late as 1887, New England accounted for 280 of the 424 taxpayer-supported libraries in the U.S., according to Joseph F. Ketts’ The Pursuit of Knowledge Under Difficulties: From Self-Improvement to Adult Education in America, 1750-1990.

Subscription libraries, such as the Philadelphia Library Company, founded by Ben Franklin in 1731, were a useful start, but they suffered from one weakness: in economic downturns, members cut funding. Taxpayer funding left public libraries somewhat less impervious to these recessions (or, at worst, even depressions).

I write “somewhat” because, whenever a municipality hits a fiscal crisis, library budgets still invariably end up among the first on the chopping block. Maybe it’s a holdover from the old-maid, “Marian the Librarian” stereotype, and a mistaken belief that a bunch of little ol’ biddies wouldn’t protest.

Now, don’t get me wrong. I’m not one of those who bleat about the salaries and/or pensions made by police and fire officials. But I’m sure other areas of city and town budgets can be reduced easily—except that the officials overseeing them always seem beholden to some relative’s idiot offspring or other.

But libraries perform multiple functions that make their existence necessary to any modern civilization. To start with, they should be seen as, in a sense, a corollary to education. They provide students with the resources—i.e., books—they need to learn.

Second—and this might make more sense to politicians who, like Charles Dickens’ Thomas Gradgrind in Hard Times, wants to “Teach these boys and girls nothing but facts”—information is the fuel of the global economy, and libraries knowledge how to organize it, put it at people’s disposal fast, recall it or find it in a hurry.

I really have to laugh when I hear politicians think communities don’t need libraries anyway because “the Internet has everything.” For one thing, I hate to disappoint you, guys, but it doesn’t. There’s a whole group of older materials that is not on the Internet and may well never be, and many organizations make their materials available only to members rather than the general public.

Even more important, if you think of the Internet as one big library, it’s really a library in which the online public access catalogs (replacing those card catalogs) are gone and the books themselves dumped all over the floor in no particular order. Who’s going to find them?

But above all, what the good citizens of Peterborough knew 175 years ago is this: the information that libraries provide can be obtained as easily by the poor as by the rich man. I don’t think it’s coincidental that Peterborough’s decision occurred in the Jacksonian Era, when ordinary Americans transformed the young republic from the patrician-dominated elite to the more egalitarian polity we know today.

Peterborough was recently named one of the “10 Coolest Small Towns” in America by Budget Travel Magazine, and for sure its two theater groups and the artistic spirit springing from the MacDowell Colony are responsible for much of this vibe. But as far as I’m concerned, that library gives it even more cachet.

Friday, April 4, 2008

This Day in American History (Assassination of Martin Luther King Jr.)


April 4, 1968—With a single round from his hunting rifle, James Earl Ray ended the life of civil-rights advocate Dr. Martin Luther King Jr. at 6 pm at the Lorraine Motel in Memphis, Tenn.

The assassination of the Nobel Peace Prize laureate also highlighted the multiple ironies in his relationships with his implacable foe, FBI Director J. Edgar Hoover, and his sometimes ambivalent ally, Senator Robert F. Kennedy.

The FBI director not only had never achieved his desire to undermine King’s moral leadership, but soon found himself so on the defensive about why King’s murderer was on the loose that he was forced to launch the largest manhunt in the agency’s history to catch Ray. (He was finally apprehended in London, two months and more than 3,000 agents later.)

An Obsessive Pursuit

Hoover’s investigation of King’s alleged Communist ties, along with Southern Democrats on board to help pass the administration’s program after a razor-thin electoral victory, placed Bobby Kennedy in one uncomfortable position after another while serving as Attorney-General. He deeply frustrated both men, though his liberal tendencies placed him far closer ideologically to King than to Hoover.

The FBI’s investigation and harassment of King ranks among the worst offenses in its history. Its relentless wiretapping campaign—including sending a package of materials to his home in 1964, warning that there was “but one way out for you,” or his “filthy fraudulent self” would be exposed—is well-known, but less so is why King became the object of Hoover’s insane fixation.

One of the best accounts, it seems to me, of Hoover’s derangement on this matter comes courtesy of Secrecy and Power: The Life of J. Edgar Hoover, by Richard Gid Powers. The explanation here is all the more impressive for its attempt to be fair to Hoover.

According to Powers, Hoover was raised in a Washington, D.C. community of traditional values, with a “turn-of-the-century vision of America as a small community of like-minded neighbors, proud of their achievements, resentful of criticism, fiercely opposed to change.” While that vision made him a powerful opponent of both Nazi and Communist subversion, it also led him to believe in a segregated order in which elites held sway.

His racism was so entrenched that it manifested itself in ways that would appear comical if the real-life consequences weren’t so tragic. Hoover thought he was doing Attorney-General Kennedy a favor by doubling the number of black FBI agents, neglecting to consider that this still only raised the total to 10—and this at a time when the backlash against the rising civil-rights movements was at its most vicious and violent.

The FBI first noticed King in 1958, when it learned that the minister had been introduced to Benjamin Davis, a black Communist Party functionary. Three years later, King’s prominent role in the Freedom Riders movement led Hoover to order an investigation of him.

The tempo of this surveillance effort appears to have picked up in late 1962. King had charged that one reason why civil-rights workers in Albany, Ga., were receiving so little protection was that FBI offices in the South were populated with native Southerners who imbibed the mores of their communities and needed to stay friendly with the local police and forces of segregation.

As it happened, the Albany FBI contained four Northerners out of its five agents. One of Hoover’s lieutenants, Cartha DeLoach, left a message to this effect with King’s office, asking for a return call. King, whose strong suit was not organization, either never received the message or forgot about it. DeLoach and, ultimately, Hoover, interpreted this as dissing the agency.

More than 40 years earlier, Hoover had succeeded in destroying a prior charismatic black leader, Marcus Garvey, by gaining his conviction on mail-order fraud charges that led to deportation. Hoover now hoped to derail another strong leader of the civil-rights movement.

The man Hoover saw as an ideal replacement for King was Samuel R. Pierce, Jr. Does the name sound vaguely familiar? It should—this was the man that Ronald Reagan later selected as his Secretary of Housing and Urban Development Secretary…a man that Reagan later mistook for someone else (hard to understand, since Pierce was the only black member of the Reagan cabinet)…a man whose department became embroiled in corruption and influence-peddling.

Kennedy and King

It was Robert Kennedy’s sad lot to run interference among the racist Hoover, an activist minister who wrote a polemic characteristically titled Why We Can’t Wait, and his own cautious older brother. Kennedy and King’s different backgrounds could lead to fundamental misunderstandings, even when they were on the same side.

At the height of the Freedom Riders campaign, King complained about the lack of protection for the activists. Kennedy’s attempt to liken their lot to that of Irish Catholics in Boston a century earlier did not impress King much. An annoyed Kennedy shot back that King was being ungrateful, since without the presence of federal marshals he and the rest of the group would be “as dead as Kelsey’s nuts.” The Attorney-General’s variation on a common, even more graphic Irish-American phrase (“as tight as Kelsey’s nuts”) just bewildered King, who remarked to aides, “Who is Kelsey?”

Hoover was not above exploiting the tensions. One of his wiretaps caught King watching JFK’s funeral on TV, making an explicit raunchy joke about the President and his widow. Hoover made sure the President’s brother heard about the incident immediately.

I remember the historian Michael Beschloss remarking on C-Span that, if you found Bobby Kennedy staring angrily at you in a meeting, you’d be well-advised to send out your resume the next morning. It was not for nothing that the President’s younger brother had a reputation for ruthlessness.

But, to use Lincoln’s phrase, the “better angels of our nature” also existed in Bobby Kennedy, and these increasingly led him to embrace the cause of civil rights as time went on, and even put aside whatever resentment he may have felt for Kennedy after learning the contents of the tapes from the FBI. The prime mover behind the civil-rights bill that JFK sent up to Capitol Hill before his death (the one that LBJ later shepherded into enactment), Bobby became an even more full-throated advocate of the cause as a Senator.

As he began his own Presidential campaign, Kennedy’s past differences with King over tactics had faded into the background, and he had a realistic chance of winning the minister’s endorsement at the time King was slain. The news jolted the candidate and his staff as they prepared for a campaign stop in an inner-city area of Indianapolis. Yet, despite fears for his safety on the part of his wife Ethel and local police (there was no Secret Service protection for candidates at that point), RFK decided he had to break the news about the tragedy in Memphis.

The result was one of the most extraordinary moments of his short campaign—one that, years later, people rightly cite as evidence of the enormous potential to move a nation that was snuffed out through Sirhan Sirhan’s bullet two months later.

“Wisdom Through the Awful Grace of God”


As Kennedy addressed the crowd on that cold night, about an hour after King’s death, he began softly, “I have bad news for you, for all our fellow citizens, and people who love peace all over the world. And that is that Martin Luther King was shot and killed tonight.”

The announcement set off gasps, sobbing and cries of “No!” from the crowd of approximately 1,000. But Kennedy plunged on. And now, the senator—so devastated by his brother’s assassination more than four years before that he had never publicly referred to it—bonded with an audience understandably angry over the night’s horrible events.

"For those of you who are black and are tempted to be filled with hatred and disgust at the injustice of such an act, against all white people, I can only say," Bobby continued, "that I feel in my own heart the same kind of feeling. I had a member of my own family killed, but he was killed by a white man. But we have to make an effort in the United States, we have to make an effort to understand, to go beyond these rather difficult times."

After JFK’s death, Bobby had sought solace in a copy of Edith Hamilton’s The Greek Way given him by the President’s widow, Jackie. Now, not worrying about whether it was speaking of the heads of many people that night who might not even have completed high school, he quoted from her translation of Aeschylus: “My favorite poet was Aeschylus. He wrote, 'In our sleep, pain which cannot forget falls drop by drop upon the heart until, in our own despair, against our will, comes wisdom through the awful grace of God.’”

Perhaps under the pressure of the moment, Kennedy made a slight error in reciting the passage: Hamilton had used the word “despite” instead of “despair” in her translation. But Kennedy’s mistranscription only strengthened the sense of existential pain that he was sharing that night with the audience, which now fell silent as it paid rapt attention to its searing peroration:

"What we need in the United States is not division; what we need in the United States is not hatred; what we need in the United States is not violence or lawlessness; but love and wisdom and compassion toward one another, and a feeling of justice toward those who still suffer within our country, whether they be white or black.

"So I shall ask you tonight to return home, to say a prayer for the family of Martin Luther King, that's true, but most importantly to say a prayer for our country, which all of us love—a prayer for understanding and that compassion of which I spoke …

"Let us dedicate ourselves to what the Greeks wrote so many years ago: to tame the savageness of man and to make gentle the life of this world.

"Let us dedicate ourselves to that, and say a prayer for our country and for our people."
Nearly 40 years later, Time Magazine columnist Joe Klein aptly summarized why this speech was so remarkable: they “stand as an example of the substance and music of politics in its grandest form and highest purpose—to heal, to educate, to lead.” Yet he also noted, correctly, why this short, simple, impromptu address marked a watershed in modern American politics: it represented “the last moments before American public life was overwhelmed by marketing professionals, consultants and pollsters who, with the flaccid acquiescence of the politicians, have robbed public life of much of its romance and vigor.”

Kennedy had risen to the demands of the occasion, showing that he was still young enough to adapt and learn. Hoover, sadly, was not. His survival in office for the last two decades (including, most conspicuously, under the Kennedys) had depended inordinately on what one government official, quoted anonymously in Curt Gentry’s J. Edgar Hoover: The Man and the Secrets, had referred to as “twelve drawers full of political cancer.” He would live another four years, increasingly oblivious to the lessons of tolerance and love learned by the two men he had tried unsuccessfully to put at odds with each other, Kennedy and King.

Tuesday, March 25, 2008

This Day in Labor History (Triangle Factory Fire)

March 25, 1911—In the worst workplace disaster in New York City history before 9/11, a half-hour-long fire broke out near closing time at the Triangle Waist Company in Greenwich Village, leaving 146 dead. The outrage provoked by the incident led to action on long-unheeded calls for better fire-protection measures while launching a wave of social-welfare legislation and leaders who paved the way for the New Deal.

One of my memories of 9/11 is the photograph of the “Falling Man” hurtling toward death below to avoid being consumed by the fire raging inside the Twin Towers. Just such a sight—much less novel then—greeted many New Yorkers 97 years ago as they beheld one young woman after another jumping to her death out of the Asch Building near Washington Square.

All through elementary and secondary school, I heard nothing about this crucial event in American history. In fact, the first time I came across it was in the superlative chapter on Alfred E. Smith in Robert A. Caro’s biography of Robert Moses, The Power Broker.

I would hope that modern texts remedy this problem, but I doubt it—kids nowadays are lucky they can figure out in which century the Civil War occurred. In certain ways, however, I believe that March 25, 1911 should be committed to memory as surely as July 4, 1776.

Both dates, in their ways, marked a movement away from heavy-handed control by an elite and toward greater freedom—in one case, for white American males of property; in the later one, for the economically oppressed laborer, frequently female and foreign-born.

So, if I were to design a syllabus to teach this event, what would I choose?

Well, I’d start with So Others Might Live, a fine account of New York’s Bravest by journalist-historian Terry Golway. The section on the Triangle fire is short—only a half-dozen pages—but they give an excellent précis for the conditions that led to the blaze and the Fire Department’s helpless anger in combating it.

It also discusses an Irish-American Cassandra, department head Edward Croker, a chief as blunt as he was fearless, who, for his repeated warnings about high-rise office and factory buildings, had to endure constant smearing by business interests for being the nephew of past Tammany Hall boss Richard Croker—until events proved him right.

After Golway’s history, I’d assign David Von Drehle’s Triangle: The Fire That Changed America, for a deeper understanding of the background, events and people involved in that day. Though the extent of the tragedy was unusual, the labor conditions that made it inevitable were anything but. Harassment for petty rule violations had sparked a massive waist-union strike only the year before, and at the time of the fire, a hundred accidents occurred in American workplaces every day.

But the Triangle sweatshop, Gotham’s largest blouse-making operation, requires a Dickens to evoke. Its 500 or more workers, mostly young Jewish and Italian women, crouched over their machines. Only the walls and floors met the owners’ claim that the building was fireproof; the fabric and other materials on the factory floor represented potential kindling.

Worse, the operations on the upper floors lay just beyond the reach of fire department ladders and doors were locked because of fears of employee theft. When the rickety fire escape collapsed, then, it meant certain death for the fire’s victims, 123 of whom were women.

The New York Times won the Pulitzer Prize for its “Portraits in Grief” after 9/11. On a somewhat smaller scale, facing heavier odds because of the distance of the years, Von Drehle was about to compile his own version of this for the Triangle victims, by combing through countless news articles and a long-lost transcript of the trial involving the factory owners (whose acquittal on manslaughter charges brought howls of execration on their heads).

Von Drehle’s account makes clear why the disaster was a landmark event in American immigrant and labor history, but it was also a watershed in American political and urban history. In particular, in the fallout from the tragedy, Tammany Hall—the same political machine that, only a decade before, successfully ran a mayoral candidate with the proud slogan, “To Hell With Reform”—at least partly redeemed its corrupt, largely inglorious history.

For this third phase of the event, Von Drehle should be read in combination with Caro. The indispensable man at the center of this phase was one of the great sphinxes of New York history, Tammany’s chieftain, Charles Murphy. Film buffs know Murphy in fictionalized form, as “Jim Gettys” in Orson Welles’ Citizen Kane, but in real life Murphy kept his own counsel as he shrewdly navigated the political shoals.

Now Murphy acted, giving the go-ahead to his Tammany lieutenants in the Albany state legislature, Al Smith and Robert F. Wagner, to investigate the blaze and what led up to it. Their work led to 25 workplace safety bills in 1912.

More important, that work helped stave off a socialist insurgency in the city (perhaps partly answering Daniel Patrick Moynihan’s famous question about why there was no socialism in the United States) and launched the careers of several illustrious figures: Smith, the governor whose tenure became a kind of laboratory for later New Deal legislation; Wagner, later a U.S. senator and proud patriarch of a line of politicians who figured in city history for nearly three-quarters of a century; and Frances Perkins, who later, as FDR’s Secretary of Labor, became the first female to serve in the Cabinet.

Saturday, March 22, 2008

This Day in American History (End of Prohibition)

March 22, 1933—With the backing of new President Franklin Delano Roosevelt, Congress approved the Beer and Wine Revenue Act, anticipating full-scale repeal of Prohibition by legalizing sale of beer and wine with alcohol content of 3.2%. At the same time, the President—an enthusiastic if unskillful martini mixer himself—couldn't resist taxing the now freely-flowing beer and wine to fund his multitude of expensive New Deal programs.

(The Eighteenth Amendment instituting Prohibition would not be formally reversed until the 21st Amendment, which was approved in December.)

Why didn't more people complain then about how the government giveth and the government taketh away? Maybe they were so pleased not to be "dry" anymore that it didn't matter.

Or maybe it was a case of having something bigger on their minds, like how to find jobs in the middle of the worst Depression this country had ever seen. One-quarter of the nation out of work—now if that wasn't enough to drive a whole country to want a drink, I don't know what was.

In any case, the country celebrated with alacrity—probably none more so than the literati who made opposition to Prohibition practically a union card for admission into the ranks of the avant-garde. Leading the way was iconoclastic man of letters H.L. Mencken, who made sure that at midnight on April 6, when the new law went into effect, he was first at the bar of Baltimore's Rennert Hotel. The next day's issue of his paper, the Baltimore Sun, featured a photograph of its most famous columnist, the "High Priest of Brew," quaffing a nice cold one.

The groundswell against Prohibition represented a stunning turn of events from even a few years before. To be sure, the 18th Amendment to the Constitution and its enabling legislation, the Volstead Act—twin measures constituting what Herbert Hoover termed, in his most harrumphing style, a "great social and economic experiment, noble in motive and far reaching in purpose" —had provoked widespread circumvention of the law ever since it went into effect in 1920.

But as late as 1931, advocates of "Repeal," as the anti-Prohibition movement was called, expected that it would take another decade to achieve their goal. The sudden collapse of support for legislative attempts to enforce temperance, then, testified to the national revulsion against the hypocrisy and criminality (the Mafia got its big boost through bootlegging during the Roaring Twenties) engendered by Prohibition.

One of these Prohibition opponents was Columbia University President Nicholas Murray Butler. Andrew Sinclair's Prohibition: The Era of Excess quotes the Nobel Peace Prize winner and past Republican candidate for President in a typically stuffy moment: "My own feeling toward prohibition, is exactly the feeling which my parents and my grandparents had toward slavery. I look upon the Volstead Act precisely as they looked upon the Fugitive Slave Law. Like Abraham Lincoln, I shall obey these laws so long as they remain on the statute book; but, like Abraham Lincoln, I shall not rest until they are repealed."

(Let's leave aside the way Nick conflated opposing a racist, soul-destroying institution with sneaking around a misguided attempt at moral busybodyism, or with his almost comical injection of himself into the whole thing. No, if you ask me, Nick might have better spent his time by "not resting" until he had eliminated the quota system that drastically restricted Jewish faculty hires at his university. But that's a story for another day.)

Some years ago, a new kind of sport on college campuses was inspired by repeats of The Bob Newhart Show. Every time another character entered his apartment with the greeting "Hi, Bob!", some student would down a drink—presumably getting buzzed midway through one half-hour episode and well on his way to oblivion by the end of a TV Land marathon.

Imagine what some of these students could have done with the literature inspired by the Prohibition Era! Every time a character downs one, they drain one in response! Let's not even consider Hemingway's The Sun Also Rises, since that featured expatriate imbibers. For our purposes, we'll stick closer to home—we'll have more than enough material to suit our purposes right there.

One contender for poet laureate of Prohibition might be Joseph Moncure March with his long poem The Wild Party (later adapted—twice—into a musical). Starting from its opening lines—"Queenie was a blonde and her age stood still,/And she danced twice a day in vaudeville"—it reads like a tabloid story come to life.

Lyricism so beautiful it makes you gasp at times also blinds readers of The Great Gatsby to the fact that, at heart, it is every bit as excessive and violent as March’s tale. Everybody knows that the eponymous “hero,” Jay Gatsby, is a hopeless romantic who decides that the best way to get the money to win his girl Daisy back is by becoming a bootlegger.

But it’s forgotten just how often the euphoric consumption of alcohol in the novel is followed by violence. Gatsby’s party begins with “yellow cocktail music” played by the orchestra and ends, several drunken hours later, with “women … now having fights with men said to be their husbands.” A party with Tom Buchanan’s mistress, Myrtle Wilson, ends, several drunken hours later, with Buchanan breaking Myrtle’s nose. The last meeting with Gatsby and Daisy starts in West Egg, moves to a party and more drinking in Manhattan, and ends—several drunken hours later—with Daisy accidentally running over Myrtle Wilson.

But even more than Fitzgerald, the writer most associated with the speakeasy culture might be John O’Hara. The two early novels that won him his reputation—Appointment in Samarra and Butterfield 8—might have been published after the end of Prohibition, but they are set in that period. Moreover, many of his later short stories set down, with almost documentary accuracy, what it was like to live through that era.

In the novella "Imagine Kissing Pete," part of his woefully underappreciated trilogy Sermons and Soda-Water (1960), O'Hara, speaking in the voice of alter ego James Malloy—like himself, a middle-aged writer forced by circumstance to stop drinking cold—could still write sternly that Prohibition bred "a cynical disregard for the law of the land" and "made liars of a hundred million men and cheaters of their children." He even traced its malign influence on "West Point cadets who cheated in examinations [and] the basketball players who connived with gamblers."

O’Hara is a far more powerful voice against Prohibition than Butler, all the more so for being so disillusioned. But now, a word on behalf of the Prohibitionists.

As counterproductive as their legislation was, at the time of its enactment there had been no real effective way to counteract alcohol, a scourge that had destroyed families. Women such as Cary Nation and Frances Willard had been the driving forces behind the anti-saloon movement, and it was no coincidence that the amendments for Prohibition and women’s suffrage represented virtually the last hurrah of Progressive legislation.

It would take Bill Wilson, founder and longtime head of Alcoholics Anonymous, to create an organization and method for attacking alcohol abuse. The seeds of his idea, however, would not come until 1935—two years after repeal of Prohibition.

Wednesday, March 12, 2008

This Day in American History


March 12, 1912—The first meeting of the Girl Guides—soon to be known as the Girl Scouts of the U.S.A.—was held at the home of Savannah hostess Juliette Gordon Low.

In early November 1999, on the same vacation when I toured Savannah and witnessed the filming of The Legend of Bagger Vance there, I visited the Juliette Gordon Low Birthplace, the first National Historic Landmark in one of my favorite cities. (That's it in this photo, taken from the Web site of the Girl Scouts.) Built in 1821 for James Moore Wayne, later an associate justice of the Supreme Court, the house, located at the corner of Bull Street and Oglethorpe Avenue, was purchased 10 years later by Low’s grandfather. It has been restored to reflect its look at roughly the time of Low’s marriage in 1885, just before significant alterations were made by her parents.

Appropriately enough for the week I was there, as well as for a city that has embraced its dark side in a major way since Midnight in the Garden of Good and Evil, I learned that Low was born on Halloween night in1860.

A posthumously published piece of hers, “Memories of My Girlhood,” published in Literary Savannah, related what happened during a visit by Union General William T. Sherman, a prewar friend of her mother’s, had stopped by the house to ask if he could be of any use to her while his army occupied the city.

Young Juliette asked a Union soldier in Sherman’s party how he had lost his arm. Upon being told that it had been shot off by a rebel, Juliette answered naively, “I s’pose my father did it. He shot lots of Yankees.” (Luckily for Juliette, her mother hustled her out of the room.)

As I researched Low’s life, it occurred to me that she might have wished that her aim had been as good as her father’s – at least when it came to handling her wayward husband. A wedding photo upstairs shows the British groom, a handsome charmer named William Mackay Low, towering over his 5-ft.-1-inch bride.

This was not to be the only difference between the two. Already partly deaf in one ear, Juliette experienced additional hearing impairment when her doctor, removing a grain of rice that had lodged there when thrown at the wedding, punctured the remaining good ear. The bride would eventually suffer continuing hearing loss that left her subject to periodic melancholia.

Despite all this, Juliette made the best of life in England, making all kinds of social rounds, but the marriage increasingly came to resemble a southern version of an Edith Wharton novel, with a wife trapped in an increasingly loveless marriage, without even a child to offer a semblance of solace.

The couple spent more and more time apart. By the time William died in 1905, they were not only separated, but he had left his mistress the bulk of his estate. Juliette contested the will and ended up with a large settlement.

Perhaps it was just as well that Juliette did not plunge even more deeply than she had into the English social whirl—all those country estates and afternoon teas would have tried the patience of this spirited woman with a blithe disregard for prevailing norms.

When she first got behind the wheel of an automobile in England, “Daisy” Low was reprimanded for driving on the right side of the road. (Protesting “But I’m an American!” didn’t help her cause.) By the time she learned to drive in the English manner, she had decided, after her husband's death, to return home, where the rules of the road were different—as she discovered when she crashed into someone’s brick wall. When her brother rushed to the scene, she informed him that she hadn’t told the owner of the house of the damage he had just suffered: “Oh, I didn’t want to bother him.”

In a straitlaced time, Low was not averse to shocking onlookers by standing on her head and exposing her bloomers. If she wasn’t fond of you, she’d seat you in the dining room behind a carving she had made of an animal’s rear end.

Low did come away with one significant relationship from her long European sojourn: a friendship with Sir Robert Baden-Powell, a retired general and founder of the Boy Scouts. The crusty general didn’t think much of the idea of girls’ scouting until his sister Agnes turned him around. The two then encouraged Low to establish a similar movement in the United States.

Previously desultory in her volunteer efforts, Low now had found her life’s work, demonstrating the truth of the George Eliot observation that "It is never too late to be what you might have been." She took to the task with aplomb, proving to be an irresistible force with her wit and charm: buttonholing family and friends for help, raising funds, and touring the nation on recruitment drives, disregarding social distinctions and physical disabilities alike as she turned her original group of 18 into the largest voluntary association of women and girls in America.

When Low died of breast cancer in 1927, she was buried next to her parents in the family plot in Laurel Green Cemetery. At her request, she was buried in her green serge Scouts uniform. Upstairs, in her birthplace, a painting depicts her in khaki rather than the green color adopted later. (Khaki, it came to be felt, had military overtones that it was better to de-emphasize.)

By the time of my visit, the Girl Scouts’ uniform had been redesigned yet again, by Bill Blass. Somehow, the idea of a fashion maven putting his touches on clothing meant to be functional doesn’t sit well with me. I wonder what Juliette would have done. Maybe stand on her head and think on it?

Sunday, March 9, 2008

This Day in American History

March 9, 1892—Early in the morning, a white mob seized three African-American prisoners from a Memphis jail, took them to a rail yard and shot them, in retaliation for their alleged part in the wounding of three white sheriff’s deputies.

The murdered prisoners, owners of the black-owned People’s Grocery Company, had been friends of a 30-year-old former schoolteacher. But the young woman,
Ida B. Wells, was now a journalist, and the death of her friend launched her on a decades-long crusade to document and destroy domestic terrorism against African-Americans in the Jim Crow era of southern segregation.

Lynchings, the particular focus of Wells’ work, were open murders of individuals often suspected of criminal activity, usually carried out spontaneously by mobs, and perpetrated publicly as a warning to others. The origins of the practice have been traced to Ireland and to colonial-era North and South Carolina.

At their worst, lynchings involved not just hangings and shootings, but also
burning at the stake, maiming, dismemberment, castration, and other brutal methods of physical torture.

As heinous as the crimes themselves was their routine nature. Nearly five thousand Americans were lynched between 1882 and 1951, an average of more than one a week, according to the
Yale-New Haven Teachers Institute. (That number included 1,200 whites who were murdered in the South for voting Republican or for being sympathetic to blacks.)

Though whites cited sexual crimes—overwhelmingly trumped-up charges—as justification for twisting the law for their own purposes, the lynching involving the
People’s Grocery Company illustrated another motivation: blacks’ assertion of political or socioeconomic rights. The People’s Grocery had dared to compete against a white company that had previously enjoyed a monopoly of blacks’ business in an area on the edge of Memphis then known as “The Curve” (named for the arc made by streetcars).

Daily white-black arguments led to threats against the People’s Grocery. Their plea for police protection was met with a that’s-not-in-our-jurisdiction response (the grocery was right outside city limits) and a suggestion that the owners use guns to protect themselves. The shooting of three whites then resulted. The owners—Thomas Moss, Calvin McDowell, and Henry Stewart—seemed to be safe, at least momentarily, in jail. But when a black contingent guarding them decided after three days that they needed no further protection, the white mob abducted and lynched the black businessmen.

As co-editor and part owner of the Memphis Free Speech, Wells swung into action. A blistering series of articles chronicled the extent of lynching as local social control (eight lynching cases in the Memphis area in just one month of 1892), contended that a Winchester rifle should have a “place of honor in every black home,” and advised readers of their last resort: “save our money and leave a town which will neither protect our lives and property, nor give us a fair trial in the courts, but takes us out and murders us in cold blood when accused by white persons.”

Eventually Wells was run out of town over a particularly pointed editorial noted that, contrary to the notion that lynchings were often justified as defenses against the rape of white womanhood, some involved black men who had engaged in consensual sex with white women. But she merely continued her crusade elsewhere.

In Britain, Wells delivered 102 lectures in an attempt to bring international pressure to bear on the United States to enact anti-lynching legislation. In Chicago, she married an attorney who sold her his shares of the Chicago Conservator, enabling her to become full owner of the city’s first African-American newspaper.

As fearless as she was indefatigable, Wells tangled with not only racists but also with other advocates for women’s or civil rights. She upbraided Frances Willard when the head of the Women’s Christian Temperance Union, in an attempt to recruit southern women, accepted the rape myth and condoned lynching and the color line. And, in a low point in an otherwise often-brilliant career, W.E.B. DuBois bragged about marginalizing her influence within the NAACP, an organization that she (along with him) was instrumental in establishing.

In a time when reporters have devolved into gotcha pestering of Presidential candidates when they’re not chasing celebrities on their way in or out of rehab, it’s important to recall an era when journalists made a difference in the lives of others.

Shakespeare wrote that some men “are born great, others achieve greatness, and others have greatness thrust upon them.” The daughter of slave parents, Wells assuredly was not born great. It was the unfortunate times in which she lived—a true dark age for African-Americans—that thrust greatness upon her. But she also continually achieved greatness over and over again. Her life and work stand as monuments to what one person can do armed only with facts and courage.

This Day in American History

March 9, 1892—Early in the morning, a white mob seized three African-American prisoners from a Memphis jail, took them to a rail yard and shot them, in retaliation for their alleged part in the wounding of three white sheriff’s deputies.

The murdered prisoners, owners of the black-owned People’s Grocery Company, had been friends of a 30-year-old former schoolteacher. But the young woman,
Ida B. Wells, was now a journalist, and the death of her friend launched her on a decades-long crusade to document and destroy domestic terrorism against African-Americans in the Jim Crow era of southern segregation.

Lynchings, the particular focus of Wells’ work, were open murders of individuals often suspected of criminal activity, usually carried out spontaneously by mobs, and perpetrated publicly as a warning to others. The origins of the practice have been traced to Ireland and to colonial-era North and South Carolina.

At their worst, lynchings involved not just hangings and shootings, but also
burning at the stake, maiming, dismemberment, castration, and other brutal methods of physical torture.

As heinous as the crimes themselves was their routine nature. Nearly five thousand Americans were lynched between 1882 and 1951, an average of more than one a week, according to the
Yale-New Haven Teachers Institute. (That number included 1,200 whites who were murdered in the South for voting Republican or for being sympathetic to blacks.)

Though whites cited sexual crimes—overwhelmingly trumped-up charges—as justification for twisting the law for their own purposes, the lynching involving the
People’s Grocery Company illustrated another motivation: blacks’ assertion of political or socioeconomic rights. The People’s Grocery had dared to compete against a white company that had previously enjoyed a monopoly of blacks’ business in an area on the edge of Memphis then known as “The Curve” (named for the arc made by streetcars).

Daily white-black arguments led to threats against the People’s Grocery. Their plea for police protection was met with a that’s-not-in-our-jurisdiction response (the grocery was right outside city limits) and a suggestion that the owners use guns to protect themselves. The shooting of three whites then resulted. The owners—Thomas Moss, Calvin McDowell, and Henry Stewart—seemed to be safe, at least momentarily, in jail. But when a black contingent guarding them decided after three days that they needed no further protection, the white mob abducted and lynched the black businessmen.

As co-editor and part owner of the Memphis Free Speech, Wells swung into action. A blistering series of articles chronicled the extent of lynching as local social control (eight lynching cases in the Memphis area in just one month of 1892), contended that a Winchester rifle should have a “place of honor in every black home,” and advised readers of their last resort: “save our money and leave a town which will neither protect our lives and property, nor give us a fair trial in the courts, but takes us out and murders us in cold blood when accused by white persons.”

Eventually Wells was run out of town over a particularly pointed editorial noted that, contrary to the notion that lynchings were often justified as defenses against the rape of white womanhood, some involved black men who had engaged in consensual sex with white women. But she merely continued her crusade elsewhere.

In Britain, Wells delivered 102 lectures in an attempt to bring international pressure to bear on the United States to enact anti-lynching legislation. In Chicago, she married an attorney who sold her his shares of the Chicago Conservator, enabling her to become full owner of the city’s first African-American newspaper.

As fearless as she was indefatigable, Wells tangled with not only racists but also with other advocates for women’s or civil rights. She upbraided Frances Willard when the head of the Women’s Christian Temperance Union, in an attempt to recruit southern women, accepted the rape myth and condoned lynching and the color line. And, in a low point in an otherwise often-brilliant career, W.E.B. DuBois bragged about marginalizing her influence within the NAACP, an organization that she (along with him) was instrumental in establishing.

In a time when reporters have devolved into gotcha pestering of Presidential candidates when they’re not chasing celebrities on their way in or out of rehab, it’s important to recall an era when journalists made a difference in the lives of others.

Shakespeare wrote that some men “are born great, others achieve greatness, and others have greatness thrust upon them.” The daughter of slave parents, Wells assuredly was not born great. It was the unfortunate times in which she lived—a true dark age for African-Americans—that thrust greatness upon her. But she also continually achieved greatness over and over again. Her life and work stand as monuments to what one person can do armed only with facts and courage.

Monday, March 3, 2008

This Day in Congressional History (Jefferson Davis Wins OK to Buy Camels for Army in Southwest)

March 3, 1855—In one of the most visionary or oddest (take your pick) actions in its history, Congress appropriated $30,000 at the request of Secretary of War Jefferson Davis to purchase and import Egyptian camels for military use in the deserts of the Southwest.

One thought went through my mind as I pondered why Congress might not “walk a mile for a camel” (as the old TV cigarette commercial put it) but would pay a heckuva lot for a batch of these. How did Davis get monetary approval for this, at a time when a) the South was disinclined to fund even internal improvement projects, and b) like now, Congress was better at creating projects than appropriating money to keep them going? (Take a look at Yellowstone National Park, which Congress created in 1872 without bothering to fork up the necessary maintenance funds.)

Given the ultimate fate of the sides they chose in the Civil War, most Americans today would be surprised to learn that in 1861, many ancestors thought more highly of the executive abilities of newly elected Confederate President Jefferson Davis than of the North’s Abraham Lincoln.

It wasn’t only that Davis had become a leading Senator while Lincoln had served only one largely unnoticed term in the House of Representatives, or even that Davis’ military glory in the Mexican War contrasted starkly with Lincoln’s almost comic-opera service in the Black Hawk War of 1832.

Like his predecessor as the South’s spokesman for states’ rights, John C. Calhoun, Davis really left his mark as one of the ablest Secretaries of War in this country’s history—something admitted by even some of his worst enemies. (One of whom was Senate colleague Sam Houston, who termed Davis "as ambitious as Lucifer and cold as a lizard.”) 

All of this occurred, of course, before the faults as politician and administrator that helped doom his leadership of the Confederacy came to the fore, including lack of flexibility, inability to work with others, and, of course, a rather large moral blindspot in regard to slavery.

Halfway through his term as Secretary of War under President Franklin Pierce, Davis had come a long way toward his goal of modernizing the Army. On March 2, he had gotten Congress to create two regiments each for the infantry and cavalry, reorganizing the latter as a separate army branch. 

(Many of the officers he installed in this crack new unit went on to gain renown for the Confederacy, including Robert E. Lee, Jeb Stuart, Joseph E. and Albert S. Johnston, and John Bell Hood.)

A supporter of a railroad stretching from the Mississippi to the Pacific, Davis hoped to build the new route through the South. In reports to President Pierce concerning explorations and surveys in 1853 and 1854, he took notice of the use of camels in the Mideast, and, believing they could also be effectively used to open up commerce in the vast Southwest desert, suggested that a small number be purchased for use here. Congress adopted his suggestion

With no camels in the U.S., Davis had to bring them from abroad. This required sending a combined army-navy force to the Eastern Mediterranean on two separate trips to buy a total of 77 bactrian (two-hump) and dromedary (one-hump) camels; enlisting the help of six Arabs and a Turk to help pick them; transporting the 70-plus beasts of burden back to Indianola, Texas a year later; and having a short, cheerful camel-driver named Hadji-Ali (nicknamed “Hi Jolly” by the Americans) teach his art to Western mule-skinners.

In 1857, Edward F. Beale used 25 of these camels to help survey a wagon road from Fort Defiance in New Mexico to eastern California. His subsequent report praised their abilities as pack animals (they could haul 600 pounds for 30 miles in desert conditions), with one significant caveat: they smelled so badly and were so mean that neither horses nor men wanted to get near them.

Davis’ successor at the War Department, John B. Floyd, was encouraged enough by the experiment to request Congress for a thousand of them. By this time, though, sectional tensions were so high that Congress wanted no more part of it, and the Camel Corps came to an abrupt end.

The “Camel Corps” -- and with it, an early American encounter with the mostly unknown Middle East -- ended, largely unnoticed, in 1866. By that time, the remaining camels were shipped to Benicia Arsenal in California, where they were auctioned to the public. Some were used to haul freight to Nevada mining camps while others were turned loose in the Southwest desert, where they popped up from time to time to scare the crap out of unwary travelers.

Today, the Benicia Historical Museum at the Camel Barns commemorates this most unusual period in American military history.

Monday, February 25, 2008

This Day in New York History (Nativist 'Bill the Butcher' Meets His Maker)

February 25, 1855—William Poole, better known as “Bill the Butcher” of the Bowery Boys Gang, was shot at the Stanwix Hotel in New York by gunmen doing the bidding of Tammany Hall tough John Morrissey. 

The Butcher, who died two weeks later, was buried in an unmarked grave in Green-Wood Cemetery.

Poole became adept in using knives by working in his father’s butcher shop, then by opening his own. 

His muscular frame made him a natural “shoulder-hitter,” or enforcer, for the virulently nativist True American or Know-Nothing Order of the Star-Spangled Banner (the Know-Nothings earned their nickname because their oath of secrecy enjoined them to answer, when questioned about the group, “I know nothing”). It was in this latter capacity that he made the fateful acquaintance of Morrissey.

Some months back, Morrissey—who made his name among the immigrant Irish in challenging the True Americans’ control of the streets and the polls by smashing up their meeting halls—had gone into the American Club on Water Street and challenged heavyweight champ Tom Hyer and Bill the Butcher. 

Morrissey was cut up so badly that The Butcher decided that he could live as an example of what could happen to the unwary.

On the night he died, Poole was in the Stanwix when Morrissey came in, pulled the trigger on his gun, and fired three times—with no result. (That type of thing happened surprisingly often in those days.) 

Poole was ready to use his knife on the Tammany tough when Morrissey was hustled out of the hotel in the nick of time. Some time later, Morrissey’s henchmen came back and gunned The Butcher down.

In 2003, a granite headstone was placed on Poole’s grave bearing his last words: “Goodbye, boys: I die a true American.” The fact that this anti-Irish, anti-Catholic hood was posthumously remembered in this fashion might derive from his recent curious cinematic fame, as “Bill the Butcher” Cutting, in Martin Scorsese’s 2002 film,
Gangs of New York.

As in There Will Be Blood, Daniel Day-Lewis allows his Snidely Whiplash moustache to perform half his acting job in this role. (That's the actor in costume in the photo accompanying this blog entry.) Moreover, Scorsese took dramatic license with the real-life facts by placing The Butcher’s death closer to the New York City draft riots of 1863.

The director rather freely adapted the
Herbert Asbury history of the 19th-century New York underworld. Scorsese leaves his own very individual thumbprint on every one of his films (notably through his fascination with crime, betrayal, and the costs of living within a group's codes), so even his worst work is not without interest.

But aside from that ridiculous handlebar moustache of Day-Lewis', how much do you recall about the film? One great scene where the Irish gang led by Liam Neeson comes in from all sides to form a great mass to face off against the nativists; that curly wig of Cameron Diaz’s; and anything else? I didn’t think so.

Instead of concocting a ramshackle plot with events that nobody can remember, Scorsese would have been better off adapting a novel that covered the same time period and milieu (and even includes the Poole killing as a background incident)—Peter Quinn’s
Banished Children of Eve—which has the advantage of an actual plot and characters an audience could really care about.

Sunday, February 24, 2008

This Day in American History (House of Representatives Impeaches Andrew Johnson)

February 24, 1868—By a straight party-line vote of 126 to 47, Republican-dominated House of Representatives impeached
President Andrew Johnson. Several months later, in the Senate, he escaped conviction and removal from office by a single vote.

Public perceptions of the impeachment of Johnson have changed, then changed again, over the years, in a fascinating case study of historiography. In the years immediately after the event, the President and his defenders were vilified in the North.

In the first half of the 20th century, a number of historians, following the lead of
Columbia University professors William Dunning and John W. Burgess, depicted Reconstruction as an era marked by corruption and Radical Republican vengeance. They viewed Andrew Johnson as at worst an imperfect upholder of the Constitution and at best a man of courage.

Starting in the mid-1950s and the civil rights movement, however, a more complicated picture emerged. 

Kenneth Stampp, James McPherson, David Herbert Donald, John Hope Franklin, Eric McKitrick and Eric Foner (the last two in the same Columbia history department as Dunning) pursued the lead of African-American historian W.E.B. DuBois’s Black Reconstruction in America, 1860-1880 (minus the Marxist-influenced ideology) in showing this era as a lost opportunity for American racial and social progress. 

These “revisionists” see President Johnson as deeply racist and his adversaries as more moderate than that loaded term “Radical Republican” would suggest.

The events leading up to the impeachment votes bolster the argument for the last viewpoint.

Johnson, a lifelong Democrat, was nominated by Republicans as Abraham Lincoln’s running mate in the 1864 election as a way to win the votes of Union Democrats (including those from Southern border states) that year. 

Lincoln’s assassination, then, put in the White House a man largely out of sympathy with the party that controlled Congress.

The Dunning school of thought on Reconstruction posited that Johnson was merely following his predecessor’s lenient Reconstruction policy. They ignore, however, both Lincoln’s political suppleness and the evolution of his policies. 

Unlike Johnson, Lincoln was a master of men, a politician with a highly attuned sense for exactly what was politically possible. If anyone could have reconciled Southerners and Radical Republicans, it was he.

But there was a limit to how much even this man who called for “malice for none and charity for all” would accept. 

In Abraham Lincoln: Man Behind the Myths, Stephen B. Oates takes issue with the idea that Lincoln would have been accommodating to the South. In fact, he shows how, on issue after issue (notably that of compensation for slaves), the South’s stubbornness pushed him increasingly toward progressive, pro-freedmen policies.

Though initially interested in indicting leading Confederates (especially the Southern aristocrats that he blamed for starting the war) for treason, Johnson soon parted ways with the Republicans over using government to promote economic development and over greater political and economic opportunity for freed slaves.

At every turn, Johnson used his office to circumvent the will of Congress: by appointing conservative generals to administer Southern military districts, by giving civilian governments the authority to control voter registration and election of convention delegates, and to obstruct any role for freedmen in the new Southern governments.

Johnson’s personal failings didn’t help. His intoxication during his inaugural address as Vice-President (Lincoln kept his eyes shut throughout the painful ordeal, then rose to his feet to deliver his immortal Second Inaugural Address) enabled detractors to depict him (wrongly) as a habitual drunkard. 

His customary intransigence and public belligerence toward opponents – even, on Washington’s birthday in 1866, accusing Congress of trying to assassinate him and starting a second rebellion—unnecessarily alienated many.

The showdown between President and Congress came over the
Tenure of Office Act, which provided that all officials confirmed by the Senate could only be removed with that body’s approval. 

Johnson sought to remove Secretary of War Edwin Stanton for defiantly claiming in a Cabinet meeting that military governors in the South were answerable to Congress, not to the President.

At first, Johnson sought to fire Stanton by offering his post to General Ulysses S. Grant. The Union hero, having his own disagreements with the President on Reconstruction, wouldn’t take the bait. 

A long, painful search for a successor then led the President to General Lorenzo Thomas, who made the mistake of acceding to Stanton’s wish for “time for reflection.”

Bolstered by a one-word message from Massachusetts Senator Charles Sumner—“Stick”—Stanton proceeded to barricade himself in his office! For a second time, Johnson removed him. 

A day later, on February 22, the spokesman for the powerful House Reconstruction Committee, Thaddeus Stevens, drafted 11 counts of impeachment—two dealing with Johnson’s indecorous behavior toward Congress, the others with the Tenure of Office Act.

(Particularly in the early 20th century, when he was caricatured as “Austin Stoneman,” the wild-eyed Congressman in D.W. Griffith’s Birth of a Nation, Stevens came in for harsh criticism. His constitutional case against Johnson was not the strongest, but this passionate supporter of civil rights long before it was fashionable deserves better than his caricature, and is finally getting his due from historians.

Though nearly all his correspondence with his household “servant” no longer survives, the few letters that do strongly suggest that the two were an interracial couple involved in a common-law marriage. When Stevens died, not long after the impeachment proceedings, he insisted on being buried in the only cemetery in his area that allowed for internment of blacks and whites together, so he could be next to the woman he loved.)

Partly because the coming trial had such important implications for the later impeachment proceedings against Nixon and Clinton, and partly because the trial was such a fascinating spectacle in itself, I will be returning to the story of Johnson’s impeachment at a later point.

Suffice it to say for now that his prospects for acquittal were enhanced immeasurably by his legal team’s insistence that he stay silent and not even attend the proceedings – his mere presence might have tipped the balance decisively against him.

Johnson, like all Presidents, had the responsibility to ensure that laws be “faithfully executed.” The Radical Republicans surely overreached with impeachment charges based on the Tenure of Office Act, but Johnson just as surely failed in his constitutional responsibility to ensure that the laws be “faithfully executed.”