Feeds:
Posts
Comments

Archive for the ‘New York Times’ Category

Unconventional Wisdom

OP-EDS & REVIEWS

By Gil Troy, New York Times, 8-27-12

With fewer Americans interested in party conventions and television executives providing less prime time coverage, the calls to “just scrap ’em” are mounting.  This summer, CBS announced it preferred broadcasting a rerun of “Hawaii Five-0” to convention speeches, while Chris Wallace of Fox News toasted the good old days when “real business got done.”

Primary voters, not convention delegates, select the presidential nominees. The nominees announce their running mates before the conventions begin. Nearly everyone seems to agree: these party parleys risk irrelevance.

But the conventional wisdom about conventions is wrong. Conventions still count. They help define the candidates, frame the debate, command attention and inject some communal moments into an increasingly atomized political process.

Maintaining traditional rituals is an important, unappreciated element of the campaign as a whole, a key part of its legitimizing function. The way we mobilize citizens, build candidate credibility and reaffirm party identity in two parallel rituals — despite all the partisan enmity — helps explain America’s quicksilver shift from vicious campaigns to peaceful, often rapturous, inaugurations. These familiar political ceremonies broadcast a reassuring continuity and stability even as candidates promise change, and partisans warn of disaster if they lose.

Since the 1830s, these matching, deliciously democratic rites have shaped campaigns, enhancing the dialogue between candidates and voters. Until Andrew Jackson’s democratizing revolution, “King Caucus” reigned, as Congressional leaders picked party nominees secretly. The conventions reflected nineteenth-century Americans’ emergence as partisans and not just voters. Popular party politics became the first great American national pastime. Then as now, convention delegates were both mediators and validators, conveying messages to candidates from their constituents, while bathing the candidates in populist love with hoopla and huzzahs.

True, conventions were once kingmakers, selecting the party’s nominee, often to the people’s surprise — and occasionally to party elites’ chagrin.  Originally, delegates chosen by local party leaders convened in elaborately festooned halls, like the “Wigwam” in Chicago, where Abraham Lincoln was nominated by the Republican Party in 1860. Back then, nominations and even the basic character of the party were up for grabs, as local political bosses squabbled over the platform while choosing the party’s “standard bearers” – the campaign’s military metaphors announced the party’s commitment to mobilizing manpower while maintaining discipline.

More power-hungry than idea driven, the bosses were angling for “spoils” and protecting turf, not just advancing policy positions. And the defining convention cliché — when delegates from the “great state” of LouWHEEziana or CaliFOURRRnia or wherever else praised their home bases effusively — affirmed regional sensibilities while uniting an increasingly centralized polity.

Ticket to the 1928 Democratic National Convention, held in Houston, Texas.Library of CongressTicket to the 1928 Democratic National Convention, held in Houston, Texas.

Seeking a “balanced ticket” to reflect both parties’ traditional self-image as broad, umbrella coalitions, conventions often produced awkward shotgun marriages. The Republicans in 1904 paired the staid William McKinley with the bombastic Theodore Roosevelt. The Democrats in 1928 mismatched the Northern Catholic city slicker Governor Alfred E. Smith of New York with a “favorite son” candidate from Lonoke, Ark., Senator Joseph T. Robinson.

Divided and disputatious, conventions frequently deadlocked. In 1928, the Democrats took 103 ballots to nominate a candidate. Sometimes, the stalemates reflected the party’s fragmented politics, producing “dark horses” — unexpected, inoffensive compromise choices — such as the Democrat James Knox Polk in 1844 and the Republican James A. Garfield in 1880. Sometimes, great ideological divisions were at play. In 1852, the Whigs, splintering over slavery, nominated the antislavery General Winfield Scott on the 53rd ballot, even as the platform appeased Southerners by endorsing states rights. Antislavery Whigs supported their nominee while “spitting upon the platform” in Horace Greeley’s memorable phrase. Twelve years later, during the Civil War, when the nominee George McClellan reversed the pacifist Democratic convention’s priorities by saying “the Union is the one condition of peace,” vice presidential nominee George H. Pendleton was so furious that McClellan would not end the war unconditionally, he boycotted his running mate’s campaign events.

Originally, nominees rarely attended the conventions, and never addressed the delegates once chosen. Believing a candidate’s reluctance and passivity reflected his virtue and suitability, the party offered the nomination by mail, which the nominee accepted with a formal reply. In 1848, Zachary Taylor’s acceptance was delayed for weeks because the notification committee’s invitation, sent postage due, languished in the Dead Letter office. The thrifty Taylor only accepted letters with prepaid postage.

By 1852, Scott used a new invention — the telegraph — to accept immediately, becoming the first nominee to address a convention directly, albeit remotely. When the rival Democrats chose Franklin Pierce after 49 ballots, the notification committee’s visit to him created a tradition of sending a party delegation to make the offer in person.

Pierce kept quiet that day. The post-convention notification ceremony later grew into a spectacle, as large delegations representing the diverse party interests visited the nominee, who, increasingly, endorsed the party and the platform with a full speech. By 1892, the Democratic financier and strategist William C. Whitney rented out Madison Square Garden so that the ex-president Grover Cleveland, seeking a comeback, could accept in front of 20,000 people.  Democrats rejoiced that this ceremony “indicated that the candidates were in touch with the people.” Republicans mocked Cleveland as “Jumbo” the circus elephant playing Coney Island.

Even before they could be transmitted live, dramatic convention moments united Americans. After William Jennings Bryan’s electric “Cross of Gold” speech in 1896, the once-obscure 36-year-old Nebraska Congressman became a national celebrity.  From then on, his wife recalled, they lost their privacy: “The public had invaded our lives.”

In the twentieth century, the proliferation of primaries increasingly shifted the focus from the convention delegates to the people. Franklin Roosevelt’s decision to fly to Chicago and accept the nomination in person in 1932 was a twofer: it illustrated his vigor despite his polio and it signaled his readiness to offer a daring “New Deal.”  Functioning more as coronation ceremonies than anointments, conventions now climaxed with acceptance speeches. Especially with the televising of the conventions starting in 1948, the Republican and Democratic gatherings became more about what the candidate stood for than who the nominee would be.

In 1912, Theodore Roosevelt, trying to recapture the presidency, championed direct primaries to bypass the party bosses who opposed him. The major issue, Roosevelt said, is “the right of the people to rule.” While these primaries were “beauty contests” sporadically reflecting voter appeal, candidates began arriving at conventions with established national reputations and independent power bases.

These blows to the conventions — and party bosses — boosted democracy. The spread of Republican and Democratic primaries, especially after the party reforms of the 1960s, popularized the nomination process. The drama of conventions now stemmed from what politicians said and did rather than which presidential aspirant lost or won. The Democrats’ divisive, disruptive conventions in 1968 and 1972 helped elect Richard Nixon to the presidency, twice. In 1968, Hubert Humphrey could not recover from the generational conflict that erupted in riots between mostly Democratic working class Chicago cops and mostly Democratic radical student protestors. The botched convention helped him lose the presidency by a slim margin.

Four years later, the convention defined George McGovern as the candidate of “amnesty, abortion, and acid.” As one McGovern supporter put it later, “we should have had a coat-and-tie rule,” as many of the 50 million viewers at home saw too many long-haired hippies in tie-died T-shirts on the convention floor. McGovern became the first candidate since polling began to drop by two points rather than enjoy a “convention bump.”

In 1992, Pat Buchanan’s alienating, shrill call for “religious” and “cultural” war to “take back our country” taught Republicans the political dangers of convention extremism. By contrast, that year, Bill Clinton tapped into the convention’s contemporary power as a forum for communicating with the masses, strolling toward Madison Square Garden with his wife and daughter in tow, as part of an image makeover that helped him find his way.

Like the Olympics opening ceremony they always follow, these televised party carnivals forge party solidarity and launch the campaign, but they can still make or break candidacies. We could do without them because like the Olympics they are often overblown and self-important, but we would miss them (and we’d miss complaining about them too).

The conventions are part of the real action. Try explaining George Bush’s turnaround victory in 1988 without his convention call for a “kinder, gentler nation,” or George W. Bush’s surprisingly narrow 2004 victory without his joke that his “swagger” was merely considered walking in Texas, or Barack Obama’s entire career without his 2004 Democratic convention keynote speech proclaiming that “we worship an awesome God in the blue states, and we don’t like federal agents poking around in our libraries in the red states.”

Maintaining a democratic dialogue with 300 million citizens is hard. Using this traditional medium — resounding with history and the echoes of earlier speeches, incorporating the battles resolved and the triumphs achieved — roots the often stressful election in America’s proud and ongoing democratic heritage. The mirror image convention rituals of the seemingly hostile parties eloquently broadcast a message of commonality even amid the many policy differences.

Ultimately, these dueling conventions remind us that presidential campaigning is not just about choosing a winner, or debating the national future. It is also, like every good national ritual, about binding a community together through symbols and stories and reaffirming our joint past, common ties and shared fate.

Gil Troy, a professor of history at McGill University, is the editor, with Arthur Schlesinger Jr. and Fred Israel, of “History of American Presidential Elections, 1789-2008,” fourth edition. His most recent book, “Moynihan’s Moment: America’s Fight Against Zionism as Racism,” will be published this fall.

Advertisements

Read Full Post »

OP-EDS & REVIEWS

By Gil Troy, New York Times, 6-26-12

In running for re-election, Barack Obama commands the most powerful democratic platform in world history and the greatest backdrop, the White House. A seemingly casual announcement in a TV interview can trigger a political earthquake, as Obama did when he endorsed gay marriage. But the president’s magnificent residence can also be what Harry Truman called the Great White Jail.

Presidents are handcuffed by their power. Presidential statements can crash financial markets or start wars. The dignity of the presidency also inhibits, even in today’s brutal political environment. Obama’s campaign ad attacking Mitt Romney’s record at Bain Capital made some Democrats squirm as Republicans labeled the president “another gut-punching politician from Washington.”

The ambivalence about presidents politicking goes back to the nation’s founding. George Washington liked “going on tour,” getting “huzzahed” north and south – but, reflecting his contemporaries’ distaste for democracy, he avoided explicit political talk. When the less popular President Martin Van Buren toured before his 1840 re-election campaign, his fellow Democrats feted him. Nevertheless, the new partisanship polarizing American politics had Whig Party critics denouncing Van Buren’s activities as “undignified” and “insulting,” while mocking “His Majesty, King Martin the First.”

A cartoon depicted the obstacles facing President Martin Van Buren's reelection effort in 1840. Weighed down by a bundle labeled "Sub Treasury," Van Buren followed the lead of Andrew Jackson toward the White House.

Library of Congress

A cartoon depicted the obstacles facing President Martin Van Buren’s reelection effort in 1840. Weighed down by a bundle labeled “Sub Treasury,” Van Buren followed the lead of Andrew Jackson toward the White House.

In 1864, Abraham Lincoln said he was too busy to campaign for reelection — a common presidential posture. Still, “Honest Abe” was a crafty pol, who was “too busy looking after the election to think of anything else,” according to his treasury secretary, William Pitt Fessenden.

This posture of presidential passivity persisted, even after William Jennings Bryan’s 18,000 mile 1896 stumping tour ended the charade for challengers, who now campaigned openly and vigorously. The hyperkinetic President Theodore Roosevelt chafed under the restrictions in 1904, comparing it to “lying still under shell fire” when he was a Rough Rider. Still, T.R. understood that no matter what he did his election would be a “referendum on Roosevelt,” as one aide said.

The impression of energetic politicking Theodore Roosevelt conveyed — even while he felt constrained — propelled presidents more explicitly into politics. In the 1930s and 1940s, Franklin D. Roosevelt perfected the presidential techniques of campaigning by governing and scoring political points by pretending to be nonpolitical. Roosevelt showered voters with governmental goodies while parrying reporters’ political questions by saying “I don’t know nothin’ about politics.” Critics wondered how to criticize him as he saved starving children. Opponents “could only talk,” the Times columnist Arthur Krock marveled, as Roosevelt announced new initiatives in his campaign addresses. “The president acted.”

Unfortunately, F.D.R.’s act reinforced the traditional impression that politicking besmirched the president. Even while presiding over his party as adeptly as he presided over the nation, even while understanding how to sell policies not just develop them, Roosevelt disrespected the democratic dialogue. He treated the sacred act of soliciting voters’ support as a profane act of crass self-promotion.

In 1964, Lyndon Johnson, despite being a Roosevelt protégé, could not keep up the charade of acting presidential for long. “Get in your cars and come to the speakin’,” he yelled as he motorcaded – and showered farm aid, disaster relief, food stamps and pay raises on the communities he visited.

Eight years later, Richard Nixon took Roosevelt’s public prudishness and private ruthlessness to such extremes that he ruined his presidency. In 1972, President Nixon said that he would win re-election simply by “doing my job.” White House staffers froze out reporters who dared treat Nixon as a candidate, even as he privately called the campaign “a fight to the death.”

The Watergate revelations made all politicians look crooked. Nixon’s defense that every president acted ruthlessly resonated with the post-1960’s adversary culture epitomized by the hypercritical news media. Conflict-oriented stories emphasized politicians’ moral failings and the brutality of American politics.

The Watergate debacle prompted a new presidential primness. Gerald Ford and Jimmy Carter each followed a “Rose Garden strategy” while running for re-election, obscuring their political calculations in moralistic claims that the nation needed them working in the White House. This return to a nineteenth-century delicacy culminated in Michael Dukakis’s 1988 campaign. Dismissed by one reporter as “just a brain in the jar,” the bloodless Massachusetts technocrat who was not even yet president was so busy declaiming what was and wasn’t “worthy of a presidential campaign,” he blew a 20-point summertime lead.

As both candidate and president, Bill Clinton combated the growing perception that the Democrats had become the party of high-minded, long-winded, weak-chinned wimps who could not take a political punch. Clinton combined a Rooseveltian charm and duplicity with a shameless Nixonian ruthlessness that reassured Democrats after so many Reagan-era losses. “I find it appalling that a lot of well-established people don’t understand how important political skills are to governing,” James Carville, Clinton’s chief strategist, complained. If you don’t win, “you are never going to get anything done.”

Even before he became President, Barack Obama struggled with these mixed messages. In 2008, some aides welcomed stories that this high-minded philosopher-politician could be the tough Chicago pol when necessary. Now, Obama’s supporters are using the recent backlash against his Bain ads to emphasize that Obama “didn’t survive and triumph in battles with Chicago politicians, some of whom resembled dockside thugs, because he’s made of cotton candy,” as the Democratic consultant Donna Brazile wrote recently.

Obama’s image is a hologram, sometimes hovering above the fray, sometimes plunging into the political muck. With his Dream Act-like executive order halting the deportation of illegal immigrants who came to the United States as children, Obama is campaigning by governing as F.D.R. did, approaching the shamelessness of L.B.J. and the desperation of Clinton, banking on Americans’ appetite for presidential remorselessness. No president can govern effectively without being a consummate politician, which includes knowing how to sell yourself, push your agenda, trim, spin, compromise, build coalitions, punish enemies and trash opposing ideas.

While presidents also need to act proportionately and be statesmen-like, the presidential primness that began with George Washington was antidemocratic, reflecting the founders’ fears of mob rule. In our more democratic era, we still should fear demagogues while cherishing popular politics. The challenge is particularly difficult these days when politics seems so poisonous and presidents shrewdly seek insulation from the toxicity.

Treating politics as disreputable demeans democracy. The expanded involvement of voters in politics and the increased pressure on presidents to communicate with voters are among America’s greatest democratic achievements of the last two centuries. Political skills in the White House are like guns in Dodge City. You want your guys to have them but worry when the bad guys wield them. Perhaps it’s time to resurrect the 1964 complaint of the historian James MacGregor Burns, as the White House yet again becomes a “round-the-clock, round-the-year campaign headquarters.”

Gil Troy, a professor of history at McGill University, is the editor, with Arthur Schlesinger Jr. and Fred Israel, of “History of American Presidential Elections, 1789-2008,” fourth edition.

Read Full Post »

OP-EDS & REVIEWS

Culture Warriors Don’t Win

By Gil Troy, NYT, 4-27-12

Campaign Stops - Strong Opinions on the 2012 Election

Ronald Reagan campaigned for governor on Nov. 5, 1966 in<br /><br /> Hawthorne, Calif.,
Associated Press Ronald Reagan campaigning for governor on Nov. 5, 1966 in Hawthorne, Calif.

Mitt Romney’s apparent nomination proves that Republican voters are more pragmatic and centrist than their reputation suggests. The Republican candidates this year fought a classic political battle. Rick Santorum, Newt Gingrich and Ron Paul campaigned as purists, echoing Henry Clay’s famous expression from 1844, “I’d rather be right than president.” The realist Romney updated the belief of nineteenth-century partisans that a candidate’s most important ability is what they called his “availability,” as in “his ability to avail” – and prevail.

Gingrich and Santorum frequently justified their extremism by invoking the modern Republican demigod, Ronald Reagan. Gingrich is just now giving up on campaigning as a “Reagan conservative” against Romney, the “Massachusetts moderate.” In March, Santorum visited a Reaganite holy site – the Jelly Belly factory in Fairfield, Calif., which produced Reagan’s favorite jelly beans. “They’re asking you, people of principle, to compromise your principles and to be for someone who is less corely convicted than Ronald Reagan because we need to win,” Santorum said. He had a pragmatic argument too: “Every time we run someone that the moderate establishment of the Republican Party said we need to win, we lose.”

Santorum’s diction – corely convicted? – is as flawed as his historical memory. Republican voters have rejected culture wars and fanaticism in presidential campaigns repeatedly – they know culture warriors don’t win. Despite the talk about the rightward lurch of their party, a majority of Republicans have learned Reagan’s central political lesson. A Republican candidate can only win by wooing the center, and a president must govern as a national leader, not a factional chief or a cultural crusader.

Even when it began in the 1850s as an ideological anti-slavery breakaway group, the Republican Party favored more “available” nominees. The first Republican nominee, John C. Frémont, was most famous as “The Pathfinder.” In 1860, Abraham Lincoln was the compromise candidate, defeating the zealots Salmon P. Chase and William Henry Seward. Lincoln’s strategy was “to give no offence to others – leave them in a mood to come to us, if they shall be compelled to give up their first love.” He even made his acceptance letter “sufficiently brief to do no harm.”

There has been a more substance-oriented counter-tradition, epitomized by Grover Cleveland’s challenge, “What is the use of being elected or re-elected, unless you stand for something?” But the need to appeal broadly to America’s diverse electorate has usually prevailed. American voters’ weakness for popular icons over articulate ideologues ultimately frustrated even Henry Clay, the conscience of the Whig Party. As the Mexican War hero Zachary Taylor, who had never even voted for president before, conquered his party in 1848, Clay, well aware that Americans loved turning soldiers into presidents, moaned, “I have thought that I might yet be able to capture or to slay a Mexican.”

In the twentieth century, Ronald Reagan delivered his best lines as a culture warrior, including the grand slam — “A hippie is someone who looks like Tarzan, walks like Jane and smells like Cheetah” – while governing California, not while he was running for president. Reagan won in 1980 by moving beyond Barry Goldwater’s cranky conservatism, which had triggered the Democratic landslide of 1964.

Reagan’s conservatism with a smiley face emphasized economic issues. Within weeks of his inauguration in 1981, conservatives were complaining that Reagan’s Cabinet was too moderate. Their cry — “Let Reagan be Reagan” — demanded a more ideological and confrontational “corely convicted” leadership. But in compromising and popularizing, Reagan was being Reagan.

Nevertheless, conservatives revered Reagan because they never doubted his essential conservative identity. In Puritan terms, Reagan had a “covenant of grace” with conservatives, not a “covenant of works.” His salvation came from sharing core beliefs not engaging in particular acts.

Since Reagan, conservative ideologues like Santorum have inspired voters, disrupted primaries, enraged Democrats, alienated independents, but lost. In 1988, the evangelical preacher Pat Robertson surged in Iowa, then faltered. In 1992, Pat Buchanan was only popular enough to hurt President Bush, not to win. This pattern has held, with flareups of varying incandescence from Alan Keyes to Gary Bauer to Mike Huckabee. George W. Bush did not run as the conservative ideologue many saw when he governed but as the Romneyesque “compassionate conservative” whom many on the right at first mistrusted.

Winning candidates need a broad national reach. The appeal of the culture warrior is far more limited than the Tea Party crowd claims. If Americans actually embraced Rick Santorum’s worldview, the rates of premarital sex, abortion, births to single mothers, divorce, and same-sex relationships would be much lower, especially in the “red states.” But these are not “blue state” phenomena or liberal Democratic behaviors.

Most Americans are not ready to jettison traditional moral strictures even as many live non-traditional lives. Especially in this election, with no particularly pressing social or cultural issue demanding the attention of voters, Santorum’s sanctimony functioned as a form of identity politics, telegraphing membership in a self-selected club of the “virtuous,” while churning divisive emotions.

Romney should be wary because culture warriors can sabotage presidential campaigns. When, at the Republican National Convention in 1992, Pat Buchanan declared a “religious war,” a “cultural war,” a war “for the soul of America,” it was President Bush who suffered. Karl Rove blamed the 2000 electoral deadlock on millions of evangelical voters who stayed home because harsh conservative attacks on George W. Bush made them doubt his ideological purity.

Romney also has to worry because when smartphones and Facebook make everyone a reporter and modern journalists can shamelessly eavesdrop at Palm Beach fundraisers, it gets harder to reconcile primary-driven genuflection toward the right with more moderate inclinations. Both Republican conservatives and liberal Democrats will resurrect his most extreme statements as he veers toward the center. But in recalibrating, he will be behaving like most nominees. As one Republican Party founder, the passionate, wild-bearded Gideon Welles, advised his ambitious friend Franklin Pierce in 1852, when Welles was an anti-slavery Jacksonian Democrat: “Be the candidate of all.”

In 1984, Reagan’s chief of staff, James Baker, offered a recipe for victory that was more apple pie than red meat: “Crime, Education, Economics – Unity.” Reagan understood that Americans had complex feelings about many issues. He knew that a presidential campaign was not a Christian camp meeting. His covenant of grace gave the conservatives a popular victory they never would have achieved otherwise. And it taught Republicans (and Democrats) that even in primary season, winning the center and the swing voter remains the candidate’s central mission; political purity is useless if you lose.

Gil Troy, a professor of history at McGill University, is the editor, with Arthur Schlesinger Jr. and Fred Israel, of “History of American Presidential Elections, 1789-2008,” fourth edition.

Read Full Post »

OP-EDS & REVIEWS

By Gil Troy, New York Times, 1-10-12

Right now, while we indulge New Hampshire’s childish insistence on its presidential primary being “first in the nation,” Americans should decide to bury this tradition. Nearly a century is enough: the Granite State has somehow turned a fluke into an entitlement. Worse, its obsession with primacy prolongs, complicates and distorts the presidential nominating process. In a democracy, no state should be first forever.

People have been grumbling about this and other undemocratic anomalies for years. But the standoff between Barack Obama and Hillary Clinton in 2008 gave the nominating process the equivalent of a stress test, which it failed.

We can find redemption via randomization. Every four years — in March, not January — four different states, from the North, South, East and West, should begin the voting.

Since 1920, each presidential primary season has started with New Hampshire. Primaries to select national convention delegates first emerged for the 1912 campaign. When New Hampshire officially embraced this democratizing alternative to boss rule for the 1916 contest, the timing served voters’ needs, not state conceit.

The primary occurred in March during “mud season” — after the snow, before the plowing — the traditional time for New England politicking. As the New Hampshire Almanac proudly explains, the legislature scheduled primary day on town meeting day, the second Tuesday in March, because “frugal New Hampshirites” loathed lighting “the Town Hall twice.” By 1920, Indiana, which originally voted earlier, decided to vote in May, making New Hampshire’s primary the first.

A voter stepped out of a town hall in Canterbury, N.H.
T.J. Kirkpatrick/Getty ImagesA voter stepped out of a town hall in Canterbury, N.H.

When New Hampshire officially embraced the primary system, the timing served voters’ needs, not state conceit.

New Hampshire continued to hold presidential primaries, even as the number of primaries dwindled and voter turnout plummeted. New Hampshire’s primary, like most in those days, selected unpledged national convention delegates. In 1949, the legislature popularized the process by allowing voters to designate favorite candidates, too, in what amounted to a non-binding straw poll.

Suddenly, in 1952, this “beauty contest” became significant. General Dwight D. Eisenhower proved his viability to Republicans, while Senator Estes Kefauver’s surprise victory in the Democratic primary inspired President Harry Truman to please his wife Bess and retire.

The legend of the cataclysmic “Live Free or Die” primary grew when President Lyndon B. Johnson in 1968 and Senator Edmund Muskie in 1972 each won but faltered by performing worse than expected. Four years later, Jimmy Carter soared after his “better than expected” win – by only 4,663 votes. In 1980 Ronald Reagan stopped George Bush’s “Big Mo.” From 1952 through 1988, every winning presidential candidate first won New Hampshire.

During the 1970s, the politics around this first presidential beauty contest started turning ugly. The New Hampshire primary – and its Iowa caucus doppelganger – was tainted by greed. With primaries proliferating nationwide to allow party members more democratic input in selecting their nominee, media scrutiny of the early contests intensified. Motel owners, car rental companies, printers, advertisers and caterers enjoyed the resulting bonanza, while otherwise obscure political hacks and journalists reveled in playing kingmaker.

This quaint ritual became a state fetish. In 1975, a state law passed protecting the prerogative. Statute RSA 653:9 now mandates that the primary be scheduled at least seven days before all other primaries.

Jealous states like South Carolina and Florida tried front-loading their primaries to enhance their voters’ influence. New Hampshire advanced its primary date into February, then January — goodbye rain boots, hello snow shoes. The shifts prolonged presidential campaigns unnecessarily, annoying millions. In 1999, New Hampshire bullied candidates into signing the New Hampshire Primary Pledge boycotting states that pre-empted New Hampshire. For this current 2012 election cycle, New Hampshire’s zealous Secretary of State, William Gardner, even considered a December date to pre-empt Nevada’s caucus, until the Westerners caved.

In 2008, this silly situation became scandalous. When two large, important states, Florida and Michigan, dared to hold January primaries, New Hampshire and Iowa state officials demanded that candidates promise not to campaign in either state. Both Hillary Clinton and Barack Obama cravenly complied. Obama’s name did not even appear on the Michigan ballot.

Hillary Clinton won Michigan’s primary on Jan. 15 and Florida’s two weeks later.  Clinton’s Michigan vote of 328,309 was more than New Hampshire’s entire Democratic vote total of 287,542.  Still, the punitive Democratic National Committee initially refused to seat any delegates from those states. Desperately seeking delegates, Clinton rediscovered the democratic ideal of “one person, one vote” and insisted on counting the delegates she won in those states. Ultimately, the Democrats awarded Florida and Michigan delegates half a vote each. This compromise affirmed party officials’ scheduling power over state legislatures, while at least partially involving these two states’ citizens in the nominating process.

New Hampshire patriots describe their primary as downright Jeffersonian. Like their Iowa counterparts, they claim the state’s size favors humbler candidates who “make their case door-to-door,” intimately, substantively. Yet New Hampshire campaigns are as frivolous as any other American elections. Candidates spend days flipping pancakes, driving tractor-trailers, slurping chowder, sucking lobster claws. No worse but no better than other states, New Hampshire merits equal but not special treatment.

Once the New Hampshire primary ends, reporters, rather than locals, start behaving badly, exaggerating this one minor, peculiar state’s significance. Speaking in percentages magnifies margins. Hillary Clinton’s slim 7,589-vote victory sounded more impressive when rendered as 39 to 36 percent. Further amplification comes from the media echo chamber as words like “triumph,” “disappointment” and “momentum” transform minor tremors into electoral earthquakes.

In 1787, the “bundle of compromises” that created the Constitution repeatedly balanced small states’ prerogatives with those of big ones. Presidential elections are too important, and first impressions too lasting, to cede so much power to one small state today. Potential presidents must handle a huge, diverse, continental America. A randomized rotation, with four different states starting the nomination process every four years, would test the candidates fairly.

Fetishizing New Hampshire’s primary position has become big business, but it’s bad politics. An idiosyncratic state’s aggressive assertion of an absurd claim, indulged by two spineless national parties and a compliant news media, effectively disenfranchises other voters while exaggerating the importance and the effect of tiny wins of a few thousand votes in a nation of more than 300 million. We can do better. After all, we are selecting candidates for what is still the most important job in the world.



Gil Troy, professor of history at McGill University, is the editor, with Arthur Schlesinger Jr. and Fred Israel, of “History of American Presidential Elections, 1789-2008,” fourth edition.

Read Full Post »

OP-EDS & REVIEWS

By Gil Troy, NYT, 12-2-11

To select someone worthy of sitting in George Washington’s chair, sleeping in Abraham Lincoln’s bed and governing from Franklin Roosevelt’s desk, Americans crave a substantial presidential campaign, as long as they don’t have to endure too many boring speeches. Like every human decision-making process, presidential campaigns seesaw between the serious and the silly.

Recent breakthroughs in neuroscience, political science and the dismal science demonstrate what we know intuitively, that human decision-making involves our heads and our hearts. We are neither fully rational nor totally emotional. Similarly, campaigns fluctuate between profound policy exchanges and trivial personality clashes, between significant indicators of future presidential performance and serious idiocy.

A substantial campaign is most likely when history conspires to offer high stakes with stark choices or an incumbent seeks reelection (and it helps if the candidates avoid defining gaffes or temper tantrums). Re-election campaigns in particular are usually well-focused, because at least one nominee presents a defined track record.

The 2012 campaign seems primed to be portentous, with an embattled incumbent confronting an opponent from an ideologically-charged party amid economic turmoil. But every campaign, no matter how high-minded, flirts daily with farce. “Unfortunately, when you run for the presidency your wife’s hair or your hair or something else always becomes of major significance,” John F. Kennedy said, when Walter Cronkite asked about his forelock. “I don’t think it’s a great issue, though, in 1960.” Actually, the Kennedys’ good looks brought John Kennedy great political luck.

Hair has been the subject of political debate for Republican Presidential candidate Mitt Romney, left, just as it was for John F. Kennedy, right.
Jim Watson/Agence France-Presse — Getty Images, left; Associated PressHair has been the subject of political debate for Republican presidential candidate Mitt Romney, left, just as it was for John F. Kennedy, right.

The first substantial presidential campaign — which brought about America’s first transition from a ruling party to the opposition — also teetered between frivolity and philosophy. An ugly brawl between two founding fathers preceded the great deadlock of 1800, which you may remember hearing about during the election standoff in 2000. Federalists called Vice President Thomas Jefferson an atheist, a libertine, a traitor, “the infidel.” Democratic-Republicans called the short, fat pompous president, John Adams, “His Rotundity.” But the election also contrasted Adams’ centralized government championing industrial development against Jefferson’s vision of limited government with limited growth.

The 1800 election was the first to show how presidential re-election campaigns crystallize issues and polarize positions. A challenger need not be as doctrinaire as Barry Goldwater to offer “a choice not an echo,” when pitting boundless hopes against a first-term president’s adjustments to reality. Running for re-election in 1936, Franklin Roosevelt admitted there would be only one issue: “It’s myself, and people must either be for me or against me.” Running a referendum on Roosevelt, the Republican candidate, Alfred M. Landon, called himself “the direct antithesis of the present executive.”

Winners beware, though. The binary choice most American elections offer frequently overstates differences and oversimplifies results, especially when presidents win re-election. Most of the twentieth-century’s most lopsided wins kept incumbents like Theodore Roosevelt, Franklin Roosevelt, Lyndon Johnson, Richard Nixon and Ronald Reagan in power, but frequently helped spawn the dreaded second-term curse.

Campaigns fluctuate between profound policy exchanges and trivial personality clashes, between significant indicators of future presidential performance and serious idiocy.

Even landslides do not offer the political equivalent of a blank check, however much it might feel that way. Roosevelt overstepped during his second term, especially when he tried packing the Supreme Court. He wrongly interpreted his 523-8 electoral-vote triumph as a more sweeping mandate for his New Deal than voters intended. Lyndon Johnson went from feeling, “for the first time in all my life,” truly “loved by the American people,” marveling at “millions upon millions of people, each one marking my name on their ballot, each one wanting me as their president,” to being hounded out of office.

Sometimes campaigns turn serious by coinciding with serious trouble, especially impending wars, ongoing hostilities or economic busts. Voters in 1860, in choosing Abraham Lincoln, knew that they were empowering abolitionists and risking war. Four years later, a worried President Lincoln needed battlefield victories to woo voters who were doubting him and his war. Ultimately, bullets swayed the ballots as General William T. Sherman’s conquest of Atlanta two months before Election Day helped vindicate Lincoln’s war strategy, leading to his re-election.

While wartime campaigns often become votes of confidence — or no confidence — regarding the incumbent, the downswing in an American business cycle often yields an upswing in surprisingly theoretical, intensely polemical, debates about American capitalism. During a recession, suddenly everyone is an economics major — or a philosopher.  The Panic of 1893 triggered 1896’s “Battle of the Standards.” Americans escalated arcane questions about valuing paper money, silver coins and gold into a searing philosophical divide that stirred fears of civil war. The major parties nominated candidates with contrasting stands. Converting from currency to morality, William McKinley, the Republican goldbug, said “The American people hold the financial honor of our country as sacred as our flag.” And catapulting from economics to metaphysics, William Jennings Bryan, the Democratic-Populist silverbug defending “the producing masses of this nation and the world” famously cried: “You shall not press down upon the brow of labor this crown of thorns, you shall not crucify mankind upon a cross of gold.” One Republican, John Hay, moaned: “The whole country has been set to talking about coinage — a matter utterly unfit for public discussion.”

Still, good intentions and clear visions do not guarantee Solomonic deliberations. In 1964, insisting that “I’m not one of those baby-kissing, hand-shaking, blintz-eating candidates,” Senator Barry Goldwater envisioned a “lofty, rational presentation of contending beliefs” against President Lyndon Johnson. Goldwater loved his campaign slogan:  “In your heart, you know he’s right.” But with Democrats sneering “In your guts, you know he’s nuts,” and his numbers tanking, Goldwater retaliated. By October he was snarling “Would you buy a used car from Lyndon?” and saying all Johnson did was “lie and lie and lie” — although the patriotic senator recoiled when crowds, riled by his rhetoric, booed the president. Time magazine deemed the 1964 campaign “one of the most disappointing ever.”

Just as ideologues can end up mudslinging, moderates do not necessarily sling mush. Mocking moderates is a great American tradition. Some, like Lewis Cass, the Democrats’ compromise nominee in 1848, earn the contempt. As Americans polarized over slavery, Cass ran as a “doughface,” a Northern man who molded his politics to satisfy Southerners, impressing few, alienating many. “And he who still for Cass can be,” one Whig wrote, “he is a Cass without the C.”

America also enjoys a rich tradition of muscular moderates. Barack Obama has already shown he can run an exciting, crisp campaign from the center. In 2008, both parties nominated centrist senators seeking the swing voters who could sway the election. These crucial voters, like the Reagan Democrats and the Clinton soccer moms before them, made a clear choice, this time for Obama. Interestingly, even though both Obama and John McCain played to the center, they clashed on foreign affairs, economic policy and governing philosophy, and in the process they offered voters two quite distinct alternatives.

President Barack Obama, left, was accused of being an atheist, as was Thomas Jefferson, right.
Pool photo by Kevin Dietsch, left; United Press International, right President Barack Obama, left, was accused of being an atheist, as was Thomas Jefferson, right.

The history of presidential campaigning reveals the ingredients that yield substantial campaigns, including a charged historical context, clashing world views and coherent candidacies.  Still, every candidate remains one slip of the tongue, one gotcha question, one feeding frenzy, away from the chaos that overwhelms so many campaigns. Americans genuinely yearn for an ideal democratic exercise, one-part university seminar, one-part town hall. Yet the blood rushes, the pulse quickens, interest peaks, when campaigning turns ugly, emotional, personal. The contradictions of popular politics, meaning mass democratic decision-making, don’t just mirror but magnify our all-too-human contradictions as personal decision-makers.



Gil Troy, professor of history at McGill University, is the editor, with Arthur Schlesinger Jr. and Fred Israel, of “History of American Presidential Elections, 1789-2008,” fourth edition.

Read Full Post »

OP-EDS & REVIEWS

By Gil Troy, New York Times, 11-6-11

There we go again. After nonstop headlines a year before Election Day and nine debates between the Republican candidates (number 10 is scheduled to take place on Wednesday in Michigan), Americans are already grumbling that the 2012 presidential campaign is ugly and interminable. But these quadrennial complaints about campaigning miss the point.  Presidential campaigns are nasty, long and expensive because they should be. Many aspects of campaigns that Americans hate reflect democratic ideals we love.

The presidential campaign’s length and fury are proportional to the electorate’s size and the presidency’s importance.  A new president should undergo a rigorous, countrywide, marathon job interview. Citizens need time to scrutinize the candidates. As David Axelrod, Barack Obama’s senior strategist, puts it: “Campaigns are like an MRI for the soul, whoever you are eventually people find out.” Already this year, “easy favorites” like Tim Pawlenty fizzled, while Rick Perry learned that years governing Texas do not provide as much political seasoning as weeks of presidential campaigning. Mitt Romney, his aides admit, worked out his campaigning “kinks” in 2008.  That year, Sarah Palin’s popularity waned while Barack Obama’s soared, the more each campaigned.

These nationwide courting rituals should be long enough to let great politicians flourish and bond with the nation. John F. Kennedy became a better president and person by encountering Appalachian poverty during the 1960 West Virginia Democratic primary. During his 18,009 mile, 600-speech campaign in 1896, the Populist Democrat William Jennings Bryan insisted that voters “have a right to know where I stand on public questions.” Franklin D. Roosevelt’s strategist advised his candidate in 1932 in strikingly modern terms: “You are you,” he said, and “have the faculty of making friends on a campaign tour.” Traditionally, candidates repeated stump speeches so frequently that, as Herbert Hoover noted, “paragraphs could be polished up, epigrams used again and again, and eloquence invented by repeated tryouts.”

A campaign is the defining democratic exercise for a country founded on the consent of the governed. Since the Jacksonian Democratic revolution against elitism in the 1820s, each revolution democratizing American life further popularized the campaign.  Democracy trumped dignity; mass politics required mass appeals that frequently became protracted, vulgar brawls.

Like automotive crash tests, nasty campaigns determine a potential president’s strength and durability.

Popular candidates stopped being passive kings-to-be, becoming active, articulate, prime-ministers-in-formation, introducing themselves to the people, who wanted to vet their leaders. Most Americans still yearned for George Washington’s dignified silence, even as they cheered candidates engaging in what Hubert Humphrey would later call “armpit politics,” intense and intimate.  In 1840, William Henry Harrison explained that “appearing among my fellow citizens” was the “only way to disprove” rivals’ libels that he was a “caged simpleton.” Similarly, in 1948, a century later, President Harry Truman traveled to California to give the locals a chance to examine him in person. “I had better come out and let you look at me to see whether I am the kind of fellow they say I am,” he said.

Like automotive crash tests, nasty campaigns determine a potential president’s strength and durability. George H.W. Bush deflected ridicule in 1988 as a “wimp,” a “weenie” and “every woman’s first husband,” by mudslinging. “Two things voters have to know about you,” his aide Roger Ailes advised. “You can take a punch and you can throw a punch.”

Alternatively, a well-placed blow can pulverize a vulnerable candidacy. Franklin Roosevelt’s Secretary of the Interior, Harold Ickes, a ferociously partisan Democrat, twice devastated Republican contender Thomas Dewey. First, in 1940, Ickes said the 38-year-old New Yorker had “thrown his diaper into the ring.” Ickes was also popularly credited with suggesting four years later that the dapper, mustachioed Dewey looked “like the groom on the wedding cake.” Both barbs stuck, crystallizing concerns about Dewey.

Voters oversimplify, viewing presidential campaigns as presidential dress rehearsals. After Bill Clinton’s 1992 victory, the defeated Vice President Dan Quayle predicted:  “If he runs the country as well as he ran the campaign, we’ll be all right.” Actually, campaigns are auditions for certain aspects of the job. Although the contrast between Barack Obama as candidate and as president suggests that great campaigners do not always make great presidents, every great president must now be a great campaigner first.

Campaign budgets reflect the time candidates require to capture attention across America’s continental expanse. Candidates compete against the din of modern life, not just against each other. Considering that Procter & Gamble spent $8.7 billion in 2008 peddling detergents and razors, spending $4.3 billion for the 2008 campaign appears a reasonable price to pay for democracy.

The time and money invested pay off because campaigns matter. The stakes in elections are high, the outcomes often in doubt. Despite frequently feeling powerless in modern America, voters can make history. The George W. Bush-Al Gore deadlock in 2000 reminded Americans that in close elections, old-fashioned civics teachers were proved right: every vote counts. When Truman upset Dewey in 1948, the St. Louis Star-Times saluted unpredictability as an “essential part of freedom.”

Ronald Reagan used his four presidential runs in 1968, 1976, 1980 and 1984 to become a better candidate – and the Great Communicator. He relished voters’ sweaty handshakes, sloppy kisses, hearty backslaps and soaring hopes, explaining simply, “I happen to like people.”  Reagan instinctively understood the Progressive philosopher John Dewey’s teaching that “democracy begins in conversation.”  That conversation can turn ridiculous, raucous or tedious, but it serves as both safety valve and social salve. Presidential campaigns historically have had happy endings, with America’s leader legitimized by the open, rollicking process.

So, yes, campaigns are excessive, part old-fashioned carnival and part modern reality show. But in these extraordinary, extended democratic conversations, a country of more than 300 million citizens chooses a leader peacefully, popularly and surprisingly efficiently. As Reagan told Iowans during his costly, nasty, lengthy – but successful – 1984 campaign, “It’s a good idea – and it’s the American way.”



Gil Troy, professor of history at McGill University, is the editor, with Arthur Schlesinger Jr. and Fred Israel, of “History of American Presidential Elections, 1789-2008,” fourth edition, just released by Facts on File of Infobase Publishing.

Read Full Post »