Recensé :
Nolan McCarty, Keith T. Poole, and Howard Rosenthal, Polarized America: The Dance of Ideology and Unequal Riches, The MIT Press, 2006.
In the final days of July, the American Congress voted to do something it had not done in nearly ten years: to raise the minimum wage. Since 1997, the minimum wage has been locked in at $5.15 an hour. This period largely, though not completely, coincides with the Republican party’s domination of all elected branches of the American government [1]. So why did this long-neglected issue suddenly become a matter of pressing concern for George W. Bush’s party? With midterm elections in November, the Republicans, who will be facing the toughest challenge to their majorities in years, were eager to claim some achievements in the final days of what many have derided as a “do-nothing” Congress. Yet the Republican plan, which would have raised the minimum wage $2.10 in several increments over three years to $7.25, did not reflect a sudden concern with the well-being of their most impoverished compatriots. This nod to the less fortunate had its price: attached to the minimum wage legislation was a provision to permanently reduce the estate tax, which would, by 2015, extend the fiscal exemption on inherited wealth to $5 million. By connecting these two measures, the Republicans hoped to cajole Senate Democrats into backing a major tax cut, which, with the help of the Senate rules, they might otherwise have been in a position to block. This rather transparent ruse gave the Democrats a perfect opportunity to roll out the rhetoric they know best: the denunciation of economic inequalities. Thus the Senate minority leader, Harry Reid of Nebraska, declared in late July: “If the Republicans were serious about raising the minimum wage for the first time in nearly 10 years and extending tax relief for working Americans, they would not hold them hostage in their effort to give the wealthiest Americans hundreds of billions more in additional tax giveaways.” [2]. In the end, however, it all came to naught: on August 3, the Senate reached an impasse, and the next day, the congressmen and congresswomen flew back to their districts for several weeks of summer campaigning, each party preparing to blame the other for its indifference to the needs of ordinary Americans [3]
In connecting the incomes of the poorest wage-earners to the fortunes of the wealthiest Americans, this congressional debate stands as an ironic symbol for one of the more alarming trends of contemporary American society: that of ever-increasing economic inequality. While this problem may in itself hardly be new, what is remarkable is how frequently the current conversation about inequality invokes the word “class.” One of the main traits of what is known as “American exceptionalism,” after all, has been the constitutive unwillingness of Americans to view their society through the lens of class, at least in the same way that socialists have in many parts of the world. This was the problem that motivated the German sociologist Werner Sombart to write in 1906 his controversial essay Why Is There No Socialism in the United States?, the text that launched the debate about American exceptionalism, at least in one of its most important forms. How could it be, Sombart wondered, that the United States was the most completely capitalist society in the world, while also being the society most completely lacking in what should have been the natural consequence of capitalism: class struggle, expressing itself in a powerful socialist party. The answers that Sombart offered—the fact that American political parties cut across class lines, the relatively high standard of living of American workers, a democratic rather than an aristocratic style of life, the availability of cheap land, and the fact that American workers are not only patriotic, but truly “love” capitalism [4]—continue to be invoked to explain why, despite the glaring inequalities in American society, class war and socialism have not become permanent fixtures of the American social universe the way that they have (or at least did) in Europe. Of course, American history, like French grammar, is filled with exceptions to exceptions (or exceptionalism). In the early decades of the twentieth century, the so-called “progressive historians” insisted that conflict between economic groups had been the dominant force in American history; one of them, Charles A. Beard, even argued in a famous study that the provisions of the United States’ much revered constitution served the economic interests of the Founding Fathers [5]. In 1944, Franklin Roosevelt denounced the “royalists of the economic order,” even as he proposed to supplement the provisions of the Bill of Rights with a broad array of social and economic [6]. Cold War liberals like the late John Kenneth Galbraith insisted on the need for “countervailing powers” to offset that of corporations when they trampled on the delicate texture of democratic life [7].
The idea that a specter might be haunting George W. Bush’s America no doubt sounds implausible—to say the least. Yet if a spate of press stories, new books, and scholarly studies is any indication, there are signs that a new consciousness is emerging about the scale of the economic disparities afflicting American society, and with it, a growing willingness to think about American society in terms of class. Take Barbara Ehrenreich. At the peak of the economic boom of the late nineties, Ehrenreich, a journalist and prolific writer on social issues, sought employment in a series of low-wage jobs to determine if it was possible to make ends meet as a member of the working class in the new economy—whether as a waitress, a maid, or a Wal-Mart employee. Yet as she recounts in Nickel and Dimed: On (Not) Getting By in America, she soon discovered a hidden world of men and women working several jobs with few (if any) days off, living in motels or cars (because they lack the bank accounts and credit ratings needed to even be offered regular—and cheaper—housing); not seeing doctors when they are sick because they lack health insurance and cannot afford to miss work; nourishing themselves on fast food because they have neither the time nor the utensils (and thus the money) to cook; and being forced by their employers to submit to humiliating drug tests and searches of their belongings. Ehrenreich remarks: “I grew up hearing over and over again, to the point of tedium, that ‘hard work’ was the secret of success … No one ever said that you could work hard—harder than you ever thought possible—and still find yourself sinking ever deeper into poverty and debt.” [8]. In another recent study, the journalist David Shipler, has elaborated on many of Ehrenreich’s insights, detailing how the low wages upon which the “working poor” subsist effectively shut them out of the mainstream of American society, depriving them not only of the income, but also the homes, the telephones, the clothes, the health, and the access to credit necessary to acquire a minimum of social visibility [9].
Last year, no less a bastion of the establishment than the New York Times brought these questions to its front page in a prolonged series on—the dirty little word is out—“class” in America. The Times assessed the role of class in contemporary American society by presenting a series of social vignettes, considering, for instance, how class determines the choice of a spouse, or the likelihood that one will complete a four-year college. A particularly revealing piece compared the different experiences of upper-class, middle-class, and working-class heart attack victims. Explaining the discrepancies in the care they received, the article quotes a doctor who believes that the United States is “transforming health, which used to be like fate, into a commodity. Like the distribution of BMWs or goat cheese.” [10]. If there is a general insight to be gleaned from this empirical study, it is that, in contemporary America, class plays a decidedly paradoxical role. On the one hand, class seems to be disappearing, as Janny Scott and David Leonhardt explain: “Today, the country has gone a long way towards an appearance of classlessness. Americans of all sorts are awash in luxuries that would have dazzled their grandparents. Social diversity has erased many of the old markers. It has become harder to read people’s status in the clothes they wear, the cars they drive, the votes they cast, the god they worship, the color of their skin.” And yet, at the same time, “class is still a powerful force in American life. Over the past three decades it has come to play a greater, not a lesser, role in important ways. At a time when education matters more than ever, success in school remains linked tightly to class. At a time when the country is increasingly integrated racially, the rich are isolating themselves more and more.” [11]. One imagines Werner Sombart smiling.
This new rhetoric of class is not just the residual utopianism or poorly digested Marxism of old “New Leftists”; it has found proponents on the right as well. David Brooks, the conservative New York Times columnist who once claimed to have discovered a new social class (the “bobos” [12]), has shown an interest in class dymanics in his recent writings. In a recent article entitled “Karl’s New Manifesto,” he quoted (no doubt with a wink and a nod) the famous opening lines of Marx’s text (“The history of all hitherto existing society is the history of class struggle”), approving the central insight, but qualifying Marx’s theory (no doubt to the relief of his admirers) by arguing that the class conflict is now primarily mediated through education: “in the information age, in which knowledge is power and money, the class struggle is fought between an educated elite and the undereducated masses.” [13]. Then there is the strange case of Kevin Phillips. In 1968, Phillips was a brilliant young Republican election guru. He is widely credited with devising the Richard Nixon’s “southern strategy,” which successfully pitched the Republican party to southern voters who, for nearly a century, had lived under one-party (i.e., Democratic) rule, thus hastening the most significant political realignment the past century. Yet since the late eighties, Phillips has soured on the very party whose current hegemony he helped to establish. He came to see the Reagan Era as a “Second Gilded Age” (a reference to the closing decades of the nineteenth century, when the fantastic fortunes of the Rockefellers and Carnegies were made), and feared that conservative values were threatened by massive concentrations of wealth and government policies that favored them [14]. Recently, Phillips’ denunciation of the plutocratic tendencies of the Republican party, in weighty diatribes produced at an almost yearly rate, have grown ever more bitter—and surprisingly personal. In 2004, he deplored the growth of inequality by presenting a scathing history of “the House of Bush,” which he cast as a plutocratic American “dynasty” whose rise to power, through a shady collusion of business interests and politics, represents a “Machiavellian Moment for the American Republic.” [15]. A disillusioned conservative, Phillips sees inequality not in Marxist but in Jeffersonian terms: it is a consequence of the corruption of republican government, in which the commonweal is undermined by the private interests of the fabulously rich.
While the new American conversation about class leads in different directions, it rests on a diagnosis that few dispute: the emergence, since the 1970s, of a minute class of the phenomenally rich people, whose wealth not only dwarfs that of the middle class, but even vastly exceeds that of the “merely” rich. David Cay Johnston, a financial reporter for the Times and a leading authority on tax matters, has determined that the top one-thousandth of American taxpayers—some 145,000 people—received an average income of $3 million in 2002, compared to “only” $1.2 million in 1980. Other income groups did not grow at even close to this rate. Moreover, over the same period, the share of the nation’s income that went to this category doubled, reaching 7.4% [16]. At the very summit of the income pyramid, some four hundred individuals earned an average of $174 million in 2000, or three times what they made in 1993. This, Johnston points out, “works out to be about half a million dollars per day, or $2.5 million every five days, which is more than most Americans make in a lifetime.” [17].
Wal-Mart, Fait Social Total?
How times have changed for the Democrats! From 1986 to 1992, as her husband stood poised to realign his party along Third Way lines, Hillary Clinton sat on the board of directors of Wal-Mart, that beacon of American capitalism founded in her adopted state of Arkansas in 1962. Yet in August 2006, Democrats vying for office were lining up to take shots at Wal-Mart, now derided as the devil incarnate. Joe Biden, a Delaware senator with his eye on the presidency, harangued the sweltering summer crowds in Iowa: “My problem with Wal-Mart is that I don’t see any indication that they care about the fate of middle-class people. They talk about paying them $10 an hour. That’s true. How can you live a middle-class life on that?” [18].
In the looming American debate about inequality, all roads lead to Wal-Mart, the most striking emblem of contemporary American capitalism. It is the largest private employer in the United States, as well as in Mexico; it is the world’s second largest corporation, and its largest retailer; earning $258 billion in 2003, its revenues account for 2% of the United States’ GDP [19]. Each week, it is estimated, some 100 million Americans—over a third of the population—enter one of Wal-Mart’s 3,800 or so stores to buy groceries, toys, clothes, gas—and pretty much anything else. And yet even as it embodies the dazzling success of American entrepreneurialism, Wal-Mart is, sociologically and culturally, part of the fabric American working class life. The source of Wal-Mart’s success is its commitment to “Every Day Low Prices,” thanks to which it consistently undersells its competitors. As a result, it caters to the needs of the least prosperous Americans: its customers, studies show, often earn salaries well below the national average. They are much less likely to have bank accounts than their compatriots [20]. Yet low wage-earners not only make up Wal-Mart’s clientele; they are also its employees. The company’s extraordinary gains in productivity in recent decades are closely connected to its relentless efforts to hold down labor costs. In 2003, the average hourly pay of a Wal-Mart clerk was $8.50, which adds up to $14,000 a year—or $1,000 less than the federal government’s definition of poverty for a three-person family [21].
In contemporary America, the social question is, in many respects, the Wal-Mart question. In addition to low salaries, Bentonville—the town in rural Arkansas where the company has its corporate headquarters—stands accused, among other vices, of providing its employees with insufficient social benefits: only 44% of its 1.3 million American can pay for the company health insurance plan [22]. Moreover, due to the poor wages and benefits, many Wal-Mart employees depend on aid from the states and the federal government to supplement their income. George Miller, a California Democrat in the House of Representatives, noted in a 2004 report that a hypothetical Wal-Mart store with 200 employees could cost taxpayers as much as $420,750 in children’s health care subsidies, housing assistance, federal tax credits, and other forms of aid [23]. Rather than the shining star of American entrepreneurialism, Wal-Mart, these critics maintain, is in fact a shameless recipient of “corporate welfare” (i.e., of public subsidies to private corporations). Bentonville is also regularly criticized of hiring illegal immigrants, of discriminating against female workers, and of imposing patronizing and excessively disciplinary workplace rules on its employees—which it insists on calling “associates.”
The problem that best reveals Wal-Mart’s place in a burgeoning class war is its policies on unionization. Bentonville maintains a vigilant watch over its stores, protecting them from the any sign of union activity. Store managers, it is reported, are issued a guide called “The Manager’s Toolbox to Remaining Union Free,” which counsels them to beware of “frequent meetings at associates’ homes” and of “associates who are never seen together...talking or associating with each other.” [24]. In the event that a unionization drive actually does get underway, Bentonville immediately dispatches a crack legal team to thwart it. Thus in 2000, meat-cutters employed by a Wal-Mart in Jacksonville, Texas, succeeded, against the odds, in establishing the first union in the company’s history, when they voted by a margin of 7 to 3 to join the United Food and Commercial Workers Union (UFCW). A mere two weeks later, Wal-Mart announced that it would be ending meat-cutting operations in 180 stores in six states—including the trouble-making store in Jacksonville. Bentonville’s reason? The company, it insisted, needed to keep up with the industry-wide trend towards selling pre-wrapped meats. “This decision was in no way related to the Jacksonville situation,” Bentonville asserted [25].
An emblem of class conflict, Wal-Mart is also a symbol of the polarization of American politics. Surveying the data of the last presidential election, the pollster John Zogby noticed striking correlations between shopping and voting habits: 76% of voters who enter a Wal-Mart once a week cast ballots for George W. Bush, while 80% of those who never go to Wal-Mart supported John Kerry. “Wal-Mart is more than retail,” Zogby remarks. “It has become a culture unto itself.” [26].
What explains the emergence over the past three decades of this new class of the “hyper-rich?” According to Johnston, they have benefited from the emergence of a global marketplace for American business, as well as from the development of new technology. Yet the full extent of their prosperity would no doubt have been impossible without the U.S. government’s tax policy. In 1980, the highest tax rate was 70%. By 1987, Ronald Reagan had cut it to 28%; though George H. W. Bush (in 1991) and Bill Clinton (in 1993) raised it, George W. Bush has since cut it once more, to 35% [27] Yet even the lip-service given to the idea of progressive taxation has been rendered meaningless by other features of the tax code. Thanks to the maze of loopholes and deductions that lobbyists write into tax legislation, “people who make their money through ownership of businesses, investments, and property … have enormous opportunities to understate income, invent deductions, and shift their tax burden” onto the less wealthy.” [28]. Moreover, earnings from dividends and investments, to which the wealthiest owe much of their fortune, are not subject to the income tax, and the taxes that are owed on these revenues have been cut in recent years. Meanwhile, the taxes that most directly affect working and middle-class Americans are frankly regressive: for instance, only the first $90,000 of income is taxed by Social Security. In such a system, Johnston reports, absurdities abound. Under the Bush tax cuts, the richest four hundred taxpayers (who earned at least $87 million in 2000) pay the same percentage of income, Medicare, and Social Security taxes as those making between $50,000 and $75,000; and those earning over $10 million owe a smaller portion of their income to these taxes than those making between $100,000 and $200,000 [29]. The income tax has become de facto what some conservatives have long demanded that it be: a flat tax, in which the wealthiest Americans pay—at best—the same percentage of their incomes as those who are simply well-off.
As the income of the wealthy as soared, other income groups have met with a more mixed fate. As James Lardner, the director of Inequality.org, a website devoted to raising awareness about economic disparities, puts it: “While the United States remains a spectacularly rich country by any standard, we are drifting towards a Third World-like distribution of our riches.” [30] According to the economists Heather Boushey and Christian E. Weller, the period 1979 to 1989 witnessed the most dramatic increase in inequality, as the wages of workers at the lower end of the income pyramid fell steeply, while those of the median worker remained unchanged. Between 1989 and 2000, lower wage workers, particularly in the tight labor markets of the 1990s, made up for their previous losses; however, it was in this period that the rich (and the “hyper-rich”) pulled away from the middle class [31].
Arguably even more serious is the evidence that, in the same period, social mobility has plummeted. The “American dream,” after all, was never really a belief about actual equality: rather, it has always been about the equality of opportunity—that the social class into which one is born should not spell one’s economic fate. Yet while many continue to cling to this dream, there are troubling signs that it may be becoming a thing of the past. According to Boushey and Weller, in 1973, 23% of sons whose fathers belonged to the bottom quarter of the socioeconomic scale (as determined by income, education, and occupation) worked their way up to the top quarter; in 1998, only 10% accomplished the same feat [32]. Moreover, at the same time that many are denied their American dream, it is becoming unclear if there is anything distinctly American about it. As the New York Times notes, social mobility in the United States now appears to be no greater than in France or the United Kingdom, and seems distinctly lower than in Canada and Scandinavia [33]. The new lines of class have clearly exacerbated the crisis of the American dream, as those lacking the economic, cultural, and educational capital that success brings find themselves hopelessly disadvantaged when they enter the job market. Janny Scott and David Leonhardt of The New York Times conclude: “A paradox lies at the heart of this new American meritocracy. Merit has replaced the old system of inherited privilege, in which parents to the manor born handed down the manor to their children. But merit, it turns out, is at least partly class-based. Parents with money, education, and connections cultivate in their children the habits that the meritocracy rewards. When their children then succeed, their success is seen as earned.” [34]
The increasing economic inequality not only poses serious challenges to American society. According to a penetrating new study, it is also responsible for one of the most visible trends in American politics: polarization. Since some time in the mid-1990s, but most visibly in the presidential elections of 2000 and 2004, Americans seem to be almost evenly divided between two political parties that are more ideologically opposed, increasingly disinclined to compromise, and less accommodating to moderates than at any other moment in recent memory. The originality of Polarized America: The Dance of Ideology and Unequal Riches by the political scientists Nolan McCarty (Princeton), Keith T. Poole (University of California, San Diego) and Howard Rosenthal (New York University) is its contention that the split between “red” (Republican) and “blue” (Democrat) is not merely a political phenomenon, but one fundamentally rooted in social—and economic—conditions. Praising their work on the editorial page of the New York Times, Paul Krugman, the liberal columnist and economist, declared: “Polarized America is a technical book written for political scientists. But it’s essential reading for anyone who wants to understand what’s happening in America.” [35]
What precisely does it mean to say that polarization has increased? McCarty, Poole, and Rosenthal assert that polarization can actually be quantified. To measure polarization, they found, requires determining “who votes with whom and how often.” As they explain: “For example, if Arlen Specter [a Republican senator from Pennsylvania] votes with both Hillary Clinton [the Democratic senator from New York] and Bill Frist [the Republican senator and majority leader from Tennessee] much more frequently than Clinton and Frist vote together, then these techniques position Specter as moderate, in between those more extreme senators.” Applying this algorithm to analyze millions of votes—their database includes every roll-call vote in the history of the U. S. Congress—McCarty, Poole, and Rosenthal have developed a tool that can, with considerable precision, locate each member’s decision fits on a liberal-conservative spectrum [36]. The “polarization index” they have compiled reveals that, during the first seven decades of the twentieth century, political polarization declined, reaching a nadir over the thirty-year period between the mid-forties and the mid-seventies (i.e., under the Roosevelt, Truman, Kennedy, Johnson, and Nixon administrations). Since then, polarization has risen sharply and continuously.
Concretely, polarization refers to five interrelated trends. In the first place, political conflict increasingly occurs solely along the liberal-conservative axis (other conflicts, such as those over racial equality, now dovetail with the liberal-conservative axis, rather than deviating from it, as they did in the sixties). At the same time, the positions of the parties on the liberal-conservative axis have dispersed, meaning that extremists of both sides are more likely to be represented in Congress. Meanwhile, each party has become more ideologically homogenous, with the Democrats becoming more liberal, and the Republicans growing increasingly conservative. Symptomatic of this shift is the long, slow, and now almost complete disappearance of Southern Democrats (though, as the authors note, liberal Northern Republicans, of the Rudy Giuliani variety, are an increasingly rare breed as well). Consequently, the divergence between the average Democrat and the average Republican has grown as well. Finally, each party’s positions are less and less likely to overlap—in other words, moderate politicians are disappearing [37]. The passions aroused by this summer’s Democratic senatorial primary in Connecticut are thus a sign of the times: the incumbent, Joe Lieberman, whose moderate credentials earned him a spot next to Al Gore on the 2000 Democratic presidential ticket, has seen his political career threatened by the fury of his party’s left, which has thrown all of its weight behind an insurgent candidate to punish Lieberman for his treachery in supporting Bush over Iraq.
Theories attempting to explain the polarized tenor of American politics abound. Some analysts blame the “southern realignment”—the South’s progressive abandonment, since 1964, of the Democratic for the Republican party. Others attribute it to institutional reforms within Congress, which have given party leaders more power to punish moderate members of their caucus if they fail to tow the partisan line. Still others see polarization as a consequence of congressional redistricting procedures, whereby the majority parties in state legislatures can, in some instances, create districts for the House of Representatives that are guaranteed to send one of their own to Washington (witness the controversy surrounding the 2003 redistricting in Texas, which led to the defeat of four incumbent Democrats the following year). Districts in which there is little inter-party competition are thus more likely to elect extremist candidates who please the party faithful (who are more likely to vote in primaries), rather than moderate ones who can elicit broader support.
McCarty, Poole, and Rosenthal argue that, whatever the explanatory value of these factors, the deeper cause of polarization lies elsewhere. Since the 1970s, they observe, the curve followed by the polarization index closely tracks the curve of the inequality index (defined by the Gini coefficient [38]). During the postwar decades, decreasing inequality was mirrored by a relatively low level of polarization (corresponding to the period when, under the administrations of Roosevelt, Johnson, and even Nixon, the government pursued redistributive economic policies). When inequality began to take off (around 1969, according to these authors), polarization followed soon after. Moreover, the authors also find that, over the past three decades, the average income of congressional districts has become increasingly predicative of its position on the liberal-conservative spectrum. Polarization is thus not, as some political scientists have held, an elite-driven phenomenon, in which the divisions between polarized political leaders have become unrepresentative of the population at large [39]. For McCarty, Poole, and Rosenthal, polarization does indeed “have some basis in the preference of voters.” [40] Drawing on these insights, the authors propose an arresting interpretation of the contemporary American politics and society. Political life consists of a “dance” between income inequality and political polarization: on the one hand, economic inequality increases polarization, as the rich (for instance) devote themselves to promoting political parties that will represent their interests (in, say, tax cuts and anti-redistributive policies); on the other, polarized politics exacerbates inequality, either because the Republicans, when they have a majority, limit redistribution, or because, when in the minority, they successfully impede (thanks to the particularities of the American legislative process) Democratic efforts to counter the trend towards inequality [41].
Perhaps the most striking consequence of this argument is its implication for the debate over the role of “moral values” in American electoral politics. Many have assumed that the Republican party’s ability to present itself as a champion of moral virtue and religious faith has been critical to its success. Recently, Democrats have attempted—so far unsuccessfully—to beat the Republicans on their own terrain, inspired by George Lakoff’s admonition that they must make the moral “frames” of their politics more visible, [42] and by Jim Wallis’s contention that Christian theology is fundamentally progressive [43]. The journalist Thomas Frank, in his recent essay What’s the Matter with Kansas? has explained the appeal of moral and religious values as a form of false consciousness, leading poor voters in many parts of countries to vote, contrary to their economic interests, for their own exploiters [44]. Yet McCarty, Poole, and Rosenthal argue that, compared to income, moral values account for little in explaining polarization. For instance, voters who had strongly supported the Republican Congress’s efforts to impeach Bill Clinton in 1998 tended to have relatively high incomes [45]. “Perhaps the biggest fallacy about conservative Christians,” they argue, “is that they systematically vote against their economic interests.” [46] This hypothesis rests on a number of faulty assumptions, notably that blue states are richer than red states (while this is true, it does not mean that rich people tend to vote Democrat) and that Christian conservatives are significantly poorer than average. Moreover, the Frank thesis overlooks the fact that income level determines the degree of partisanship even among so-called “values” voters: Christian conservatives are more likely to be Republican the richer they are [47]. Furthermore, the Republican grip on the South appears to have less to do with the cultural conservatism of these states than with their economic prosperity: not only did per capita income grow faster in the South between 1959 and 1989 than the rest of the nation, but there has been a large migration of northern middle and upper-class whites—of which the Bush family is only the most celebrated example [48]. Moral values, the obsession of American electoral analysts, is revealed in the analysis of McCarty, Poole, and Rosenthal to be little more than superstructure, yielding before the ultimately determining force of economic realities.
Yet if income inequality and polarization are so closely connected, why have only the Republicans benefited? Why, in other words, has the emergence of the “hyper-rich” not led the less affluent to reward Democratic candidates working for more economic redistribution? The answer, McCarty, Poole, and Rosenthal contend, lies in another major trend of American society: immigration. The percentage of the foreign-born living in the United States follows the inequality and polarization curves: from the restrictive laws of the 1920s until the immigration reform laws of 1965, the percentage of the foreign-born declined sharply; since then, however, it has steadily climbed, reaching 7.8% of the population in 2000 [49]. Consequently, at the lower end of the income distribution, there has emerged a large class of (often non-white) immigrants who, lacking citizenship, are politically disenfranchised. What this means is that while the economic position of the median voter has remained generally stable, that of the median family has, as a result of immigration, declined [50]. Immigration thus skews the usual expectations concerning the political consequences of inequality in a democracy. Normally, increased inequality should, over time, lead a majority of voters to favor a party advocating greater redistribution (thus correcting the inequality). But the presence of a significant class of economically disadvantaged immigrants produces, in the analysis of McCarty, Poole, and Rosenthal, two effects that impede this self-correcting mechanism. It creates a “sharing effect,” meaning that all citizens become less favorable to redistribution measures as immigrants “shrink the per capita pie that has to be shared equally with all residents.” [51] At the same times, immigration produces a “disenfranchisement effect,” as the exclusion of the foreign-born from elections makes the rich relatively more influential and the poor relatively less so. In other words, the fact that an important strata of the poor are disenfranchised prevents a redistributive self-correction, which would take place if they had voting rights, from occurring. Thus while immigration may not have contributed to inequality and polarization, McCarty, Poole, and Rosenthal suggest, it has exacerbated their effects.
The politics of inequality and polarization have produced, McCarty, Poole, and Rosenthal conclude, a “very stable political system.” [52]. It has been in existence since the 1970s, and nothing suggests that era of George W. Bush has fundamentally changed it in any significant way. Part of the problem is that American political institutions do not lend themselves to correcting the pressures of growing inequality, notably because they are non-majoritarian (i.e., minority parties have considerable power to obstruct the work of the majority party). If anything, McCarty, Poole, and Rosenthal suggest, the non-economic issues that so many have seen as integral to Bush’s 2004 triumph may have hurt him: given the economic conditions during the election year, all existing models would have had Bush win a far more comfortable mandate than the razor-thin victory he eked out. (The antagonism with which Bush’s recent proposal to set illegal immigrants on a path to citizenship met on the part of fellow Republicans may be another instance in which Bush has departed from the interests of his supporters).
Despite the existing system’s solidity, the authors envisage some conditions under which it could budge. Republican positions on social issues like abortion or gay marriage could scare away secular voters; the Republicans might “overreach” in their pursuit of free market policies, alienating even some of their middle-class supporters; and, of course, an economic crisis or a new terrorist attack could have wildly unpredictable effects. The only solution they envisage that would depend on the actions of the Democrats rather than the failure of the Republicans is the pursuit of Bill Clinton’s strategy in the 1990s. Contra Lakoff and Wallis, McCarty, Poole, and Rosenthal do not think that the Democrats must match the Republican religious and moral rhetoric to compete for socially-conservative middle-class voters. They cite Joe Lieberman and Hillary Clinton as politicians who learned this lesson well—though given the contempt with which they are held by the left of their party at present, one wonders if Democratic primary voters, itching for a fight after eight years of Bush, will be prepared to swallow the easy platitudes of “Third Way” centrism.
“At the present time,” Werner Sombart wrote in 1906, “it may be said indisputably that the absolute contrasts between poor and rich are nowhere in the world anything like as great as they are in the United States.” [53]. Considered from the other side of the social leveling that occurred between the 1930s and the 1970s, this statement once again seems true today. The new attention to class among scholars, policy intellectuals, and journalists reflects this trend, as do a number of social and political movements, such as the drive to unionize Wal-Mart workers, or efforts in various states to raise the minimum wage by bypassing Congress and placing it directly on the ballot. Yet these trends notwithstanding, many of Sombart’s basic insights about American society remain valid. The United States (to paraphrase Niall Ferguson) remains something of a class society in denial. “The idea of fixed class positions,” an article in the New York Times series observes, “rubs many the wrong way. Americans have never been comfortable with the notion of a pecking order based on anything other than talent and hard work. Class contradicts their assumptions about the American dream, equal opportunity, and the reasons for their own successes and even failures.” [54] The challenge that the American political class faces is to find a way to articulate the problem of mounting inequality, without eliciting the American aversion to thinking in terms of class.