It’s a curious feature of American history that some of its major turning points are best summed up by books. In the years just before the American Revolution, Thomas Paine’s Common Sense was the book; it had a huge role in focusing colonial grievances to the point that they were ready to burst into flame. In the years before the Civil War, it was Harriet Beecher Stowe’s novel Uncle Tom’s Cabin; that’s the book that made the North redefine a troubled national dialogue over a range of regional differences as a moral debate over slavery, pure and simple, and so pushed both halves of the country into positions from which they couldn’t back down short of war.
Both of those books stayed famous long after the issues they influenced were settled, and back when American children actually learned about American history in school, at least, most people knew the titles—though you won’t find many people of any recent generation who read either one. The book that played a similar role in launching America on its career as a global empire didn’t get the same kind of treatment. Unless you know a fair amount about military history, you’ve probably never heard of it. Its title is The Influence of Sea Power upon History, and its author was Alfred Thayer Mahan.
Mahan was an officer in the US Navy; he’d seen combat duty in the Civil War, and remained in the service during the postwar decades when the country’s naval forces were basically tied up at the dock and allowed to rot. In the 1880s, while serving at the Naval War College, he became a leading figure among the intellectuals—a small minority at that point—who hoped to shake the United States out of its focus on internal concerns and transform it into an imperial power. He was among the most original of American military strategists as well as a capable writer, and he had an ace in the hole that neither he nor anybody else knew about when his book saw print in 1890: his good friend and fellow lecturer at the Naval War College, a New York politician and passionate imperialist named Theodore Roosevelt, would become president of the United States just over a decade later by way of an assassin’s bullet.
Mahan’s theory of naval power was influential enough, then and now, that it’s going to be necessary to sketch out the central themes of his book. He argued, first of all, for the importance of maritime trade to a national economy, partly because shipping was (and is) cheaper than land transport, and partly because most international trade had to go by sea; second, for the necessity of a strong navy to protect shipping routes and project force to defend national economic interests overseas; and third, for the need to establish permanent naval bases at a distance from the nation’s own shores, along important trade routes, so that naval forces could be refueled and supported, and so that a naval blockade could be effectively countered—Mahan here was thinking about his own experiences with the Union blockade of the Confederacy during the Civil War, a crucial element in the North’s victory. He backed up all these points with detailed case studies from history, but his aim wasn’t limited to understanding the past; he was proposing a plan of action for the United States for the near future.
In 1890, the United States had spent a quarter century following exactly the opposite advice. The Union victory in the Civil War, as discussed in the last two posts, handed control of the nation’s economic policy to industrial and agrarian interests that wanted high tariffs and trade barriers to protect domestic industry. As those took effect, other nations followed suit by raising tariffs and barriers against goods from the United States, and America distanced itself from the global economy of the late 19th century. Straight through the Long Depression of 1873-1896, economic self-sufficiency was one of the core elements of national policy; the idea was that American farms and factories should produce the goods and services Americans needed and wanted, so that the United States could avoid the state of permanent dependency British-supported policies of free trade, backed by the superlative size and power of the British Navy, was imposing on so many other countries at that time.
As we saw in last week’s post, though, Mahan’s advocacy of naval expansion came at a crucial time, when the wealth pump of America’s industrial system was struggling to keep from consuming itself, and a growing number of Americans were beginning to look enviously at Europe’s global empires. The huge success of The Influence of Sea Power upon History—it was an international bestseller, was translated into more than a dozen languages, and became required reading for politicians and naval officers around the world—had a massive role in reformulating the debate around imperialism. Armed with Mahan’s logic, the proponents of an American empire could redefine the pursuit of global power in terms of the nation’s safety and prosperity. By the mid-1890s, the obsolete Civil War-era ships that made up what there was of the Navy a decade earlier were rapidly being replaced by a new fleet on the cutting edge of naval technology. All that was left was an opportunity to put the new fleet to use and begin carving out an American empire.
That last step came in 1898, with the Spanish-American war. Those of my readers who think that the neoconservatives marked any kind of radical departure from America’s previous behavior in the world should take the time read a book or two on this now-forgotten conflict. Spain at that time was the weakest of the European colonial powers, with only a handful of possessions remaining from her once-vast empire—a few islands in the Caribbean, notably Cuba and Puerto Rico, and the Philippines were among the most important. The project of seizing Cuba from Spain had been a popular subject of discussion in the South in the years before the Civil War, when finding new acreage for the plantation system had been a central theme of regional politics; Mahan’s book argued forcefully that the United States needed at least one large naval base somewhere in the islands to the south of the US mainland, and the hope that new territorial possessions might become captive markets for American industry gave new incentive to the old plan.
The Phillippines were another matter. In the pre-trade barrier era before the Civil War, the United States had begun to establish a presence along the western shores of the Pacific, sending a fleet to wring trade concessions from Japan in 1853 and making substantial inroads into the lucrative markets in China. The Civil War and the years of relative isolation that followed put paid to that, but regaining a place along the shores of east Asia was a high priority for the pro-empire party. The possibility of a US naval base in the Philippines was a tempting one, and added to the incentives for a war with Spain.
All that was needed was a provocation. That was provided, first, by propaganda campaigns in the American mass media accusing the Spanish government in Cuba of atrocities against the Cuban population, and second, by a boiler explosion aboard the USS Maine, one of the Navy’s new battleships, which was making a port call in Havana. The explosion was instantly blamed on a Spanish mine; public opinion in the United States, fanned by the media, favored war; Congress, which in those days still fulfilled its constitutional role by setting policies that presidents were expected to carry out, duly declared war; US naval forces were already in position, and sailed at once. Ten weeks later Cuba and Puerto Rico were conquered, two Spanish fleets had been crushed in separate battles nearly half the world apart, and the United States had its overseas naval bases and its empire.
The American president at that time, William McKinley, was not among the cheering majority. He was no opponent of American expansion—it was during his presidency that the United States annexed Hawai’i and what is now American Samoa—but service in the Union infantry in the Civil War gave him a more realistic attitude toward war, and he did what he could, with the limited power presidents had in those days, to stop the rush to war with Spain. He won reelection easily in 1900, but the next year he was killed by a lone gunman. His vice president was none other than Theodore Roosevelt, who proceeded to turn Mahan’s strategic principles into national policy. It’s an interesting commentary on the difference between the two eras that nobody, as far as I know, has ever proposed a conspiracy theory to account for McKinley’s death.
The dawn of American empire had impacts reaching well beyond the handful of territories the United States seized and held in McKinley’s day. The same Congress that declared war against Spain had passed a resolution forbidding the annexation of Cuba—this was partly to win support for the war from the anti-empire faction in Congress, partly a bit of pork-barrel protectionism for the American sugar and tobacco industries—and that limit forced the proponents of empire to take a hard look at other options. The system that resulted was one that remains standard throughout the American empire to this day. Cuba got a new constitution and an officially independent government, but the United States reserved the right to interfere in Cuban affairs at will, got a permanent lease on a naval base at Guantánamo Bay, and turned the Cuban economy into a wholly owned subsidiary of American commercial interests. The result fed the wealth pump of empire, but cost the United States much less than an ordinary colonial government would have done.
It also proved easy to export. In 1903, using a stage-managed revolution backed by US ships and Marines, the United States manufactured the new nation of Panama out of a chunk of northern Colombia, and established a Cuba-style government there under tight American control to provide a suitable context for a canal uniting the Pacific Ocean with the Caribbean Sea. Other Latin American countries fell under United States control in the years that followed, and had their resources fed into the increasingly busy wealth pump of American empire. Standards of living across Latin America duly began their long downward slide, while the United States boomed.
Meanwhile, as one of the last major acts of his presidency, Roosevelt launched what would be the definitive announcement that America had arrived on the world stage: the voyage of the “Great White Fleet.” In December 1907, sixteen battleships and their support vessels—their hulls painted stark white, the Navy’s peacetime paint scheme just then—sailed out of East Coast harbors to begin a voyage around the world, stopping at ports on the way. By the time they returned to Hampton Roads in February 1909, governments around the world had been forced to deal with the fact that a new power had entered the global political order.
All of this—Mahan’s theories, the Spanish-American war and its aftermath, the growth of a US empire in Latin America, and the military implications of America’s huge naval buildup and sudden attainment of global reach—was discussed at great length in books and periodicals at the time. What very few people noticed, because the intellectual tools needed to make sense of it hadn’t been developed yet, was that the United States was developing what amounted to a second empire, parallel to the one just described, during these same years. Where the imperial expansion we’ve just examined established an empire across space, this second empire was an empire across time. Like the move to global empire, this empire of time built on an earlier but more limited method of feeding the wealth pump, and turned a large but otherwise ordinary nation into a world power.
This “empire of time,” of course, consisted of the American fossil fuel industries. Where an empire extracts wealth from other countries for the benefit of an imperial nation, fossil fuel exploitation extracts wealth in the form of very cheap thermal energy from the distant past for the benefit of one or more nations in the present. The parallels are remarkably precise. An empire is profitable for an imperial nation because that nation’s citizens don’t have to produce the wealth that comes from foreign colonies and subject nations; they simply have to take it, either by force or by unbalanced systems of exchange backed by the threat of force. In the same way, fossil fuel extraction is so profitable because nobody nowadays has to invest their own labor and resources to grow and harvest prehistoric trees or extinct sea life, or to concentrate the resulting biomass into coal, oil, and natural gas. Equally, as we’ve seen already, empires go under when the wealth pump drives colonies and subject nations into poverty, just as fossil fuels become problematic when sustained extraction depletes them. In both cases, it’s a matter of drawing down a nonrenewable resource, and that leads to trouble.
Nobody seems to know for sure when coal was first mined by European settlers in the New World, but the anthracite coal fields of eastern Pennsylvania were already being developed by the time of the Revolution, and the coming of the industrial revolution made coal an important commodity. Like the real estate that fueled America’s westward expansion, coal was abundant, widely distributed, and of even more widely varying value; it was more than adequate to fuel the growth of a national economy, but not enough by itself to open the door to world power. It took the second empire of time—the one embodied in petroleum—to do that, just as the concentrated wealth that could be had from overseas empire made it possible for the United States to transform itself into a global force.
There’s another fascinating parallel between America’s overseas empire of space and its second empire of time. That latter began in 1859, with the drilling of America’s first oil well in western Pennsylvania, right about the time that the United States was making its first tentative movements toward intervention in Asia. For decades thereafter, though, petroleum was used mostly as a source of lamp oil. It took a flurry of inventions in the 1880s and 1890s—right around the time the push for overseas empire was taking shape in the United States—to turn petroleum from a useful commodity to a source of nearly limitless mechanical power. It was in the wake of that transformation that the two empires fused, and the United States vaulted into global power. We’ll talk about that next week.
****************
End of the World of the Week #15
The apocalyptic thinking discussed in previous posts here has percolated in plenty of odd directions over the centuries, and traces of it can be found in plenty of unexpected places today. One example that’s worth at least a glance is the role of apocalyptic ideas in helping to shape the remarkably messianic notions the liberal end of the Baby Boomer generation has generally had of itself and its place in history.
Some of my readers may recall The Greening of America by Charles Reich, a book published to much fanfare in 1970. Reich argued that American history could be understood in part as a process of shifting modes of consciousness in which Consciousness I, which had been glued firmly in place from colonial times to the Second World War, had morphed into Consciousness II, or square consciousness. This, Reich insisted, was about to be replaced by Consciousness III, or hip consciousness, which would become universal just as soon as all the squares either died off or got a clue.
Ten years later, Reich wasn’t exactly looking like a prophet, but that didn’t stop Marilyn Ferguson from making much the same claim in The Aquarian Conspiracy (1980). Ferguson didn’t use Reich’s historical scheme, but the basic argument—that those of the baby boomer generation who were into the 1980 equivalent of hippie culture were the forerunners of a great wave of change that would make the world much better—was essentially the same.
Twenty years further down the road, the same claim was being circulated with a little less generational slant by Paul H. Ray and Sherry Ruth Anderson in their 2000 book Cultural Creatives. Under that flattering label, Ray and Anderson lumped the same ideas and attitudes that Reich assigned to Consciousness III and Ferguson to her Aquarian Conspiracy, and paired it with the same claim, that a great positive change of consciousness was on its way and would give boomer idealists the world they thought they wanted.
None of these grand transformations, it bears remembering, has happened, but it may be worth noting what happened instead. In the aftermath of 1970, the Sixties guttered out, and in the next presidential election, Nixon won by a landslide. In the aftermath of 1980, the alternative scene of the Seventies collapsed, and Reagan won the presidency. In the aftermath of 2000, we got the rise of the neoconservatives and George W. Bush in the White House. It seems unlikely that any of these sudden rightward turns were what the authors had in mind.
—story from Apocalypse Not
Thursday, March 29, 2012
Wednesday, March 21, 2012
America: Crossing the Line
The struggle between Northern and Southern models of human ecology in nineteenth-century America, the theme of last week’s post, determined more than the shape of American continental expansion. The South followed what was becoming the standard pattern in the non-European world during that century, focusing on the production of commodities that were traded on global markets to pay for manufactured goods from European factories. That’s what the South did with cotton, tobacco, and a variety of lesser cash crops, and it’s also what British North America (that’s Canada nowadays) was doing at that same time with grain, lumber, fish, and the like.
Had the South kept the dominant position it originally held in American national politics, and arranged the nation’s trade policy to its own satisfaction, that’s what would have happened between the Mason-Dixon line and the Canadian border, too. Without the protection of tariffs and trade barriers, the North’s newborn industrial system would have been flattened by competition from Britain’s far more lavishly capitalized factories and mercantile firms. The products of America’s farms, mines, and logging camps would have had to be traded for hard currency to pay for manufactured products from overseas. That would have locked the United States into the same state of economic dependency as the nations of Latin America, where British banks and businesses—backed whenever necessary by the firepower of the Royal Navy—maintained the unequal patterns of exchange by which Britain prospered at the rest of the world’s expense.
That possibility went whistling down the wind once the rising spiral of conflict between North and South exploded into war. Southern opposition to trade barriers was no longer an issue once Southern congressmen packed their bags and went home in 1860; with that difficulty out of the way, Northern industries got the protection they needed, and the requirements of the war poured millions of dollars—yes, that was a lot of money back then—into Northern factories. The North’s total victory put the seal on the process, and not incidentally put paid to any lingering thoughts of regime change in America that might have been aired in private among Europe’s upper classes. Prussian general Helmut von Moltke could glare though his monocle and claim that the American Civil War consisted of “two armed mobs chasing each other around the country, from which nothing could be learned,” but a great many others paid close attention, and blanched.
If the powers of Europe needed any reminder of these issues, it came in 1867, when the short-lived French colonial regime in Mexico was terminated with extreme prejudice. That’s another of those bits of history remembered by nobody north of the Rio Grande and everybody south of it, and it deserves discussion here for reasons that will quickly become apparent to all of my readers who have been watching the situation in Greece. The short version is that banks in Britain, Spain and France loaned large sums of money to the government of Mexico, which then fell on hard times—there was a civil war involved, the Guerra de la Reforma, which is a bit further than Greece has gotten yet—and had to suspend payment on its debts. The British, Spanish and French governments responded by putting pressure on the Mexican government to pay up; this being the 19th century instead of the 21st, that took the form of military intervention.
The British and Spanish forces were willing to settle for cash, but the French emperor Napoleon III had a wider agenda and launched a full-scale invasion. After more than a year of heavy fighting, the French army controlled enough of the country to install a friend of Napoleon’s, an Austrian prince named Maximilian, as Emperor of Mexico. That’s one of several good reasons the Union forces in the Civil War threw so much effort into seizing the Mississippi valley in the first part of the war; Napoleon III was known to be sympathetic to the Southern cause, and so any land route by which he could get money and arms to the main Confederate armies and population centers had to be sealed off. As soon as the Civil War ended, in turn, toppling Maximilian’s government became a top priority for the United States government; money and arms poured south across the Rio Grande to support guerrillas loyal to Mexican president Benito Juarez, while the French came under heavy American pressure to withdraw their forces from Mexico. Napoleon III pulled his troops out in 1866, and a year later Maximilian got marched out in front of a Mexican firing squad. That was the last time any European power attempted to expand its holdings in the New World.
North of the Rio Grande, though, the potential for further conflict was hard to miss. It’s a commonplace of history that the aftermath of a war normally includes quarreling among the victors, since all the disagreements that had to be kept at bay while there was still an enemy to defeat typically come boiling up once that little obstacle is removed. That’s what happened across the North in the wake of the Civil War, as the loose alliance between industrial and agrarian interests began to splinter about the time the last of the confetti from the victory celebrations got swept up. Alongside the ordinary sources of economic and political disagreement was a hard fact better understood then than now: the farm states of the Midwest were unwilling to accept the unequal patterns of exchange that the industrial states of the East required.
To make sense of this, it’s necessary to glance back at Alf Hornborg’s analysis of industrial production as a system of wealth concentration. To build and maintain an industrial system takes vast amounts of capital, since factories don’t come cheap. All that capital has to be extracted from the rest of the economy, placed in the hands of a few magnates, and kept there, in order for an industrial economy to come into being and sustain itself. That’s why, in a market economy, the technological dimension of industrialism—the replacement of human labor with machines—is always paired with the economic and social dimension of industrialism—the creation of unequal patterns of exchange that concentrate wealth in the hands of factory owners at the expense of workers, farmers, and pretty much everybody else. The exact mechanisms used to impose and maintain those unequal exchanges vary from case to case, but some such mechanism has to be there, because an economy that allows the wealth produced by an industrial system to spread out through the population pretty quickly becomes an economy that no longer has the concentrated capital an industrial system needs to survive.
That’s the problem the United States faced in the latter third of the 19th century. The rising industrial economy of what would eventually turn into the Rust Belt demanded huge concentrations of capital, but attempts to extract that capital from the farm states ran into hard limits early on. The epic struggle between the railroad barons and the Grange movement over shipping rates for farm commodities made it uncomfortably clear to the industrialists that if they pushed the farm belt too far, the backlash could cost them much more than they wanted to pay. During the Reconstruction era, the defeated South could have what was left of its wealth fed into the business end of the industrial wealth pump, but that only worked for so long. When it stopped working, in the 1870s, the result was what normally happens when the industrial wealth pump runs short of fuel: depression.
They called it the Long Depression, though you’ll have a hard time finding references to that term in most economic texts these days. The first warning came with a spectacular stock market crash in 1873. The US economy faltered, struggled, then plunged into full-scale deflation in 1876 and 1877. There were plenty of ups and downs, and some relatively calm years in the 1880s, but a good many economic measures stayed on the wrong side of the scale until better times finally arrived in 1896.
There’s one good reason and at least three bad ones that you won’t hear much discussion of the Long Depression in today’s troubled economic time. The good reason is that most of today’s economic theories came into being in response to a later crisis—the Great Depression of the 1930s—and the desire to avoid a repeat of the ghastly consequences of that latter collapse has inspired a certain amount of tunnel vision on the part of economic historians. The bad ones? Well, that’s a little more complex.
Many of my readers, to begin with, will have heard pundits insist that economic crises happen because modern currencies aren’t based on a gold standard, or because central bankers always mismanage the economy, or both. That’s a popular belief just now, but it’s nonsense, and it only takes a glance at American economic history between the Civil War and the founding of the Federal Reserve in 1912 to prove once and for all that it’s nonsense. The Panic of 1873, the Long Depression, the Panic of 1893, the Depression of 1900-1904, the Panic of 1907, and several lesser economic disasters all happened in an era when the US dollar was on the strictest of gold standards and the United States didn’t have a central bank. That’s bad reason #1: once you discuss the Long Depression, most of the rhetoric backing a very popular set of economic notions pops like a punctured whoopee cushion.
More broadly, across nearly all of the squabbling theological sects of modern economic thought, Adam Smith’s belief in the invisible hand remains glued in place. Smith, as longtime readers of mine will recall from an earlier series of posts, insisted that a free market economy is innately self-regulating, as though controlled by an invisible hand, and tends to maximize everybody’s prosperity so long as it’s left to its own devices. Exactly how much leeway should be left to the invisible hand is a matter of much disagreement among economists. There’s a broad spectrum from the Keynesians, who want government to cushion the market’s wilder vagaries, to the Austrian school, which insists that whatever the market does by itself is by definition right, but you’ll have a hard time finding anybody in the economic mainstream willing to consider the possibility that the market, left entirely to itself, might dive into a depression twenty-three years long. That’s bad reason #2; once you discuss the Long Depression, it becomes very hard to ignore the fact that an economy left to its own devices can dole out decades of misery to everybody.
Then there’s bad reason #3, which is that the cause of intractable problems like the Long Depression was well understood at the time, but nobody wants to talk about it now. That unwillingness, in turn, reflects the way that a concept once very widespread in economics—the concept of overproduction—came to be associated with a single economic school or, even more precisely, with a single economist, Karl Marx. Overproduction is one, though only one, of the elements Marx wove into his system of economic ideas, and generations of Marxist theorists and publicists used it as a reason why capitalist economies must eventually collapse; with the inevitability of Pavlov’s drooling dogs, capitalist theorists and publicists thus automatically shy away from it; and the Long Depression makes it excruciatingly hard to shy away from it.
For all that, overproduction’s easy to understand, and it offers a crucial insight into how industrial economies work—or, more precisely, how they stop working. An industrial system, as we’ve discussed already, rely on unequal patterns of exchange that extract wealth from the many and concentrate it in the hands of the few, to provide the capital concentrations needed to build and maintain the industrial system itself. The way this usually works in practice is that whatever the people on the losing end of the exchange have to exchange—whether it’s the labor of a work force, the raw materials of a foreign country, or what have you—is given an artificially low value, while the products of the industrial system have an artificially high value, so the people on the losing end get a pittance for their labor, their crops, and so on, while high prices for industrial products keep the factory owners rich.
The problem, of course, comes when the people who are getting next to nothing for their labor, crops, and so on are also the people who are supposed to buy those expensive industrial products. As the people on the losing end of the exchange get poorer, their ability to buy industrial products goes down, and unsold products pile up in warehouses. What happens next puts the entire system at risk: if the factory owners cut prices to move product, they risk dispersing the concentration of capital they need to keep the system going; if they cut production and lay off workers, they decrease the number of people able to buy their products even further; there are other options, but all of them add up to serious trouble for the industrial wealth pump.
That’s overproduction. In the Long Depression, as in the Great Depression, it was an everyday reality, driving, severe deflation and high unemployment, and we’d still be talking about it today if Marx hadn’t been turned into an intellectual figurehead for one side in the bare-knuckle brawl over global power that dominated the second half of the 20th century. There’s much to be said for talking about it again, since it’s becoming an everyday reality in America as we speak—we’ll be exploring that in more detail in later posts—but it also needs to be factored into any understanding of the rise of America’s global empire, because the decision to go into the empire business in a big way was driven, in large part, by the overproduction crises that pounded the American economy in the late 19th century.
Read the literature of empire from the Victorian period and the connection is impossible to miss. Why did industrial nations want imperial colonies? The reason given in book after book and speech after speech at the time is that the industrial nations needed markets. Free trade rhetoric, then as now, insisted that all an industrial nation had to do was to build a better mousetrap and the world would beat a path to its door, but then as now, that’s not how things worked; the markets that mattered were the ones where a single industrial nation could exclude competitors and impose the unequal exchange of cheap labor and raw materials for expensive manufactured products that would keep the wealth pump churning away.
That was the option that faced America as it approached the beginning of the 20th century. It says something for the influence of ideals in American public life that this option wasn’t chosen without a fight. The debate over an American empire was fought out in the halls of Congress, in the letters pages of hundreds of newspapers, in public meetings, and any number of other venues. Important politicians of both major parties opposed imperial expansion with every resource and procedural trick they could muster, and scores of cultural figures—Mark Twain was among them—filled the popular magazines of the time with essays, stories, and poems challenging the imperial agenda.
In the end, though, they lost. To a majority of Americans, the economic case for empire outweighed the moral and political arguments against it. By 1898, the pro-Empire faction had become strong enough that it could push the country into action; the annexation of Hawai’i and the Spanish-American War that year crossed the line and redefined America as an imperial power, launching it along a trajectory that would very quickly draw it into conflicts that generations of Americans had done their best to avoid. We’ll discuss that next week.
****************
End of the World of the Week #14
In apocalyptic belief systems, just as in baseball, you often can’t tell the players without a program, and it’s not at all uncommon for prophets of any given end-of-the-world prediction to provide helpful guides explaining which figure in contemporary public life ought to be identified with the Great Beast of Revelations or the equivalent. Life can get complicated, though, when competing prophets disagree about who corresponds with which legendary figure—and when the two sides of a savage political and religious quarrel end up accusing each other of being the Antichrist, it can be hard indeed to figure out who if anyone is on the side of the angels.
That’s what happened in the early thirteenth century when the Holy Roman Emperor Frederick II faced off against Pope Gregory IX. The popes and the emperors spent most of the Middle Ages getting into it over a galaxy of issues that even historians have trouble remembering nowadays, but Frederick was not your usual medieval emperor. His contemporaries called him Stupor Mundi, which more or less translates out as “the astonishment of the world;” he was fluent in six languages, ruled a ramshackle empire that extended from Sicily to Germany with a bit of Palestine thrown in for good measure, and carried out a successful crusade while excommunicated by the Catholic church, which in medieval terms was definitely a puzzler.
It was probably inevitable that Gregory IX would call Frederick the Antichrist—it was par for the course for thirteenth-century popes to use that timeworn label for the people on their enemies list—but not even a pope could get away with doing that to the Stupor Mundi. Frederick immediately had his public relations people—yes, medieval emperors had public relations people, or at least Frederick did—start churning out tracts proving that the real Antichrist was Gregory IX. The two of them continued along the same lines for years, filling the medieval equivalent of the media with colorful denunciations extracted from the usual parts of the Bible .
Still, Gregory never made a convincing Antichrist to anyone but Frederick and his friends, since he was simply a common or garden variety medieval pope, and spent his career in the usual way, fighting bloodthirsty wars against his personal enemies and issuing proclamations condemning Jews, heretics, and cats. Frederick was another matter; a great many people in his lifetime thought that there was a good chance the Stupor Mundi might just be the Great Beast after all. In his inimitable way, though, he surprised them all by suddenly dying of dysentery in 1250.
—story from Apocalypse Not
Had the South kept the dominant position it originally held in American national politics, and arranged the nation’s trade policy to its own satisfaction, that’s what would have happened between the Mason-Dixon line and the Canadian border, too. Without the protection of tariffs and trade barriers, the North’s newborn industrial system would have been flattened by competition from Britain’s far more lavishly capitalized factories and mercantile firms. The products of America’s farms, mines, and logging camps would have had to be traded for hard currency to pay for manufactured products from overseas. That would have locked the United States into the same state of economic dependency as the nations of Latin America, where British banks and businesses—backed whenever necessary by the firepower of the Royal Navy—maintained the unequal patterns of exchange by which Britain prospered at the rest of the world’s expense.
That possibility went whistling down the wind once the rising spiral of conflict between North and South exploded into war. Southern opposition to trade barriers was no longer an issue once Southern congressmen packed their bags and went home in 1860; with that difficulty out of the way, Northern industries got the protection they needed, and the requirements of the war poured millions of dollars—yes, that was a lot of money back then—into Northern factories. The North’s total victory put the seal on the process, and not incidentally put paid to any lingering thoughts of regime change in America that might have been aired in private among Europe’s upper classes. Prussian general Helmut von Moltke could glare though his monocle and claim that the American Civil War consisted of “two armed mobs chasing each other around the country, from which nothing could be learned,” but a great many others paid close attention, and blanched.
If the powers of Europe needed any reminder of these issues, it came in 1867, when the short-lived French colonial regime in Mexico was terminated with extreme prejudice. That’s another of those bits of history remembered by nobody north of the Rio Grande and everybody south of it, and it deserves discussion here for reasons that will quickly become apparent to all of my readers who have been watching the situation in Greece. The short version is that banks in Britain, Spain and France loaned large sums of money to the government of Mexico, which then fell on hard times—there was a civil war involved, the Guerra de la Reforma, which is a bit further than Greece has gotten yet—and had to suspend payment on its debts. The British, Spanish and French governments responded by putting pressure on the Mexican government to pay up; this being the 19th century instead of the 21st, that took the form of military intervention.
The British and Spanish forces were willing to settle for cash, but the French emperor Napoleon III had a wider agenda and launched a full-scale invasion. After more than a year of heavy fighting, the French army controlled enough of the country to install a friend of Napoleon’s, an Austrian prince named Maximilian, as Emperor of Mexico. That’s one of several good reasons the Union forces in the Civil War threw so much effort into seizing the Mississippi valley in the first part of the war; Napoleon III was known to be sympathetic to the Southern cause, and so any land route by which he could get money and arms to the main Confederate armies and population centers had to be sealed off. As soon as the Civil War ended, in turn, toppling Maximilian’s government became a top priority for the United States government; money and arms poured south across the Rio Grande to support guerrillas loyal to Mexican president Benito Juarez, while the French came under heavy American pressure to withdraw their forces from Mexico. Napoleon III pulled his troops out in 1866, and a year later Maximilian got marched out in front of a Mexican firing squad. That was the last time any European power attempted to expand its holdings in the New World.
North of the Rio Grande, though, the potential for further conflict was hard to miss. It’s a commonplace of history that the aftermath of a war normally includes quarreling among the victors, since all the disagreements that had to be kept at bay while there was still an enemy to defeat typically come boiling up once that little obstacle is removed. That’s what happened across the North in the wake of the Civil War, as the loose alliance between industrial and agrarian interests began to splinter about the time the last of the confetti from the victory celebrations got swept up. Alongside the ordinary sources of economic and political disagreement was a hard fact better understood then than now: the farm states of the Midwest were unwilling to accept the unequal patterns of exchange that the industrial states of the East required.
To make sense of this, it’s necessary to glance back at Alf Hornborg’s analysis of industrial production as a system of wealth concentration. To build and maintain an industrial system takes vast amounts of capital, since factories don’t come cheap. All that capital has to be extracted from the rest of the economy, placed in the hands of a few magnates, and kept there, in order for an industrial economy to come into being and sustain itself. That’s why, in a market economy, the technological dimension of industrialism—the replacement of human labor with machines—is always paired with the economic and social dimension of industrialism—the creation of unequal patterns of exchange that concentrate wealth in the hands of factory owners at the expense of workers, farmers, and pretty much everybody else. The exact mechanisms used to impose and maintain those unequal exchanges vary from case to case, but some such mechanism has to be there, because an economy that allows the wealth produced by an industrial system to spread out through the population pretty quickly becomes an economy that no longer has the concentrated capital an industrial system needs to survive.
That’s the problem the United States faced in the latter third of the 19th century. The rising industrial economy of what would eventually turn into the Rust Belt demanded huge concentrations of capital, but attempts to extract that capital from the farm states ran into hard limits early on. The epic struggle between the railroad barons and the Grange movement over shipping rates for farm commodities made it uncomfortably clear to the industrialists that if they pushed the farm belt too far, the backlash could cost them much more than they wanted to pay. During the Reconstruction era, the defeated South could have what was left of its wealth fed into the business end of the industrial wealth pump, but that only worked for so long. When it stopped working, in the 1870s, the result was what normally happens when the industrial wealth pump runs short of fuel: depression.
They called it the Long Depression, though you’ll have a hard time finding references to that term in most economic texts these days. The first warning came with a spectacular stock market crash in 1873. The US economy faltered, struggled, then plunged into full-scale deflation in 1876 and 1877. There were plenty of ups and downs, and some relatively calm years in the 1880s, but a good many economic measures stayed on the wrong side of the scale until better times finally arrived in 1896.
There’s one good reason and at least three bad ones that you won’t hear much discussion of the Long Depression in today’s troubled economic time. The good reason is that most of today’s economic theories came into being in response to a later crisis—the Great Depression of the 1930s—and the desire to avoid a repeat of the ghastly consequences of that latter collapse has inspired a certain amount of tunnel vision on the part of economic historians. The bad ones? Well, that’s a little more complex.
Many of my readers, to begin with, will have heard pundits insist that economic crises happen because modern currencies aren’t based on a gold standard, or because central bankers always mismanage the economy, or both. That’s a popular belief just now, but it’s nonsense, and it only takes a glance at American economic history between the Civil War and the founding of the Federal Reserve in 1912 to prove once and for all that it’s nonsense. The Panic of 1873, the Long Depression, the Panic of 1893, the Depression of 1900-1904, the Panic of 1907, and several lesser economic disasters all happened in an era when the US dollar was on the strictest of gold standards and the United States didn’t have a central bank. That’s bad reason #1: once you discuss the Long Depression, most of the rhetoric backing a very popular set of economic notions pops like a punctured whoopee cushion.
More broadly, across nearly all of the squabbling theological sects of modern economic thought, Adam Smith’s belief in the invisible hand remains glued in place. Smith, as longtime readers of mine will recall from an earlier series of posts, insisted that a free market economy is innately self-regulating, as though controlled by an invisible hand, and tends to maximize everybody’s prosperity so long as it’s left to its own devices. Exactly how much leeway should be left to the invisible hand is a matter of much disagreement among economists. There’s a broad spectrum from the Keynesians, who want government to cushion the market’s wilder vagaries, to the Austrian school, which insists that whatever the market does by itself is by definition right, but you’ll have a hard time finding anybody in the economic mainstream willing to consider the possibility that the market, left entirely to itself, might dive into a depression twenty-three years long. That’s bad reason #2; once you discuss the Long Depression, it becomes very hard to ignore the fact that an economy left to its own devices can dole out decades of misery to everybody.
Then there’s bad reason #3, which is that the cause of intractable problems like the Long Depression was well understood at the time, but nobody wants to talk about it now. That unwillingness, in turn, reflects the way that a concept once very widespread in economics—the concept of overproduction—came to be associated with a single economic school or, even more precisely, with a single economist, Karl Marx. Overproduction is one, though only one, of the elements Marx wove into his system of economic ideas, and generations of Marxist theorists and publicists used it as a reason why capitalist economies must eventually collapse; with the inevitability of Pavlov’s drooling dogs, capitalist theorists and publicists thus automatically shy away from it; and the Long Depression makes it excruciatingly hard to shy away from it.
For all that, overproduction’s easy to understand, and it offers a crucial insight into how industrial economies work—or, more precisely, how they stop working. An industrial system, as we’ve discussed already, rely on unequal patterns of exchange that extract wealth from the many and concentrate it in the hands of the few, to provide the capital concentrations needed to build and maintain the industrial system itself. The way this usually works in practice is that whatever the people on the losing end of the exchange have to exchange—whether it’s the labor of a work force, the raw materials of a foreign country, or what have you—is given an artificially low value, while the products of the industrial system have an artificially high value, so the people on the losing end get a pittance for their labor, their crops, and so on, while high prices for industrial products keep the factory owners rich.
The problem, of course, comes when the people who are getting next to nothing for their labor, crops, and so on are also the people who are supposed to buy those expensive industrial products. As the people on the losing end of the exchange get poorer, their ability to buy industrial products goes down, and unsold products pile up in warehouses. What happens next puts the entire system at risk: if the factory owners cut prices to move product, they risk dispersing the concentration of capital they need to keep the system going; if they cut production and lay off workers, they decrease the number of people able to buy their products even further; there are other options, but all of them add up to serious trouble for the industrial wealth pump.
That’s overproduction. In the Long Depression, as in the Great Depression, it was an everyday reality, driving, severe deflation and high unemployment, and we’d still be talking about it today if Marx hadn’t been turned into an intellectual figurehead for one side in the bare-knuckle brawl over global power that dominated the second half of the 20th century. There’s much to be said for talking about it again, since it’s becoming an everyday reality in America as we speak—we’ll be exploring that in more detail in later posts—but it also needs to be factored into any understanding of the rise of America’s global empire, because the decision to go into the empire business in a big way was driven, in large part, by the overproduction crises that pounded the American economy in the late 19th century.
Read the literature of empire from the Victorian period and the connection is impossible to miss. Why did industrial nations want imperial colonies? The reason given in book after book and speech after speech at the time is that the industrial nations needed markets. Free trade rhetoric, then as now, insisted that all an industrial nation had to do was to build a better mousetrap and the world would beat a path to its door, but then as now, that’s not how things worked; the markets that mattered were the ones where a single industrial nation could exclude competitors and impose the unequal exchange of cheap labor and raw materials for expensive manufactured products that would keep the wealth pump churning away.
That was the option that faced America as it approached the beginning of the 20th century. It says something for the influence of ideals in American public life that this option wasn’t chosen without a fight. The debate over an American empire was fought out in the halls of Congress, in the letters pages of hundreds of newspapers, in public meetings, and any number of other venues. Important politicians of both major parties opposed imperial expansion with every resource and procedural trick they could muster, and scores of cultural figures—Mark Twain was among them—filled the popular magazines of the time with essays, stories, and poems challenging the imperial agenda.
In the end, though, they lost. To a majority of Americans, the economic case for empire outweighed the moral and political arguments against it. By 1898, the pro-Empire faction had become strong enough that it could push the country into action; the annexation of Hawai’i and the Spanish-American War that year crossed the line and redefined America as an imperial power, launching it along a trajectory that would very quickly draw it into conflicts that generations of Americans had done their best to avoid. We’ll discuss that next week.
****************
End of the World of the Week #14
In apocalyptic belief systems, just as in baseball, you often can’t tell the players without a program, and it’s not at all uncommon for prophets of any given end-of-the-world prediction to provide helpful guides explaining which figure in contemporary public life ought to be identified with the Great Beast of Revelations or the equivalent. Life can get complicated, though, when competing prophets disagree about who corresponds with which legendary figure—and when the two sides of a savage political and religious quarrel end up accusing each other of being the Antichrist, it can be hard indeed to figure out who if anyone is on the side of the angels.
That’s what happened in the early thirteenth century when the Holy Roman Emperor Frederick II faced off against Pope Gregory IX. The popes and the emperors spent most of the Middle Ages getting into it over a galaxy of issues that even historians have trouble remembering nowadays, but Frederick was not your usual medieval emperor. His contemporaries called him Stupor Mundi, which more or less translates out as “the astonishment of the world;” he was fluent in six languages, ruled a ramshackle empire that extended from Sicily to Germany with a bit of Palestine thrown in for good measure, and carried out a successful crusade while excommunicated by the Catholic church, which in medieval terms was definitely a puzzler.
It was probably inevitable that Gregory IX would call Frederick the Antichrist—it was par for the course for thirteenth-century popes to use that timeworn label for the people on their enemies list—but not even a pope could get away with doing that to the Stupor Mundi. Frederick immediately had his public relations people—yes, medieval emperors had public relations people, or at least Frederick did—start churning out tracts proving that the real Antichrist was Gregory IX. The two of them continued along the same lines for years, filling the medieval equivalent of the media with colorful denunciations extracted from the usual parts of the Bible .
Still, Gregory never made a convincing Antichrist to anyone but Frederick and his friends, since he was simply a common or garden variety medieval pope, and spent his career in the usual way, fighting bloodthirsty wars against his personal enemies and issuing proclamations condemning Jews, heretics, and cats. Frederick was another matter; a great many people in his lifetime thought that there was a good chance the Stupor Mundi might just be the Great Beast after all. In his inimitable way, though, he surprised them all by suddenly dying of dysentery in 1250.
—story from Apocalypse Not
Wednesday, March 14, 2012
America: Modes of Expansion
The three settlement patterns that emerged in the American colonies in the century or so before independence—New England’s attempt to copy its namesake across the Atlantic, the Tidewater economy of plantations feeding cash crops to Old World markets, and the fusion of immigrant traditions that was giving birth to American frontier society—were anything but fixed. By the time they had finished taking shape, they were already blurring into one another at the edges, and responding in various ways to the new influences brought by further waves of immigration. Still, the patterns are worth watching, because they played a significant role in shaping the modes of expansion that would define its age of empire in a later century.
The New England pattern, as already mentioned, had two sides with profoundly different possibilities for expansion. While many people from rural New England moved westward with the frontier, nearly all of them abandoned the settlement patterns of their home for the freer, more flexible frontier way of doing things; the village greens, town meetings and Puritan attitudes of the New England countryside sparked few imitations elsewhere. The waterwheels and shipyards of New England’s nascent mill towns and cities turned out to be a more enduring contribution, driving the first wave of an industrial revolution parallel to the one that transformed England not long before.
The frontier pattern also had a twofold form, though the dividing line there was different. The classic frontier society of independent subsistence farmers emerged at a time when the inland reaches of the middle colonies had no transportation links to the coast except a few muddy trails, and remained viable only when distance or geographical barriers replicated this condition. Elsewhere, as roads, canals, and (eventually) railroads began to wind their way westward, inland farmers discovered that there was ample money to be made by shipping grain eastwards for local use and export, and plenty of ways to spend that money on manufactured goods shipped west in exchange. That’s why the Appalachians, for example, which remain a challenge to transportation even today, kept the old frontier pattern long after the frontier itself had vanished out of sight over the western horizon, while upstate New York morphed into a prosperous mix of farms and mill towns as soon as the Erie Canal and a network of feeder roads opened it up to efficient freight transport.
In the Ohio River basin, the first of America’s many wild Wests, the industrial system from New England hybridized with the export-oriented reworking of the frontier settlement pattern to create a new and extremely successful human ecology. Along a network of navigable rivers and canals spilling north to the Great Lakes, towns sprang up, and those that had good sites for waterwheels—the prime mover of industry in the days before coal—normally transformed themselves into industrial cities as soon as population permitted. The space in between the towns was given over to small farms, most of them family-owned and operated, which produced nearly all the food needed locally and also raised grains and other bulk products for sale. It turned out to be very easy to extend this hybrid system further west across the northern and central Mississippi basin, and the idea that it could and should be extended straight across the continent ended up freighted with feelings of very nearly religious intensity as the 19th century unfolded.
The two parts of the hybrid—the rising mill towns and industrial centers, on the one hand, and the agricultural hinterlands on the other—had conflicting interests of great importance, but until 1865 both sides had a very good reason to find grounds for compromise. That reason, of course, was the existence of a radically different system of human ecology on the other side of the Mason-Dixon line.
The plantation economy of the Tidewater region, like the economies further north, adapted to changing conditions as westward expansion proceeded. Unlike New England’s rural economy, it could expand: across the southern half of the new nation, wherever climate and geography made it profitable to put large acreages into cash crop monocultures, above all cotton, plantations spread west. They had to spread, because the plantation economy had a critical weakness: like all cash crop monocultures from Roman latifundia to the latest agrobusiness models, Southern plantations stripped fertility from the soil. The equation’s a simple one: growing one crop repeatedly on the same acreage uses up the nutrient base of the soil, and in a farming economy dominated by cash, a farm that invests the money necessary to restore soil fertility will always be less profitable, at least in the short term, than a farm on new soil that concentrates on cutting costs and maximizing profit.
That specific equation is one form of a much more general rule. As providers of raw materials for industry in another nation, the plantations of the South were the business end of a wealth pump; the Southern states may not have been directly ruled by Britain but, economically speaking, they were as much a part of the British Empire as Canada or India. The Southern upper class, like upper classes in Third World nations today, benefited substantially from their role as guardians of the pump’s intake pipe, but the fertility stripped from Southern soils to provide cheap cotton for Lancashire mills still represented wealth pumped out of the Southern states for the benefit of Britain. New lands had to be brought into the system to keep the pump fed without beggaring those who fed it.
The expansion of the plantation system brought it into a complex relationship with the frontier society that moved westward ahead of it. Regions that were unsuited to plantation farming in the South, like regions that were difficult for transportation technologies in the North, became enclaves of the old frontier pattern, and where these were large enough, they became enclaves of support for the Union once the Civil War broke out—the northwestern third of Virginia, which broke away to become the state of West Virginia, and the eastern hill country of Tennessee were strategically important examples. Elsewhere, as plantations spread, the frontier society was absorbed into the plantation system. Some frontier folk rose to the top—Jefferson Davis, US senator and Confederate president, was born in a log cabin in rural Kentucky in 1808, when it was still well out on the frontier, less than a hundred miles from the log cabin where Abraham Lincoln was born eight months later. (One family went south, the other north; it’s by no means impossible that if the families had chosen differently, the two men might have ended up filling each other’s places.) Most of the others became the poor white not-quite-underclass of the rural South, and provided the South with the bulk of its soldiers in the Civil War, just as their equivalents further north made up a very large fraction of Union soldiers.
The plantation society, then, had to expand in order to survive. The mixed farming society further north was not quite so dependent on expansion, but desired it intensely, and population pressure from a booming birthrate and a steady flood of immigrants backed up that desire with potent economic pressures. While the Mississippi valley was free for the taking, both systems could expand without coming into conflict, but by the late 1840s people on either side were looking westward across the arid west to the Pacific, still distant but too close for comfort. That’s when the national debate over the shape of America’s human ecology—framed south of the Mason-Dixon line as a debate about local autonomy, and north of it as a debate over the ethics of slavery—began to spin out of control.
There were plenty of other issues involved, to be sure. Across the board, on almost every point of national policy that touched on economics, the measures that would support the plantation economy of the South were diametrically opposed to the measures that would support the industrial and farming economies of the North. Trade policy is one good example: to the North, trade barriers and protective tariffs to shelter rising industries from competition by the industrial behemoth of Britain were simple common sense; to the South, free trade was essential so that British markets would remain open to Southern cotton. The endless debates over Federal funding for canals and other internal improvements is another: investments that were essential to the expansion of the Northern economy were useless to the plantation system—which is why it was the North that wove a web of canals and railroads from the Hudson River to the upper Mississippi, while the South built few railways and fewer canals, and relied instead on shallow-draft riverboats that were adequate for getting cotton to market but for very little else, an economy that would cost the South terribly once war came.
Still, the issue that couldn’t be resolved short of war was the future shape of America’s territorial expansion. That’s why Southern leaders, for all their belief in the virtues of local autonomy, bitterly opposed any compromise that would give the people of each newly settled territory the right to decide whether or not slavery would exist within that territory. That’s why the South backed the annexation of Texas and the war with Mexico so fervently in 1844 and 1845, and why so many Southerners in the decades before the Civil War supported the Order of the Golden Circle, a society that advocated the outright military conquest of Mexico, Central America, northern South America and the islands of the Caribbean, so that the plantation economy would have ample room to expand. Meanwhile, north of the cotton belt, younger sons of farmers looked hungrily westwards at the Great Plains and imagined farms of their own, if only slavery could be kept out—and that, in turn, is what made “Bleeding Kansas” the scene of a decade of terrorism and guerrilla war between pro- and antislavery factions, and lit the fuse that finally went off at Fort Sumter.
By the time the war ended at Appomattox Court House four lean and bloody years later, four points had been settled for the foreseeable future. The first was that victory in the wars of the next century would be determined not by which side had the best generals—the South had them, hands down—but by which side had a bigger industrial base, a larger population, and a greater willingness to chuck the traditional rules of war and treat enemy civilians as a military target. The second point was that if wealth was going to be pumped out of the South, and of course it was, it was going to benefit the United States—more precisely, the industrial states of the North—rather than England or any other foreign power.
The third point was that the United States had become a major military power, capable of fighting and funding both sides of one of the 19th century’s biggest wars, and potentially capable of intervening in the affairs of Europe if it came to that. Every major European power had military attachés prowling the battlefields of the Civil War, and this was partly because that uncomfortable reality was beginning to dawn on politicians in Europe’s capitals. Partly, though, it was because the technological advances of the 19th century had as dramatic an impact on the battlefield as elsewhere, and the Civil War provided a disquieting glimpse of how repeating rifles, improved cannon, ironclad ships, and rail transport could transform warfare. Most of them drew exactly the wrong conclusions—a point we’ll discuss in some detail later on in this series of posts—but the fact that they were there points up the extent to which America, a backwater in world affairs fifty years previously, had become much less so by 1860.
The fourth point, though, was the most crucial for the theme we’re exploring here. The end of the war was also the end of the debate over the mode of American expansion. The plantation economy wasn’t abolished—textile mills in the North depended on Southern cotton just as much as mills in the English Midlands did—but the door was slammed on its hopes of expansion as a series of Homestead Acts threw open the Great Plains to the family-farm model of the Northern economy. Questions of public policy that had been central to prewar debates—trade policy, internal improvements, and the rest—were settled for the rest of the century to the North’s satisfaction. The wealth pump kicked back into gear without the safety valve of new lands, and the South’s relative prosperity in the prewar era gave way to a regional depression that didn’t end until after the Second World War. Meanwhile, protected by tariffs and trade barriers, supported by federal investments in railroads and the like, and buoyed by the wealth pump, the Northern economy boomed.
The settlement of the rest of the continent followed promptly, and it followed the Northern pattern. The military technologies that had broken the South were turned on those First Nations that still defended their tribal territories, with even more devastating effect. European military attachés—yes, they were still prowling around during the Indian Wars; the United States was a continuing object of interest to all the major European powers throughout the 19th century—wrote admiringly that the Plains tribes were the finest light cavalry in all of history, but they were still unable to hold their own against repeating rifles, Gatling guns, and a systematic campaign of extermination directed against the buffalo that provided the bulk of their food. As the tribes were driven onto tiny, barren reservations, white settlers streamed onto their land, laying out the same pattern of towns and farms that had succeeded so well further east. It would not succeed anything like so well on the plains, and further west it would not succeed at all, but the first signs of its failure went utterly unnoticed for many decades.
Still, in the age of the railroad, the West simply wasn’t that big any longer, and the shores of the Pacific put a hard limit in the way of further territorial expansion. During the heady days of the 1840s, when it was still possible to forget the cost of war, American politicians seriously debated the invasion and conquest of all of Mexico—they settled for half—and during the Oregon Territory controversy of the same decade, a substantial faction had demanded the seizure of what’s now the southern half of Canada’s four westernmost provinces, even if it meant war with Britain. Cooler heads prevailed in each of these debates, and by the last decades of the 19th century, nobody was seriously suggesting either option: Britain by then had far and away the world’s most powerful and technologically advanced military, and the idea of absorbing the rest of Mexico into an expanded United States ran headlong into a pervasive racism that would not have tolerated the idea of millions of Mexicans suddenly becoming American citizens.
The modes of expansion that defined 19th century America thus ended before the century did, and that hard fact ultimately launched America into its age of overseas empire. We’ll discuss that next week.
***************
Those of my readers who might happen find themselves within reach of rural Pennsylvania over Memorial Day weekend this year might be interested to hear of a conference then and there that will discuss many of the themes I’ve been covering in this blog over the last half dozen years. Its name? The Age of Limits. I’ll be there, and presenting; so will Carolyn Baker, Dmitry Orlov, Gail Tverberg, and Tom Whipple, just to name the big name speakers. It’s intended for those who have grasped the fact that the age of abundance is ending, and want to discuss what can still be done as industrial society unravels. It should be a worthwhile time; I hope to see some of you there. Check out http://ageoflimits.org for the details.
***************
End of the World of the Week #13
One of the embarrassments of history, at least for anyone who believes in the ability of human beings to learn from their mistakes, is the way that the same bad ideas keep on being rehashed under new labels every few decades. The so-called “Law of Attraction” marketed so vigorously a decade ago in a variety of New Age products—basically, the claim that if you want something bad enough, the universe is obligated to give it to you—is a case in point. The previous time it was new and hot was in the 1920s, and it helped feed the clueless optimism that drove the stock market bubble that crashed so disastrously in 1929; it became new and hot again during the last decade, and helped feed the same clueless optimism that drove the real estate bubble that crashed so disastrously in 2008.
Apocalyptic thought is well supplied with similar examples. One of them is the notion, very popular for centuries, that the Book of Genesis could be used as a template for the history of the world. Genesis describes the process of creation as taking six days, with a day of rest to follow; 2 Peter 3:8 states that a day of the Lord is as a thousand years; equate the Millennium, the thousand years of Utopia that’s supposed to follow the Second Coming, with the thousand-year day of rest at the end of the week of creation, and you’ve got a world history six thousand years long. Figure out the location of the present year in that six-millennia sequence, and you know the date of the Second Coming.
Of course that’s the difficult part, and for something like fifteen hundred years, prophets imitated Harold Camping by coming up with dates for the apocalypse, on that basis that rolled on past without any noticeable result. The apocalyptic frenzy around the year 1000 AD was driven by exactly this calculation; any number of prophets had insisted that the birth of Christ marked the beginning of the sixth day, so the end was clearly nigh. When it didn’t arrive, other prophets decided that the Crucifixion was the beginning of the sixth day, and predicted 1033 AD as the big date; they were just as wrong, of course, but it didn’t keep others from trying the same thing later on.
It’s not often remembered that Archbishop James Ussher, who notoriously calculated the date of the Creation using the Bible as his guide, was still working under the Book of Genesis paradigm. His date of 4004 BC for the beginning of the world implies that 1996, exactly 6000 years later, would mark the Second Coming. Those interested in exact dates will probably want to know that God said "Let there be light" at 9:00 am on October 23, 4004, so the end certainly should have arrived at the same date and time in 1996. That it didn’t can probably be credited to the essential cussedness of things.
—story from Apocalypse Not
The New England pattern, as already mentioned, had two sides with profoundly different possibilities for expansion. While many people from rural New England moved westward with the frontier, nearly all of them abandoned the settlement patterns of their home for the freer, more flexible frontier way of doing things; the village greens, town meetings and Puritan attitudes of the New England countryside sparked few imitations elsewhere. The waterwheels and shipyards of New England’s nascent mill towns and cities turned out to be a more enduring contribution, driving the first wave of an industrial revolution parallel to the one that transformed England not long before.
The frontier pattern also had a twofold form, though the dividing line there was different. The classic frontier society of independent subsistence farmers emerged at a time when the inland reaches of the middle colonies had no transportation links to the coast except a few muddy trails, and remained viable only when distance or geographical barriers replicated this condition. Elsewhere, as roads, canals, and (eventually) railroads began to wind their way westward, inland farmers discovered that there was ample money to be made by shipping grain eastwards for local use and export, and plenty of ways to spend that money on manufactured goods shipped west in exchange. That’s why the Appalachians, for example, which remain a challenge to transportation even today, kept the old frontier pattern long after the frontier itself had vanished out of sight over the western horizon, while upstate New York morphed into a prosperous mix of farms and mill towns as soon as the Erie Canal and a network of feeder roads opened it up to efficient freight transport.
In the Ohio River basin, the first of America’s many wild Wests, the industrial system from New England hybridized with the export-oriented reworking of the frontier settlement pattern to create a new and extremely successful human ecology. Along a network of navigable rivers and canals spilling north to the Great Lakes, towns sprang up, and those that had good sites for waterwheels—the prime mover of industry in the days before coal—normally transformed themselves into industrial cities as soon as population permitted. The space in between the towns was given over to small farms, most of them family-owned and operated, which produced nearly all the food needed locally and also raised grains and other bulk products for sale. It turned out to be very easy to extend this hybrid system further west across the northern and central Mississippi basin, and the idea that it could and should be extended straight across the continent ended up freighted with feelings of very nearly religious intensity as the 19th century unfolded.
The two parts of the hybrid—the rising mill towns and industrial centers, on the one hand, and the agricultural hinterlands on the other—had conflicting interests of great importance, but until 1865 both sides had a very good reason to find grounds for compromise. That reason, of course, was the existence of a radically different system of human ecology on the other side of the Mason-Dixon line.
The plantation economy of the Tidewater region, like the economies further north, adapted to changing conditions as westward expansion proceeded. Unlike New England’s rural economy, it could expand: across the southern half of the new nation, wherever climate and geography made it profitable to put large acreages into cash crop monocultures, above all cotton, plantations spread west. They had to spread, because the plantation economy had a critical weakness: like all cash crop monocultures from Roman latifundia to the latest agrobusiness models, Southern plantations stripped fertility from the soil. The equation’s a simple one: growing one crop repeatedly on the same acreage uses up the nutrient base of the soil, and in a farming economy dominated by cash, a farm that invests the money necessary to restore soil fertility will always be less profitable, at least in the short term, than a farm on new soil that concentrates on cutting costs and maximizing profit.
That specific equation is one form of a much more general rule. As providers of raw materials for industry in another nation, the plantations of the South were the business end of a wealth pump; the Southern states may not have been directly ruled by Britain but, economically speaking, they were as much a part of the British Empire as Canada or India. The Southern upper class, like upper classes in Third World nations today, benefited substantially from their role as guardians of the pump’s intake pipe, but the fertility stripped from Southern soils to provide cheap cotton for Lancashire mills still represented wealth pumped out of the Southern states for the benefit of Britain. New lands had to be brought into the system to keep the pump fed without beggaring those who fed it.
The expansion of the plantation system brought it into a complex relationship with the frontier society that moved westward ahead of it. Regions that were unsuited to plantation farming in the South, like regions that were difficult for transportation technologies in the North, became enclaves of the old frontier pattern, and where these were large enough, they became enclaves of support for the Union once the Civil War broke out—the northwestern third of Virginia, which broke away to become the state of West Virginia, and the eastern hill country of Tennessee were strategically important examples. Elsewhere, as plantations spread, the frontier society was absorbed into the plantation system. Some frontier folk rose to the top—Jefferson Davis, US senator and Confederate president, was born in a log cabin in rural Kentucky in 1808, when it was still well out on the frontier, less than a hundred miles from the log cabin where Abraham Lincoln was born eight months later. (One family went south, the other north; it’s by no means impossible that if the families had chosen differently, the two men might have ended up filling each other’s places.) Most of the others became the poor white not-quite-underclass of the rural South, and provided the South with the bulk of its soldiers in the Civil War, just as their equivalents further north made up a very large fraction of Union soldiers.
The plantation society, then, had to expand in order to survive. The mixed farming society further north was not quite so dependent on expansion, but desired it intensely, and population pressure from a booming birthrate and a steady flood of immigrants backed up that desire with potent economic pressures. While the Mississippi valley was free for the taking, both systems could expand without coming into conflict, but by the late 1840s people on either side were looking westward across the arid west to the Pacific, still distant but too close for comfort. That’s when the national debate over the shape of America’s human ecology—framed south of the Mason-Dixon line as a debate about local autonomy, and north of it as a debate over the ethics of slavery—began to spin out of control.
There were plenty of other issues involved, to be sure. Across the board, on almost every point of national policy that touched on economics, the measures that would support the plantation economy of the South were diametrically opposed to the measures that would support the industrial and farming economies of the North. Trade policy is one good example: to the North, trade barriers and protective tariffs to shelter rising industries from competition by the industrial behemoth of Britain were simple common sense; to the South, free trade was essential so that British markets would remain open to Southern cotton. The endless debates over Federal funding for canals and other internal improvements is another: investments that were essential to the expansion of the Northern economy were useless to the plantation system—which is why it was the North that wove a web of canals and railroads from the Hudson River to the upper Mississippi, while the South built few railways and fewer canals, and relied instead on shallow-draft riverboats that were adequate for getting cotton to market but for very little else, an economy that would cost the South terribly once war came.
Still, the issue that couldn’t be resolved short of war was the future shape of America’s territorial expansion. That’s why Southern leaders, for all their belief in the virtues of local autonomy, bitterly opposed any compromise that would give the people of each newly settled territory the right to decide whether or not slavery would exist within that territory. That’s why the South backed the annexation of Texas and the war with Mexico so fervently in 1844 and 1845, and why so many Southerners in the decades before the Civil War supported the Order of the Golden Circle, a society that advocated the outright military conquest of Mexico, Central America, northern South America and the islands of the Caribbean, so that the plantation economy would have ample room to expand. Meanwhile, north of the cotton belt, younger sons of farmers looked hungrily westwards at the Great Plains and imagined farms of their own, if only slavery could be kept out—and that, in turn, is what made “Bleeding Kansas” the scene of a decade of terrorism and guerrilla war between pro- and antislavery factions, and lit the fuse that finally went off at Fort Sumter.
By the time the war ended at Appomattox Court House four lean and bloody years later, four points had been settled for the foreseeable future. The first was that victory in the wars of the next century would be determined not by which side had the best generals—the South had them, hands down—but by which side had a bigger industrial base, a larger population, and a greater willingness to chuck the traditional rules of war and treat enemy civilians as a military target. The second point was that if wealth was going to be pumped out of the South, and of course it was, it was going to benefit the United States—more precisely, the industrial states of the North—rather than England or any other foreign power.
The third point was that the United States had become a major military power, capable of fighting and funding both sides of one of the 19th century’s biggest wars, and potentially capable of intervening in the affairs of Europe if it came to that. Every major European power had military attachés prowling the battlefields of the Civil War, and this was partly because that uncomfortable reality was beginning to dawn on politicians in Europe’s capitals. Partly, though, it was because the technological advances of the 19th century had as dramatic an impact on the battlefield as elsewhere, and the Civil War provided a disquieting glimpse of how repeating rifles, improved cannon, ironclad ships, and rail transport could transform warfare. Most of them drew exactly the wrong conclusions—a point we’ll discuss in some detail later on in this series of posts—but the fact that they were there points up the extent to which America, a backwater in world affairs fifty years previously, had become much less so by 1860.
The fourth point, though, was the most crucial for the theme we’re exploring here. The end of the war was also the end of the debate over the mode of American expansion. The plantation economy wasn’t abolished—textile mills in the North depended on Southern cotton just as much as mills in the English Midlands did—but the door was slammed on its hopes of expansion as a series of Homestead Acts threw open the Great Plains to the family-farm model of the Northern economy. Questions of public policy that had been central to prewar debates—trade policy, internal improvements, and the rest—were settled for the rest of the century to the North’s satisfaction. The wealth pump kicked back into gear without the safety valve of new lands, and the South’s relative prosperity in the prewar era gave way to a regional depression that didn’t end until after the Second World War. Meanwhile, protected by tariffs and trade barriers, supported by federal investments in railroads and the like, and buoyed by the wealth pump, the Northern economy boomed.
The settlement of the rest of the continent followed promptly, and it followed the Northern pattern. The military technologies that had broken the South were turned on those First Nations that still defended their tribal territories, with even more devastating effect. European military attachés—yes, they were still prowling around during the Indian Wars; the United States was a continuing object of interest to all the major European powers throughout the 19th century—wrote admiringly that the Plains tribes were the finest light cavalry in all of history, but they were still unable to hold their own against repeating rifles, Gatling guns, and a systematic campaign of extermination directed against the buffalo that provided the bulk of their food. As the tribes were driven onto tiny, barren reservations, white settlers streamed onto their land, laying out the same pattern of towns and farms that had succeeded so well further east. It would not succeed anything like so well on the plains, and further west it would not succeed at all, but the first signs of its failure went utterly unnoticed for many decades.
Still, in the age of the railroad, the West simply wasn’t that big any longer, and the shores of the Pacific put a hard limit in the way of further territorial expansion. During the heady days of the 1840s, when it was still possible to forget the cost of war, American politicians seriously debated the invasion and conquest of all of Mexico—they settled for half—and during the Oregon Territory controversy of the same decade, a substantial faction had demanded the seizure of what’s now the southern half of Canada’s four westernmost provinces, even if it meant war with Britain. Cooler heads prevailed in each of these debates, and by the last decades of the 19th century, nobody was seriously suggesting either option: Britain by then had far and away the world’s most powerful and technologically advanced military, and the idea of absorbing the rest of Mexico into an expanded United States ran headlong into a pervasive racism that would not have tolerated the idea of millions of Mexicans suddenly becoming American citizens.
The modes of expansion that defined 19th century America thus ended before the century did, and that hard fact ultimately launched America into its age of overseas empire. We’ll discuss that next week.
***************
Those of my readers who might happen find themselves within reach of rural Pennsylvania over Memorial Day weekend this year might be interested to hear of a conference then and there that will discuss many of the themes I’ve been covering in this blog over the last half dozen years. Its name? The Age of Limits. I’ll be there, and presenting; so will Carolyn Baker, Dmitry Orlov, Gail Tverberg, and Tom Whipple, just to name the big name speakers. It’s intended for those who have grasped the fact that the age of abundance is ending, and want to discuss what can still be done as industrial society unravels. It should be a worthwhile time; I hope to see some of you there. Check out http://ageoflimits.org for the details.
***************
End of the World of the Week #13
One of the embarrassments of history, at least for anyone who believes in the ability of human beings to learn from their mistakes, is the way that the same bad ideas keep on being rehashed under new labels every few decades. The so-called “Law of Attraction” marketed so vigorously a decade ago in a variety of New Age products—basically, the claim that if you want something bad enough, the universe is obligated to give it to you—is a case in point. The previous time it was new and hot was in the 1920s, and it helped feed the clueless optimism that drove the stock market bubble that crashed so disastrously in 1929; it became new and hot again during the last decade, and helped feed the same clueless optimism that drove the real estate bubble that crashed so disastrously in 2008.
Apocalyptic thought is well supplied with similar examples. One of them is the notion, very popular for centuries, that the Book of Genesis could be used as a template for the history of the world. Genesis describes the process of creation as taking six days, with a day of rest to follow; 2 Peter 3:8 states that a day of the Lord is as a thousand years; equate the Millennium, the thousand years of Utopia that’s supposed to follow the Second Coming, with the thousand-year day of rest at the end of the week of creation, and you’ve got a world history six thousand years long. Figure out the location of the present year in that six-millennia sequence, and you know the date of the Second Coming.
Of course that’s the difficult part, and for something like fifteen hundred years, prophets imitated Harold Camping by coming up with dates for the apocalypse, on that basis that rolled on past without any noticeable result. The apocalyptic frenzy around the year 1000 AD was driven by exactly this calculation; any number of prophets had insisted that the birth of Christ marked the beginning of the sixth day, so the end was clearly nigh. When it didn’t arrive, other prophets decided that the Crucifixion was the beginning of the sixth day, and predicted 1033 AD as the big date; they were just as wrong, of course, but it didn’t keep others from trying the same thing later on.
It’s not often remembered that Archbishop James Ussher, who notoriously calculated the date of the Creation using the Bible as his guide, was still working under the Book of Genesis paradigm. His date of 4004 BC for the beginning of the world implies that 1996, exactly 6000 years later, would mark the Second Coming. Those interested in exact dates will probably want to know that God said "Let there be light" at 9:00 am on October 23, 4004, so the end certainly should have arrived at the same date and time in 1996. That it didn’t can probably be credited to the essential cussedness of things.
—story from Apocalypse Not
Wednesday, March 07, 2012
America: Origins of an Empire
To understand the decline and approaching fall of the American empire, it’s necessary to understand how that empire came into being. That’s a complex issue, as all historical questions are these days, and a grasp of the tangled role of history in today’s political discourse may make it a little easier to avoid certain common but unhelpful habits of thought as we proceed.
Until the 18th century, in the Western world as elsewhere around the planet, the core language of political rhetoric came from religion. From monarchs who based their claims to legitimacy on theories of the divine right of kings, straight across the spectrum to revolutionaries who borrowed the rhetoric of Old Testament prophets to call for the slaughter of the rich, political argument drew primarily on theology’s vision of an eternal order imperfectly reflected in the material cosmos. Conservatives argued that the existing structure of society more or less mirrored God’s order, or would do so if the liberals would only shut up and behave; liberals argued that the existing structure of society was moving toward a more perfect reflection of God’s order, and would get there more quickly if the conservatives would only stop dragging their heels; and radicals argued that the existing structure of society was in utter conflict with God’s order, and had to be terminated with extreme prejudice (along, often enough, with the liberals and the conservatives) so that a new and perfect world can come into being.
The displacement of religion by secular ideologies in the 18th century left all three of these basic political viewpoints in place, but levered them neatly off their theological foundations, leaving their adherents floundering for new justifications. The standard response at the time, and ever since, was to force history to play theology’s role, by mapping theological ideas of good and evil onto the complexities of the past. Whenever a political question comes up for debate, accordingly, it’s a safe bet to assume that all sides will immediately drag in canned historical narratives that have been stretched and lopped to fit whatever simplistic moral dualism their ideology requires, so that they can tar their opponents by associating them with history’s villains (to the contemporary American right, socialists; to the contemporary American left, fascists) and wrap themselves in the mantle of history’s good guys.
Thus it’s vanishingly rare to see any public discussion of historical events these days that doesn’t fixate, often to the extent of caricature, on distinguishing the good guys from the bad guys—and when this is attempted, the first reaction of a great many listeners or readers is to figure out how to cram what’s been said into that same simplistic moral dualism. While there’s a point to applying ethical philosophy to history (and vice versa), though, there are entire realms of understanding that can’t be reached so long as the center of discussion is who was right and who was wrong. Over the next few weeks, I plan on talking about the rise of America’s current empire as a historical phenomenon and not a morality play, and leave my readers free to make their own moral judgments if they find those useful.
The use of history as moral ammunition in contemporary politics, mind you, accounts for only part of the complexity of the subject we’ll be discussing. Another part, a crucial one, comes from the intricate history of America’s empire itself; that, in turn, comes from the fact that the United States of America may be a single political unit but it has never been a single culture or, really, a single country; and the fault lines along which America has split repeatedly for more than three centuries can be traced right back to the European settlement of the continent’s eastern seaboard. We can start there.
When the first waves of colonists from western and central Europe arrived on the Atlantic shores of North America in the 17th century, none of them seem to have realized that they were the beneficiaries of a cataclysm. Around the periphery of the Old World, the European voyages of discovery found crowded nations with no spare territory for migrants, but the Americas and Australasia seemed all but empty. The native peoples of all three continents have reasonably enough objected to this description—after all, they were there—but the perception of empty space wasn’t simply propaganda. It reflected the aftermath of the most appalling demographic disaster in recorded history.
The accident of plate tectonics that opened oceanic barriers between the Old and New Worlds had an impact on disease that wasn’t clearly understood until quite recently. Most of the world’s serious human pathogens came to our species from domestic livestock, and nearly all of that happened in the Old World, because Eurasia happened to have many more species suitable for domestication than the New World did. One at a time, over the tens of millennia between the closing of the Bering land bridge and the voyages of Columbus, pathogens found their way from animal vectors into the human population, epidemics swept the Old World, and the survivors gradually picked up a certain level of resistance. Those pathogens didn’t cross the ocean to the New World until the first European ships began to arrive, but when they did, they hit the native people of the Americas all at once. Within a century of 1492, as a result, native populations collapsed to 10% or less of their precontact levels.
The scale of the dieoff can be measured by a simple fact still rarely mentioned outside of the specialist literature: in 1500 the Amazon jungle as we now know it did not exist. At that time, and for many centuries before, the Amazon basin was a thickly settled agricultural region full of sizeable cities and towns with thriving local and long distance trade. The first Spanish explorers to travel down the Amazon described it in these terms, which were dismissed as fables by later writers who knew only the “green hell” of the postcollapse Amazon. Only in the last two decades or so have sophisticated archeological studies shown that the conquistadors were right and their critics wrong.
The same collapse swept the eastern seaboard of North America, where settled farming villages were established by 2000 BCE, and complex agricultural societies with rich political, cultural and religious traditions thrived for many centuries before 1492. (A thousand years before the founding of Jamestown, the level of cultural sophistication in the Chesapeake Bay tribes was arguably higher than that found among the inhabitants of Dark Age England.) After a century of dieoff, the survivors were scattered in small communities across a mostly vacant landscape. That was what the first waves of European colonists encountered. They told themselves they were settling in a wilderness, but they were quite wrong: they were arriving in a land that had been settled and farmed for countless generations before their time, and benefited immensely from the legacies of the peoples whose surviving descendants they elbowed out of the way.
Compared to cramped and crowded Europe, the eastern seaboard of North America seemed almost unimaginably vast—the distance between the two early colonies at Jamestown and Plymouth is greater than the entire length of England from the cliffs of Dover to the border with Scotland—and the sheer impact of space, together with sharp differences in climate and even sharper differences in the people who came to settle, drove the newly founded colonies in radically different directions. In what would become New England, English religious minorities made up much of the first wave of arrivals, and the society they built replicated 17th century English rural society as closely as the new environment would permit. The result proved impossible to transplant further into the country, which is why rural New England remains something of a world unto itself, but it wasn’t accidental that the Industrial Revolution got started in New England not much later than it did in the English Midlands: the same cultural forms that drove industrialization at home did much the same thing in the transplanted society, and the industrial society that emerged out of the transformation spread westwards as the country did.
Far to the south, in the band of settlement that started at Jamestown, matters were different. The settlers in what became the tidewater South weren’t religious minorities fleeing discrimination, by and large, but the employees of English magnates who simply wanted, like their masters, to make as much money as possible. From Chesapeake Bay south, the climate was suited to grow tobacco, and like most drugs, this was a hugely lucrative cash crop; after a few generations, cotton joined tobacco, and the basic pattern of antebellum Southern life was set. Sprawling plantations worked first by indentured servants shipped over from Britain and Ireland, and then by slaves shipped over from Africa, became the defining land use pattern along the southern half of the coast, and spread inland wherever climate and topography allowed.
Between New England and the tidewater South lay a poorly defined intermediate zone, a scattering of small colonies—New Jersey, Delaware, Maryland—and one very large one, Pennsylvania. Maryland and Delaware were mostly tidewater and might have gone the Southern path, Pennsylvania and New Jersey weren’t and might have gone the New England path, but Pennsylvania and Maryland both enacted religious liberty statutes early on and welcomed all comers, so the middle zone got dealt a couple of wild cards that ended up transforming the entire colonial enterprise: a torrent of religious and political refugees from central Europe, who fled the aftermath of the Thirty Years War, and a torrent of economic and political refugees from northern Ireland, who fled England’s tightening grip on her first and most thoroughly looted imperial colony. West of Chesapeake Bay lay the Potomac valley, one of the few easy routes into the mountains, and it’s likely that somewhere up that way—by the nature of the thing, nobody will ever know when or where—German and Scots-Irish traditions blended with scraps of a dozen other ethnic heritages to create the first draft of American frontier culture. Think log cabins and long rifles, homespun cloth and home-brewed liquor, a fierce habit of local independence and an equally fierce disdain for the cultures of the coast, and all the rest: that’s where it came from, and it spread westward along a wide front from the Great Lakes to the middle South.
All this ought to be part of any basic education in American history, though as often as not it gets lost in the teach-to-the-sound-bites frenzy that passes for education in America these days. What doesn’t get in even in those rare schools that teach history worth the name, though, is that these three nascent American cultures—call them New England, Tidewater, and Frontier, if you like—also define three modes of expansion, two imperial and one much less so.
The first mode is the New England industrial model, which spread west to the Great Lakes early on and trickled gradually southward from there. It’s one of the shibboleths of modern thought that industrial systems create wealth, but as Alf Hornborg points out usefully in The Power of the Machine, their main function is actually to concentrate wealth; the wealth that would have gone to a large number of small proprietors and skilled craftspeople in a nonindustrial society goes instead to the very small minority with the money and political connections to build and run factories, control access to raw materials and energy resources, and the like. That’s why every nation on Earth that has ever built an industrial economy within a free market system has ended up polarized between vast fortunes on the one hand and an even vaster number of hopelessly impoverished workers on the other. That’s the New England model—it was also the English model, but that will be relevant a bit later on—and it drives a very specific kind of imperial expansion, in which sources of raw materials, on the one hand, and markets where industrial products can be exported, on the other, are the central targets of empire.
The second mode is the Southern plantation model, which spread due west from the tidewater country until it ran up against certain hard political realities we’ll discuss next week. The plantation model started out as a straightforward export economy, but found itself drawn into the orbit of the rising industrial system; cotton from Southern plantations was eagerly sought by the textile mills of the English Midlands, and the political economy of the cotton belt morphed into a pattern that ought to be profoundly familiar to Americans today, though it’s generally not: it’s the pattern found today in Third World nations under American or European domination, in which raw materials for industry overseas are produced under harsh conditions by a vast and impoverished labor force, while a small upper class is well rewarded for keeping the system running smoothly. That’s the Southern model, and it drives a very different mode of imperial expansion, in which arable land and cheap labor are the central targets of empire.
The Frontier model is something else again. It also had a powerful expansionist dynamic, but it was egalitarian rather than hierarchical, and didn’t provide anybody with a convenient place to hook up a wealth pump. What Frontier culture craved from expansion was simply real estate, where people could build a cabin, break the sod, plant crops, and make a life for themselves. Over time, as the model ripened and values shifted, it gave rise to a vision of American expansion in which an entire continent would be seeded first with frontier homesteads, then with prosperous farms and nascent towns, and replicate political and economic democracy straight across to the Pacific. What would happen once that limit was reached was a question very few Americans asked themselves.
Before that point was reached, though, these three cultures were going to have to sort out their relative strength and influence on the new American nation. We’ll talk about that next week.
***************
End of the World of the Week #12
Hegel, whose personal contribution to the history of false prophecy was the subject of last week’s End of the World of the Week, was one of dozens of 19th century intellectuals who were convinced that their careers marked a great turning point in human history. Unlike his competitors, though, Hegel proved to be a major inspiration to future generations. Most of the ideological follies of the 19th and 20th centuries drew on Hegel in one way or another; Marx was only the most successful of the people who fell under the enchantment of Hegelian dialectic and spouted prophecies that turned out to be just as wrong as Hegel’s had been.
Still, a special place belongs to Francis Fukuyama. A US State Department policy wonk turned neo-Hegelian academic, Fukuyama got his fifteen minutes of fame in 1989 by proclaiming, in a widely read essay and a book, that history was over. His argument, a sort of pop Hegelianism reduced to the lowest common denominator, was that history is a Darwinian struggle among different systems of political economy, in which whichever one crushes the competition is by definition the best; that the defeat of Communism showed that “liberal democracy”—that is, the country club Republicanism of George Bush senior—was the winner of the great contest; and that, just as soon as the last stragglers got with the program, humanity would henceforth bask in peace and prosperity forever.
What made all this a masterpiece of unintended irony is that almost identical claims were retailed by the Marxist regimes whose collapse Fukuyama’s essay was intended to celebrate. Not that long before, in fact, the American conservative movement was notable for its skepticism of Hegelian handwaving and grand theories of history’s march to perfection, while the Marxists they opposed spent their time proclaiming that history was on their side and everybody else simply had to get with the program. By Fukuyama’s time, that skepticism had given way to blatant imitation—compare the official US arguments for the invasion of Iraq with articles in Pravda justifying the Soviet Union’s invasion of Afghanistan sometime, and see if you can find a significant difference—and the neoconservative movement, which was heavily influenced by Fukuyama’s work for a time, proceeded to launch itself along the same track to history’s dustbin that the Marxist regimes they loathed had followed before them.
—story from Apocalypse Not
Until the 18th century, in the Western world as elsewhere around the planet, the core language of political rhetoric came from religion. From monarchs who based their claims to legitimacy on theories of the divine right of kings, straight across the spectrum to revolutionaries who borrowed the rhetoric of Old Testament prophets to call for the slaughter of the rich, political argument drew primarily on theology’s vision of an eternal order imperfectly reflected in the material cosmos. Conservatives argued that the existing structure of society more or less mirrored God’s order, or would do so if the liberals would only shut up and behave; liberals argued that the existing structure of society was moving toward a more perfect reflection of God’s order, and would get there more quickly if the conservatives would only stop dragging their heels; and radicals argued that the existing structure of society was in utter conflict with God’s order, and had to be terminated with extreme prejudice (along, often enough, with the liberals and the conservatives) so that a new and perfect world can come into being.
The displacement of religion by secular ideologies in the 18th century left all three of these basic political viewpoints in place, but levered them neatly off their theological foundations, leaving their adherents floundering for new justifications. The standard response at the time, and ever since, was to force history to play theology’s role, by mapping theological ideas of good and evil onto the complexities of the past. Whenever a political question comes up for debate, accordingly, it’s a safe bet to assume that all sides will immediately drag in canned historical narratives that have been stretched and lopped to fit whatever simplistic moral dualism their ideology requires, so that they can tar their opponents by associating them with history’s villains (to the contemporary American right, socialists; to the contemporary American left, fascists) and wrap themselves in the mantle of history’s good guys.
Thus it’s vanishingly rare to see any public discussion of historical events these days that doesn’t fixate, often to the extent of caricature, on distinguishing the good guys from the bad guys—and when this is attempted, the first reaction of a great many listeners or readers is to figure out how to cram what’s been said into that same simplistic moral dualism. While there’s a point to applying ethical philosophy to history (and vice versa), though, there are entire realms of understanding that can’t be reached so long as the center of discussion is who was right and who was wrong. Over the next few weeks, I plan on talking about the rise of America’s current empire as a historical phenomenon and not a morality play, and leave my readers free to make their own moral judgments if they find those useful.
The use of history as moral ammunition in contemporary politics, mind you, accounts for only part of the complexity of the subject we’ll be discussing. Another part, a crucial one, comes from the intricate history of America’s empire itself; that, in turn, comes from the fact that the United States of America may be a single political unit but it has never been a single culture or, really, a single country; and the fault lines along which America has split repeatedly for more than three centuries can be traced right back to the European settlement of the continent’s eastern seaboard. We can start there.
When the first waves of colonists from western and central Europe arrived on the Atlantic shores of North America in the 17th century, none of them seem to have realized that they were the beneficiaries of a cataclysm. Around the periphery of the Old World, the European voyages of discovery found crowded nations with no spare territory for migrants, but the Americas and Australasia seemed all but empty. The native peoples of all three continents have reasonably enough objected to this description—after all, they were there—but the perception of empty space wasn’t simply propaganda. It reflected the aftermath of the most appalling demographic disaster in recorded history.
The accident of plate tectonics that opened oceanic barriers between the Old and New Worlds had an impact on disease that wasn’t clearly understood until quite recently. Most of the world’s serious human pathogens came to our species from domestic livestock, and nearly all of that happened in the Old World, because Eurasia happened to have many more species suitable for domestication than the New World did. One at a time, over the tens of millennia between the closing of the Bering land bridge and the voyages of Columbus, pathogens found their way from animal vectors into the human population, epidemics swept the Old World, and the survivors gradually picked up a certain level of resistance. Those pathogens didn’t cross the ocean to the New World until the first European ships began to arrive, but when they did, they hit the native people of the Americas all at once. Within a century of 1492, as a result, native populations collapsed to 10% or less of their precontact levels.
The scale of the dieoff can be measured by a simple fact still rarely mentioned outside of the specialist literature: in 1500 the Amazon jungle as we now know it did not exist. At that time, and for many centuries before, the Amazon basin was a thickly settled agricultural region full of sizeable cities and towns with thriving local and long distance trade. The first Spanish explorers to travel down the Amazon described it in these terms, which were dismissed as fables by later writers who knew only the “green hell” of the postcollapse Amazon. Only in the last two decades or so have sophisticated archeological studies shown that the conquistadors were right and their critics wrong.
The same collapse swept the eastern seaboard of North America, where settled farming villages were established by 2000 BCE, and complex agricultural societies with rich political, cultural and religious traditions thrived for many centuries before 1492. (A thousand years before the founding of Jamestown, the level of cultural sophistication in the Chesapeake Bay tribes was arguably higher than that found among the inhabitants of Dark Age England.) After a century of dieoff, the survivors were scattered in small communities across a mostly vacant landscape. That was what the first waves of European colonists encountered. They told themselves they were settling in a wilderness, but they were quite wrong: they were arriving in a land that had been settled and farmed for countless generations before their time, and benefited immensely from the legacies of the peoples whose surviving descendants they elbowed out of the way.
Compared to cramped and crowded Europe, the eastern seaboard of North America seemed almost unimaginably vast—the distance between the two early colonies at Jamestown and Plymouth is greater than the entire length of England from the cliffs of Dover to the border with Scotland—and the sheer impact of space, together with sharp differences in climate and even sharper differences in the people who came to settle, drove the newly founded colonies in radically different directions. In what would become New England, English religious minorities made up much of the first wave of arrivals, and the society they built replicated 17th century English rural society as closely as the new environment would permit. The result proved impossible to transplant further into the country, which is why rural New England remains something of a world unto itself, but it wasn’t accidental that the Industrial Revolution got started in New England not much later than it did in the English Midlands: the same cultural forms that drove industrialization at home did much the same thing in the transplanted society, and the industrial society that emerged out of the transformation spread westwards as the country did.
Far to the south, in the band of settlement that started at Jamestown, matters were different. The settlers in what became the tidewater South weren’t religious minorities fleeing discrimination, by and large, but the employees of English magnates who simply wanted, like their masters, to make as much money as possible. From Chesapeake Bay south, the climate was suited to grow tobacco, and like most drugs, this was a hugely lucrative cash crop; after a few generations, cotton joined tobacco, and the basic pattern of antebellum Southern life was set. Sprawling plantations worked first by indentured servants shipped over from Britain and Ireland, and then by slaves shipped over from Africa, became the defining land use pattern along the southern half of the coast, and spread inland wherever climate and topography allowed.
Between New England and the tidewater South lay a poorly defined intermediate zone, a scattering of small colonies—New Jersey, Delaware, Maryland—and one very large one, Pennsylvania. Maryland and Delaware were mostly tidewater and might have gone the Southern path, Pennsylvania and New Jersey weren’t and might have gone the New England path, but Pennsylvania and Maryland both enacted religious liberty statutes early on and welcomed all comers, so the middle zone got dealt a couple of wild cards that ended up transforming the entire colonial enterprise: a torrent of religious and political refugees from central Europe, who fled the aftermath of the Thirty Years War, and a torrent of economic and political refugees from northern Ireland, who fled England’s tightening grip on her first and most thoroughly looted imperial colony. West of Chesapeake Bay lay the Potomac valley, one of the few easy routes into the mountains, and it’s likely that somewhere up that way—by the nature of the thing, nobody will ever know when or where—German and Scots-Irish traditions blended with scraps of a dozen other ethnic heritages to create the first draft of American frontier culture. Think log cabins and long rifles, homespun cloth and home-brewed liquor, a fierce habit of local independence and an equally fierce disdain for the cultures of the coast, and all the rest: that’s where it came from, and it spread westward along a wide front from the Great Lakes to the middle South.
All this ought to be part of any basic education in American history, though as often as not it gets lost in the teach-to-the-sound-bites frenzy that passes for education in America these days. What doesn’t get in even in those rare schools that teach history worth the name, though, is that these three nascent American cultures—call them New England, Tidewater, and Frontier, if you like—also define three modes of expansion, two imperial and one much less so.
The first mode is the New England industrial model, which spread west to the Great Lakes early on and trickled gradually southward from there. It’s one of the shibboleths of modern thought that industrial systems create wealth, but as Alf Hornborg points out usefully in The Power of the Machine, their main function is actually to concentrate wealth; the wealth that would have gone to a large number of small proprietors and skilled craftspeople in a nonindustrial society goes instead to the very small minority with the money and political connections to build and run factories, control access to raw materials and energy resources, and the like. That’s why every nation on Earth that has ever built an industrial economy within a free market system has ended up polarized between vast fortunes on the one hand and an even vaster number of hopelessly impoverished workers on the other. That’s the New England model—it was also the English model, but that will be relevant a bit later on—and it drives a very specific kind of imperial expansion, in which sources of raw materials, on the one hand, and markets where industrial products can be exported, on the other, are the central targets of empire.
The second mode is the Southern plantation model, which spread due west from the tidewater country until it ran up against certain hard political realities we’ll discuss next week. The plantation model started out as a straightforward export economy, but found itself drawn into the orbit of the rising industrial system; cotton from Southern plantations was eagerly sought by the textile mills of the English Midlands, and the political economy of the cotton belt morphed into a pattern that ought to be profoundly familiar to Americans today, though it’s generally not: it’s the pattern found today in Third World nations under American or European domination, in which raw materials for industry overseas are produced under harsh conditions by a vast and impoverished labor force, while a small upper class is well rewarded for keeping the system running smoothly. That’s the Southern model, and it drives a very different mode of imperial expansion, in which arable land and cheap labor are the central targets of empire.
The Frontier model is something else again. It also had a powerful expansionist dynamic, but it was egalitarian rather than hierarchical, and didn’t provide anybody with a convenient place to hook up a wealth pump. What Frontier culture craved from expansion was simply real estate, where people could build a cabin, break the sod, plant crops, and make a life for themselves. Over time, as the model ripened and values shifted, it gave rise to a vision of American expansion in which an entire continent would be seeded first with frontier homesteads, then with prosperous farms and nascent towns, and replicate political and economic democracy straight across to the Pacific. What would happen once that limit was reached was a question very few Americans asked themselves.
Before that point was reached, though, these three cultures were going to have to sort out their relative strength and influence on the new American nation. We’ll talk about that next week.
***************
End of the World of the Week #12
Hegel, whose personal contribution to the history of false prophecy was the subject of last week’s End of the World of the Week, was one of dozens of 19th century intellectuals who were convinced that their careers marked a great turning point in human history. Unlike his competitors, though, Hegel proved to be a major inspiration to future generations. Most of the ideological follies of the 19th and 20th centuries drew on Hegel in one way or another; Marx was only the most successful of the people who fell under the enchantment of Hegelian dialectic and spouted prophecies that turned out to be just as wrong as Hegel’s had been.
Still, a special place belongs to Francis Fukuyama. A US State Department policy wonk turned neo-Hegelian academic, Fukuyama got his fifteen minutes of fame in 1989 by proclaiming, in a widely read essay and a book, that history was over. His argument, a sort of pop Hegelianism reduced to the lowest common denominator, was that history is a Darwinian struggle among different systems of political economy, in which whichever one crushes the competition is by definition the best; that the defeat of Communism showed that “liberal democracy”—that is, the country club Republicanism of George Bush senior—was the winner of the great contest; and that, just as soon as the last stragglers got with the program, humanity would henceforth bask in peace and prosperity forever.
What made all this a masterpiece of unintended irony is that almost identical claims were retailed by the Marxist regimes whose collapse Fukuyama’s essay was intended to celebrate. Not that long before, in fact, the American conservative movement was notable for its skepticism of Hegelian handwaving and grand theories of history’s march to perfection, while the Marxists they opposed spent their time proclaiming that history was on their side and everybody else simply had to get with the program. By Fukuyama’s time, that skepticism had given way to blatant imitation—compare the official US arguments for the invasion of Iraq with articles in Pravda justifying the Soviet Union’s invasion of Afghanistan sometime, and see if you can find a significant difference—and the neoconservative movement, which was heavily influenced by Fukuyama’s work for a time, proceeded to launch itself along the same track to history’s dustbin that the Marxist regimes they loathed had followed before them.
—story from Apocalypse Not