Wednesday, May 27, 2015

The Era of Response

The third stage of the process of collapse, following what I’ve called the eras of pretense and impact, is the era of response. It’s easy to misunderstand what this involves, because both of the previous eras have their own kinds of response to whatever is driving the collapse; it’s just that those kinds of response are more precisely nonresponses, attempts to make the crisis go away without addressing any of the things that are making it happen.

If you want a first-rate example of the standard nonresponse of the era of pretense, you’ll find one in the sunny streets of Miami, Florida right now. As a result of global climate change, sea level has gone up and the Gulf Stream has slowed down. One consequence is that these days, whenever Miami gets a high tide combined with a stiff onshore wind, salt water comes boiling up through the storm sewers of the city all over the low-lying parts of town. The response of the Florida state government has been to ssue an order to all state employees that they’re not allowed to utter the phrase “climate change.”

That sort of thing is standard practice in an astonishing range of subjects in America these days. Consider the roles that the essentially nonexistent recovery from the housing-bubble crash of 2008-9 has played in political rhetoric since that time. The current inmate of the White House has been insisting through most of two turns that happy days are here again, and the usual reams of doctored statistics have been churned out in an effort to convince people who know better that they’re just imagining that something is wrong with the economy. We can expect to hear that same claim made in increasingly loud and confident tones right up until the day the bottom finally drops out. 

With the end of the era of pretense and the arrival of the era of impact comes a distinct shift in the standard mode of nonresponse, which can be used quite neatly to time the transition from one era to another. Where the nonresponses of the era of pretense insist that there’s nothing wrong and nobody has to do anything outside the realm of business as usual, the nonresponses of the era of impact claim just as forcefully that whatever’s gone wrong is a temporary difficulty and everything will be fine if we all unite to do even more of whatever activity defines business as usual. That this normally amounts to doing more of whatever made the crisis happen in the first place, and thus reliably makes things worse is just one of the little ironies history has to offer.

What unites the era of pretense with the era of impact is the unshaken belief that in the final analysis, there’s nothing essentially wrong with the existing order of things. Whatever little difficulties may show up from time to time may be ignored as irrelevant or talked out of existence, or they may have to be shoved aside by some concerted effort, but it’s inconceivable to most people in these two eras that the existing order of things is itself the source of society’s problems, and has to be changed in some way that goes beyond the cosmetic dimension. When the inconceivable becomes inescapable, in turn, the second phase gives way to the third, and the era of response has arrived.

This doesn’t mean that everyone comes to grips with the real issues, and buckles down to the hard work that will be needed to rebuild society on a sounder footing. Winston Churchill once noted with his customary wry humor that the American people can be counted on to do the right thing, once they have exhausted every other possibility. He was of course quite correct, but the same rule can be applied with equal validity to every other nation this side of Utopia, too. The era of response, in practice, generally consists of a desperate attempt to find something that will solve the crisis du jour, other than the one thing that everyone knows will solve the crisis du jour but nobody wants to do.

Let’s return to the two examples we’ve been following so far, the outbreak of the Great Depression and the coming of the French Revolution. In the aftermath of the 1929 stock market crash, once the initial impact was over and the “sucker’s rally” of early 1930 had come and gone, the federal government and the various power centers and pressure groups that struggled for influence within its capacious frame were united in pursuit of a single goal: finding a way to restore prosperity without doing either of the things that had to be done in order to restore prosperity.  That task occupied the best minds in the US elite from the summer of 1930 straight through until April of 1933, and the mere fact that their attempts to accomplish this impossibility proved to be a wretched failure shouldn’t blind anyone to the Herculean efforts that were involved in the attempt.

The first of the two things that had to be tackled in order to restore prosperity was to do something about the drastic imbalance in the distribution of income in the United States. As noted in previous posts, an economy dependent on consumer expenditures can’t thrive unless consumers have plenty of money to spend, and in the United States in the late 1920s, they didn’t—well, except for the very modest number of those who belonged to the narrow circles of the well-to-do. It’s not often recalled these days just how ghastly the slums of urban America were in 1929, or how many rural Americans lived in squalid one-room shacks of the sort you pretty much have to travel to the Third World to see these days. Labor unions and strikes were illegal in 1920s America; concepts such as a minimum wage, sick pay, and health benefits didn’t exist, and the legal system was slanted savagely against the poor.

You can’t build prosperity in a consumer society when a good half of your citizenry can’t afford more than the basic necessities of life. That’s the predicament that America found clamped to the tender parts of its economic anatomy at the end of the 1920s. In that decade, as in our time, the temporary solution was to inflate a vast speculative bubble, under the endearing delusion that this would flood the economy with enough unearned cash to make the lack of earned income moot. That worked over the short term and then blew up spectacularly, since a speculative bubble is simply a Ponzi scheme that the legal authorities refuse to prosecute as such, and inevitably ends the same way.

There were, of course, effective solutions to the problem of inadequate consumer income. They were exactly those measures that were taken once the era of response gave way to the era of breakdown; everyone knew what they were, and nobody with access to political or economic power was willing to see them put into effect, because those measures would require a modest decline in the relative wealth and political dominance of the rich as compared to everyone else. Thus, as usually happens, they were postponed until the arrival of the era of breakdown made it impossible to avoid them any longer.

The second thing that had to be changed in order to restore prosperity was even more explosive, and I’m quite certain that some of my readers will screech like banshees the moment I mention it. The United States in 1929 had a precious metal-backed currency in the most literal sense of the term. Paper bills in those days were quite literally receipts for a certain quantity of gold—1.5 grams, for much of the time the US spent on the gold standard. That sort of arrangement was standard in most of the world’s industrial nations; it was backed by a dogmatic orthodoxy all but universal among respectable economists; and it was strangling the US economy.

It’s fashionable among certain sects on the economic fringes these days to look back on the era of the gold standard as a kind of economic Utopia in which there were no booms and busts, just a warm sunny landscape of stability and prosperity until the wicked witches of the Federal Reserve came along and spoiled it all. That claim flies in the face of economic history. During the entire period that the United States was on the gold standard, from 1873 to 1933, the US economy was a moonscape cratered by more than a dozen significant depressions. There’s a reason for that, and it’s relevant to our current situation—in a backhanded manner, admittedly.

Money, let us please remember, is not wealth. It’s a system of arbitrary tokens that represent real wealth—that is, actual, nonfinancial goods and services. Every society produces a certain amount of real wealth each year, and those societies that use money thus need to have enough money in circulation to more or less correspond to the annual supply of real wealth. That sounds simple; in practice, though, it’s anything but. Nowadays, for example, the amount of real wealth being produced in the United States each year is contracting steadily as more and more of the nation’s economic output has to be diverted into the task of keeping it supplied with fossil fuels. That’s happening, in turn, because of the limits to growth—the awkward but inescapable reality that you can’t extract infinite resources, or dump limitless wastes, on a finite planet.

The gimmick currently being used to keep fossil fuel extraction funded and cover the costs of the rising impact of environmental disruptions, without cutting into a culture of extravagance that only cheap abundant fossil fuel and a mostly intact biosphere can support, is to increase the money supply ad infinitum. That’s become the bedrock of US economic policy since the 2008-9 crash. It’s not a gimmick with a long shelf life; as the mismatch between real wealth and the money supply balloons, distortions and discontinuities are surging out through the crawlspaces of our economic life, and crisis is the most likely outcome.

In the United States in the first half or so of the twentieth century, by contrast, the amount of real wealth being produced each year soared, largely because of the steady increases in fossil fuel energy being applied to every sphere of life. While the nation was on the gold standard, though, the total supply of money could only grow as fast as gold could be mined out of the ground, which wasn’t even close to fast enough. So you had more goods and services being produced than there was money to pay for them; people who wanted goods and services couldn’t buy them because there wasn’t enough money to go around; business that wanted to expand and hire workers were unable to do so for the same reason. The result was that moonscape of economic disasters I mentioned a moment ago.

The necessary response at that time was to go off the gold standard. Nobody in power wanted to do this, partly because of the dogmatic economic orthodoxy noted earlier, and partly because a money shortage paid substantial benefits to those who had guaranteed access to money. The rentier class—those people who lived off income from their investments—could count on stable or falling prices as long as the gold standard stayed in place, and the mere fact that the same stable or falling prices meant low wages, massive unemployment, and widespread destitution troubled them not at all. Since the rentier class included the vast majority of the US economic and political elite, in turn, going off the gold standard was unthinkable until it became unavoidable.

The period of the French revolution from the fall of the Bastille in 1789 to the election of the National Convention in 1792 was a period of the same kind, though driven by different forces. Here the great problem was how to replace the Old Regime—not just the French monarchy, but the entire lumbering mass of political, economic, and social laws, customs, forms, and institutions that France had inherited from the Middle Ages and never quite gotten around to adapting to drastically changed conditions—with something that would actually work. It’s among the more interesting features of the resulting era of response that nearly every detail differed from the American example just outlined, and yet the results were remarkably similar.

Thus the leaders of the National Assembly who suddenly became the new rulers of France in the summer of 1789 had no desire whatsoever to retain the traditional economic arrangements that gave France’s former elites their stranglehold on an oversized share of the nation’s wealth. The abolition of manorial rights that summer, together with the explosive rural uprisingsagainst feudal landlords and their chateaux in the wake of the Bastille’s fall, gutted the feudal system and left most of its former beneficiaries the choice between fleeing into exile and trying to find some way to make ends meet in a society that had no particular market for used aristocrats. The problem faced by the National Assembly wasn’t that of prying the dead fingers of a failed system off the nation’s throat; it was that of trying to find some other basis for national unity and effective government.

It’s a surprisingly difficult challenge. Those of my readers who know their way around current events will already have guessed that an attempt was made to establish a copy of whatever system was most fashionable among liberals at the time, and that this attempt turned out to be an abject failure. What’s more, they’ll have been quite correct. The National Assembly moved to establish a constitutional monarchy along British lines, bring in British economic institutions, and the like; it was all very popular among liberal circles in France and, naturally, in Britain as well, and it flopped. Those who recall the outcome of the attempt to turn Iraq into a nice pseudo-American democracy in the wake of the US invasion will have a tolerably good sense of how the project unraveled.

One of the unwelcome but reliable facts of history is that democracy doesn’t transplant well. It thrives only where it grows up naturally, out of the civil institutions and social habits of a people; when liberal intellectuals try to impose it on a nation that hasn’t evolved the necessary foundations for it, the results are pretty much always a disaster. That latter was the situation in France at the time of the Revolution. What happened thereafter  is what almost always happens to a failed democratic experiment: a period of chaos, followed by the rise of a talented despot who’s smart and ruthless enough to impose order on a chaotic situation and allow new, pragmatic institutions to emerge to replace those destroyed by clueless democratic idealists. In many cases, though by no means all, those pragmatic institutions have ended up providing a bridge to a future democracy, but that’s another matter.

Here again, those of my readers who have been paying attention to current events already know this; the collapse of the Soviet Union was followed in classic form by a failed democracy, a period of chaos, and the rise of a talented despot. It’s a curious detail of history that the despots in question are often rather short. Russia has had the great good fortune to find, as its despot du jour, a canny realist who has successfully brought it back from the brink of collapse and reestablished it as a major power with a body count considerably smaller than usual.. France was rather less fortunate; the despot it found, Napoleon Bonaparte, turned out to be a megalomaniac with an Alexander the Great complex who proceeded to plunge Europe into a quarter century of cataclysmic war. Mind you, things could have been even worse; when Germany ended up in a similar situation, what it got was Adolf Hitler.

Charismatic strongmen are a standard endpoint for the era of response, but they properly belong to the era that follows, the era of breakdown, which will be discussed next week. What I want to explore here is how an era of response might work out in the future immediately before us, as the United States topples from its increasingly unsteady imperial perch and industrial civilization as a whole slams facefirst into the limits to growth. The examples just cited outline the two most common patterns by which the era of response works itself out. In the first pattern, the old elite retains its grip on power, and fumbles around with increasing desperation for a response to the crisis. In the second, the old elite is shoved aside, and the new holders of power are left floundering in a political vacuum.

We could see either pattern in the United States. For what it’s worth, I suspect the latter is the more likely option; the spreading crisis of legitimacy that grips the country these days is exactly the sort of thing you saw in France before the Revolution, and in any number of other countries in the few decades just prior to revolutionary political and social change. Every time a government tries to cope with a crisis by claiming that it doesn’t exist, every time some member of the well-to-do tries to dismiss the collective burdens its culture of executive kleptocracy imposes on the country by flinging abuse at critics, every time institutions that claim to uphold the rule of law defend the rule of entrenched privilege instead, the United States takes another step closer to the revolutionary abyss.

I use that last word advisedly. It’s a common superstition in every troubled age that any change must be for the better—that the overthrow of a bad system must by definition lead to the establishment of a better one. This simply isn’t true. The vast majority of revolutions have established governments that were far more abusive than the ones they replaced. The exceptions have generally been those that brought about a social upheaval without wrecking the political system: where, for example, an election rather than a coup d’etat or a mass rising put the revolutionaries in power, and the political institutions of an earlier time remained in place with only such reshaping as new necessities required.

We could still see that sort of transformation as the United States sees the end of its age of empire and has to find its way back to a less arrogant and extravagant way of functioning in the world. I don’t think it’s likely, but I think it’s possible, and it would probably be a good deal less destructive than the other alternative. It’s worth remembering, though, that history is under no obligation to give us the future we think we want.

Wednesday, May 20, 2015

The Era of Impact

Of all the wistful superstitions that cluster around the concept of the future in contemporary popular culture, the most enduring has to be the notion that somehow, sooner or later, something will happen to shake the majority out of its complacency and get it to take seriously the crisis of our age. Week after week, I field comments and emails that presuppose that belief. People want to know how soon I think the shock of awakening will finally hit, or wonder whether this or that event will do the trick, or simply insist that the moment has to come sooner or later.

To all such inquiries and expostulations I have no scrap of comfort to offer. Quite the contrary, what history shows is that a sudden awakening to the realities of a difficult situation is far and away the least likely result of what I’ve called the era of impact, the second of the five stages of collapse. (The first, for those who missed last week’s post, is the era of pretense; the remaining three, which will be covered in the coming weeks, are the eras of response, breakdown, and dissolution.)

The era of impact is the point at which it becomes clear to most people that something has gone wrong with the most basic narratives of a society—not just a little bit wrong, in the sort of way that requires a little tinkering here and there, but really, massively, spectacularly wrong. It arrives when an asset class that was supposed to keep rising in price forever stops rising, does its Wile E. Coyote moment of hang time, and then drops like a stone. It shows up when an apparently entrenched political system, bristling with soldiers and secret police, implodes in a matter of days or weeks and is replaced by a provisional government whose leaders look just as stunned as everyone else. It comes whenever a state of affairs that was assumed to be permanent runs into serious trouble—but somehow it never seems to succeed in getting people to notice just how temporary that state of affairs always was.

Since history is the best guide we’ve got to how such events work out in the real world, I want to take a couple of examples of the kind just outlined and explore them in a little more detail. The stock market bubble of the 1920s makes a good case study on a relatively small scale. In the years leading up to the crash of 1929, stock values in the US stock market quietly disconnected themselves from the economic fundamentals and began what was, for the time, an epic climb into la-la land. There were important if unmentionable reasons for that airy detachment from reality; the most significant was the increasingly distorted distribution of income in 1920s America, which put more and more of the national wealth in the hands of fewer and fewer people and thus gutted the national economy.

It’s one of the repeated lessons of economic history that money in the hands of the rich does much less good for the economy as a whole than money in the hands of the working classes and the poor. The reasoning here is as simple as it is inescapable. Industrial economies survive and thrive on consumer expenditures, but consumer expenditures are limited by the ability of consumers to buy the things they want and need. As money is diverted away from the lower end of the economic pyramid, you get demand destruction—the process by which those who can’t afford to buy things stop buying them—and consumer expenditures fall off. The rich, by contrast, divert a large share of their income out of the consumer economy into investments; the richer they get, the more of the national wealth ends up in investments rather than consumer expenditures; and as consumer expenditures falter, and investments linked to the consumer economy falter in turn, more and more money ends up in illiquid speculative vehicles that are disconnected from the productive economy and do nothing to stimulate demand.

That’s what happened in the 1920s. All through the decade in the US, the rich got richer and the poor got screwed, speculation took the place of productive investment throughout the US economy, and the well-to-do wallowed in the wretched excess chronicled in F. Scott Fitzgerald’s The Great Gatsby while most other people struggled to get by. The whole decade was a classic era of pretense, crowned by the delusional insistence—splashed all over the media of the time—that everyone in the US could invest in the stock market and, since the market was of course going to keep on rising forever, everyone in the US would thus inevitably become rich.

It’s interesting to note that there were people who saw straight through the nonsense and tried to warn their fellow Americans about the inevitable consequences. They were denounced six ways from Sunday by all right-thinking people, in language identical to that used more recently on those of us who’ve had the effrontery to point out that an infinite supply of oil can’t be extracted from a finite planet.  The people who insisted that the soaring stock values of the late 1920s were the product of one of history’s great speculative bubbles were dead right; they had all the facts and figures on their side, not to mention plain common sense; but nobody wanted to hear it.

When the stock market peaked just before the Labor Day weekend in 1929 and started trending down, therefore, the immediate response of all right-thinking people was to insist at the top of their lungs that nothing of the sort was happening, that the market was simply catching its breath before its next great upward leap, and so on. Each new downward lurch was met by a new round of claims along these lines, louder, more dogmatic, and more strident than the one that preceded it, and nasty personal attacks on anyone who didn’t support the delusional consensus filled the media of the time.

People were still saying those things when the bottom dropped out of the market.

Tuesday, October 29, 1929 can reasonably be taken as the point at which the era of pretense gave way once and for all to the era of impact. That’s not because it was the first day of the crash—there had been ghastly slumps on the previous Thursday and Monday, on the heels of two months of less drastic but still seriously ugly declines—but because, after that day, the pundits and the media pretty much stopped pretending that nothing was wrong. Mind you, next to nobody was willing to talk about what exactly had gone wrong, or why it had gone wrong, but the pretense that the good fairy of capitalism had promised Americans happy days forever was out the window once and for all.

It’s crucial to note, though, that what followed this realization was the immediate and all but universal insistence that happy days would soon be back if only everyone did the right thing. It’s even more crucial to note that what nearly everyone identified as “the right thing”—running right out and buying lots of stocks—was a really bad idea that bankrupted many of those who did it, and didn’t help the imploding US economy at all.

It’s probably necessary to talk about this in a little more detail, since it’s been an article of blind faith in the United States for many decades now that it’s always a good idea to buy and hold stocks. (I suspect that stockbrokers have had a good deal to do with the promulgation of this notion.) It’s been claimed that someone who bought stocks in 1929 at the peak of the bubble, and then held onto them, would have ended up in the black eventually, and for certain values of “eventually,” this is quite true—but it took the Dow Jones industrial average until the mid-1950s to return to its 1929 high, and so for a quarter of a century our investor would have been underwater on his stock purchases.

What’s more, the Dow isn’t necessarily a good measure of stocks generally; many of the darlings of the market in the 1920s either went bankrupt in the Depression or never again returned to their 1929 valuations. Nor did the surge of money into stocks in the wake of the 1929 crash stave off the Great Depression, or do much of anything else other than provide a great example of the folly of throwing good money after bad. The moral to this story? In an era of impact, the advice you hear from everyone around you may not be in your best interest.

That same moral can be shown just as clearly in the second example I have in mind, the French Revolution. We talked briefly in last week’s post about the way that the French monarchy and aristocracy blinded themselves to the convulsive social and economic changes that were pushing France closer and closer to a collective explosion on the grand scale, and pursued business as usual long past the point at which business as usual was anything but a recipe for disaster. Even when the struggle between the Crown and the aristocracy forced Louis XVI to convene the États-Généraux—the rarely-held national parliament of France, which had powers more or less equivalent to a constitutional convention in the US—next to nobody expected anything but long rounds of political horse-trading from which some modest shifts in the balance of power might result.

That was before the summer of 1789. On June 17, the deputies of the Third Estate—the representatives of the commoners—declared themselves a National Assembly and staged what amounted to a coup d’etat; on July 14, faced with the threat of a military response from the monarchy, the Parisian mob seized the Bastille, kickstarting a wave of revolt across the country that put government and military facilities in the hands of the revolutionary National Guard and broke the back of the feudal system; on August 4, the National Assembly abolished all feudal rights and legal distinctions between the classes. Over less than two months, a political and social system that had been welded firmly in place for a thousand years all came crashing to the ground.

Those two months marked the end of the era of pretense and the arrival of the era of impact. The immediate response, with a modest number of exceptions among the aristocracy and the inner circles of the monarchy’s supporters, was frantic cheering and an insistence that everything would soon settle into a wonderful new age of peace, prosperity, and liberty. All the overblown dreams of the philosophes about a future age governed by reason were trotted out and treated as self-evident fact. Of course that’s not what happened; once it was firmly in power, the National Assembly used its unchecked authority as abusively as the monarchy had once done; factional struggles spun out of control, and before long mob rule and the guillotine were among the basic facts of life in Revolutionary France. 

Among the most common symptoms of an era of impact, in other words, is the rise of what we may as well call “crackpot optimism”—the enthusiastic and all but universal insistence, in the teeth of the evidence, that the end of business as usual will turn out to be the door to a wonderful new future. In the wake of the 1929 stock market crash, people were urged to pile back into the market in the belief that this would cause the economy to boom again even more spectacularly than before, and most of the people who followed this advice proceeded to lose their shirts. In the wake of the revolution of 1789, likewise, people across France were encouraged to join with their fellow citizens in building the shining new utopia of reason, and a great many of those who followed that advice ended up decapitated or, a little later, dying of gunshot or disease in the brutal era of pan-European warfare that extended almost without a break from the cannonade of Valmy in 1792 to the battle of Waterloo in 1815.

And the present example? That’s a question worth exploring, if only for the utterly pragmatic reason that most of my readers are going to get to see it up close and personal.

That the United States and the industrial world generally are deep in an era of pretense is, I think, pretty much beyond question at this point. We’ve got political authorities, global bankers, and a galaxy of pundits insisting at the top of their lungs that nothing is wrong, everything is fine, and we’ll be on our way to the next great era of prosperity if we just keep pursuing a set of boneheaded policies that have never—not once in the entire span of human history—brought prosperity to the countries that pursued them. We’ve got shelves full of books for sale in upscale bookstores insisting, in the strident language usual to such times, that life is wonderful in this best of all possible worlds, and it’s going to get better forever because, like, we have technology, dude! Across the landscape of the cultural mainstream, you’ll find no shortage of cheerleaders insisting at the top of their lungs that everything’s going to be fine, that even though they said ten years ago that we only have ten years to do something before disaster hits, why, we still have ten years before disaster hits, and when ten more years pass by, why, you can be sure that the same people will be insisting that we have ten more.

This is the classic rhetoric of an era of pretense. Over the last few years, though, it’s seemed to me that the voices of crackpot optimism have gotten more shrill, the diatribes more fact-free, and the logic even shoddier than it was in Bjorn Lomborg’s day, which is saying something. We’ve reached the point that state governments are making it a crime to report on water quality and forbidding officials from using such unwelcome phrases as “climate change.” That’s not the action of people who are confident in their beliefs; it’s the action of a bunch of overgrown children frantically clenching their eyes shut, stuffing their fingers in their ears, and shouting “La, la, la, I can’t hear you.”

That, in turn, suggests that the transition to the era of impact may be fairly close. Exactly when it’s likely to arrive is a complex question, and exactly what’s going to land the blow that will crack the crackpot optimism and make it impossible to ignore the arrival of real trouble is an even more complex one. In 1929, those who hadn’t bought into the bubble could be perfectly sure—and in fact, a good many of them were perfectly sure—that the usual mechanism that brings bubbles to a catastrophic end was about to terminate the boom of the 1920s with extreme prejudice, as indeed it did. In the last decades of the French monarchy, it was by no means clear exactly what sequence of events would bring the Ancien Régime crashing down, but such thoughtful observers as Talleyrand knew that something of the sort was likely to follow the crisis of legitimacy then under way.

The problem with trying to predict the trigger that will bring our current situation to a sudden stop is that we’re in such a target-rich environment. Looking over the potential candidates for the sudden shock that will stick a fork in the well-roasted corpse of business as usual, I’m reminded of the old board game Clue. Will Mr. Boddy’s killer turn out to be Colonel Mustard in the library with a lead pipe, Professor Plum in the conservatory with a candlestick, or Miss Scarlet in the dining room with a rope?

In much the same sense, we’ve got a global economy burdened to the breaking point with more than a quadrillion dollars of unpayable debt; we’ve got a global political system coming apart at the seams as the United States slips toward the usual fate of empires and its rivals circle warily, waiting for the kill; we’ve got a domestic political system here in the US entering a classic prerevolutionary condition under the impact of a textbook crisis of legitimacy; we’ve got a global climate that’s hammered by our rank stupidity in treating the atmosphere as a gaseous sewer for our wastes; we’ve got a global fossil fuel industry that’s frantically trying to pretend that scraping the bottom of the barrel means that the barrel is full, and the list goes on. It’s as though Colonel Mustard, Professor Plum, Miss Scarlet, and the rest of them all ganged up on Mr. Boddy at once, and only the most careful autopsy will be able to determine which of them actually dealt the fatal blow.

In the midst of all this uncertainty, there are three things that can, I think, be said for certain about the end of the current era of pretense and the coming of the era of impact. The first is that it’s going to happen. When something is unsustainable, it’s a pretty safe bet that it won’t be sustained indefinitely, and a society that keeps on embracing policies that swap short-term gains for long-term problems will sooner or later end up awash in the consequences of those policies. Timing such transitions is difficult at best; it’s an old adage among stock traders that the market can stay irrational longer than you can stay solvent. Still, points made above—especially the increasingly shrill tone of the defenders of the existing order—suggest to me that the era of impact may be here within a decade or so at the outside.

The second thing that can be said for certain about the coming era of impact is that it’s not the end of the world. Apocalyptic fantasies are common and popular in eras of pretense, and for good reason; fixating on the supposed imminence of the Second Coming, human extinction, or what have you, is a great way to distract yourself from the real crisis that’s breathing down your neck. If the real crisis in question is partly or wholly a result of your own actions, while the apocalyptic fantasy can be blamed on someone or something else, that adds a further attraction to the fantasy.

The end of industrial civilization will be a long, bitter, painful cascade of conflicts, disasters, and accelerating decline in which a vast number of people are going to die before they otherwise would, and a great many things of value will be lost forever. That’s true of any falling civilization, and the misguided decisions of the last forty years have pretty much guaranteed that the current example is going to have an extra helping of all these unwelcome things. I’ve discussed at length, in earlier posts in the Dark Age America sequence here and in other sequences as well, why the sort of apocalyptic sudden stop beloved of Hollywood scriptwriters is the least likely outcome of the predicament of our time; still, insisting on the imminence and inevitability of some such game-ending event will no doubt be as popular as usual in the years immediately ahead.

The third thing that I think can be said for certain about the coming era of impact, though, is the one that counts. If it follows the usual pattern, as I expect it to do, once the crisis hits there will be serious, authoritative, respectable figures telling everyone exactly what they need to do to bring an end to the troubles and get the United States and the world back on track to renewed peace and prosperity. Taking these pronouncements seriously and following their directions will be extremely popular, and it will almost certainly also be a recipe for unmitigated disaster. If forewarned is forearmed, as the saying has it, this is a piece of firepower to keep handy as the era of pretense winds down. In next week’s post, we’ll talk about comparable weaponry relating to the third stage of collapse—the era of response.

Wednesday, May 13, 2015

The Era of Pretense

I've mentioned in previous posts here on The Archdruid Report the educational value of the comments I receive from readers in the wake of each week’s essay. My post two weeks ago on the death of the internet was unusually productive along those lines. One of the comments I got in response to that post gave me the theme for last week’s essay, but there was at least one other comment calling for the same treatment. Like the one that sparked last week’s post, it appeared on one of the many other internet forums on which The Archdruid Report, and it unintentionally pointed up a common and crucial failure of imagination that shapes, or rather misshapes, the conventional wisdom about our future.

Curiously enough, the point that set off the commenter in question was the same one that incensed the author of the denunciation mentioned in last week’s post: my suggestion in passing that fifty years from now, most Americans may not have access to electricity or running water. The commenter pointed out angrily that I’d claimed that the twilight of industrial civilization would be a ragged arc of decline over one to three centuries. Now, he claimed, I was saying that it was going to take place in the next fifty years, and this apparently convinced him that everything I said ought to be dismissed out of hand.

I run into this sort of confusion all the time. If I suggest that the decline and fall of a civilization usually takes several centuries, I get accused of inconsistency if I then note that one of the sharper downturns included in that process may be imminent. If I point out that the United States is likely within a decade or two of serious economic and political turmoil, driven partly by the implosion of its faltering global hegemony and partly by a massive crisis of legitimacy that’s all but dissolved the tacit contract between the existing order of US society and the masses who passively support it, I get accused once again of inconsistency if I then say that whatever comes out the far side of that crisis—whether it’s a battered and bruised United States or a patchwork of successor states—will then face a couple of centuries of further decline and disintegration before the deindustrial dark age bottoms out.

Now of course there’s nothing inconsistent about any of these statements. The decline and fall of a civilization isn’t a single event, or even a single linear process; it’s a complex fractal reality composed of many different events on many different scales in space and time. If it takes one to three centuries, as usual, those centuries are going to be taken up by an uneven drumbeat of wars, crises, natural disasters, and assorted breakdowns on a variety of time frames with an assortment of local, regional, national, or global effects. The collapse of US global hegemony is one of those events; the unraveling of the economic and technological framework that currently provides most Americans with electricity and running water is another, but neither of those is anything like the whole picture.

It’s probably also necessary to point out that any of my readers who think that being deprived of electricity and running water is the most drastic kind of collapse imaginable have, as the saying goes, another think coming. Right now, in our oh-so-modern world, there are billions of people who get by without regular access to electricity and running water, and most of them aren’t living under dark age conditions. A century and a half ago, when railroads, telegraphs, steamships, and mechanical printing presses were driving one of history’s great transformations of transport and information technology, next to nobody had electricity or running water in their homes. The technologies of 1865 are not dark age technologies; in fact, the gap between 1865 technologies and dark age technologies is considerably greater, by most metrics, than the gap between 1865 technologies and the ones we use today.

Furthermore, whether or not Americans have access to running water and electricity may not have as much to say about the future of industrial society everywhere in the world as the conventional wisdom would suggest.  I know that some of my American readers will be shocked out of their socks to hear this, but the United States is not the whole world. It’s not even the center of the world. If the United States implodes over the next two decades, leaving behind a series of bankrupt failed states to squabble over its territory and the little that remains of its once-lavish resource base, that process will be a great source of gaudy and gruesome stories for the news media of the world’s other continents, but it won’t affect the lives of the readers of those stories much more than equivalent events in Africa and the Middle East affect the lives of Americans today.

As it happens, over the next one to three centuries, the benefits of industrial civilization are going to go away for everyone. (The costs will be around a good deal longer—in the case of the nuclear wastes we’re so casually heaping up for our descendants, a good quarter of a million years, but those and their effects are rather more localized than some of today’s apocalyptic rhetoric likes to suggest.) The reasoning here is straightforward. White’s Law, one of the fundamental principles of human ecology, states that economic development is a function of energy per capita; the immense treasure trove of concentrated energy embodied in fossil fuels, and that alone, made possible the sky-high levels of energy per capita that gave the world’s industrial nations their brief era of exuberance; as fossil fuels deplete, and remaining reserves require higher and higher energy inputs to extract, the levels of energy per capita the industrial nations are used to having will go away forever.

It’s important to be clear about this. Fossil fuels aren’t simply one energy source among others; in terms of concentration, usefulness, and fungibility—that is, the ability to be turned into any other form of energy that might be required—they’re in a category all by themselves. Repeated claims that fossil fuels can be replaced with nuclear power, renewable energy resources, or what have you sound very good on paper, but every attempt to put those claims to the test so far has either gone belly up in short order, or become a classic subsidy dumpster surviving purely on a diet of government funds and mandates.

Three centuries ago, the earth’s fossil fuel reserves were the largest single deposit of concentrated energy in this part of the universe; now we’ve burnt through nearly all the easily accessible reserves, and we’re scrambling to keep the tottering edifice of industrial society going by burning through the dregs that remain. As those run out, the remaining energy resources—almost all of them renewables—will certainly sustain a variety of human societies, and some of those will be able to achieve a fairly high level of complexity and maintain some kinds of advanced technologies. The kind of absurd extravagance that passes for a normal standard of living among the more privileged inmates of the industrial nations is another matter, and as the fossil fuel age sunsets out, it will end forever.

The fractal trajectory of decline and fall mentioned earlier in this post is simply the way this equation works out on the day-to-day scale of ordinary history. Still, those of us who happen to be living through a part of that trajectory might reasonably be curious about how it’s likely to unfold in our lifetimes. I’ve discussed in a previous series of posts, and in my book Decline and Fall: The End of Empire and the Future of Democracy in 21st Century America, how the end of US global hegemony is likely to unfold, but as already noted, that’s only a small portion of the broader picture. Is a broader view possible?

Fortunately history, the core resource I’ve been using to try to make sense of our future, has plenty to say about the broad patterns that unfold when civilizations decline and fall. Now of course I know all I have to do is mention that history might be relevant to our present predicament, and a vast chorus of voices across the North American continent and around the world will bellow at rooftop volume, “But it’s different this time!” With apologies to my regular readers, who’ve heard this before, it’s probably necessary to confront that weary thoughtstopper again before we proceed.

As I’ve noted before, claims that it’s different this time are right where it doesn’t matter and wrong where it counts.  Predictions made on the basis of history—and not just by me—have consistently predicted events over the last decade or so far more accurately than predictions based on the assumption that history doesn’t matter. How many times, dear reader, have you heard someone insist that industrial civilization is going to crash to ruin in the next six months, and then watched those six months roll merrily by without any sign of the predicted crash? For that matter, how many times have you heard someone insist that this or that policy that’s never worked any other time that it’s been tried, or this or that piece of technological vaporware that’s been the subject of failed promises for decades, will inevitably put industrial society back on its alleged trajectory to the stars—and how many times has the policy or the vaporware been quietly shelved, and something else promoted using the identical rhetoric, when it turned out not to perform as advertised?

It’s been a source of wry amusement to me to watch the same weary, dreary, repeatedly failed claims of imminent apocalypse and inevitable progress being rehashed year after year, varying only in the fine details of the cataclysm du jour and the techno-savior du jour, while the future nobody wants to talk about is busily taking shape around us. Decline and fall isn’t something that will happen sometime in the conveniently distant future; it’s happening right now in the United States and around the world. The amusement, though, is tempered with a sense of familiarity, because the period in which decline is under way but nobody wants to admit that fact is one of the recurring features of the history of decline.

There are, very generally speaking, five broad phases in the decline and fall of a civilization. I know it’s customary in historical literature to find nice dull labels for such things, but I’m in a contrary mood as I write this, so I’ll give them unfashionably colorful names: the eras of pretense, impact, response, breakdown, and dissolution. Each of these is complex enough that it’ll need a discussion of its own; this week, we’ll talk about the era of pretense, which is the one we’re in right now.

Eras of pretense are by no means limited to the decline and fall of civilizations. They occur whenever political, economic, or social arrangements no longer work, but the immediate costs of admitting that those arrangements don’t work loom considerably larger in the collective imagination than the future costs of leaving those arrangements in place. It’s a curious but consistent wrinkle of human psychology that this happens even if those future costs soar right off the scale of frightfulness and lethality; if the people who would have to pay the immediate costs don’t want to do so, in fact, they will reliably and cheerfully pursue policies that lead straight to their own total bankruptcy or violent extermination, and never let themselves notice where they’re headed.

Speculative bubbles are a great setting in which to watch eras of pretense in full flower. In the late phases of a bubble, when it’s clear to anyone who has two spare neurons to rub together that the boom du jour is cobbled together of equal parts delusion and chicanery, the people who are most likely to lose their shirts in the crash are the first to insist at the top of their lungs that the bubble isn’t a bubble and their investments are guaranteed to keep on increasing in value forever. Those of my readers who got the chance to watch some of their acquaintances go broke in the real estate bust of 2008-9, as I did, will have heard this sort of self-deception at full roar; those who missed the opportunity can make up for the omission by checking out the ongoing torrent of claims that the soon-to-be-late fracking bubble is really a massive energy revolution that will make America wealthy and strong again.

The history of revolutions offers another helpful glimpse at eras of pretense. France in the decades before 1789, to cite a conveniently well-documented example, was full of people who had every reason to realize that the current state of affairs was hopelessly unsustainable and would have to change. The things about French politics and economics that had to change, though, were precisely those things that the French monarchy and aristocracy were unwilling to change, because any such reforms would have cost them privileges they’d had since time out of mind and were unwilling to relinquish.

Louis XIV, who finished up his long and troubled reign a supreme realist, is said to have muttered “Après moi, le déluge”—“Once I’m gone, this sucker’s going down” may not be a literal translation, but it catches the flavor of the utterance—but that degree of clarity was rare in his generation, and all but absent in those of his increasingly feckless successors. Thus the courtiers and aristocrats of the Old Regime amused themselves at the nation’s expense, dabbled in avant-garde thought, and kept their eyes tightly closed to the consequences of their evasions of looming reality, while the last opportunities to excuse themselves from a one-way trip to visit the guillotine and spare France the cataclysms of the Terror and the Napoleonic wars slipped silently away.

That’s the bitter irony of eras of pretense. Under most circumstances, they’re the last period when it would be possible to do anything constructive on the large scale about the crisis looming immediately ahead, but the mass evasion of reality that frames the collective thinking of the time stands squarely in the way of any such constructive action. In the era of pretense before a speculative bust, people who could have quietly cashed in their positions and pocketed their gains double down on their investments, and guarantee that they’ll be ruined once the market stops being liquid. In the era of pretense before a revolution, in the same way, those people and classes that have the most to lose reliably take exactly those actions that ensure that they will in fact lose everything. If history has a sense of humor, this is one of the places that it appears in its most savage form.

The same points are true, in turn, of the eras of pretense that precede the downfall of a civilization. In a good many cases, where too few original sources survive, the age of pretense has to be inferred from archeological remains. We don’t know what motives inspired the ancient Mayans to build their biggest pyramids in the years immediately before the Terminal Classic period toppled over into a savage political and demographic collapse, but it’s hard to imagine any such project being set in motion without the usual evasions of an era of pretense being involved Where detailed records of dead civilizations survive, though, the sort of rhetorical handwaving common to bubbles before the bust and decaying regimes on the brink of revolution shows up with knobs on. Thus the panegyrics of the Roman imperial court waxed ever more lyrical and bombastic about Rome’s invincibility and her civilizing mission to the nations as the Empire stumbled deeper into its terminal crisis, echoing any number of other court poets in any number of civilizations in their final hours.

For that matter, a glance through classical Rome’s literary remains turns up the remarkable fact that those of her essayists and philosophers who expressed worries about her survival wrote, almost without exception, during the Republic and the early Empire; the closer the fall of Rome actually came, the more certainty Roman authors expressed that the Empire was eternal and the latest round of troubles was just one more temporary bump on the road to peace and prosperity. It took the outsider’s vision of Augustine of Hippo to proclaim that Rome really was falling—and even that could only be heard once the Visigoths sacked Rome and the era of pretense gave way to the age of impact.

The present case is simply one more example to add to an already lengthy list. In the last years of the nineteenth century, it was common for politicians, pundits, and mass media in the United States, the British empire, and other industrial nations to discuss the possibility that the advanced civilization of the time might be headed for the common fate of nations in due time. The intellectual history of the twentieth century is, among other things, a chronicle of how that discussion was shoved to the margins of our collective discourse, just as the ecological history of the same century is among other things a chronicle of how the worries of the previous era became the realities of the one we’re in today. The closer we’ve moved toward the era of impact, that is, the more unacceptable it has become for anyone in public life to point out that the problems of the age are not just superficial.

Listen to the pablum that passes for political discussion in Washington DC or the mainstream US media these days, or the even more vacuous noises being made by party flacks as the country stumbles wearily toward yet another presidential election. That the American dream of upward mobility has become an American nightmare of accelerating impoverishment outside the narrowing circle of the kleptocratic rich, that corruption and casual disregard for the rule of law are commonplace in political institutions from local to Federal levels, that our medical industry charges more than any other nation’s and still provides the worst health care in the industrial world, that our schools no longer teach anything but contempt for learning, that the national infrastructure and built environment are plunging toward Third World conditions at an ever-quickening pace, that a brutal and feckless foreign policy embraced by both major parties is alienating our allies while forcing our enemies to set aside their mutual rivalries and make common cause against us: these are among the issues that matter, but they’re not the issues you’ll hear discussed as the latest gaggle of carefully airbrushed candidates go through their carefully scripted elect-me routines on their way to the 2016 election.

If history teaches anything, though, it’s that eras of pretense eventually give way to eras of impact. That doesn’t mean that the pretense will go away—long after Alaric the Visigoth sacked Rome, for example, there were still plenty of rhetors trotting out the same tired clichés about Roman invincibility—but it does mean that a significant number of people will stop finding the pretense relevant to their own lives. How that happens in other historical examples, and how it might happen in our own time, will be the theme of next week’s post.

Wednesday, May 06, 2015

The Whisper of the Shutoff Valve

Last week’s post on the impending decline and fall of the internet fielded a great many responses. That was no surprise, to be sure; nor was I startled in the least to find that many of them rejected the thesis of the post with some heat. Contemporary pop culture’s strident insistence that technological progress is a clock that never runs backwards made such counterclaims inevitable.

Still, it’s always educational to watch the arguments fielded to prop up the increasingly shaky edifice of the modern mythology of progress, and the last week was no exception. A response I found particularly interesting from that standpoint appeared on one of the many online venues where Archdruid Report posts appear. One of the commenters insisted that my post should be rejected out of hand as mere doom and gloom; after all, he pointed out, it was ridiculous for me to suggest that fifty years from now, a majority of the population of the United States might be without reliable electricity or running water.

I’ve made the same prediction here and elsewhere a good many times. Each time, most of my readers or listeners seem to have taken it as a piece of sheer rhetorical hyperbole. The electrical grid and the assorted systems that send potable water flowing out of faucets are so basic to the rituals of everyday life in today’s America that their continued presence is taken for granted.  At most, it’s conceivable that individuals might choose not to connect to them; there’s a certain amount of talk about off-grid living here and there in the alternative media, for example.  That people who want these things might not have access to them, though, is pretty much unthinkable.

Meanwhile, in Detroit and Baltimore, tens of thousands of residents are in the process of losing their access to water and electricity.

The situation in both cities is much the same, and there’s every reason to think that identical headlines will shortly appear in reference to other cities around the nation. Not that many decades ago, Detroit and Baltimore were important industrial centers with thriving economies. Along with more than a hundred other cities in America’s Rust Belt, they were thrown under the bus with the first wave of industrial offshoring in the 1970s.  The situation for both cities has only gotten worse since that time, as the United States completed its long transition from a manufacturing economy producing goods and services to a bubble economy that mostly produces unpayable IOUs.

These days, the middle-class families whose tax payments propped up the expansive urban systems of an earlier day have long since moved out of town. Most of the remaining residents are poor, and the ongoing redistribution of wealth in America toward the very rich and away from everyone else has driven down the income of the urban poor to the point that many of them can no longer afford to pay their water and power bills. City utilities in Detroit and Baltimore have been sufficiently sensitive to political pressures that large-scale utility shutoffs have been delayed, but shifts in the political climate in both cities are bringing the delays to an end; water bills have increased steadily, more and more people have been unable to pay them, and the result is as predictable as it is brutal.

The debate over the Detroit and Baltimore shutoffs has followed the usual pattern, as one side wallows in bash-the-poor rhetoric while the other side insists plaintively that access to utilities is a human right. Neither side seems to be interested in talking about the broader context in which these disputes take shape. There are two aspects to that broader context, and it’s a tossup which is the more threatening.

The first aspect is the failure of the US economy to recover in any meaningful sense from the financial crisis of 2008. Now of course politicians from Obama on down have gone overtime grandstanding about the alleged recovery we’re in. I invite any of my readers who bought into that rhetoric to try the following simple experiment. Go to your favorite internet search engine and look up how much the fracking industry has added to the US gross domestic product each year from 2009 to 2014. Now subtract that figure from the US gross domestic product for each of those years, and see how much growth there’s actually been in the rest of the economy since the real estate bubble imploded.

What you’ll find, if you take the time to do that, is that the rest of the US economy has been flat on its back gasping for air for the last five years. What makes this even more problematic, as I’ve noted in several previous posts here, is that the great fracking boom about which we’ve heard so much for the last five years was never actually the game-changing energy revolution its promoters claimed; it was simply another installment in the series of speculative bubbles that has largely replaced constructive economic activity in this country over the last two decades or so.

What’s more, it’s not the only bubble currently being blown, and it may not even be the largest. We’ve also got a second tech-stock bubble, with money-losing internet corporations racking up absurd valuations in the stock market while they burn through millions of dollars of venture capital; we’ve got a student loan bubble, in which billions of dollars of loans that will never be paid back have been bundled, packaged, and sold to investors just like all those no-doc mortgages were a decade ago; car loans are getting the same treatment; the real estate market is fizzing again in many urban areas as investors pile into another round of lavishly marketed property investments—well, I could go on for some time. It’s entirely possible that if all the bubble activity were to be subtracted from the last five years or so of GDP, the result would show an economy in freefall.

Certainly that’s the impression that emerges if you take the time to check out those economic statistics that aren’t being systematically jiggered by the US government for PR purposes. The number of long-term unemployed in America is at an all-time high; roads, bridges, and other basic infrastructure is falling to pieces; measurements of US public health—generally considered a good proxy for the real economic condition of the population—are well below those of other industrial countries, heading toward Third World levels; abandoned shopping malls litter the landscape while major retailers announce more than 6000 store closures. These are not things you see in an era of economic expansion, or even one of relative stability; they’re markers of decline.

The utility shutoffs in Detroit and Baltimore are further symptoms of the same broad process of economic unraveling. It’s true, as pundits in the media have been insisting since the story broke, that utilities get shut off for nonpayment of bills all the time. It’s equally true that shutting off the water supply of 20,000 or 30,000 people all at once is pretty much unprecedented. Both cities, please note, have had very large populations of poor people for many decades now. Those who like to blame a “culture of poverty” for the tangled relationship between US governments and the American poor, and of course that trope has been rehashed by some of the pundits just mentioned, haven’t yet gotten around to explaining how the culture of poverty all at once inspired tens of thousands of people who had been paying their utility bills to stop doing so.

There are plenty of good reasons, after all, why poor people who used to pay their bills can’t do so any more. Standard business models in the United States used to take it for granted that the best way to run the staffing dimensions of any company, large or small, was to have as many full-time positions as possible and to use raises and other practical incentives to encourage employees who were good at their jobs to stay with the company. That approach has been increasingly unfashionable in today’s America, partly due to perverse regulatory incentives that penalize employers for offering full-time positions, partly to the emergence of attitudes in corner offices that treat employees as just another commodity. (I doubt it’s any kind of accident that most corporations nowadays refer to their employment offices as “human resource departments.” What do you do with a resource? You exploit it.)

These days, most of the jobs available to the poor are part-time, pay very little, and include nasty little clawbacks in the form of requirements that employees pay out of pocket for uniforms, equipment, and other things that employers used to provide as a matter of course. Meanwhile housing prices and rents are rising well above their post-2008 dip, and a great many other necessities are becoming more costly—inflation may be under control, or so the official statistics say, but anyone who’s been shopping at the same grocery store for the last eight years knows perfectly well that prices kept on rising anyway.

So you’ve got falling incomes running up against rising costs for food, rent, and utilities, among other things. In the resulting collision, something’s got to give, and for tens of thousands of poor Detroiters and Baltimoreans, what gave first was the ability to keep current on their water bills. Expect to see the same story playing out across the country as more people on the bottom of the income pyramid find themselves in the same situation. What you won’t hear in the media, though it’s visible enough if you know where to look and are willing to do so, is that people above the bottom of the income pyramid are also losing ground, being forced down toward economic nonpersonhood. From the middle classes down, everyone’s losing ground.

That process doesn’t continue any further than the middle class, to be sure. It’s been pointed out repeatedly that over the last four decades or so, the distribution of wealth in America has skewed further and further out of balance, with the top 20% of incomes taking a larger and larger share at the expense of everybody else. That’s an important factor in bringing about the collision just described. Some thinkers on the radical fringes of American society, which is the only place in the US you can talk about such things these days, have argued that the raw greed of the well-to-do is the sole reason why so many people lower down the ladder are being pushed further down still.

Scapegoating rhetoric of that sort is always comforting, because it holds out the promise—theoretically, if not practically—that something can be done about the situation. If only the thieving rich could be lined up against a convenient brick wall and removed from the equation in the time-honored fashion, the logic goes, people in Detroit and Baltimore could afford to pay their water bills!  I suspect we’ll hear such claims increasingly often as the years pass and more and more Americans find their access to familiar comforts and necessities slipping away.  Simple answers are always popular in such times, not least when the people being scapegoated go as far out of their way to make themselves good targets for such exercises as the American rich have done in recent decades.

John Kenneth Galbraith’s equation of the current US political and economic elite with the French aristocracy on the eve of revolution rings even more true than it did when he wrote it back in 1992, in the pages of The Culture of Contentment. The unthinking extravagances, the casual dismissal of the last shreds of noblesse oblige, the obsessive pursuit of personal advantages and private feuds without the least thought of the potential consequences, the bland inability to recognize that the power, privilege, wealth, and sheer survival of the aristocracy depended on the system the aristocrats themselves were destabilizing by their actions—it’s all there, complete with sprawling overpriced mansions that could just about double for Versailles. The urban mobs that played so large a role back in 1789 are warming up for their performances as I write these words; the only thing left to complete the picture is a few tumbrils and a guillotine, and those will doubtless arrive on cue.

The senility of the current US elite, as noted in a previous post here, is a massive political fact in today’s America. Still, it’s not the only factor in play here. Previous generations of wealthy Americans recognized without too much difficulty that their power, prosperity, and survival depended on the willingness of the rest of the population to put up with their antics. Several times already in America’s history, elite groups have allied with populist forces to push through reforms that sharply weakened the power of the wealthy elite, because they recognized that the alternative was a social explosion even more destructive to the system on which elite power depends.

I suppose it’s possible that the people currently occupying the upper ranks of the political and economic pyramid in today’s America are just that much more stupid than their equivalents in the Jacksonian, Progressive, and New Deal eras. Still, there’s at least one other explanation to hand, and it’s the second of the two threatening contextual issues mentioned earlier.

Until the nineteenth century, fresh running water piped into homes for everyday use was purely an affectation of the very rich in a few very wealthy and technologically adept societies. Sewer pipes to take dirty water and human wastes out of the house belonged in the same category. This wasn’t because nobody knew how plumbing works—the Romans had competent plumbers, for example, and water faucets and flush toilets were to be found in Roman mansions of the imperial age. The reason those same things weren’t found in every Roman house was economic, not technical.

Behind that economic issue lay an ecological reality.  White’s Law, one of the foundational principles of human ecology, states that economic development is a function of energy per capita. For a society before the industrial age, the Roman Empire had an impressive amount of energy per capita to expend; control over the agricultural economy of the Mediterranean basin, modest inputs from sunlight, water and wind, and a thriving slave industry fed by the expansion of Roman military power all fed into the capacity of Roman society to develop itself economically and technically. That’s why rich Romans had running water and iced drinks in summer, while their equivalents in ancient Greece a few centuries earlier had to make do without either one.

Fossil fuels gave industrial civilization a supply of energy many orders of magnitude greater than any previous human civilization has had—a supply vast enough that the difference remains huge even after the vast expansion of population that followed the industrial revolution. There was, however, a catch—or, more precisely, two catches. To begin with, fossil fuels are finite, nonrenewable resources; no matter how much handwaving is employed in the attempt to obscure this point—and whatever else might be in short supply these days, that sort of handwaving is not—every barrel of oil, ton of coal, or cubic foot of natural gas that’s burnt takes the world one step closer to the point at which there will be no economically extractable reserves of oil, coal, or natural gas at all.

That’s catch #1. Catch #2 is subtler, and considerably more dangerous. Oil, coal, and natural gas don’t leap out of the ground on command. They have to be extracted and processed, and this takes energy. Companies in the fossil fuel industries have always targeted the deposits that cost less to extract and process, for obvious economic reasons. What this means, though, is that over time, a larger and larger fraction of the energy yield of oil, coal, and natural gas has to be put right back into extracting and processing oil, coal, and natural gas—and this leaves less and less for all other uses.

That’s the vise that’s tightening around the American economy these days. The great fracking boom, to the extent that it wasn’t simply one more speculative gimmick aimed at the pocketbooks of chumps, was an attempt to make up for the ongoing decline of America’s conventional oilfields by going after oil that was far more expensive to extract. The fact that none of the companies at the heart of the fracking boom ever turned a profit, even when oil brought more than $100 a barrel, gives some sense of just how costly shale oil is to get out of the ground. The financial cost of extraction, though, is a proxy for the energy cost of extraction—the amount of energy, and of the products of energy, that had to be thrown into the task of getting a little extra oil out of marginal source rock.

Energy needed to extract energy, again, can’t be used for any other purpose. It doesn’t contribute to the energy surplus that makes economic development possible. As the energy industry itself takes a bigger bite out of each year’s energy production, every other economic activity loses part of the fuel that makes it run. That, in turn, is the core reason why the American economy is on the ropes, America’s infrastructure is falling to bits—and Americans in Detroit and Baltimore are facing a transition to Third World conditions, without electricity or running water.

I suspect, for what it’s worth, that the shutoff notices being mailed to tens of thousands of poor families in those two cities are a good working model for the way that industrial civilization itself will wind down. It won’t be sudden; for decades to come, there will still be people who have access to what Americans today consider the ordinary necessities and comforts of everyday life; there will just be fewer of them each year. Outside that narrowing circle, the number of economic nonpersons will grow steadily, one shutoff notice at a time.

As I’ve pointed out in previous posts, the line of fracture between the senile elite and what Arnold Toynbee called the internal proletariat—the people who live within a failing civilization’s borders but receive essentially none of its benefits—eventually opens into a chasm that swallows what’s left of the civilization. Sometimes the tectonic processes that pull the chasm open are hard to miss, but there are times when they’re a good deal more difficult to sense in action, and this is one of these latter times. Listen to the whisper of the shutoff valve, and you’ll hear tens of thousands of Americans being cut off from basic services the rest of us, for the time being, still take for granted.