Wednesday, June 26, 2013

Imperfect Storms

Last week’s post on the need to check our narratives against the evidence of history turned out to be rather more timely than I expected. Over the weekend, following hints and nods from the Fed that the current orgy of quantitative easing may not continue, stock and bond markets around the globe did a swan dive. In response, with the predictability of a well-oiled cuckoo clock, the usual claims that total economic collapse is imminent have begun to spread across the peak oil blogosphere.

As I write these words, the slump seems to have stabilized, but it’s a safe bet that if it resumes—and there’s reason to think that it will—the same claims will get plenty of air time, as they did during the last half dozen market slumps  If that happens, it’s an equally safe bet that a year from now, those who made and circulated those predictions will once again have egg on their faces, and the peak oil movement will have suffered another own goal, inflicted by those who have forgotten that the ability to offer accurate predictions about an otherwise baffling future is one of the few things that gives the peak oil movement any claim on the attention of the rest of the world.

Mind you, worries about the state of the world economy are far from misplaced just now.  In the wake of the 2008 crash, financial authorities in the US—first the Department of the Treasury, backed by Congressional appropriations, and then the Federal Reserve, backed by nothing but its own insistence that it had the right to spin the presses as enthusiastically as it wished—flooded markets in the US and overseas with a tsunami of money, in an attempt to forestall the contraction of the money supply that usually follows a market crash and ushers in a recession or worse.  The theory behind that exercise was outlined by Ben Bernanke in his famous “helicopter speech” in 2002:  keep the money supply from contracting in the wake of a market crash, if necessary by dumping money out of helicopters, and the economy will recover from the effects of the crash and return to robust growth in short order.

That theory was put to the test, and it failed. Five years after the 2008 crash, the global economy has not returned to robust growth. Across America and Europe, in the teeth of quantitative easing, hard times of a kind rarely seen since the Great Depression have become widespread. Official claims that happy days will be here again just as soon as everybody but the rich accepts one more round of belt-tightening (also a feature of the Great Depression, by the way) are increasingly hard to sustain in the face of the flat failure of current policies to bring anything but more poverty.  Meanwhile, the form taken by quantitative easing in the present case—massive purchases of worthless securities by central banks—has national governments drowning in debt, central banks burdened with mountains of the kind of financial paper that makes junk bonds look secure, and no one better off except a financial industry that has become increasingly disconnected from political and economic realities.

Thus the boom is coming down. On the 18th of this month, Obama commented in a media interview that Bernanke had been at the Fed’s helm “longer than he wanted,” an unsubtle way of announcing that the chairman would not be appointed to a third term in 2014. Shortly thereafter, the Fed let it be known that the ongoing quantitative easing program would be tapered off toward the end of the year, and the general manager of the Bank of International Settlements (BIS), one of the core institutions of global finance, gave a speech noting that central banks had gone too far in spinning the presses, and risked problems as bad as the ones quantitative easing was supposed to cure.

Markets around the world panicked, and for good reason. Most of the cash from quantitative easing in the US and elsewhere got paid out to large banks, on the theory that it would go to borrowers and drive another round of economic growth. That didn’t happen, because borrowing at interest only makes sense when growth can be expected to exceed the interest rate.  Whether it’s 18-year-olds taking out student loans to go to college, business owners issuing corporate paper to finance expansion, or what have you, the assumption is that the return on investment will be high enough to cover the cost of interest and still yield a profit. In the stagnant economy of the last five years, that assumption has not fared well, and where government guarantees didn’t distort the process—as happened with student loans in the US, for example—the result was a dearth of new loans, and thus a dearth of new economic activity.

Unused money in a bank’s coffers these days is about as secure as it is in the pocket of your average eight-year-old, though, and for most of the last five years, the world’s speculative markets were among the standard places for banks to go and spend it. That helped drive a series of boomlets in various kinds of speculative paper, and pushed some market indices to all-time highs. The end of the quantitative easing gravy train very likely means the end of that process, and for an assortment of other fiscal gimmicks that have been surfing the waves of cheap money pouring out of the Fed and other central banks in recent years. A prolonged bear market is thus likely.

Could that bear market trigger a run on the investment banks that, under the cozy illusion that they’re still too big to fail, have become too arrogant to survive?  Very possibly.  The twilight of “Helicopter Ben” and his spin-the-presses policies also marks the end of the line for a coterie of economists and bankers, most of them associated with Goldman Sachs, who came to power after the 2008 crisis insisting that they knew how to fix the broken economy. They didn’t, and they are now in the process of discovering—as the neoconservatives found out before them—that while the American political class has almost limitless patience with corruption and venality, it has no tolerance at all for failure.  I expect to see a fair number of prominent figures in the nation’s financial bureaucracies headed back to the same genteel obscurity that swallowed the neocons, and it’s by no means unlikely that Goldman Sachs or some other big financial firm may be allowed to crash and burn as part of the payback.

And beyond that? One way or another, the end of quantitative easing bids fair to trigger a wave of harsh economic readjustments, government defaults, corporate bankruptcies, and misery for all. An immense overhang of unpayable debt is going to have to be liquidated in one way or another, and there’s no way for that to happen without a lot of pain.  That may well involve a recession harsh enough that the D-word will probably need to be pulled out of cold storage and used instead. Will the remaining scraps of democratic governance in Europe and America, and the increasingly fragile peace among the world’s military powers, survive several years of that?  That’s a good question, to which history offers mostly unencouraging answers.

Still, these deeply troubling possibilities aren’t the things you’ll hear aired across the more apocalyptic end of the peak oil scene, if recent declines in global stock markets continue. Rather, if experience is any guide, we can expect a rehash of the claims that the next big economic crisis will cause a total implosion of global financial systems, leading to a credit collapse that will prevent farmers from buying seed for next year’s crops, groceries from stocking their shelves, factories from producing anything at all, and thus land us all plop in the middle of the Dark Ages in short order.

It’s here that the issue discussed in last week’s post becomes particularly relevant, because there’s a difference—a big one—between the imaginary cataclysms that fill so much space on the doomward end of the blogosphere and what actually happens. Financial history is full of markets that imploded, economies that plunged into recession and depression, currencies that became worthless, and all the other stage properties of current speculations concerning total economic collapse, and it also has quite detailed things to say about what followed each of these crises. Without too much trouble, given access to the internet or a decently stocked library, you can find out what happens when a highly centralized economic system comes apart at the seams, no matter what combination of factors do the deed. The difference between what actually happens and the whole range of current fantasies about instant doom can be summed up in a single phrase: negative feedback.

That’s the process by which a thermostat works: when the house gets cold, the furnace turns on and heats it back up; when the house gets too warm, the furnace shuts down and lets it cool off. Negative feedback is one of the basic properties of whole systems, and the more complex the system, the more subtle, powerful, and multilayered the negative feedback loops tend to be.  The opposite process is positive feedback, and it’s extremely rare in the real world, because systems with positive feedback promptly destroy themselves—imagine a thermostat that responded to rising temperatures by heating things up further until the house burns down. Negative feedback, by contrast, is everywhere.

That’s not something you’ll see referenced in any of the current crop of fast-crash theories, whether those fixate on financial markets, global climate, or what have you. Nearly all those theories make sweeping claims about some set of hypothetical positive feedback loops, while systematically ignoring the existence of well-documented negative feedback loops, and dismissing the evidence of history. The traditional cry of “But it’s different this time!” serves its usual function as an obstacle to understanding: no matter how many times a claim has failed in the past, and no matter how many times matters have failed to follow the predicted course, believers can always find some reason or other to insist that this time isn’t like all the others.

It  happens that I’ve been doing plenty of thinking about negative feedback recently, because I’ve fielded yet another flurry of claims that my theory of catabolic collapse must be false because it doesn’t allow for the large-scale crises that we’re evidently about to experience. Mind you, I have no objection to having my theory critiqued, but it would be helpful if those who did so took the time to learn a little about the theory they think they’re critiquing. In point of fact—I encourage doubters to read a PDF of the original essay—the theory of catabolic collapse not only assumes but requires large-scale crises. What it explains is why those crises aren’t followed by a plunge into oblivion but by stabilization and partial recovery.

The reason is negative feedback. A civilization on the way down normally has much more capital—buildings, infrastructure, knowledge, population, and everything else a macroeconomist would put under this label—than it can afford to maintain. Crisis solves this problem by wrecking a great deal of excess capital, so that it no longer requires maintenance, and resources that had been maintaining it can be put to more immediate needs. In addition, much of the wrecked capital can be stripped for raw materials, cutting expenditures further. Since civilizations in decline are by and large desperately short of uncommitted resources, and are also normally squeezed by rising costs for resource extraction, both these windfalls make it possible for a crumbling society to buy time and stave off collapse for at least a little longer; that’s what drives the stairstep process of crisis, stabilization, partial recovery, and renewed crisis that shows up in the last centuries of every historically documented civilization.

That sequence is so reliable that Arnold Toynbee could argue, with no shortage of evidence, that there are usually three and a half rounds of it in the fall of any civilization—the last half-cycle being the final crisis from which the recovery is somebody else’s business.  Our civilization, by the way, has already been through its first cycle, the global crisis of 1914-1954 that saw Europe stripped of its once-vast colonial empires and turned into a battleground between American and Russian successor states. We’re just about due for the second, which will likely be at least as traumatic as the first; the third, if our civilization follows the usual pattern, should hit a battered and impoverished industrial world sometime in the 22nd century, and the final collapse will follow maybe fifty to a hundred years after that.

Now of course there are plenty of people these days insisting that industrial civilization can’t possibly take that long to fall, just as there are plenty of people who insist that it can’t fall at all. In both cases, the arguments normally rest on the blindness to negative feedback discussed above. Consider the currently popular notion, critiqued in one of last month’s posts, that humanity will go extinct by 2030 due to runaway climate change. The logic here follows the pattern I sketched out earlier—extreme claims about hypothetical positive feedback loops, combined with selective blindness to well-documented negative feedback loops that have put an end to greenhouse events in the past, propped up with the inevitable claim that the modest details that distinguish the present situation from similar events in the past mean that the lessons of the past don’t count. 

Current rhetoric aside, greenhouse events driven by extremely rapid CO2 releases are anything but rare in Earth’s history. The usual culprits are large-scale volcanic releases of greenhouse gases, which  boosted CO2 levels in the atmosphere up above 1200 ppm—that’s four times current levels—and thus drove what geologists, not normally an excitable bunch, call “super-greenhouse events.”  If massive CO2 releases into the atmosphere were going to exterminate life on Earth, these would have done the trick—and super-greenhouse events have happened many times already, just within the small share of the planet’s history that geologists have enough evidence to study.

What stops it? Negative feedback. The most important of the many negative feedback loops that counter greenhouse events is the shutdown of the thermohaline circulation, the engine that drives the world’s ocean currents. The thermohaline circulation also puts oxygen into the deep oceans, and when it shuts down, you get an oceanic anoxic event.  Ocean waters below 50 meters or so run out of oxygen and become incapable of supporting life, and the rain of carbon-rich organic materials from the sunlit levels of the ocean, which normally supports a galaxy of deepwater ecosystems, falls instead to the bottom of the sea, taking all its carbon with it. It’s an extremely effective way of sucking excess carbon out of the biosphere:  around 70% of all known petroleum reserves, along with thick belts of carbon-rich black shale found over much of the world, were laid down in a handful of oceanic anoxic events in the Jurassic and Cretaceous periods.

Oceanographers aren’t sure yet of the mechanism that shuts off the thermohaline circulation, but it doesn’t require the steamy temperatures of the Mesozoic to do it. At least one massive oceanic anoxic event happened in the Ordovician period, in the middle of a glaciation, and there’s tolerably good evidence that a brief shutdown was responsible for the thousand-year-long Younger Dryas cold period at the end of the last ice age. Not that long ago, global warming researchers were warning about the possibility of a shutdown of the thermohaline circulation in the near future, and measurements of deepwater formation have not been encouraging to believers in business as usual.

Meanwhile, other patterns of negative feedback are already under way.  Across much of the tropical world, increased CO2 levels in the atmosphere are helping to drive bush encroachment—the rapid spread of thorny shrubs and trees across former grasslands.  Western media coverage so far has fixated on the plight of cheetahs—is there any environmental issue we can’t reduce to sentimentality about cute animals?—but the other side of the picture is that shrubs and trees soak up much more carbon than grasslands, and in many areas, the shrubs involved in bush encroachment make cattle raising impossible, cutting into another source of greenhouse gases. Meanwhile, the depletion of fossil fuels imposes its own form of negative feedback; as petroleum geologists have been pointing out for quite a while now, there aren’t enough economically recoverable fossil fuels in the world to justify even the IPCC’s relatively unapocalyptic predictions of climate change.

Apply the same logic to the economic convulsions I mentioned earlier and the same results follow. The reason a financial collapse won’t result in bare grocery shelves, deserted factories, fallow fields, and mass death is, again, negative feedback. The world’s political, economic, and military officials have plenty of options for preventing such an outcome, most of them thoroughly tested in previous economic breakdowns, and so these officials aren’t exactly likely to respond to crisis by wringing their hands and saying, “Oh, whatever shall we do?” For that matter, ordinary people caught in previous periods of extreme economic crisis have proven perfectly able to jerry-rig whatever arrangements might be necessary to stay fed and provided with other necessities. 

Whether the crisis is contained by federal loan guarantees and bank nationalizations that keep farms, factories, and stores supplied with the credit they need, by the repudiation of debts and the issuance of a new currency, by martial law and the government seizure of unused acreage, or by ordinary citizens cobbling together new systems of exchange in a hurry, as happened in Argentina, Russia, and other places where the economy suddenly went to pieces, the crisis will be contained. The negative feedback here is provided by the simple facts that people are willing to do almost anything to put food on the table, governments are willing to do even more to stay in power, and in hundreds of previous crises, their actions have proven more than sufficient to stop the positive feedback loops of economic crisis in their tracks, and stabilize the situation at some level.

None of this means the crisis will be easy to get through, nor does it mean that the world that emerges once the rubble stops bouncing and the dust settles will be anything like as prosperous, as comfortable, or as familiar as the one we have today. That’s true of all three of the situations I’ve sketched out in this post. While the next round of crisis along the arc of industrial civilization’s decline and fall will likely be over by 2070 of so, living through the interval between then and now will probably have more than a little in common with living through the First World War, the waves of political and social crises that followed it, the Great Depression, and the rise of fascism, followed by the Second World War and its aftermath—and this time the United States is unlikely to be sheltered from the worst impacts of crisis, as it was between 1914 and 1954.

In the same way, the negative feedback loops that counter greenhouse events in the Earth’s biosphere don’t prevent drastic climate swings, with all the agricultural problems and extreme weather events that those imply; they simply prevent those swings from going indefinitely, and impose reverse swings that could be just as damaging. If the thermohaline circulation shuts down, in particular, there’s a very real possibility that the world could be whipsawed by extreme weather in both directions—too hot for a few more decades, and then too cold for the next millennium—as happened around the beginning of the Younger Dryas period 12,800 years ago. Our species survived then, and on several other similar occasions, and the Earth as a whole has been through even more drastic climate shifts many times; still, it’s a sufficiently harsh prospect for those of us who may have to live through it that anything that can be done to prevent it is well worth doing.

It’s only the contemporary fixation on “perfect storms” of various imaginary kinds that leads so many people to forget that imperfect storms can cause quite a bit of damage all by themselves. Yet it’s the imperfect storms, the ones we can actually expect to get in the real world, that ought to feature in predictions of the future—if those predictions are meant to predict the future, that is, rather than serving as inkblots onto which to project emotionally charged fantasies,  excuses for not abandoning unsustainable but comfortable lifestyles, or what have you.

Wednesday, June 19, 2013

What Actually Happens

When you think about it, it’s really rather odd that so many people nowadays should be so hostile to the suggestion that history moves in circles. Central to the rhetoric that celebrates industrial civilization’s supposed triumph over the ignorant and superstitious past is the notion that our beliefs about the world are founded on experience and tested against hard facts. Since the cyclic theory of history gave Oswald Spengler the basis for accurate predictions about the future—predictions, mind you, that contradicted  the conventional wisdom of his time and ours, and proved to be correct anyway—wouldn’t it be more reasonable to consider the suggestion that his theory applies to our civilization too?

Reasonable or not, of course, that’s not what generally happens. Suggest that industrial civilization is following the same arc of rise and fall as all previous civilizations have done, and shows every sign of completing the rest of that trajectory in due time, and outside of a few circles of intellectual heretics on the fringes of contemporary culture, what you’ll get in the way of response is an angry insistence that it just ain’t so.  The overfamiliar claim that this time it really is different, that modern industrial civilization will either keep soaring ever higher on the way to some glorious destiny or plunge overnight into some unparalleled catastrophe, is wedged so tightly into the collective imagination of our age that not even repeated failure seems to be able to break it loose.

That last comment is anything but hyperbole; the repeated failures have happened, and are happening, without having the least effect on the claims just mentioned. Glance back over the last half century or so, to start with, and notice just how many prophecies of progress and apocalypse have ended up in history’s wastebasket.  From cities in orbit and regular flights to the Moon, through fusion power and household robots who can cook your dinner and do your laundry for you, to the conquest of poverty, disease, and death itself, how many supposedly inevitable advances have been proclaimed as imminent by scientists and the media, only to end up in history’s wastebasket when it turned out that they couldn’t be done after all? Of all the dozens of great leaps forward that were being announced so confidently in my youth, only a few—notably the computer revolution—actually happened, and even there the gap between what was predicted and what we got remains vast.

It’s indicative that the humor magazine The Onion, which makes its money by saying the things nobody else in American life is willing to say, ran an edgy piece a few months back announcing that Americans had begun to grasp that the shiny new era of progress and innovation promised so many times was never actually going to happen.  No doubt sometime soon they’ll run a similar story about the claims of imminent cataclysm that fill the same role on the other side of the spectrum of industrial society’s folk beliefs about the future. Year after weary year, the same grandiose visions of destiny and disaster get dusted off for one more showing,; they resemble nothing so much as a rerun of a television show that originally aired when your grandparents were on their first date, and yet audiences across the industrial world sit there and do their best to forget that they’ve watched the same show so often they could close their eyes and plug their ears and still recall every tawdry detail.

Meanwhile, over the same half century or so, a very different story has been unfolding here in America and, to a significant extent, elsewhere in the industrial world. Cheap, easily accessible deposits of the resources on which industrial civilization depends have been exhausted, and replaced with increasing difficulty by more expensive substitutes, at steadily rising costs in money, labor, energy, and other resources; the national infrastructure and the natural environment have both been drawn into an accelerating spiral of malign neglect; standards of living for most of the population have been sliding steadily, along with most measures of public health and meaningful education; constitutional rights and the rule of law have taken a beating, administered with equal enthusiasm by both major parties, who seem incapable of agreeing on anything else even when the welfare of the nation is obviously at stake.

In other words, while one set of true believers has been waiting hopefully for the arrival of a bright new golden age of scientific and technological progress, and another set of true believers has been waiting just as hopefully for the arrival of the vast catastrophe that will prove to their satisfaction just how wrong everyone else was, history ignored them both and brought what it usually brings at this season of a civilization’s life: that is to say, decline.

Even so, our collective fixation on those two failed narratives shows few signs of slipping. It’s uncomfortably easy to imagine an America a century from now,  in fact, in which half the sharply reduced population lives in squalid shantytowns without electricity or running water, tuberculosis and bacterial infections are the leading causes of death, cars and computers are luxury goods assembled from old parts and reserved for the obscenely rich, and space travel is a distant memory—and in which one set of true believers still insists that the great leap upward into a golden age of progress will get going any day now, another set insists just as passionately that some immense cataclysm is about to kill us all, and only a few intellectual heretics on the fringes of society are willing to talk about the hard facts of ongoing decline or the destination toward which that decline is pretty obviously headed.

There’s no shortage of irony here, because modern industrial culture’s fixation on fantasies of progress and apocalypse and its irritable rejection of any other possibilities have contributed mightily to the process of decline that both sets of fantasies reject out of hand. Since the early 1980s, when the industrial world turned its back on the hopes of the previous decade and slammed the door on its best chance of a smooth transition to sustainability, every attempt to bring up the limits to growth or propose a useful response to the impending mess has been assailed by partisans of both fantasies; the rhetoric of progress—"I’m sure they’ll come up with something," "There are no limits to the power of technology," and so on—has been precisely balanced by the rhetoric of apocalypse—"Jesus will come soon so we don’t have to worry about that," "It’s too late to save humanity from inevitable extinction," and so on.  Thirty years on, the breakthroughs have proven just as elusive as the catastrophes, but the rhetoric still plods onward.

Behind both sides of that rhetoric, I’ve come to believe, is a habit of thought that’s deeply ingrained in contemporary consciousness—the habit, mentioned toward the end of last week’s post, of postulating an imaginary "real world" that contains some set of desirable features the actual world lacks, and then condemning the actual world for its failure to measure up to the imaginary one.  Few corners of modern have escaped that habit of thinking, and fewer still have avoided being harmed by it.

Take politics, which used to be the process of finding acceptable compromises among the competing needs and wants of members of a community.  These days that process has been all but swamped by supporters of an assortment of fictive worlds—consider the heavily fictionalized pre-1960s America that features so heavily in Christian fundamentalist rhetoric, in which Christian faith was universal, happy families all prayed together on Sunday mornings, and gays, atheists, and other deviant types were safely quarantined in New York City, for example, or for that matter the assorted utopias of political correctness to be found on the other end of the political spectrum. People who are struggling to make the actual world conform to some imaginary one are rarely prepared to accept the compromises, the negotiations, and the quest for common ground that make for functional politics, and the result is the stalemate between entrenched factions that pervades politics on nearly all levels today.

From public health to personal ethics, from dietary choices to the management of the economy, the words are different but they’re all sung to the same old tune.  Abstract theories about how the world ought to work are treated as descriptions of how the world actually works, and heaven help you if you suggest that the theories might be judged by comparing them to the facts on the ground. All the usual contortions of cognitive dissonance then come into play when, as so often happens, measures that are supposed to improve public health make it worse, moral stances intended to improve the world cause more harm than good, diets that are supposed to make people healthy actually make them sick, economic programs proclaimed as the key to lasting prosperity run one economy after another straight into the ground, and so on.

What’s the alternative? Simply put, it involves setting aside our own desires, preferences, and sense of entitlement, and paying attention to the way things actually happen in the world.

It’s important not to overthink what’s being said here. Philosophers since ancient times have pointed out, and quite rightly, that human beings have no access to absolute truth; the world as we experience it comes into being out of the interaction between the "buzzing, blooming confusion" of raw sensory data and the structures of the individual consciousness. Whatever its relevance to the deeper questions of philosophy, a subject I don’t propose to address here, the world as we experience it is as close as we need to get to reality to apply the proposal I’ve just made. In the world as we experience it, some things happen reliably, other things happen unpredictably, and still other things never seem to get around to happening at all—and it’s not hard, even across cultural and linguistic barriers, to find common ground concerning which things belong in which of these categories.

That quest for common ground among the vagaries of individual experience is among other things the basis of modern science. The theory of gravitation is an elegant mathematical way of summing up the fact  that billions of individual human beings have had the experience of watching something fall, and each one of those experiences had important features in common with all the others, as well as with such apparently unconnected things as the apparent movements of the Sun in the sky. The kind of knowledge found in the theory of gravitation, and the whole range of other scientific theories, is not absolute truth; it’s always at least a little tentative, subject to constant testing and reformulation as more data comes in, but it was good enough to put human bootprints on the Moon, and it was gained by setting aside narratives that played on the preferences of the individual and collective ego, in order to listen to what Nature herself was saying.

Suggest that this attentiveness to what actually happens is a good idea when dealing with falling rocks, and you’ll get little debate. It’s when you suggest that the same approach might be usefully applied to falling civilizations that the arguments spring up, but the principle is the same in both cases. Over the last five thousand years or so, scores of societies have risen and fallen, and their trajectories through time, like those of falling rocks, have had important features in common. It’s easy to insist that because contemporary industrial society differs from these other societies in various ways, those common features have nothing to say to our future, but what follows this claim? Inevitably, it’s yet another weary rehash of the familiar, failed narratives of perpetual progress and imminent apocalypse. If the present case really is unprecedented, wouldn’t it make more sense either to suggest some equally unprecedented model for the future, or simply to shrug and admit that nobody knows what will happen? Both these responses would make more sense than trotting out what amounts to scraps of medieval theology that have been dolled up repeatedly in pseudosecular drag since the market for religious prophecy turned south in the eighteenth century.

I’d like to suggest that it’s high time for both narratives to be put out to pasture. No, I’ll go further than that. I’d like to suggest that it’s high time for all our stories about the world and ourselves to be tested against the yardstick of what actually happens, and chucked if they can’t meet that test.

What I’m suggesting here needs to be understood with a certain amount of care. Knowledge about the world takes two broad forms, and the connection between them is rather like the connection between a pile of bricks and lumber, on the one hand, and the house that will be built out of the bricks and lumber, on the other.  The first form of knowledge is history in the broadest sense of the world—a sense that includes what used to be called "natural history," the careful collection of observed facts about the world of nature. Before Isaac Newton could sit down in his Cambridge study and work out the theory of gravitation, hundreds of other investigators had to note down their own observations about how things fall, and tens of thousands of astronomers down the centuries had to look up into the sky and notice where the little moving lights they called "wanderers"—planetoi in Greek—had turned up that night. That was the gathering of the bricks and the milling of the lumber that would eventually be used to build the elegant structure of Newton’s gravitational theory.

Long before Newton got to work, though, his brick-hauling and lumber-gathering predecessors had picked up quite a bit of relevant knowledge about how rocks fall, how planets move, and a range of similar things, and could explain in quite some detail what these things did and didn’t do. The theoretical models they used to explain these regularities of behavior weren’t always that useful—I’m thinking here especially of those medieval mystics who were convinced that rocks were head over heels in love with the Earth, and would fling themselves in the direction of their beloved whenever other forces didn’t prevent them from doing so—but the regularities themselves were well understood. That’s the kind of knowledge that comes from a close study of history. Once enough historical data has been gathered, that empirical knowledge can often be summarized and replaced by a coherent theory, but that’s not always possible; if the subject is complex enough, the number of examples is small enough, or both, a meaningful theory may remain out of reach. In that case, though, the empirical knowledge is well worth having, since it’s the only real knowledge you have to go on.

The trajectory of human civilizations over time is an immensely complex subject, and the scores of societies that have risen and fallen during recorded history still forms a small enough data set that strict theoretical models may be premature. That leaves the empirical knowledge gathered from history.  It’s impossible to prove from that knowledge that the same patterns will continue to happen, just as it was impossible for one of the medieval mystics I mentioned to disprove the claim that now and then a rock might have a lover’s quarrel with the Earth and fall straight up into the sky to get away from her. Still, when known patterns are already at work in a given society, it’s reasonable to accept that they’re likely to continue to their normal end, and when a given theory about the future has failed every time it’s been proposed, it’s just as reasonable to dismiss it from consideration and look for alternatives that work better in practice.

This is what I’d like to ask my readers to do. Each of us carries around an assortment of narratives about what the future might be like, most of them derived from one or another corner of popular culture or from various older traditions and writings. Each of us uses those narratives, consciously or otherwise, as templates into which scraps of information about the future are fitted, and very often this is done without paying attention to what history has to say about the narratives themselves.  Instead, I’d like to suggest that it’s worth taking a hard look at those narratives whenever they surface, and checking them against the evidence of history. Has anything like this happened before, and if so, what results followed? Has anyone ever believed something like this before, and if so, how did that belief work out in practice? These are the kinds of questions I encourage my readers to ask.

I’m aware that this is a heavy burden—much heavier than it may seem at first glance, because it involves discarding some of our most cherished cultural narratives, including those that have become central to a great many modern religious traditions. Those of my Christian readers who believe that their scriptures predict a total overturning of the order of history in the near future may feel that burden more sharply than most. To them, I would point out that the belief in an imminent and literal apocalypse is only one of several ways that devout Christians have interpreted the scriptures. A great many believers in Christ have seen his words on the Mount of Olives as a prophecy of the destruction of Jerusalem by the Romans in 70 AD, and the Book of Revelations as a prophecy—put in symbolic terms to get past Roman censors—of the impending decline and fall of the Roman Empire: both events, it bears noting, having far more immediate importance to their audiences than, say, a cataclysm in the distant twenty-first century. My Jewish readers will have to fill me in about the range of accepted  interpretations of the prophecies concerning the Messianic kingdom—it’s not a subject I know much about—but I’d be very surprised, given the ebullient nature of rabbinic debate, if there weren’t plenty of options as well, including some that don’t require history to be stood on its head.

My atheist readers will have an easier time of it in one sense but, in at least some cases, as hard a time in others. To believe that the universe is mere matter and energy without purpose or consciousness, that humanity is simply one more biological species to which evolution has granted a few unusual gifts, and that nobody is peering anxiously down from the sky to observe our species’ foibles and bail it out from its mistakes, might seem to offer few obstacles to the sort of realism I’m proposing. Still, I’ve met an embarrassingly large number of atheists who accord humanity the same privileged status and glorious destiny that prophetic religions claim for their believers. It might seem odd to portray humanity as the Chosen Species while denying that there’s anybody to do the choosing, but such is the nature of the return of the repressed.  To those of my atheist readers who indulge in such imaginings, I would encourage attention to the presuppositions of their own beliefs, and a particularly close study of past claims of progress and apocalypse that didn’t happen to include a god as one of the stage properties.

To those of my readers who share my Druid faith, or any of the other movements in today’s inchoate but lively field of nature-centered spirituality, I hope I may speak even more frankly. For those who recognize the ways of Nature as a revelation of the powers that create and sustain the cosmos, as Druidry does, the notion that the world will abandon her normal ways and jump through hoops like a trained seal to satisfy our sense of entitlement or our craving for revenge is really pretty absurd. To study nature from a Druid perspective is to learn that that limitation is the first law of existence, that what looks like a straight line to us is merely part of a circle too large to see at a single glance, that every movement generates and is balanced by a corresponding countermovement, that what systems theory calls negative feedback and an older way of thought calls the Royal Secret of equilibrium governs all things and all beings, with or without their conscious cooperation. In such a cosmos—and all things considered, a strong case can be made that this is the kind of cosmos we live in—there’s no room for the paired fantasies of perpetual progress and imminent apocalypse, except as exhibits in a display of the odd things human beings talk themselves into believing from time to time.

Other faiths face their own challenges in dealing with the task I’ve proposed. I hope that at least some of my readers will be willing to attempt that task, though, because it’s far less abstract than it might seem at first; it has practical applications that bear directly on the hard work of preparing for the difficult future ahead. We’ll discuss that next week.

Wednesday, June 12, 2013

A Question of Values

Oswald Spengler, whose theory of historical cycles was discussed in last week’s post, was far from the first scholar to propose that the future of modern industrial civilization might best be understood by paying at least a little attention to what happened to other civilizations in the past.  Back in 1725, as the industrial revolution was just getting under way, an Italian philosopher named Giambattista Vico traced out "the course the nations run"—the phrase is his—in the pages of his masterpiece, Principles of a New Science Concerning the Common Nature of Nations (for obvious reasons, scholars these days shorten this to The New Science). Since then, it’s been rare for more than a generation to go by without some historian or philosopher pointing out to readers that every previous society has followed the familiar arc of rise and fall, and ours seems to be doing exactly the same thing.

Spengler was thus contributing to an established tradition, rather than breaking wholly new ground, and there have been important works since his time—most notably Arnold Toynbee’s sprawling A Study of History, twelve weighty volumes packed with evidence and case studies—that carried the same tradition further. Vico spent his whole career laboring in obscurity, but Spengler and Toynbee were both major public figures in their day, as well as bestselling authors whose ideas briefly became part of the common currency of thought in the Western world. They and their work, in turn, were both consigned to oblivion once it stopped being fashionable to think about the points they raised, and you can read any number of hefty studies of the philosophy of history and never find either man mentioned at all.

What makes this disappearance fascinating to me is that very few critics ever made a serious attempt to argue the facts that Spengler and his peers discussed. There was never any shortage of disagreement, to be sure, but nearly all of it remained weirdly detached from the issues the theorists of historical cycles were attempting to raise. There was a great deal of quibbling about details, a great deal of handwaving about fatalism and pessimism, and whole armies of straw men were lined up and beaten with gusto, but next to nobody tried to show that the basic concept of historical cycles doesn’t work—that the patterns traced by the history of China, let’s say, contradict those displayed by the history of ancient Egypt—and the few attempts that were made in this direction were embarrassingly weak. 

By and large, those who disputed Vico, Spengler, Toynbee, et al. either brushed aside the entire question of patterns of historical change, or conceded that, well, of course, those other civilizations of the past might have followed a shared trajectory, but ours?  Never.  That’s still the predictable response to any suggestion that the past might have anything useful to say about the future, and regular readers of this blog will have seen it deployed countless times in critiques posted by commenters here: in words made famous in any number of speculative bubbles, it’s different this time.

There’s a wry amusement to be had by thinking through the implications of this constantly repeated claim. If our society was in fact shaking off the burdens of the past and breaking new ground with every minute that goes by, as believers in progress like to claim, wouldn’t it be more likely that the theory of historical cycles would be challenged each time it appears with dazzlingly new, innovative responses that no one had ever imagined before?  Instead, in an irony Nietzsche would have relished, the claim that history can’t repeat itself endlessly repeats itself, in what amounts to an eternal return of the insistence that there is no eternal return. What’s more, those who claim that it’s different this time seem blissfully unaware that anyone has made the same claim before them, and if this is pointed out to them, they insist—often with quite some heat—that what they’re saying has nothing whatsoever to do with all the other times the same argument was used to make the same point down through the years.

There are deep patterns at work here, but it’s probably necessary to tackle the different-this-time argument on its own terms first. Of course there are differences between contemporary industrial civilization and those older societies that have already traced out the completed arc of rise and fall.  Each of those previous civilizations differed from every other human society in its own unique ways, too. Each human life, to use an analogy Spengler liked to cite, differs from every other human life in a galaxy of ways, but certain processes—birth, infancy, childhood, puberty, and so on through the life cycle to old age and death—are hardwired into the basic structure of being human, and will come to every individual who lives out a normal lifespan.  The talents, experiences, and achievements that fit into the common sequence of life will vary, often drastically, from person to person, but those differences exist within a common frame. The same thing, the theorists of historical cycles suggest, is true of human societies, and they offer ample evidence to support that claim.

Furthermore, these same scholars point out, modern industrial civilization has passed so far through all the normal stages of social existence appropriate to its age. It emerged out of a feudal setting all but indistinguishable from those that provided the cradle for past civilizations; out of that background, it developed its own unique view of the cosmos, expressed first in religious terms, and later in the form of a rationalist philosophy; it passed through the normal political, economic, and social changes in the usual order, and at the same broad time intervals, as other civilizations; its current political, economic, and cultural condition has precise parallels in older civilizations as far back as records reach. Given that the uniqueness of modern industrial civilization has so far failed to nudge it off the standard trajectory, it’s hard to find any valid reason to insist that our future won’t continue along the same track.

Claims that it’s different this time usually rest on one of three foundations. The first is that this is the first global civilization on record. A difference of scale, though, does not necessarily equal a difference of kind; the trajectory we’re discussing appears in Neolithic societies limited to a single river basin and continental empires with thriving international trade networks, as well as every scale in between. While it might be argued that the greater size of contemporary industrial society amounts to a difference in kind, that claim would have to be backed up with evidence, rather than merely asserted—as, so far, it generally has been. Furthermore, when the slower speed of earlier transportation technologies is taken into account, the "worlds" inhabited by older societies were effectively as large as ours; if your fastest means of transport is a horse-drawn chariot, for example, ancient China is a very big place.

The second foundation for claims of our uniqueness is, of course, the explosive growth of technology made possible over the last three centuries by the reckless extraction and burning of fossil fuels. It’s true that no other civilization has done that, but the differences have had remarkably little impact on the political, cultural, and social trends that shape our lives and the destinies of our communities. The corruption of mass politics in the modern industrial world, for example, is following point for point the same patterns traced out by equivalent phenomena in ancient Greece and Rome; the weapons of war have changed, similarly, but the downward spiral of a failing empire trying to cling to fractious but strategically vital borderlands is the same for America in Afghanistan as it was for the Babylonians in the Elamite hill country in 1000 BCE. Our technology has given us new means, but by and large we’ve employed them for the same ancient purposes, and reaped the same consequences.

The third foundation is newer, and appears these days mostly in those corners of the blogosphere where the apocalyptic faith discussed in an earlier post in this sequence has become standard. This is the claim that the global disasters that are about to wallop industrial civilization go so far beyond anything in the past that there’s no basis for comparison. Now of course that argument is very often based on the well-worn tactic of heaping up an assortment of worst-case scenarios, insisting that the resulting cataclysm is the only possible outcome of current trends, and using that imagined future as a measuring rod with which to dismiss what really happened in the past. This is the sort of thinking I critiqued in a recent post about claims that humanity will inevitably go extinct in the next few decades: if you cherrypick a set of extreme scenarios backed by less than five per cent of current climate change research, and treat those highly speculative hypotheses as though they’re incontrovertible facts, it’s easy to paint the end of industrial civilization in colors as extreme as you happen to like.

What tends to be missed in the resulting discussions, though, is that ecological disasters of the sort we’re actually likely to face featured repeatedly in the collapse of earlier civilizations. Clive Ponting’s excellent A Green History of the World is a helpful corrective for this myopia.  The collapse of classic Mayan civilization, for example, was triggered in large part by catastrophic droughts, caused in large part by deforestation and farming practices that wrecked the hydrologic cycles on which Mayan life depended. Anthropogenic climate change, in other words, was just as destructive to Mayan civilization as it promises to be to ours.  While climate change in Mayan times was more localized than the present equivalent, it’s worth remembering that the Mayan world was equally circumscribed by its geographical knowledge and transport technologies; from a Mayan perspective, the whole world was being ravaged by climate change, as ours will be in its turn.

The downfall of classic Mayan civilization unfolded over a century and a half, and involved the loss of irreplaceable cultural treasures and scientific knowledge, the abandonment of nearly all of the Lowland Mayan cities to the encroaching jungle, and dieoff so severe that postcollapse populations bottomed out around 5% of the Late Classic peak. It’s by no means impossible that the decline and fall of modern industrial civilization could involve losses on the same scale, and it’s a source of endless fascination to me that this suggestion—based as it is on the one source of useful evidence we’ve got, the experience of a previous society going through an equivalent process—should be dismissed by one set of disbelievers in historical cycles as too pessimistic, and by another set as too optimistic.

It’s exactly this twofold dismissal that points to the deeper, nonrational dimension underlying the whole discussion of possible futures. It bears repeating that the belief in progress, and the equal and opposite belief in apocalypse, are narratives about the unknowable. Both claim that the past has nothing to say about the future, that something is about to happen that has never happened before and that can’t be judged on the basis of any previous event. Whether the thing that’s about to happen is a shining new age of wonder, along the lines of Joachim’s Age of the Holy Spirit, or a sudden end of history, along the lines of Augustine’s version of the Second Coming, it refutes everything that has come before. This is what lies behind the endlessly repeated (and just as repeatedly disproved) insistence, on the part of believers in both narratives, that it’s different this time: if it’s not different this time, if the lessons of the past reveal the shape of the future, then both belief systems come crashing down at once.

That is to say, the belief in progress and in apocalypse are both matters of faith, not fact. The same is true of every set of beliefs about the future, however, or about anything else for that matter.  No system of logical inferences, however elaborate and exact, can prove its own presuppositions; dig down to the foundations, and you’ll find that the structure rests on assumptions about the nature of things that have to be taken on faith. It probably has to be pointed out that this is just as true of rationalist beliefs as it is of the most exotic forms of mysticism.  To say, as science does, that statements about the universe ought to be based on observation assumes, has to assume, that what we observe tells us truths about the universe—an assumption that the old Gnostics would have considered laughably naive. To claim that there are many gods, a few gods, only one god, or no gods at all is to insist on something about which human beings have no independently verifiable source of information whatsoever.

It’s tolerably common these days, outside of the surviving theist religions, to affect to despise faith, and you’ll find plenty of people who insist that they take nothing on faith at all. Of course they’re quite wrong. None of us can function in the world for five minutes without taking a galaxy of things on faith, from the solidity of the floor in front of us, through the connection between another person’s words and their thoughts, to the existence of places and times we will never experience.  Gregory Bateson pointed out, in a series of papers that have vanished as thoroughly from the literature of psychology as Spengler and Toynbee have from that of history, that an unwillingness to take anything on faith is at the core of schizophrenia; that’s what lies behind the frantic efforts of the paranoiac to find the hidden meaning in everything around him, and the catatonic’s ultimate refusal to have anything to do with the world at all.

Faith is, among other things, the normal and necessary human response to those questions that can’t be answered on the basis of any form of proof, but have to be answered in one way or another in order to live in the world. The question that deserves discussion is why different people, faced with the same unanswerable question, put their faith in different propositions. The answer is as simple to state as it is sweeping in its consequences: every act of faith rests on a set of values.

We’ll probably have to spend a good deal of time talking about the difference between facts and values one of these weeks, but that’s material for another post. The short form is that facts belong to the senses and the intellect, and they’re objective, at least to the extent that anyone with an ordinarily functioning set of senses and no reason to prevaricate can say, "yes, I see it too."  Values, by contrast, are a matter of the heart and the will, and they’re irreducibly subjective; to say "this is good" or "this is bad," or any other statement of value, does not communicate an objective fact about the thing being discussed, but always expresses a value judgment from some irreducibly individual point of view. More than half the confusions of contemporary thought result from attempts to treat personal value judgments as though they were objectively knowable facts—to say, for example, "x is better than y" without addressing such questions as "better by what criteria?" and "better for whom?"

The prejudices of modern industrial culture encourage that sort of confusion by claiming a higher status for facts than for values. Listen to atheists and Christians talking past each other, as they normally do, and you have a classic example of the result.  The real difference between the two, as the best minds on both sides have grasped, is a radical difference in values that defines equally profound differences in basic assumptions about humanity and the world.  Behind the atheist vision of humanity as a unique but wholly natural phenomenon, in the midst of a soulless universe of dead matter following natural laws, stands one set of value judgments about what counts as right and true; behind the Christian vision of humanity as the adopted child of divine omnipotence, placed temporarily in the material universe as a prologue to eternal bliss or damnation, stands a completely different set.  The difference in values is the heart of the matter, and no amount of bickering over facts can settle a debate rooted in that soil.

In the classical world, in an age when values were given at least as high a status as facts, debates of this sort were conducted on their natural ground, and systems of thought appealed to potential followers by presenting their own visions of the Good and calling into question the values presented by competing systems. Nowadays, such clarity is rare, and indeed it’s embarrassingly common to hear people insist that there’s no way to judge among competing value claims. It’s true that a value can’t be disproved in the same way as a fact, but values don’t exist in a vacuum; any statement of value has implications and consequences, and it’s by assessing these that each of us can judge whether a value is consistent with the other values we happen to hold, and with the universe of fact that we happen to experience.

We all know this, at least in practice. The reason why doctrines of racial inequality are widely and rightly dismissed by most people in the modern industrial world, for example, has little to do with the shoddy intellectual basis offered for such doctrines by their few defenders, quite a bit to do with lynch mobs, ethnic cleansing, concentration camps, and other well-known consequences of value systems that deny the humanity of other ethnic groups, and at least as much to do with the conflict between the values expressed in these consequences and other values, such as fairness and compassion, that most people embrace.  This is an extreme example, but the same principle applies more generally: when a statement is made about the unprovable, it’s always wise to ask what the consequences of believing that statement have been in the past, and what other values are consistent or inconsistent with the claim.

We can thus apply the same logic to the faith in perpetual progress and imminent apocalypse. One consequence of accepting these beliefs, and embracing the values that underlie them, is clear enough: both reliably yield false predictions about the future. Believers in progress like to claim the opposite, but you have to read descriptions of the future from before 1960 or so to grasp just how few the hits have always been compared to the flops. For all its failures, though, the faith in progress has a better track record than the faith in apocalypse; across the centuries, whenever anyone has insisted that the world was about to end, he or she has always been dead wrong. Aside from speculative bubbles or the quest for perpetual motion, it’s hard to name a more reliable source of utter hogwash.

For faiths that focus on the future as intently as these do, this inability to foresee the future is not exactly encouraging. It’s possible to go further, though, by noticing the values embodied in the progressive and apocalyptic faiths. Both of them insist that the world we know must shortly be swept away, to be replaced by some better age or annihilated in some grand final judgment. Both of them anchor their entire sense of meaning and value on an imaginary future, and disparage the present by contrast.  Both faiths are thus founded on a rejection of the world as it actually exists.  To borrow one of Nietzsche’s phrases, both are Nay-sayings to life, attempts to posit an unreal "real world" (the shining future of progress, the world after apocalypse) against which life as it actually is can be judged, condemned, and sentenced to death.  The mere fact that the executioners never do their job, though it’s an inconvenience to the believers on either side, does nothing to alter the furious zeal with which, over and over again, the sentence is handed down.

The religion of progress and its antireligion of apocalypse are by no means alone in their Nay-saying to life. The same world-condemning attitude has had a pervasive role in most (though not all) branches of Christianity, the theist faith from which these secular religions covertly derive a good many of their ideas and images.  In an upcoming post, we’ll discuss the way that this attitude in its many forms has helped send contemporary industrial society slamming face first into the wall of crises that shapes today’s headlines and will be defining our history for a good many years to come.

For the time being, though, I’d like to leave my readers with this reflection: what would it mean to found a set of values, and a corresponding set of presuppositions about the world, on life exactly as it is? In the course of opening that can of worms, and getting the worms inside more comfortably situated in their proper soil, we’ll begin the process of circling in toward the question at the center of this series of posts—the quest for a philosophy of life, and perhaps even a spirituality, that can make sense of the human reality of the Long Descent.

Wednesday, June 05, 2013

The Scheduled Death of God

There's a mordant irony in the fact that a society as fixated on the future as ours is should have so much trouble thinking clearly about it.  Partly, to be sure, that difficulty unfolds from the sheer speed of social and technological change in the age of cheap abundant energy that’s now coming to an end, but there’s more to it than that. 

In the civil religions of the modern world, the future functions as a surrogate for heaven and hell alike,  the place where the wicked will finally get the walloping they deserve and the good will be granted the promised benefits that the present never quite gets around to providing them. What Nietzsche called the death of God—in less colorful language, the fading out of living religious belief as a significant force in public life—left people across the Western world flailing for something to backstop the sense of moral order in the cosmos they once derived from religious faith. Over the course of the nineteenth century, a great many of them found what they wanted in one or another civil religion that projected some version of utopia onto the future.

It’s crucial not to underestimate the emotional force of the resulting beliefs. The future of perpetual betterment promised by the faith in progress, and the utopia on the far side of cataclysm promised with equal fervor by the faith in apocalypse, are no less important to their believers than heaven is to the ordinary Christian, and for exactly the same reason. Every human society has its own conception of the order of the cosmos; the distinctive concept of cosmic order that became central to the societies of Europe and the European diaspora envisioned a moral order that could be understood, down to the fine details, by human beings.  Since everyday life pretty clearly fails to follow such an order, there had to be some offstage location where everything would balance out, whether that location took the form of heaven, humanity’s future among the stars, a future society of equality and justice, or what have you.  Discard that imagined place and, for a great many people in the Western world, the cosmos ceases to have any order or meaning at all.

It was precisely against this sense of moral order, though, that Nietzsche declared war.  Like any good general, he sent his forces into action along several routes at once; the assault relevant to our theme was aimed at the belief that the arithmetic of morality would finally add up in some other place or time.  He rejected the idea of a utopian world of past or future just as forcefully as he did the concept of heaven itself.  That’s one of the things his doctrine of eternal return was intended to do:  by revisioning the past and the future as endless repetition, Nietzsche did his level best to block any attempt to make the past or the future fill the role once filled by heaven.

Here, though, he overplayed his hand. Strictly speaking, a cycle of eternal return is just as imaginary as any golden age in the distant past, or for that matter the glorious future come the Revolution when we will all eat strawberries and cream. In a philosophy that presents itself as a Yes-saying to life exactly as it is, his reliance on a theory of time just as unprovable as those he assaulted  was a massive problem. Nietzsche’s madness, and the resolute effort on the part of most European intellectuals of the time not to think about any of the issues he tried to raise, left this point  among many others hanging in the air. Decades passed before another German thinker tackled the same challenge with better results. His name, as I think most of my regular readers have guessed by now, was Oswald Spengler.

Spengler was in his own way as eccentric a figure as Nietzsche, though it was a more stereotypically German eccentricity than Nietzsche’s fey Dionysian aestheticism. A cold, methodical, solitary man, he spent his entire working life as a schoolteacher, and all his spare time—he never married—with his nose in a polymath’s banquet of books from every corner of scholarship. Old Kingdom Egyptian theology, traditional Chinese landscape design, the history of the medieval Russian church, the philosophical schools of ancient India, the latest discoveries in early twentieth century physics: all these and more were grist for his highly adaptable mill. In 1914, as the impending fall of the British empire was sweeping Europe into a vortex of war, he started work on the first volume of The Decline of the West; it appeared in 1918, and the second volume followed it in 1922.

The books became immediate bestsellers in German and several other languages—this despite a world-class collective temper tantrum on the part of professional historians. Logos, one of the most prestigious German scholarly journals of the time, ran an entire special issue on him, in which historians engaged in a frenzy of nitpicking about Spengler’s historical claims. (Spengler, unperturbed, read the issue, doublechecked his facts, released a new edition of his book with corrections, and pointed out that none of the nitpicking addressed any of the major points of his book; he was right, too.) One study of the furore around Spengler noted more than 400 publications, most of them hostile, discussing The Decline of the West in the decade of the 1920s alone.

Interest in Spengler’s work peaked in the 1920s and 1930s and faded out after the Second World War; some of the leading figures of the "Beat generation" used to sit around a table reading The Decline of the West out loud, and a few other figures of the 1950s drew on his ideas, but thereafter silence closed in. There’s an ironic contrast here to Nietzsche, who provided Spengler with so many of his basic insights; Nietzsche’s work was almost completely unknown during his life and became a massive cultural presence after his death; with Spengler, the sequence ran the other way around.

The central reason why Spengler was so fiercely if inconclusively attacked by historians in his own time, and so comprehensively ignored since then, is the same reason why he’s relevant to the present theme. At the core of his work stood the same habit of morphological thinking I discussed in an earlier post in this sequence. Johann Wolfgang von Goethe, who launched the study of comparative morphology in the life sciences in the eighteenth century, remained a massive cultural presence in the Germany of Spengler’s time, and so it came naturally to Spengler to line up the great civilizations of history side by side and compare their histories, in the same way that a biologist might compare a dolphin’s flipper to a bat’s wing, to see the common patterns of deep structure that underlie the surface differences.

Such comparisons are surprisingly unfashionable in modern historical studies. Most other fields of study rely on comparisons as a matter of course:  the astronomer compares one nebula to another, just as the literary critic compares one experimental novel to another, and in both fields it’s widely accepted that such comparisons are the most important way to get past irrelevancies to an understanding of what’s really going on. There are historical works that compare, say, one revolution to others, or one feudal system to another, but these days they’re in the minority.  More often, historians consider the events of some period in the past by themselves, without placing them in the context of comparable periods or events, and either restrict themselves to storytelling or propose assorted theories about the causes of those events—theories that can never be put to the test, because it’s all but impossible to test a hypothesis when you’re limited to a sample size of one.

The difficulty with a morphological approach to history is precisely that a sample size of more than one turns up patterns that next to nobody in the modern industrial world wants to think about. By placing past civilizations side by side with that of the modern industrial West, Spengler found that all the great historical changes that our society sees as uniquely its own have exact equivalents in older societies. Each society emerges out of chaos as a decentralized feudal society, with a warrior aristocracy and an epic poetry so similar that an enterprising bard could have recited the Babylonian tale of Gilgamesh in an Anglo-Saxon meadhall without anyone present sensing the least incongruity.  Each then experiences corresponding shifts in social organization: the meadhalls and their equivalents give way to castles, the castles to fortified towns, the towns to cities, and then a few of the cities outgrow all the others and become the centers in which the last stages of the society’s creative life are worked out.

Meanwhile, in the political sphere, feudal aristocrats become subject to kings, who are displaced by oligarchies of the urban rich, and these latter eventually fall before what Spengler calls Caesarism, the emergence of charismatic leaders who attract a following from the urban masses and use that strength to seize power from the corrupt institutions of an oligarchic state.  Traditional religions rich in myth give way to rationalist philosophies as each society settles on the intellectual projects that will define its legacy to the future—for example, logical method in the classical world, and natural science in ours. Out of the diverse background of folk crafts and performances, each culture selects the set of art forms that will become the focus of its creative life, and these evolve in ever more distinctive ways; Gilgamesh and Beowulf could just as well have swapped swords and fought each other’s monsters, for example, but the briefest glance at plays from ancient Greece, India, China, and the Western world shows a wholly different dramatic and aesthetic language at work in each.

All this might have been forgiven Spengler, but the next step in the comparison passes into territory that makes most people in the modern West acutely uncomfortable. Spengler argued that the creative potential of every culture is subject to the law of diminishing returns. Sooner or later, everything worth bothering with that can be done with Greek sculpture, Chinese porcelain, Western oil painting, or any other creative art has been done; sooner or later, the same exhaustion occurs in every other dimension of a culture’s life—its philosophies, its political forms, you name it. At that point, in the terms that Spengler used, a culture turns into a civilization, and its focus shifts from creating new forms to sorting through the products of its creative centuries, choosing a selection of political, intellectual, religious, artistic, and social patterns that will be sustainable over the long term, and repeating those thereafter in much the same way that a classical orchestra in the modern West picks and chooses out of the same repertoire of standard composers and works.

As that last example suggests, furthermore, Spengler didn’t place the transition from Western culture to its subsequent civilization at some conveniently far point in the future. According to his chronology, that transition began in the nineteenth century and would be complete by 2100 or so. The traditional art forms of the Western world would reach the end of the line, devolving into empty formalism or staying on in mummified form, the way classical music is preserved today; political ideologies would turn into empty slogans providing an increasingly sparse wardrobe to cover the naked quest for power; Western science, having long since exhausted the low-hanging fruit in every field, would wind down into a repetition of existing knowledge, and most forms of technology would stagnate, while a few technological fields capable of yielding grandiose prestige projects would continue to be developed for a while; rationalism would be preserved in intellectual circles, while popular religious movements riddled with superstition would rule the mental life of the bulk of the population. Progress in any Western sense of the word would be over forever, for future cultures would choose their own directions in which to develop, as different from ours as ours is from the traditional Chinese or the Mayans.

Spengler didn’t leave these projections of the future in abstract form; he turned them into detailed predictions about the near future, and those predictions have by and large turned out to be correct.  He was wrong in thinking that Germany would become an imperial state that would unite the Western world the way Rome united the classical world, the kingdom of Qin united China, and so on, though it’s fair to say that Germany’s two efforts to fill that role came uncomfortably close to succeeding. Other than that, his aim has proved remarkably good. 

He argued, for example, that the only artistic forms that could have any vitality in 20th century Europe and America would take their inspiration from other, non-Western cultures. Popular music, which was dominated by African-derived jazz in the first half of the century and African-derived rock thereafter, is only one of many examples. As for politics, he suggested that the history of the twentieth and twenty-first centuries would be dominated by a struggle pitting charismatic national dictators against a globalized oligarchy of high finance lightly concealed under a mask of democracy, a struggle that the financiers would eventually lose.  Though the jury’s still out on the final outcome, the struggle itself is splashed over the news on a daily basis.

All these events took place in other times and places, and will take place in future societies, each in its own way. What distinguishes contemporary Western society from earlier urban civilizations, according to Spengler’s view, is not that it’s "more advanced," "more progressive"—every society goes in a different direction, and proceeds along that route until the same law of diminishing returns cuts in—but simply that it happened to take mastery of physical matter and energy as its special project, and in the process stumbled across the buried carbon we’re burning so extravagantly just now. It’s hard to think of any historical vision less flattering to the inherited egotism of the modern industrial West; it deprives us of our imagined role as the cutting edge of humanity in its grand upward march toward the stars, and plops us back down to earth as just one civilization among many, rising and falling along with the rest.

It’s in this way that Spengler proved to be Nietzsche’s heir.  Where Nietzsche tried to challenge the imaginary utopia at the end of history with an equally imaginary vision of eternal return, Spengler offered a vision that was not imaginary, but rather rested on a foundation of historical fact.  Where Nietzsche’s abandonment of a moral order to the cosmos left him staring into an abyss in which order and meaning vanished once and for all, Spengler presented an alternative vision of cosmic order in which morality is not a guiding principle, but simply a cultural form, human-invented, that came and went with the tides of history. Life was as much Spengler’s banner as it was Nietzsche’s, life in the full biological sense of the word, unreasoning, demanding, and resistant to change over less than geological time scales; the difference was that Nietzsche saw life as the abyss, while Spengler used it to found his sense of an ordered universe and ultimately his values as well.

It’s among the richest ironies of Spengler’s project that among the things that he relativized and set in a historic context was Nietzsche’s philosophy. Nietzsche liked to imagine himself as a figure of destiny, poised at the turning point of the ages—this was admittedly a common occupational disease of nineteenth-century philosophers. Spengler noted his debts to Nietzsche repeatedly in The Decline of the West, but kept a sense of perspective the older man lacked; in the table of historical parallels that finishes the first volume of Spengler’s book, Nietzsche has become one more symptom of the late, "Winter" phase of Western culture, one of many figures participating in the final disintegration of traditional religious thought at the hands of skeptical intellectuals proposing new systems of philosophical ethics.

When Nietzsche announced the death of God, in other words, he was filling a role familiar in other ages, announcing an event that occurs on schedule in the life of each culture. The Greek historian Plutarch had announced the death of Pan some eighteen centuries earlier, around the time that the classical world was settling firmly into the end-state of civilization; the people of ancient Crete, perhaps recalling some similar event even further back, used to scandalize Greek tourists by showing them the grave of Zeus. Every literate urban society, Spengler argued, followed the same trajectory from an original folk religion rich in myths, through the rise of intellectual theology, the birth of rationalism, the gradual dissolution of the religious worldview into rational materialism, and then the gradual disintegration of rational materialism into a radical skepticism that ends by dissolving itself; thereafter ethical philosophies for the intellectuals and resurgent folk religion for the masses provide the enduring themes for the civilization to come.

It’s a stark vision, especially painful to those who have been raised to see the most recent phases of that process in our own culture as the heralds of the bright new era of history presupposed by the Joachimist shape of time, or the initial shockwaves of the imminent apocalypse presupposed by its Augustinian rival. Defenders of these latter viewpoints have accordingly developed standard ways of countering Spengler’s challenge—or, more precisely, defenders of both have settled on the same way of doing so. We’ll discuss their argument, and place it in its own wider context, in next week’s post.