Wednesday, March 27, 2013

The Sound of the Gravediggers

Over the nearly seven years I’ve spent blogging on The Archdruid Report, the themes of my weekly posts have veered back and forth between pragmatic ways to deal with the crisis of our time and the landscape of ideas that give those steps their meaning.  That’s been unavoidable, since what I’ve been trying to communicate here is as much a way of looking at the world as it is a set of practices that unfold from that viewpoint, and a set of habits of observation that focus attention on details most people these days tend to ignore.

There’s a lot more that could be said about the practical side of a world already feeling the pressures of peak oil, and no doubt I’ll contribute to that conversation again as we go. For now, though, I want to move in a different direction, to talk about what’s probably the most explosive dimension of the crisis of our time. That’s the religious dimension—or, if you prefer a different way of speaking, the way that our crisis relates to the fundamental visions of meaning and value that structure everything we do, and don’t do, in the face of a troubled time.

There are any number of ways we could start talking about the religious dimensions of peak oil and the end of the industrial age. The mainstream religions of our time offer one set of starting points; my own Druid faith, which is pretty much as far from the mainstream as you can get, offers another set; then, of course, there’s the religion that nobody talks about and most people believe in, the religion of progress, which has its own dogmatic way of addressing such issues.

Still, I trust that none of my readers will be too greatly surprised if I choose a starting point a little less obvious than any of these. To be specific, the starting point I have in mind is a street scene in the Italian city of Turin, on an otherwise ordinary January day in 1889. Over on one side of the Piazza Carlo Alberto, at the moment I have in mind, a teamster was beating one of his horses savagely with a stick, and his curses and the horse’s terrified cries could be heard over the traffic noise. Finally, the horse collapsed; as it hit the pavement, a middle-aged man with a handlebar mustache came sprinting across the plaza, dropped to his knees beside the horse, and flung his arms around its neck, weeping hysterically.  His name was Friedrich Wilhelm Nietzsche, and he had just gone hopelessly insane.

At that time, Nietzsche was almost completely unknown in the worlds of European philosophy and culture. His career had a brilliant beginning—he was hired straight out of college in 1868 to teach classical philology at the University of Basel, and published his first significant work, The Birth of Tragedy, four years later—but strayed thereafter into territory few academics in his time dared to touch; when he gave up his position in 1879 due to health problems, the university was glad to see him go. His major philosophical works saw print in small editions, mostly paid for by Nietzsche himself, and were roundly ignored by everybody.  There were excellent reasons for this, as what Nietzsche was saying in these books was the last thing that anybody in Europe at that time wanted to hear.

Given Nietzsche’s fate, there’s a fierce irony in the fact that the most famous description he wrote of his central message is put in the mouth of a madman.  Here’s the passage in question, from The Joyous Science (1882):

Haven’t you heard of the madman who lit a lantern in the bright morning hours, ran into the marketplace, and shouted over and over, ‘I’m looking for God! I’m looking for God!’ There were plenty of people standing there who didn’t believe in God, so he caused a great deal of laughter. ‘Did you lose him, then?’ asked one.  ‘Did he wander off like a child?’ said another. ‘Or is he hiding?  Is he scared of us? Has he gone on a voyage, or emigrated?’ They shouted and laughed in this manner. The madman leapt into their midst and pierced him with his look.

‘Where is God?’ he shouted. ‘I’ll tell you. We’ve killed him, you and I! We are all his murderers. But how could we have done this?  How could we gulp down the oceans? Who gave us a sponge to wipe away the whole horizon? What did we do when we unchained the earth from the sun? Where is it going now? Where are we going now? Away from all suns? Aren’t we falling forever, backwards, sideways, forwards, in all directions at once? Do up and down even exist any more? Aren’t we wandering in an infinite void? Don’t we feel the breath of empty space? Hasn’t it become colder?  Isn’t night coming on more and more all the time? Shouldn’t we light lanterns in the morning? Aren’t we already hearing the sounds of the gravediggers who are coming to bury God?  Don’t we smell the stink of a rotting God—for gods rot too?

‘God is dead, God remains dead, and we have killed him. How can we, the worst of all murderers, comfort ourselves? The holiest and mightiest thing that the world has yet possessed has bled to death beneath our knives!’

Beyond the wild imagery—which was not original to Nietzsche, by the way; several earlier German writers had used the same metaphor before he got to it—lay a precise and trenchant insight. In Nietzsche’s time, the Christian religion was central to European culture in a way that’s almost unthinkable from today’s perspective. By this I don’t simply mean that a much greater percentage of Europeans attended church then than now, though this was true; nor that Christian narratives, metaphors, and jargon pervaded popular culture to such an extent that you can hardly make sense of the literature of the time if you don’t know your way around the Bible and the standard tropes of Christian theology, though this was also true.

The centrality of Christian thought to European culture went much deeper than that. The core concepts that undergirded every dimension of European thought and behavior came straight out of Christianity. This was true straight across the political spectrum of the time—conservatives drew on the Christian religion to legitimize existing institutions and social hierarchies, while their liberal opponents relied just as extensively on Christian sources for the ideas and imagery that framed their challenges to those same institutions and hierarchies. All through the lively cultural debates of the time, values and ethical concepts that could only be justified on the basis of Christian theology were treated as self-evident, and those few thinkers who strayed outside that comfortable consensus quickly found themselves, as Nietzsche did, talking to an empty room.

It’s indicative of the tenor of the times that even those thinkers who tried to reject Christianity usually copied it right down to the fine details.  Thus the atheist philosopher Auguste Comte (1798-1857), a well known figure in his day though almost entirely forgotten now, ended up launching a “Religion of Humanity” with a holy trinity of Humanity, the Earth, and Destiny, a calendar of secular saints’ days, and scores of other borrowings from the faith he thought he was abandoning. He was one of dozens of figures who attempted to create ersatz pseudo-Christianities of one kind or another, keeping most of the moral and behavioral trappings of the faith they thought they were rejecting. Meanwhile their less radical neighbors went about their lives in the serene conviction that the assumptions their culture had inherited from its Christian roots were eternally valid.

The only difficulty this posed that a large and rapidly growing fraction of 19th-century Europeans no longer believed the core tenets of the faith that structured their lives and their thinking. It never occurred to most of them to question the value of Christian ethics, the social role of Christian institutions, or the sense of purpose and value they and their society had long derived from Christianity; straight across the spectrum of polite society, everyone agreed that good people ought to go to church, that missionaries should be sent forth to eradicate competing religions in foreign lands, and that the world would be a much better place if everybody would simply follow the teachings of Jesus, in whatever form those might most recently have been reworked for public consumption.  It was simply that a great many of them could no longer find any reason to believe in such minor details as the existence of God.

Even those who did insist loudly on this latter point, and on their own adherence to Christianity, commonly redefined both in ways that stripped them of their remaining relevance to the 19th-century world. Immanuel Kant (1724-1804), the philosopher whose writings formed the high water mark of western philosophy and also launched it on its descent into decadence, is among many other things the poster child for this effect.  In his 1793 book Religion Within The Limits of Bare Reason, Kant argued that the essence of religion—in fact, the only part of it that had real value—was leading a virtuous life, and everything else was superstition and delusion.

The triumph of Kant’s redefinition of religion was all but total in Protestant denominations, up until the rise of fundamentalism at the beginning of the 20th century, and left lasting traces on the leftward end of Catholicism as well.  To this day, if you pick an American church at random on a Sunday morning and go inside to listen to the sermon, your chances of hearing an exhortation to live a virtuous life, without reference to any other dimension of religion, are rather better than one in two.  The fact remains that Kant’s reinterpretation has almost nothing in common with historic Christianity. To borrow a phrase from a later era of crisis, Kant apparently felt that he had to destroy Christianity in order to save it, but the destruction was considerably more effective than the salvation turned out to be. Intellects considerably less acute than Kant’s had no difficulty at all in taking his arguments and using them to suggest that living a virtuous life was not the essence of religion but a suitably modern replacement for it.

Even so, public professions of Christian faith remained a social necessity right up into the 20th century.  There were straightforward reasons for this; even so convinced an atheist as Voltaire, when guests at one of his dinner parties spoke too freely about the nonexistence of God, is said to have sent the servants away and then urged his friends not to speak so freely in front of them, asking, “Do you want your throats cut tonight?” Still, historians of ideas have followed the spread of atheism through the European intelligentsia from the end of the 16th century, when it was the concern of small and secretive circles, to the middle of the 18th, when it had become widespread; spreading through the middle classes during of the 18th century and then, in the 19th—continental Europe’s century of industrialization—into the industrial working classes, who by and large abandoned their traditional faiths when they left the countryside to take factory jobs.

By the time Nietzsche wrote God’s epitaph, in other words, the core claims of Christianity were taken seriously only by a minority of educated Europeans, and even among the masses, atheism and secular religions such as Marxism were spreading rapidly at the expense of the older faith.  Despite this, however, habits of thought and behavior that could only be justified by the basic presuppositions of Christianity stayed welded in place throughout European sociery.  It was as though, to coin a metaphor that Nietzsche might have enjoyed, one of the great royal courts of the time busied itself with all the details of the king’s banquets and clothes and bedchamber, and servants and courtiers hovered about the throne waiting to obey the king’s least command, even though everyone in the palace knew that the throne was empty and the last king had died decades before.

To Nietzsche, all this was incomprehensible. The son and grandson of Lutheran pastors, raised in an atmosphere of more than typical middle-class European piety, he inherited a keen sense of the internal logic of the Christian faith—the way that every aspect of Christian theology and morality unfolds step by step from core principles clearly defined in the historic creeds of the early church. It’s not an accident that the oldest and most broadly accepted of these, the Apostle’s Creed, begins with the words
“I believe in God the Father almighty, Creator of heaven and earth.” Abandon that belief, and none of the rest of it makes any sense at all. 

This was what Nietzsche’s madman, and Nietzsche himself, were trying to bring to the attention of their contemporaries. Unlike too many of today’s atheists, Nietzsche had a profound understanding of juast what it was that he was rejecting when he proclaimed the death of God and the absurdity of faith. To abandon belief in a divinely ordained order to the cosmos, he argued, meant surrendering any claim to objectively valid moral standards, and thus stripping words like “right” and “wrong” of any meaning other than personal preference.  It meant giving up the basis on which governments and institutions founded their claims to legitimacy, and thus leaving them no means to maintain social order or gain the obedience of the masses other than the raw threat of violence—a threat that would have to be made good ever more often, as time went on, to maintain its effectiveness. Ultimately, it meant abandoning any claim of meaning, purpose, or value to humanity or the world, other than those that individual human beings might choose to impose on the inkblot patterns of a chaotic universe.

I suspect that many, if not most, of my readers will object to these conclusions. There are, of course, many grounds on which such objections could be raised.  It can be pointed out, and truly, that there have been plenty of atheists whose behavior, on ethical grounds, compares favorably to that of the average Christian, and some who can stand comparison with Christian saints. On a less superficial plane, it can be pointed out with equal truth that it’s only in a distinctive minority of ethical systems—that of historic Christianity among them—that ethics start from the words “thou shalt” and proceed from there to the language of moral exhortation and denunciation that still structures Western moral discourse today.  Political systems, it might be argued, can work out new bases for their claims to legitimacy, using such concepts as the consent of the governed, while claims of meaning, purpose and value can be refounded on a variety of bases that have nothing to do with an objective cosmic order imposed on it by its putative creator.

All this is true, and the history of ideas in the western world over the last few centuries can in fact be neatly summed up as the struggle to build alternative foundations for social, ethical, and intellectual existence in the void left behind by Europe’s gradual but unremitting abandonment of Christian faith. Yet this simply makes Nietzsche’s point for him, for all these alternative foundations had to be built, slowly, with a great deal of trial and error and no small number of disastrous missteps, and even today the work is by no means anything like as solid as some of its more enthusiastic proponents seem to think. It has taken centuries of hard work by some of our species’ best minds to get even this far in the project, and it’s by no means certain even now that their efforts have achieved any lasting success.

A strong case can therefore be made that Nietzsche got the right answer, but was asking the wrong question. He grasped that the collapse of Christian faith in European society meant the end of the entire structure of meanings and values that had God as its first postulate, but thought that the only possible aftermath of that collapse was a collective plunge into the heart of chaos, where humanity would be forced to come to terms with the nonexistence of objective values, and would finally take responsibility for their own role in projecting values on a fundamentally meaningless cosmos; the question that consumed him was how this could be done. A great many other people in his time saw the same possibility, but rejected it on the grounds that such a cosmos was unfit for human habitation. Their question, the question that has shaped the intellectual and cultural life of the western world for several centuries now, is how to find some other first postulate for meaning and value in the absence of faith in the Christian God.

They found one, too—one could as well say that one was pressed upon them by the sheer force of events. The surrogate God that western civilization embraced, tentatively in the 19th century and with increasing conviction and passion in the 20th, was progress. In our time, certainly, the omnipotence and infinite benevolence of progress have become the core doctrines of a civil religion as broadly and unthinkingly embraced, and as central to contemporary notions of meaning and value, as Christianity was before the Age of Reason. 

That in itself defines one of the central themes of the predicament of our time.  Progress makes a poor substitute for a deity, not least because its supposed omnipotence and benevolence are becoming increasingly hard to take on faith just now.  There’s every reason to think that in the years immediately before us, that difficulty is going to become impossible to ignore—and the same shattering crisis of meaning and value that the religion of progress was meant to solve will be back, adding its burden to the other pressures of our time. 

Listen closely, and you can hear the sound of the gravediggers who are coming to bury progress. Next week, we’ll talk about what that implies.

Wednesday, March 20, 2013

The Illusion of Invincibility

One of the wry amusements to be had from writing a blog that routinely contradicts the conventional wisdom of our time is the way that defenders of that same conventional wisdom tend to react.  You might think that those who are repeating what most people believe would take advantage of that fact, and present themselves as the voice of the majority, speaking for the collective consensus of our time.

In the nearly seven years since I started this blog, though, the number of times that’s happened can be counted neatly on the fingers of one foot.  Instead, those who rehash the conventional wisdom of our day inevitably like to portray themselves as innovative thinkers bursting with ideas that nobody has ever thought before.  It’s those whose views most closely ape fashionable clichés culled from pop culture and the mass media, in fact, who are most likely to try to strike a pose of heroic originality, just as it’s those rare thinkers who stray from today’s popular orthodoxies who most reliably get accused of being rigid, dogmatic and closed-minded.

Quite often, for instance, I field flurries of emails and comments on my blog insisting that I really ought to consider the new and radical idea that technology can overcome the limits to growth. The latest occasion for this curious claim is a new book titled Abundance: The Future Is Better Than You Think by Peter Diamandis and Steven Kotler, which is currently benefiting from a well-funded publicity campaign featuring lavish praise from the likes of Richard Branson and Bill Gates. I haven’t read it; doubtless I’ll do so once the public library here in Cumberland gets a copy, if only to find out if the book can possibly be as full of meretricious twaddle as it looks. 

What interests me, though, is that by and large, the people who have emailed me recently invoking the book as proof that I’m wrong about the future admit that they haven’t read it either. The mere fact that somebody has insisted in print that we’re going to get a shiny high-tech future of limitless abundance seems to be enough to convince them.  That the same claim has been breathlessly retailed in print for the better part of three centuries, as of course it has, seems never to enter their minds, and when I point this out, the response is the online equivalent of a you-kicked-my-puppy look and an insistence that I ought to be more open-minded to their supposedly new ideas.

In reality, if course, it’s hard to think of any cliché in today’s pop culture more trite and hackneyed than the notion that technology always trumps resource limits. That shopworn Victorian trope very nearly defines the conventional wisdom of our age. The evidence doesn’t support such claims, for reasons I’ve discussed on this blog many times already, and claims about the future that take that notion as gospel have already proven problematic, to use no harsher word. Yet it’s as certain as anything can be that when the hullabaloo over this latest book dies down, and some new book comes out rehashing the same weary cliché, I’ll field yet another round of enthusiastic emails from people who insist that it’s saying something new and exciting that I must never have heard before.

Glance over at the other side of the conventional wisdom and you can see the same process at work. The flurries of emails and comments I get pushing the vision of a bright new future are equalled, and more than equalled, by the flurries I get insisting that I obviously haven’t heard about the exciting new notion that something or other is about to squash industrial civilization like a bug. It’s all the funnier in that these flurries continued apace during the year I spent running the End Of The World Of The Week Club, retelling the story of some failed apocalyptic prediction of the past with every single weekly post. When I point out that the people who make such claims are rehashing the oldest and most consistently mistaken of all historical clichés, in turn, I can count on fielding another flurry of angry rhetoric insisting that I need to be more open-minded about their allegedly innovative ideas.

Amusing as all this is, it’s anything but unexpected. Every human society draws a boundary between those ideas that are acceptable and those that are beyond the pale, and modern industrial civilization is no exception to this rule; it’s simply that modern industrial civilization also likes to preen itself on its supposed openness to novel ideas. The habit of pretending that repeated rehashes of the conventional wisdom are always new and innovative, no matter how many times the same things have been repeated down through the years, and insisting that ideas that challenge the conventional wisdom aren’t new and innovative, even when they are, is as good a way as any to duck the potential conflicts between these two emotionally powerful cultural themes.

Appealing as though such habits might be—and they certainly help spare people the hard work of coming up with ideas that are genuinely original—they have at least one serious drawback.  If the conventional wisdom is leading straight toward disaster, and only a radically different way of looking at the world offers any hope of escaping a messy fate, a radically different way of looking at the world is exactly what you won’t get, because everybody thinks that the only way to get a radically different way of looking at the world is to keep on regurgitating the conventional wisdom that’s leading toward disaster in the first place.  It’s much as though people trapped in a burning building went around writing FIRE EXIT in bright red letters on every door that led straight back into the flames.

With these points in mind, I’d like to talk a bit about the latest attempt to rehash the conventional wisdom under the guise of rejecting it.

Over the last six weeks or so, I’ve fielded emails and comments from many sources insisting that peak oil has been disproved conclusively by the recent fracking phenomenon. This is hardly a new theme—in recent months, the same claims have been repeated almost daily at earsplitting volume in the mass media—but there’s a difference of some importance. The people who are sending these claims my way aren’t trying to claim that everything’s fine and the future of perpetual progress promised us by our culture’s most cherished mythology is on its way.  No, they’re insisting that because peak oil has been disproved, I and other peak oil writers and bloggers need to get with the program, stop talking about peak oil, and start talking about the imminent threat of climate change instead.

It’s a curious claim, all things considered.  For well over a decade now, predictions based on peak oil have proven far more accurate than predictions based on the conventional wisdom that insists resource limits don’t matter.  A decade ago, cornucopian theorist Daniel Yergin was loudly proclaiming that the price of oil had reached a permanent plateau at $38 a barrel, smart money was flooding into exciting new ethanol and biodiesel startups, and everyone other than a few peak oil writers out there on the fringes assumed as a matter of course that the market would provide, ahem, limitless supplies of energy from alternative sources if the price of oil ever did rise to the unthinkable level of $60 a barrel.

Meanwhile, those peak oil writers out there on the fringe were garnering almost universal denunciation by predicting a difficult future of triple-digit oil prices, spiraling economic dysfunction, and the failure of alternative energy technologies to provide more than a very modest fraction of the vast energetic largesse our society currently gets from fossil fuels. The conventional wisdom was that this couldn’t possibly happen. A decade on, it’s not exactly hard to see who was right.

As for the claim that the fracking phenomenon has disproved peak oil, it’s worth revisiting two graphs I’ve posted before. The first one tracks oil production in the United States between 1920 and 2012:


See the little bitty uptick over on the right hand side of the graph?  That’s the vast new outpouring of crude oil made possible by fracking technology. That’s what all the shouting and handwaving are about. I’d encourage my readers to take a long hard look at that very modest upward blip, and then turn to the second graph, which should also be familiar:


This is the diagram of peak oil from M. King Hubbert’s original 1956 paper on the subject. Those of my readers who are paying attention will already have noticed the very large area on the right hand side of the curve, more than two and a half times the size of all cumulative production and proven reserves  shown, labeled “future discoveries.”  The Bakken shale?  It’s included in there, along with many other oil fields that haven’t even been found yet.

The current fracking phenomenon, in other words, doesn’t disprove peak oil theory.  It was predicted by peak oil theory. As the price of oil rises, petroleum reserves that weren’t economical to produce when the price was lower get brought into production, and efforts to find new petroleum reserves go into overdrive; that’s all part of the theory. Since oil fields found earlier are depleting all the while, in turn, the rush to discover and produce new fields doesn’t boost overall petroleum production more than a little, or for more than a short time; the role of these new additions to productive capacity is simply to stretch out the curve, yielding the long tail of declining production Hubbert showed in his graph, and preventing the end of the age of oil from turning into the sort of sudden apocalyptic collapse imagined by one end of the conventional wisdom.

Pick up any decent book on peak oil, or spend ten minutes of independent research on the internet, and you can learn all of this. Somehow, though, the pundits whose heated denunciations of peak oil theory show up in the mainstream media nearly every day don’t manage to mention any of these points. It’s not the only noticeable gap in their reasoning, either:  I’ve long since lost track of the number of times I’ve seen media stories insist with a straight face that kerogen shales like the Green River formation are the same as oil shales like the Bakken, say, or duck the entire issue of depletion rates of fracked wells, or engage in other bits of evasion and misstatement that make our predicament look a great deal less challenging than it actually is.

Until recently, I’ve assumed that the failure to do basic research implied by these curious lapses was simply a product of the abysmal ignorance displayed by the media, and American society in general, concerning the important issues of our time. Still, I’ve had to rethink that, and a good part of the reason is a chart that was picked out of the mainstream media by one of the ever-vigilant Drumbeat commenters over on The Oil Drum—tip of the archdruid’s hat to Darwinian.  Here it is:


You can find this graph in various forms in quite a few places in the American media just now. You’ll notice that, at first glance, it appears to be showing domestic production of petroleum here in the US rising up inexorably to equal domestic consumption, and leaving imports far in the dust. Take another look, and you’ll see that the line tracking domestic production uses a different scale, on the right side of the chart, that just happens to make current production look three times bigger than it is.

Perhaps some of my readers can think of an honest reason why the chart was laid out that way. I confess that I can’t. It seems uncomfortably likely, in other words, that peak oil theory has racked up another successful prediction.  It’s one that my regular readers will remember from several previous posts, including this one from last June: the opening up of a chasm between those who are willing to face the reality of our situation and those who flee from that reality into fantasy and self-deception.

That chasm runs straight through the middle of the contemporary environmental movement, very much including the subset of that movement that concerns itself with climate change.  It doesn’t run in the obvious place—say, between the techno-environmentalists who insist that everyone on the planet can have a lavish American middle class lifestyle powered by renewable energy, and the deep ecologists who see humanity as a gang of ecocidal apes yelling in triumph as they rush toward planetary dieoff. Both these extremes, and the entire spectrum of opinions between them, embrace the core presupposition that undergirds the conventional wisdom of our age.

What is that presupposition? Total faith in the invincibility of technological progress.

That’s the common thread that unites the whole spectrum of acceptable viewpoints in today’s industrial society, from the cornucopians who insist that the universe is obligated to give us all the resources we want if we just wave enough money around, through the faux-environmentalists who are out there shilling for the nuclear industry because the other options are a little bit worse, right across the landscape of ideas to the believers in imminent apocalypse and the darkest of dark green ecologists. What differentiates these viewpoints from one another is their assessment of the value of technological progress:  the cornucopians think it’s all good, the techno-environmentalists think most of it’s good, and so on along the line to those extreme neoprimitivists who have convinced themselves that the invention of spoken language was probably a bad idea.

If you want to trace the fault line mentioned above, suggest to any of them that technological progress might stop in its tracks and give way to regress, and see how they respond. Mind you, by making that suggestion you’ll put yourself on the far side of a different line, the line between those ideas that are acceptable in industrial society and those that are beyond the pale. It’s acceptable to glorify progress as a mighty steamroller that will inevitably flatten anything in its path; it’s acceptable to argue that the steamroller has to be steered onto a different course, so it doesn’t flatten something of value that’s currently in its path; it’s acceptable to rage and weep over all the things it’s going to flatten as it continues on its unstoppable way; it’s even acceptable to insist that the steamroller is so mighty a juggernaut that no one can stop it from rolling over a cliff and crashing to ruin on the rocks below.

You can say any of these things in polite company.  What you can’t say, not without meeting total incomprehension and violent hostility, is that the steamroller’s fuel gauge is swinging over inexorably toward the letter E, and the jerry cans in back are dead empty.  You can’t mention that ominous grinding sounds are coming out from under the hood, that trouble lights are flashing all over the dashboard, and that the steamroller’s forward motion is already visibly slowing down.  You can’t even suggest as a possibility that in the not too distant future, the mighty steamroller will be a rusting, abandoned hulk, buried up to its axles in mud, stripped of all usable parts by roaming scavengers, and left to the patient and pitiless wrecking crew of sun and wind and rain.

Now of course some people are saying this.  To step back out of the metaphor, they are saying that technological progress, as well as the sciences that helped to make it possible, are subject to the law of diminishing returns; furthermore, that what has been called progress is in large part a mere side effect of a short-term, self-limiting process of stripping the planet’s easily accessible carbon reserves at an extravagant pace, and will stop in its tracks and shift into reverse as those reserves run short; more broadly, that modern industrial society is in no way exempt from the common fate of civilizations.  Ideas such as these have a long and intriguing history in the modern world, and I’ll want to discuss that history here one of these days.

Still, the point I want to make just now is that until recently, those who embraced the conventional wisdom simply ignored those of us who embraced these deeply unfashionable ideas. Climate change activists, to return to the point at issue, could simply brush aside the peak oil perspective, and keep on insisting that the only thing that can stop technological progress from destroying the planet is more technological progress—biofuels, solar energy, geoengineering, you name it; plenty of technologies and their supporters vied for the lucrative role of planetary savior, but next to nobody questioned the assumption that some technology or other was going to play the part. When peak oil researchers pointed out that predictions of catastrophic climate change assumed continued increases in fossil fuel extraction at rates the planet couldn’t provide, nobody paid the least attention.

The flurry of emails and comments I’ve received of late suggest that we aren’t being ignored any longer, and I think I know why.  It’s the same reason why peak oil theory keeps on getting denounced on a daily basis in increasingly shrill tones in the mainstream media, even though nobody with access to the mainstream media is arguing in favor of it, and the reason why those denunciations have strayed further and further into what looks remarkably like overt dishonesty.  For the last decade and more, again, predictions based on peak oil theory have proven substantially correct, while predictions based on a rejection of peak oil theory have been embarrassingly wrong. For that matter, the "standard run" model from The Limits to Growth, the most savagely denounced of the Seventies-era predictions of industrial civilization’s fate, has proven to be the most accurate projection of future trends to come out of that decade. One more graph:


The further we go into the future traced out by M. King Hubbert and The Limits to Growth, and the wider the gap that opens up between the myth of perpetual progress and the realities of contraction and regress that are taking shape around us right now, the more effort the chorus of believers will likely put into drowning out dissenting voices and proclaiming the infallibility of an already disproved creed. Why this should be so, why the illusion of invincibility is so central to the myth of progress and its believers, is an intricate question, far more complex than a single paragraph or a single post can cover. To explore it, I’m going to have to plunge into one of the handful of subjects I’d originally decided to leave severely alone on this blog: the religious implications of the end of the age of oil. We’ll start that discussion next week.

Wednesday, March 13, 2013

Reinventing America

It’s been more than a year now since my posts here on The Archdruid Report veered away from the broader theme of this blog, the decline of industrial civilization, to consider the rise and impending fall of America’s global empire. That was a necessary detour, and the points I’ve tried to explore since last February will have no small impact on the outcome of the broader trajectory of our age.

It’s only in the imaginary worlds erected by madmen and politicians, after all, that the world is limited to one crisis at a time. In the real world, by contrast, multiple crises piling atop one another are the rule rather than the exception, and tolerably often it’s the pressure of immediate troubles that puts a solution to the major crises of an age out of reach. Here in America, at least, that’s the situation we face today.  The end of the industrial age, and the long descent toward the ecotechnic societies of the far future, defines the gravest of the predicaments of our time, but any action the United States might pursue to deal with that huge issue also has to cope with the less gargantuan but more immediate impacts of the end of America’s age of empire.

This latter issue has a great deal to say about what responses to the former predicament are and aren’t possible for us.  Among the minority of Americans who have woken up to the imminent twilight of the age of cheap energy, for example, far and away the most popular response is to hope that some grand technological project or other can be deployed in time to replace fossil fuels and keep what James Howard Kunstler calls “the paradise of happy motoring” rolling on into the foreseeable future. It’s an understandable hope, drawing on folk memories of the Manhattan Project and the Apollo program.  There are solid thermodynamic reasons why no such project could replace fossil fuels, but let’s set that aside for the moment, because there’s a more immediate issue here: can a post-imperial America still afford any project on that scale?

History is a far more useful guide here than the wishful thinking and cheerleader’s rhetoric so often used to measure such possibilities. What history shows, to sum up thousands of years of examples in a few words, is that empires accomplish their biggest projects early on, when the flow of wealth in from the periphery to the imperial center—the output of those complex processes I’ve termed the imperial wealth pump—is at its height, before the periphery is stripped of its movable wealth and the center has slipped too far into the inflation that besets every imperial system sooner or later.  The longer an empire lasts and the more lavish the burden it imposes on its periphery, the harder it is to free up large sums of money (or the equivalent in nonfinancial resources) for grand projects, until finally the government has to scramble to afford even the most urgent expenditures.

We’re well along that curve in today’s America. The ongoing disintegration of our built infrastructure is only one of the many problem lights flashing bright red, warning that the wealth pump is breaking down and the profits of empire are no longer propping up a disintegrating domestic economy. Most Americans, for that matter, have seen their effective standard of living decline steadily for decades. Fifty years ago, for example, many American families supported by one full time working class income owned their own homes and lived relatively comfortable lives. Nowadays?  In many parts of the country, one full time working class income won’t even keep a family off the street.

The US government’s ongoing response to the breakdown of the imperial wealth pump has drawn a bumper crop of criticism, much of it well founded.  Under most circumstances, after all, an economic policy that focuses on the mass production of imaginary wealth via the deliberate encouragement of speculative excess is not a good idea. Still, it’s only fair to point out that there really isn’t much else any US administration could do—not and survive the next election, at least. In the abstract, most Americans believe in fiscal prudence, but when any move toward fiscal prudence risks setting off an era of economic contraction that would put an end to the extravagant lifestyles most Americans see as normal, abstract considerations quickly give way.

Thus it’s a safe bet that the federal government will keep following its present course, pumping the economy full of imaginary wealth by way of the Fed’s printing presses, artificially low rates of interest, and a dizzying array of similar gimmicks, in order to maintain the illusion of abundance a little longer, and keep the pressure groups that crowd around the government feeding trough from becoming too unruly.  In the long run, it’s a fool’s game, but nobody in Washington DC can afford to think in the long run, not when their political survival depends on what happens right now.

That’s the stumbling block in the way of the grand projects that still take up so much space in the peak oil blogosphere: the solar satellites, the massive buildout of thorium reactors, the projects to turn some substantial portion of Nevada into algal biodiesel farms, or what have you. Any such project that was commercially viable would already be under construction—with crude oil hovering around $100 a barrel on world markets, remember, there’s plenty of incentive for entrepreneurs to invest in new energy technologies. Lacking commercial viability, in turn, such a project would have to find ample funding from the federal government, and any such proposal runs into the hard fact that every dollar that rolls off the Fed’s printing presses has a pack of hungry pressure groups baying for it already.

It’s easy to insist that solar satellites are more important than, say, jet fighters, the Department of Education, or some other federal program, and in a good many cases, this insistence is probably true.  On the other hand, jet fighters, the Department of Education, and other existing federal programs have large and politically savvy constituencies backing them, which are funded by people whose livelihoods depend on those programs, and which have plenty of experience putting pressure on Congress and the presidency if their pet programs are threatened. It’s easy to insist, in turn, that politicians ought to ignore such pressures, but those who want to survive the next election don’t have that luxury—and if they did make it a habit to ignore pressure from their constituents, where would that leave the people who want to lobby for solar satellites, thorium reactors, or the like?

Meanwhile the broader economic basis that could make a buildout of alternative energy technologies possible has mostly finished trickling away. The United States is a prosperous country on paper, because the imaginary wealth manufactured by government and the financial industry alike still finds buyers who are willing to gamble that business as usual will continue for a while longer. Mind you, the government’s paper wealth is finding a dwindling supply of takers these days, Most treasury bills are currently being bought by the Fed, and while any number of reasons have been cited for this policy, I’ve come to suspect that most of what’s behind it is the simple fact that most other potential buyers aren’t interested.

If the law of supply and demand were to come into play, interest rates on treasury bills would have to rise as the pool of buyers shrank. That’s not something any US government can afford—the double whammy of a major recession and a sharp rise in the cost of financing the national debt would almost certainly trigger the massive economic and political crisis both parties are desperately trying to avoid. Instead, the torrent of paper liquidity allows the same thing to happen more slowly and less visibly, as creditor nations take their shares of that torrent and use it to outbid the United States in the increasingly unruly global scramble for what’s left of the planet’s fossil fuels and other nonrenewable resources.

A great many people are wondering these days when the resulting bubble in US paper wealth—for that’s what it is, of course—is going to pop.  That might still happen, especially as a side effect of a sufficiently sharp political or military crisis, but it’s also possible that the trillions of dollars in imaginary wealth that currently prop up America’s domestic economy could trickle away more gradually, by way of stagflation or any of the other common forms of prolonged economic dysfunction.  We could, in other words, get the kind of massive crisis that throws millions of people out of work and erases the value of trillions of dollars of paper wealth in a matter of months; we could equally well get the more lengthy  and less visible kind of crisis, in which every year that passes sees an ever larger fraction of the population driven out of the work force, an ever larger fraction of the nation’s wealth reduced to paper that would be worth plenty if only anybody were willing and able to buy it, and an ever larger part of the nation itself turning visibly into one more impoverished and misgoverned Third World nation.

Either way, the economic unraveling is bound to end in political crisis. Take a culture that assumes an endlessly rising curve of prosperity, and put it in a historical setting that puts that curve forever out of reach, and sooner or later an explosion is going to happen. A glance back at the history of Communism makes a good reminder of what happens in the political sphere when rhetoric and reality drift too far apart, and the expectations cultivated by a political system are contradicted daily by the realities its citizens have to face. As the American dream sinks into an American nightmare of metastatic poverty, disintegrating infrastructure, and spreading hopelessness, presided over by a baroque and dysfunctional bureaucratic state that prattles about freedom while loudly insisting on its alleged constitutional right to commit war crimes against its own citizens, scenes like the ones witnessed in a dozen eastern European capitals in the late 20th century are by no means unthinkable here.

Whether or not the final crisis takes that particular form or some other, it’s a safe bet that it will mark the end of what, for the last sixty years or so, has counted as business as usual here in the United States. As discussed in an earlier post in this series, this has happened many times before. It’s as old as democracy itself, having been chronicled and given a name, anacyclosis, in ancient Greece.  Three previous versions of the United States—call them Colonial America, Federal America, and Gilded Age America—each followed the same trajectory toward a crisis all too familiar from today’s perspective.  Too much political power diffusing into the hands of pressure groups with incompatible agendas, resulting in gridlock, political failure, and a collapse of legitimacy that in two cases out of three had to be reestablished the hard way, on the battlefield: we’re most of the way there this time around, too, as Imperial America follows its predecessors toward the recycle bin of history.

Our fourth trip around the track of anacyclosis may turn out to be considerably more challenging than the first three, though, partly for reasons already explored in this sequence of posts, and partly due to another factor entirely. The reasons discussed before are the twilight of America’s global empire and the end of the age of cheap abundant energy, both of which guarantee that whatever comes out of this round of anacyclosis will have to get by on much less real wealth than either of its two most recent predecessors. The reason I haven’t yet covered is a subtler thing, but in some ways even more potent.

The crises that ended Colonial America, Federal America, and Gilded Age America all happened in part because a particular vision of what America was, or could be, was fatally out of step with the times, and had to be replaced. In two of the three cases, there was another vision already in waiting: in 1776, a vision of an independent republic embodying the ideals of the Enlightenment; in 1933, a vision of a powerful central government using its abundant resources to dominate the world while, back at home, embodying the promises of social democracy. (Not, please note, socialism; socialism is state ownership of the means of production, social democracy is the extension of democratic ideals into the social sphere by means of government social welfare programs. The two are not the same, and it’s one of the more embarrassing intellectual lapses of today’s American pseudoconservatism that it so often tries to pretend otherwise.)

In the third, in 1860, there were not one but two competing visions in waiting: one that drew most of its support from the states north of the Mason-Dixon line, and one that drew most of its support from those south of it. What made the conflicts leading up to Fort Sumter so intractable was precisely that the question wasn’t simply a matter of replacing a failed ideal with one that might work, but deciding which of two new ideals would take its place. Would the United States become an aristocratic, agrarian society fully integrated with the 19th century’s global economy and culture, like the nations further south between the Rio Grande and Tierra del Fuego, or would it go its own way, isolating itself economically from Europe to protect its emerging industrial sector and decisively rejecting the trappings of European aristocratic culture?  The competing appeal of the two visions was such that it took four years of war to determine that one of them would triumph across a united nation.

Our situation in the twilight years of Imperial America is different still, because a vision that might replace the imperial foreign policy and domestic social social democracy of 1933 has yet to take shape. The image of America welded into place by Franklin Roosevelt during the traumatic years of the Great Depression and the Second World War still guides both major parties—the Republicans, for all their eagerness to criticize Roosevelt’s legacy, have proven themselves as quick to use federal funds to pursue social agendas as any Democrat, while the Democrats, for all their lip service to the ideals of world peace and national self-determination, have proven themselves as eager to throw America’s military might around the globe as any Republican.

Both sides of the vision of Imperial America depended utterly on access to the extravagant wealth that America could get in 1933, partly from its already substantial economic empire in Latin America, partly from the even more substantial "empire of time" defined by Appalachia’s coal mines and the oilfields of Pennsylvania and Texas. Both those empires are going away now, and everything that depends on them is going away with equal inevitability—and yet next to nobody in American public life has begun to grapple with the realities of a post-imperial and post-industrial America, in which debates over the fair distribution of wealth and the extension of national power overseas will have to give way to debates over the fair distribution of poverty and the retreat of national power to the borders of the United States and to those few responsibilities the constitution assigns to the federal government.

We don’t yet have the vision that could guide that process. I sometimes think that such a vision began to emerge, however awkwardly and incompletely, in the aftermath of the social convulsions of the 1960s.  During the decade of the 1970s, between the impact of the energy crisis, the blatant failure of the previous decade’s imperial agendas in Vietnam and elsewhere, and the act of collective memory that surrounded the nation’s bicentennial, it became possible for a while to talk publicly about the values of simplicity and self-sufficiency, the strengths of local tradition and memory, and the worthwhile things that were lost in the course of America’s headlong rush to empire.

I’ve talked elsewhere about the way that this nascent vision helped guide the first promising steps toward technologies and lifestyles that could have bridged the gap between the age of cheap abundant energy and a sustainable future of relative comfort and prosperity.  Still, as we know, that’s not what happened; the hopes of those years were stomped to a bloody pulp by the Reagan counterrevolution, Imperial America returned with a vengeance, and stealing from the future became the centerpiece of a bipartisan consensus that remains welded into place today.

Thus one of the central tasks before Americans today, as our nation’s imperial age stumbles blindly toward its end, is that of reinventing America: that is, of finding new ideals that can provide a sense of collective purpose and meaning in an age of deindustrialization and of economic and technological decline.  We need, if you will, a new American dream, one that doesn’t require promises of limitless material abundance, one that doesn’t depend on the profits of empire or the temporary rush of affluence we got by stripping a continent of its irreplaceable natural resources in a few short centuries.

I think it can be done, if only because it’s been done three times already.  For that matter, the United States is far from the only nation that’s had to find a new meaning for itself in the midst of crisis, and a fair number of other nations have had to do it, as we will, in the face of decline and the failure of some extravagant dream.  Nor will the United States be the only nation facing such a challenge in the years immediately ahead:  between the tectonic shifts in geopolitics that will inevitably follow the fall of America’s empire, and the far greater transformations already being set in motion by the imminent end of the industrial age, many of the world’s nations will have to deal with a similar work of revisioning.

That said, nothing guarantees that America will find the new vision it needs, just because it happens to need one, and it’s very late in the day.  Those of us who see the potential, and hope to help fill it, will have to get a move on.

Wednesday, March 06, 2013

The Hard Road Ahead

The latest round of political theater in Washington DC over the automatic budget cuts enacted in the 2011 debt ceiling compromise—the so-called “sequester”—couldn’t have been better timed, at least as far as this blog is concerned. It’s hard to imagine better evidence, after all, that the American political process has finally lost its last fingernail grip on reality.
Let’s start with the basics. Despite all the bellowing on the part of politicians, pressure groups, and the media, the cuts in question total only 2.3% of the US federal budget.  They thus amount to a relatively modest fraction of the huge increases in federal spending that have taken place over the last decade or so. (I sincerely doubt that those of my readers who were in the US in 2003 noticed any striking lack of federal dollars being spent then.) In the same way, those who protested the “tax increases” at the beginning of this year by and large failed to mentioned that the increases in question were simply the expiration of some—by no means all—of the big tax cuts enacted a little over a decade ago in the second Bush administration.
At a time when the United States is spending hundreds of billions of dollars a year it doesn’t happen to have, and making up the difference by spinning the printing presses at ever-increasing speeds,  a strong case can be made that rolling back spending increases and giving up tax breaks are measures that deserve serious consideration.  Any such notion, though, is anathema to most Americans these days, at least to the extent that it might affect them. Straight across the convoluted landscape of contemporary American political opinion, to be sure, you can count on an enthusiastic hearing if you propose that budget cuts ought to be limited to whatever government payouts don’t happen to benefit your audience.  Make even the most timid suggestion that your audience might demand a little bit less for itself, though, and your chances of being tarred, feathered, and ridden out of town on a rail are by no means small.

The only consensus to be found about budget cuts in today’s America, in other words, is the belief that someone else ought to take the hit. As politicians in Washington DC try to sort out which of the many groups clamoring for handouts get how many federal dollars, that consensus isn’t exactly providing them useful guidance. I’ve wondered more than once if the whole sequestration business is a charade, crafted by the leadership of both parties and tacitly accepted by the rank and file in Congress, that permits them to impose roughly equivalent budget cuts on as many federal programs as they think they can get away with, while giving each party enough plausible deniability that they can still manage to keep blaming everything on the other side. If so, it’s an ingenious strategem; the real challenge will come when Congress runs out of gimmicks of this kind and has to admit to the crowd of needy, greedy pressure groups crowding close around the feeding trough that the gravy train has come to an end.

That latter detail is the one piece of news you won’t hear anywhere in the current uproar.  It’s also the one piece of news that has to be understood in order to make sense of the American politics in the present and the near future. When the economics of empire start running in reverse, as they do in the latter years of every empire, familiar habits of extravagance that emerged during the glory days of the empire turn into massive liabilities, and one of the most crucial tasks of every empire in decline is finding some way to cut its expenses down to size. There are always plenty of people who insist that this isn’t necessary, and plenty more who are fine with cutting all expenditures but those that put cash in their own pocket; the inertia such people generate is a potent force, but eventually it gives way, either to the demands of national survival, or to the even more unanswerable realities of political, economic, and military collapse.

Between the point when a nation moves into the penumbra of crisis, and the point when that crisis becomes an immediate threat to national survival, there’s normally an interval when pretense trumps pragmatism and everyone in the political sphere goes around insisting that everything’s all right, even though everything clearly is not all right. In each of the previous cycles of anacyclosis in American history, such an interval stands out: the years leading up to the Revolutionary War, when leaders in the American colonies insisted that they were loyal subjects of good King George and the little disagreements they had with London could certainly be worked out; the bitter decade of the 1850s, when one legislative compromise after another tried to bandage over the widening gulf between slave states and free states, and succeeded only in making America’s bloodiest war inevitable; the opening years of the Great Depression, when the American economy crashed and burned as politicians and pundits insisted that everything would fix itself shortly.

We’re in America’s fourth such interval.  Like the ones that preceded it, it’s a time when the only issues that really matter are the ones that nobody in the nation’s public life is willing to talk about, and when increasingly desperate attempts to postpone the inevitable crisis a little longer have taken over the place of any less futile pursuit. How long the interval will last is a good question. The first such interval ran from the end of the Seven Years War in 1763 to the first shots at Lexington in 1775; the second, from the Compromise of 1850 to the bombardment of Fort Sumter in 1861; the third, the shortest to date, from the stock market crash of 1929 to the onset of the New Deal in 1933.  How long this fourth interval will last is anyone’s guess at present; my sense, for what it’s worth, is that historians in the future will probably consider the crash of 2008 as its beginning, and I would be surprised to see it last out the present decade before crisis hits.

During the interval before the explosion, if history is any guide, the one thing nobody will be able to get out of the federal government is constructive action on any of the widening spiral of problems and predicaments facing the nation. That’s the cost of  trying to evade a looming crisis:  the effort that’s required to keep postponing the inevitable, and the increasing difficulty of patching together a coalition between ever more divergent and fractious power centers, puts any attempt to deal with anything else out of reach. The decade before the Civil War is as good an example as any; from 1850 until the final explosion, on any topic you care to name, there was a Northern agenda and a Southern agenda, and any attempt to get anything done in Washington DC ran headlong into ever more tautly polarized sectional rivalry. Replace the geographical labels with today’s political parties, and the scenery’s all too familiar.

If there’s going to be a meaningful response to the massive political, economic, and social impacts of the end of America’s age of empire, in other words, it’s not going to come from the federal government. It probably isn’t going to come from state governments, either.  There’s a chance that a state here and there may be able to buck the trend and do something helpful, but most US state governments are as beholden to pressure groups as the federal government, and are desperately short of discretionary funds as a result.  That leaves local governments, local community groups, families and individuals as the most likely sources of constructive change—if, that is, enough people are willing to make “acting locally” something more than a comforting slogan.

This is where the dysfunctional but highly popular form of protest politics critiqued in an earlier post in this sequence becomes a major obstacle to meaningful change, rather than a vehicle for achieving it. As that critique showed, protest is an effective political tool when it’s backed up by an independent grassroots organization, one that can effectively threaten elected officials—even those of the party its members normally support—with removal from office if said elected officials don’t pay attention to the protest.  When that threat isn’t there, protest is toothless, and can be ignored.

That distinction remains relevant, since very few of the groups gearing up to protest these days have taken the time and invested the resources to build the kind of grassroots support that gives a protest teeth.  Yet there’s another way that protest politics can become hopelessly dysfunctional, and that’s when what the protesters demand is something that neither the officials they hope to influence, nor anyone else in the world, can possibly give them.

If current attitudes are anything to judge by, we’re going to see a lot of that in the years immediately ahead. The vast majority of Americans are committed to the belief that the lavish wealth they enjoyed in the last half dozen decades is normal, that they ought to be able to continue to enjoy that wealth and all the perks and privileges it made possible, and that if the future looming up ahead of them doesn’t happen to contain those things, somebody’s to blame.  Try to tell them that they grew up during a period of absurd imperial extravagance, and that this and everything connected with it is going to go away forever in the near future, and you can count on getting a response somewhere on the spectrum that links blank incomprehension and blind rage.

The incomprehension and the rage will doubtless drive any number of large and vocal protest movements in the years immediately ahead, and it’s probably not safe to assume that those movements will limit themselves to the sort of ineffectual posturing that featured so largely in the Occupy protests a couple of years back. It’s all too easy, in fact, to imagine the steps by which armed insurgents, roadside bombs, military checkpoints, and martial law could become ordinary features of daily life here in America, and the easy insistence that everything that’s wrong with the country must be the fault of some currently fashionable scapegoat or other is to my mind one of the most important forces pushing in that direction.

Right now, the US government is one of those fashionable scapegoats.  The pornography of political fear that plays so large a role in American public discourse these days feeds into this habit. Those people who spent the eight years of the second Bush administration eagerly reading and circulating those meretricious claims that Bush was about to impose martial law and military tyranny on the US, and their exact equivalents on the other end of the political spectrum who are making equally dishonest claims about Obama right now, are helping to feed the crisis of legitimacy I’ve discussed in several posts here. The habit that Carl Jung described as “projecting the shadow”—insisting, that is, that all your own least pleasant traits actually belong to whoever you hate most—has a great deal to do with the spread of that mood. I’ve wondered more than once if there might be more to it than that, though.

It’s hard to think of anything that would give more delight America’s rivals on the world stage, or play out more to their advantage, than a popular insurgency against the US government on American soil. Even if it was crushed, as it likely would be, such a rising would shred what’s left of the American economy, cripple the ability of the US to intervene outside its borders, and yield a world-class propaganda coup to any nation tired of the US government’s repeated posturing over issues of human rights. Funding antigovernment propaganda here in the United States without getting caught would be easy enough to do, and plenty of hostile governments might find it a gamble worth taking. I find myself suspecting at times that this might be what’s behind the remarkable way that American public life has become saturated with propaganda insisting that the current US system of government is evil incarnate, and that any replacement whatsoever would necessarily be an improvement.

Now of course that latter is a common opinion in revolutionary eras; equally common, of course, is the discovery that as bad as the status quo might happen to be, its replacement can be much, much worse.  Those who witnessed the French and Russian revolutions, to name only two examples, got to find that out the hard way. It would be helpful, to use no stronger word, to avoid a repeat of that same unpleasant object lesson in the postimperial United States.  As long as Americans keep on trying to convince themselves that the limits to growth don’t matter, the profits of empire never came their way, and the reckless extravagance that American popular culture considers basic to an ordinary lifestyle is no more than their due, steering clear of some such outcome is going to be a very tricky proposition indeed.

It would be helpful, in other words, if more Americans were to come to terms with the  fact that deciding what kind of future they want, and then insisting at the top of their lungs that they ought to have it, is not a useful response.  Instead, it’s going to be necessary to start by thinking, hard, about the kind of futures a postimperial, postpetroleum America might be able to afford, and then trying to make the best possible choice among the available options. Making such a choice, in turn, will be made much easier once we have some practical experience of the way the various options work out in the real world—and this brings us back again to the question of local action.

Nobody knows what political, economic, and cultural forms will be best suited to thrive in the wake of America’s failed empire, or to deal with the broader consequences as the industrial world stumbles down the long, ragged slope toward the deindustrial world of the future.  Plenty of people think they know; there’s no shortage of abstract ideologies proclaiming the one true path to a supposedly better future; but betting the future on an untested theory or, worse, on a theory that’s failed every time it was put to the test, is not exactly a useful habit.

What’s needed  instead, as the United States stumbles toward its fourth great existential crisis, is the broadest possible selection of options that have been shown to work. This is where local communities and community groups can play a critical role, for it’s precisely on the local scale that options can be tested, problems identified and fixed, and possibilities explored most easily.  Furthermore, since the whole country isn’t committed to any one response, options tested in different places can be compared with one another, and the gaudy rhetoric of triumphalism that so often fills so much space online and off—how many projects, dear reader, have you seen hailed as the one and only definitive answer to the crisis of our time, without the least bit of evidence to show that it actually works?—can be set aside in favor of straightforward demonstrations that a given option can do what it’s supposed to do.

In an earlier post in this sequence, for example, I discussed some of the possibilities that might come out of a revival of traditional democratic process.  The simplest and most effective way to launch such a revival would be by way of existing community groups, which very often retain the remnants of democratic process in their organizational structure, or in newly founded groups using democratic principles.  These groups would then become training grounds from which people who had learned the necessary skills could proceed to such other venues as local government, the organization of new political parties, or what have you, and put those skills to good use.

The same principle applies to almost any other aspect of our collective predicament you care to name. Whether the issue that needs a meaningful response is the impending shortage of energy and other resources, the increasingly unstable climate, the disintegration of an economy in which accounting fraud is nearly the only growth industry left, and so on down the list, the scale of the problem is clear but the details are murky, and the best way to deal with it remains shrouded in blackest night. For that matter, there’s no way to be sure that the response that works best in one place will be equally well suited to conditions elsewhere. Tackle the issues locally, trying out various options and seeing how well they work, and the chances of hitting on something useful go up sharply.

It will doubtless be objected that we don’t have time for any such program of trial and error. Quite the contrary, we no longer have time for anything else. Spinning grand theoretical programs, waiting for the improbable circumstances that might possibly lead to their being adopted on a national or global scale, and hoping that they work as advertised if they ever do get put to the test, is a luxury best suited to those eras when crisis is still comfortably far off in the future.  We don’t live in such an era, in case you haven’t noticed.

Over the decades ahead, the people of the United States and the rest of the industrial world are going to have to deal with the unraveling of an already declining American global empire, the end of a global economic order dominated by the dollar and thus by America’s version of the imperial wealth pump, the accelerating depletion of a long list of nonrenewable resources, and the shattering impact of rapid climate change, just for starters. If history is any guide, the impact of those already inevitable crises will likely be compounded by wars, revolutions, economic crises, and all the other discontinuities that tend to crop up when one global order gives way to another.  It’s going to be a very rough road—quite probably at least as rough as the road the world had to travel between 1914 and 1954, when the end of Britain’s global empire brought the long peace of 19th century Europe to a messy end and unleashed a tidal wave of radical change and human blood.

Equally, the hard road ahead will likely be comparable in its scope and impacts to the harrowing times brought by America’s first three rounds of anacyclosis. To live through the Revolutionary War, the Civil War, or the Great Depression was not an easy thing; those of my readers who are curious about what might be ahead could probably do worse than to read a good history of one or more of those, or one of the many firsthand accounts penned by those who experienced them and lived to tell about it. The records of such times do not give any noticeable support to the claim that we can have whatever kind of future we want. The kinds of hope they do hold out is a point I plan on discussing next week.