Wednesday, July 31, 2013

On the Far Side of Progress

The pointless debates over evolution discussed in last week’s Archdruid Report post have any number of equivalents all through contemporary industrial culture. Pick a topic, any topic, and it’s a pretty safe bet that  the collective imagination defines it these days as an irreconcilable divide between two and only two points of view, one of which is portrayed as realistic, reasonable, progressive, and triumphant, while the other is portrayed as sentimental, nostalgic, inaccurate, and certain to lose—that is to say, as a microcosm of the mythology of progress.

According to that mythology, after all, every step of the heroic onward march of progress came about because some bold intellectual visionary or other, laboring against the fierce opposition of a majority of thinkers bound by emotional ties to outworn dogmas, learned to see the world clearly for the first time, and in the process deprived humanity of some sentimental claim to a special status in the universe. That’s the way you’ll find the emergence of the theory of evolution described in textbooks and popular nonfiction to this day.  Darwin’s got plenty of company, too:  all the major figures of the history of science from Copernicus through Albert Einstein get the same treatment in popular culture. It’s a remarkably pervasive bit of narrative, which makes it all the more remarkable that, as far as history goes, it’s essentially a work of fiction. 

I’d encourage those of my readers who doubt that last point to read Stephen Jay Gould’s fascinating book Time’s Arrow, Time’s Cycle. Gould’s subject is the transformation in geology that took place in the late 18th and early 19th centuries, when theories of geological change that centered on Noah’s flood gave way to the uniformitarian approach that’s dominated geology ever since. Pick up a popular book on the history of earth sciences, and you’ll get the narrative I’ve just outlined:  the role of nostalgic defender of an outworn dogma is assigned to religious thinkers such as Thomas Burnet, while that of heroic pioneer of reason and truth is conferred on geologists such as James Hutton.

What Gould demonstrates in precise and brutal detail is that the narrative can be imposed on the facts only by sacrificing any claim to intellectual honesty.  It’s simply not true, for example, that Burnet dismissed the evidence of geology when it contradicted his Christian beliefs, or that Hutton reached his famous uniformitarian conclusions in a sudden flash of insight while studying actual rock strata—two claims that have been endlessly repeated in textbooks and popular literature. More broadly, the entire popular history of uniformitarian geology amounts to a “self-serving mythology”—those are Gould’s words, not mine—that’s flatly contradicted by every bit of the historical evidence.

Another example? Consider the claim, endlessly regurgitated in textbooks and popular literature about the history of astronomy, that the geocentric theory—the medieval view of things that put the Earth at the center of the solar system—assigned humanity a privileged place in the cosmos. I don’t think I’ve ever read a popular work on the subject that didn’t include that factoid. It seems plausible enough, too, unless you happen to know the first thing about medieval cosmological thought.

The book to read here is The Discarded Image by C.S. Lewis—yes, that C.S. Lewis; the author of the Narnia books was also one of the most brilliant medievalists of his day, and the author of magisterial books on medieval and Renaissance thought. What Lewis shows, with a wealth of examples from the relevant literature, is that nobody in the Middle Ages thought of the Earth’s position as any mark of privilege, or for that matter as centrally placed in the universe. To the medieval mind, the Earth was one notch above the rock bottom of the cosmos, a kind of grubby suburban slum built on the refuse dump outside the walls of the City of Heaven. Everything that mattered went on above the sphere of the Moon; everything that really mattered went on out beyond the sphere of the fixed stars, where God and the angels dwelt.

The one scrap of pride left to fallen humanity was that, even though it was left to grub for a living on the dungheap of the cosmos, it hadn’t quite dropped all the way to the very bottom. The very bottom was Hell, with Satan trapped at its very center; the Earth was a shell of solid matter that surrounded Hell, the same way that the sphere of the Moon surrounded that of Earth, the sphere of Mercury that of the Moon, and so on outwards to Heaven.  Physically speaking, in other words, the medieval cosmos was diabolocentric, not geocentric—again, the Earth was merely one of the nested spheres between the center and the circumference of the cosmos—and the physical cosmos itself was simply an inverted reflection of the spiritual cosmos, which had God at the center, Satan pinned immovably against the outermost walls of being, and the Earth not quite as far as you could get from Heaven.

Thus the Copernican revolution didn’t deprive anybody of a sense of humanity’s special place in the cosmos; quite the contrary, eminent thinkers at the time wondered if it wasn’t arrogant to suggest that humanity might be privileged enough to dwell in what, in the language of the older cosmology, was the fourth sphere up from the bottom! It takes only a little leafing through medieval writings to learn that, but the fiction that the medieval cosmos assigned humanity a special place until Copernicus cast him out of it remains glued in place in the conventional wisdom of our time. When the facts don’t correspond to the mythology of progress, in other words, too bad for the facts.

Other examples could be multiplied endlessly, starting with the wholly fictitious flat-earth beliefs that modern writers insist on attributing to the people who doubted Columbus, but these will do for the moment, not least because one of the authors I’ve cited was one of the 20th century’s most thoughtful evolutionary biologists and the other was one of the 20th century’s most thoughtful Christians. The point I want to make is that the conventional modern view of the history of human thought is a fiction, a morality play that has nothing to do with the facts of the past and everything to do with justifying the distribution of influence, wealth, and intellectual authority in today’s industrial world. That’s relevant here because the divide sketched out at the beginning of this essay—the supposedly irreconcilable struggles between a way of knowing the world that’s realistic, progressive and true, and a received wisdom that’s sentimental, nostalgic, and false—is modeled on the narrative we’ve just been examining, and has no more to do with the facts on the ground than the narrative does.

The great difference between the two is that neither medieval cosmographers nor late 18th century geologists had the least notion that they were supposed to act out a morality play for the benefit of viewers in the early 21st century. Here in the early 21st century, by contrast, a culture that’s made the morality play in question the center of its collective identity for more than three hundred years is very good at encouraging people to act out their assigned roles in the play, even when doing so flies in the face of their own interests.  Christian churches gain nothing, as I pointed out in last week’s post, by accepting the loser’s role in the ongoing squabble over evolution, and the huge amounts of time, effort, and money that have gone into the creationist crusade could have been applied to something relevant to to the historic creeds and commitments of the Christian religion, rather than serving to advance the agenda of their enemies. That this never seems to occur to them is a measure of the power of the myth.

Those of my readers who have an emotional investment in the environmental movement might not want to get too smug about the creationists, mind you, because their own movement has been drawn into filling exactly the same role, with equally disastrous consequences.  It’s not just that the media consistently likes to portray environmentalism as a sentimental, nostalgic movement with its eyes fixed on an idealized prehuman or pretechnological past, though of course that’s true. A great many of the public spokespersons for environmental causes also speak in the same terms, either raging against the implacable advance of progress or pleading for one or another compromise in which a few scraps are tossed nature’s way as the engines of progress go rumbling on. 

According to the myth of progress, those are the sort of speeches that are assigned to the people on history’s losing side, and environmentalists in recent decades have done a really impressive job of conforming to the requirements of their assigned role.  When was the last time, for example, that you heard an environmentalist offer a vision of the future that wasn’t either business as usual with a coat of green spraypaint, a return to an earlier and allegedly greener time, or utter catastrophe?  As recently as the 1970s, it was quite common for people in the green end of things to propose enticing visions of a creative, sustainable, radically different future in harmony with nature, but that habit got lost in the next decade, about the time the big environmental lobbies sold out to corporate America.

Now of course once a movement redefines its mission as begging for scraps from the tables of the wealthy and influential, as mainstream environmentalism has done, it’s not going to do it any good to dream big dreams. Still, there’s a deeper pattern at work here.  The myth of progress assigns the job of coming up with bold new visions of the future to the winning side—which means in practice the side that wins the political struggle to get its agenda defined as the next step of progress—and assigns to the losing side instead the job of idealizing the past and warning about the dreadful catastrophes that are sure to happen unless the winners relent in their onward march. Raise people to believe implicitly in a social narrative, and far more often than not they’ll fill their assigned roles in that narrative, even at great cost to themselves, since the alternative is a shattering revaluation of all values in which the unthinking certainties that frame most human thought have to be dragged up to the surface and judged on their own potentially dubious merits.

Such a revaluation, though, is going to happen anyway in the not too distant future, because the onward march of progress is failing to live up to the prophecies that have been made in its name.  As noted in an earlier post in this sequence, civil religions are vulnerable to sudden collapse because their kingdom is wholly of this world; believers in a theist religion can console themselves in the face of continual failure with the belief that their sufferings will be amply repaid in heaven, but the secular worldview common to civil religions slams the door in the face of that hope.

The civil religion of Communism thus imploded when it became impossible for people on either side of the Iron Curtain to ignore the gap between prophecy and reality, and I’ve argued in an earlier series of posts that there’s good reason to think that the civil religion of Americanism may go the same way in the decades ahead of us. The civil religion of progress, though, is at least as vulnerable to that species of sudden collapse. So far, the suggestion that progress might be over for good is something you’ll encounter mostly in edgy humor magazines and the writings of intellectual heretics far enough out on the cultural fringes to be invisible to the arbiters of fashion; so far, “they’ll think of something” remains the soothing mantra du jour of the true believers in the great god Progress.

Nonetheless, history points up the reliability with which one era’s unquestioned truths become the next era’s embarrassing memories.  To return to a point raised earlier in this sequence, the concept of progress has no content of its own, and so it’s been possible so far for believers in progress to pretend to ignore all the things in American life that are blatantly retrogressing, and to keep scrabbling around for something, anything, that will still prop up the myth. In today’s America, living standards for most people have been falling for decades, along with literacy rates and most measures of public health; the nation’s infrastructure has been ravaged by decades of malign neglect, its schools are by most measures the worst in the industrial world, and even the most basic public services are being cut to Third World standards or below; the lunar landers scattered across the face of the Moon stare back blindly at a nation that no longer has a manned space program at all and, despite fitful outbursts of rhetoric from politicians and the idle rich, almost certainly will never have one again. None of that matters—yet.

Another of the lessons repeatedly taught by history, though, is that sooner or later these things will matter.  Sooner or later, some combination of events will push cognitive dissonance to the breaking point, and the civil religion of progress will collapse under the burden of its own failed prophecies. That’s almost unthinkable for most people in the industrial world these days, but it’s crucial to recognize that the mere fact that something is unthinkable is no guarantee that it won’t happen.

Thus it’s important for those of us who want to be prepared for the future to try to think about the unthinkable—to come to terms with the possibility that the future will see a widespread rejection of the myth of progress and everything connected to it. That wasn’t a likely option in an age when economic expansion and rapid technological development were everyday facts of life, but we no longer live in such an age, and the fading memories of the last decades when those things happened will not retain their power indefinitely. Imagine a future America where the available resources don’t even suffice to maintain existing technological systems, only the elderly remember sustained economic growth, and the new technological devices that still come onto the market now and then are restricted to the very few who are wealthy enough to afford them. At what point along that curve do the promises of progress become so self-evidently absurd that the power of the civil religion of progress to shape thought and motivate behavior breaks down completely?

It’s ironic but entirely true that actual technological progress could continue, at least for a time, after the civil religion of progress is busy pushing up metaphorical daisies in the cemetery of dead faiths. What gives the religion of progress its power over so many minds and hearts is not progress itself, but the extraordinary burden of values and meanings that progress is expected to carry in our society.  It’s not the mere fact that new technologies show up in the stores every so often that matters, but the way that this grubby commercial process serves to bolster a collective sense of entitlement and a galaxy of wild utopian dreams about the human future. If the sense of entitlement gives way to a sense of failure or, worse, of betrayal, and the dreamers wake up and recognize that the dreams were never anything more than pipe dreams in the first place, the backlash could be one for the record books.

One way or another, the flow of new products will eventually sputter to a halt, though at least some of today’s technologies will stay in use for as long as they can be kept functioning in the harsh conditions of an age of resource scarcity and ecological payback. A surprisingly broad range of technologies can be built and maintained by people who have little or no grasp of the underlying science, and thus it has happened more than once—as with the Roman aqueducts that brought water to medieval cities—that a relatively advanced technology can be kept running for centuries by people who have no clue how it was built. Over the short and middle term, in a world after progress, we can probably expect many current technologies to remain in place for a while, though it’s an open question how many people in America and elsewhere will still be able to afford to use them for how much longer.

Ultimately, that last factor may be the Achilles’ heel of most modern technologies.  In the not too distant future, any number of projects that might be possible in some abstract sense will never happen, because all the energy, raw materials, labor, and money that are still available are already committed twice over to absolute necessities, and nothing can be spared for anything else. In any age of resource scarcity and economic contraction, that’s a fairly common phenomenon, and it’s no compliment to contemporary thinking about the future that so many of the grand plans being circulated in the sustainability scene ignore the economics of contraction so completely.

Still, that’s a theme for a different post. The point I want to raise here has to do with the consequences of a collective loss of faith in the civil religion of progress—consequences that aren’t limited to the realm of technology, but spill over into economics, politics, and nearly every other dimension of contemporary life. The stereotyped debates introduced at the beginning of this post and discussed in more detail toward the middle will be abandoned, and their content will have to be reframed in completely different terms, once the myth of progress, which provides them with their basic script, loses its hold on the collective imagination. The historical fictions also discussed earlier will be up for the same treatment. It’s hard to think of any aspect of modern thought that hasn’t been permeated by the myth of progress, and when that myth shatters and has to be replaced by other narratives, an extraordinary range of today’s unquestioned certainties will be up for grabs.

That has implications I plan on exploring in a number of future posts. Some of the most crucial of those implications, though, bear directly on one of the core institutions of contemporary industrial culture, an institution that has derived much of its self-image and a galaxy of benefits from the historical fictions and stereotyped debates discussed earlier in this post. Next week, therefore, we’ll talk about what science might look like in a world on the far side of progress.

Wednesday, July 24, 2013

The Quest for a Common Language

It was probably inevitable that my comment last week about the pseudoconservative crusade against Darwinian evolution in today’s America would attract more attention, and generate more heat, than anything else in the post. Some of my readers abroad expressed their surprise that the subject was even worth mentioning any more, and it’s true that most religious people elsewhere on the planet, even those who revere the same Bible our American creationists insist on treating as a geology textbook, got over the misunderstandings that drive the creationist crusade a long time ago.

While it’s primarily an American issue, though, I’d like to ask the indulgence of my readers elsewhere in the world, and  also of American readers who habitually duck under the nearest couch whenever creationists and evolutionists start shouting past each other.  As a major hot-button issue in the tangled relationship between science and religion, the quarrel over evolution highlights the way that this relationship has gotten messed up, and thus will have to be sorted out as the civil religion of progress comes unraveled and its believers have to find some new basis for their lives.

Mind you, I also have a personal stake in it. It so happens that I’m a religious person who accepts the validity of Darwin’s theory of evolution. That’s not despite my religion—quite the contrary, it’s part of my religion—and so I’m going to break one of my own rules and talk a little bit about Druidry here.

The traditions of modern Druidry, the faith I follow, actually embraced biological evolution even before Darwin provided a convincing explanation for it. Here’s part of a ritual dialogue from the writings of Edward Williams (1747-1826), one of the major figures of the early Druid Revival:

“Q. Where art thou now, and how camest thou to where thou art?”

“A. I am in the little world, whither I came, having traversed the circle of Abred, and now I am a man at its termination and extreme limits.”

“Q. What wert thou before thou didst become a man in the circle of Abred?”

“A. I was in Annwn the least possible that was capable of life, and the nearest possible to absolute death, and I came in every form, and through every form capable of a body and life, to the state of man along the circle of Abred.”

Like most 18th-century rituals, this one goes on for a good long while, but the passage just cited is enough to give the flavor and some of the core ideas. Abred is the realm of incarnate existence, and includes “every form capable of a body and life,” from what used to be called “infusoria” (single-celled organisms, nowadays) all the way up the scale of biological complexity and diversity, through every kind of plant and animal, including you and me. What the dialogue is saying is that we all, every one of us, embody all these experiences in ourselves. When Taliesin in his great song of triumph said “I have been all things previously,” this is what we believe he was talking about.

There are at least two ways in which all this can be taken. It might be referring to the long biological process that gave rise to each of us, and left our bodies and minds full of traces of our kinship with all other living things. It might also be referring to the transmigration of souls, which was a teaching of the ancient Druids and is fairly common in the modern tradition as well: the belief that there is a center of consciousness that survives the death of one body to be reborn in another, and that each such center of consciousness, by the time it first inhabits a human body, has been through all these other forms, slowly developing the complexity that will make it capable of reflective thought and wisdom. You’ll find plenty of Druids on either side of this divide; what you won’t find—at least I’ve yet to encounter one—are Druids who insist that the existence of a soul is somehow contradicted by the evolution of the body.

Yet you can’t bring up the idea of evolution in today’s America without being beseiged by claims that Darwinian evolution is inherently atheistic. Creationists insist on this notion just as loudly as atheists do, which is really rather odd, considering that it’s nonsense. By this I don’t simply mean that an eccentric minority faith such as Druidry manages to combine belief in evolution with belief in gods; I mean that the supposed incompatibility between evolution and the existence of one or more gods rests on the failure of religious people to take the first principles of their own faiths seriously.

Let’s cover some basics first. First of all, Darwin’s theory of natural selection may be a theory, but evolution is a fact. Living things change over time to adapt to changing environments; we’ve got a billion years of fossil evidence to show that, and the thing is happening right now—in the emergence of the Eastern coyote, the explosive radiation of cichlid fishes in East Africa, and many other examples. The theory attempts to explain why this observed reality happens. A great deal of creationist rhetoric garbles this distinction, and tries to insist that uncertainties in the explanation are proof that the thing being explained doesn’t exist, which is bad logic. The theory, furthermore, has proven itself solidly in practice—it does a solid job of explaining things for which competing theories have to resort to ad hoc handwaving—and it forms the beating heart of today’s life sciences, very much including ecology.

Second, the narratives of the Book of Genesis, if taken literally, fail to match known facts about the origins and history of the Earth and the living things on it. Creationists have argued that the narratives are true anyway, but their attempts to prove this convince only themselves.  It’s been shown beyond reasonable doubt, for example, that the Earth came into being long before 4004 BCE, that animals and plants didn’t evolve in the order given in the first chapter of Genesis, that no flood large enough to put an ark on Mount Ararat happened during the chronological window the Bible allows for the Noah story, and so on.  It was worth suggesting back in the day that the narratives of the Book of Genesis might be literally true, but that hypothesis failed to fit the data, and insisting that the facts must be wrong if they contradict a cherished theory is not a useful habit.

Third, the value of the Bible—or of any other scripture—does not depend on whether it makes a good geology textbook, any more than the value of a geology textbook depends on whether it addresses the salvation of the soul. I don’t know of any religion in which faith and practice center on notions of how the Earth came into existence and got its current stock of living things. Certainly the historic creeds of Christianity don’t even consider the issue worth mentioning. The belief that God created the world does not require believing any particular claim about how that happened; nor does it say in the Bible that the Bible has to be taken literally, or that it deals with questions of geology or paleontology at all.

What’s happened here, as I’ve suggested in previous posts, is that a great many devout Christians in America have been suckered into playing a mug’s game. They’ve put an immense amount of energy into something that does their religion no good, and plays straight into the hands of their opponents.

It’s a mug’s game, to begin with, because the central strategy that creationists have been using since well before Darwin’s time guarantees that they will always lose. It’s what historians of science call the “God of the gaps” strategy—the attempt to find breaks in the evolutionary process that scientists haven’t yet filled with an explanation, and then to insist that only God can fill them. Back in Darwin’s own time, the usual argument was that there weren’t any transitional forms between one species and another; in response to the resulting talk about “missing links,” paleontologists spent the next century and a half digging up transitional forms, so that nowadays there are plenty of evolutionary lineages—horses, whales, and human beings among them—where every species is an obvious transition between the one before it and the one after. As those gaps got filled in, critics of evolution retreated to another set, and another, and another; these days, they’ve retreated all the way to fine details of protein structure, and when that gap gets filled in, it’ll be on to the next defeat. The process is reliable enough that I’ve come to suspect that biologists keep an eye on the latest creationist claims when deciding what corner of evolutionary theory gets intensively researched next.

Still, there’s a much deeper sense in which it’s a mug’s game, and explaining that deeper sense is going to require attention to some of the basic presuppositions of religious thought. To keep things in suitably general terms, we’ll talk here about what philosophers call classical theism, defined as the belief that the universe was created out of nothing by a unique, perfect, eternal, omnipotent and omniscient being. (There’s more to classical theism than that—you can find the details in any good survey of philosophy of religion—but these are the details that matter for our present purposes.) I’ve argued elsewhere that classical theism isn’t the best explanation of human religious experience, but we’ll let that go for now; it corresponds closely to the beliefs of most American creationists, and it so happens that arguments that apply to classical theism here can be applied equally well to nearly all other theist beliefs.

Of the terms in the definition just given, the one that gets misused most often these days is “eternal.” That word doesn’t mean “lasting for a very long time,” as when we say that a bad movie lasts for an eternity; it doesn’t even mean “lasting for all of time.” What it means instead is “existing outside of time.” (Connoisseurs of exact diction will want to know that something that lasts for a very long time is diuturnal, and something that lasts for all of time is sempiternal.) Eternal beings, if such there be, would experience any two moments in time the way you and I experience two points on a tabletop—distinct but simultaneously present. It’s only beings who exist in time who have to encounter those two moments sequentially, or as we like to say, “one at a time.”

That’s why, for example, the endless arguments about whether divine providence contradicts human free will are barking up the wrong stump. Eternal beings wouldn’t have to foresee the future—they would simply see it, because to them, it’s not in the future. An omniscient eternal being can know exactly what you’ll do in 2025, not because you lack free will, but because there you are, doing it right out in plain sight, as well as being born, dying, and doing everything else in between. An eternal being could also see what you’re doing in 2025 and respond to it in 2013, or at any other point in time from the Big Bang to whatever final destiny might be waiting for the universe billions of years from now. All this used to be a commonplace of philosophy through the end of the Middle Ages, and it’s no compliment to modern thought that a concept every undergraduate knew inside and out in 1200 has been forgotten even by people who think they believe in eternal beings.

Now of course believers in classical theism and its equivalents don’t just believe in eternal beings in general.  They believe in one, unique, perfect, eternal, omnipotent and omniscient being who created the universe and everything in it out of nothing. Set aside for the moment whether you are or aren’t one of those believers, and think through the consequences of the belief.  If it’s true, then everything in the universe without exception is there either because that being deliberately put it there, or because he created beings with free will in the full knowledge that they would put it there. Everything that wasn’t done by one of those created beings, in turn, is a direct manifestation of the divine will.  Gravity and genetics,  photosynthesis and continental drift, the origin of life from complex carbon compounds and the long evolutionary journey since then: grant the presuppositions of classical theism, and these are, and can only be, how beings in time perceive the workings of the eternally creative will of God.

Thus it’s a waste of time to go scrambling around the machinery of the cosmos, looking for scratches left by a divine monkeywrench on the gears and shafts. That’s what the “God of the gaps” strategy does in practice; without ever quite noticing it, it accepts the purely mechanistic vision of the universe that’s promoted by atheists, and then tries to prove that God tinkers with the machinery from time to time. Accept the principles of classical theism and you’ve given up any imaginable excuse for doing that, since a perfect, omniscient, and omnipotent deity leaves no scratches and doesn’t need to tinker. It’s not even a matter of winding up the gears of the cosmos and letting them run from there, in the fashion of the “clockmaker God” of the 18th century Deists; to an eternal divine being, all of time is present simultaneously, every atom is doing exactly and only what it was put there to do, and what looks like machinery to the atheist can only be, to the believer in classical theism or its equivalents, the action of the divine will in eternity acting upon the world in time.

Such a universe, please note, doesn’t differ from the universe of modern science in any objectively testable way, and this is as it should be. The universe of matter and energy is what it is, and modern science is the best toolkit our species has yet discovered for figuring out how it works. The purpose of theology isn’t to bicker with science over questions that science is much better prepared to address, but to relate the material universe studied by science to questions of ultimate concern—of value, meaning and purpose—which science can’t and shouldn’t address and are instead the proper sphere of religion. To return to a point I tried to raise in one of last month’s posts, not everything that matters to human beings can be settled by an objective assessment of fact; there are times, many of them, that you have to decide on some other basis which of several different narratives you choose to trust.

Step beyond questions of fact, that is, and you’re in the territory of faith—a label that properly includes the atheist’s belief in a purely material cosmos just as much as it does the classical theist’s belief in a created cosmos made by an infinite and eternal god, the traditional polytheist’s belief in a living cosmos shaped by many divine powers, and so on, since none of these basic presuppositions about the cosmos can be proven or disproven.  How do people decide between these competing visions, then?  As noted in the post just mentioned, when that choice is made honestly, it’s made on the basis of values. Values are always individual, and always relative to a particular person in a particular context. They are not a function of the intellect, but of the heart and will—or to use a old and highly unfashionable word, of character. Different sets of presuppositions about the cosmos speak to different senses of what values matter; which is to say that they speak to different people, in different situations.

This, of course, is what a great many religions have been saying all along. In most of the religions of the west, and many of those from other parts of the world, faith is a central theme, and faith is not a matter of passing some kind of multiple choice test; it’s not a matter of the intellect at all; rather, it’s the commitment of the whole self to a way of seeing the cosmos that can be neither proved nor disproved rationally, but has to be accepted or rejected on its own terms. To accept any such vision of the nature of existence is to define one’s identity and relationship to the whole cosmos; to refuse to accept any such vision is also to define these things, in a different way; and in a certain sense, you don’t make that choice—you are that choice. Rephrase what I’ve just said in the language of salvation and grace, and you’ve got one of the core concepts of Christianity; phrase it in other terms, and you’ve got an important element of many other religions, Druidry among them.

It’s important not to ignore the sweeping differences among these different visions of the nature of existence—these different faiths, to use a far from meaningless idiom. Still, there’s a common theme shared by many of them, which is the insight that human beings are born and come to awareness in a cosmos with its own distinctive order, an order that we didn’t make or choose, and one that imposes firm limits on what we can and should do with our lives.  Different faiths understand that experience of universal order in radically different ways—call it dharma or the Tao, the will of God or the laws of Great Nature, or what have you—but the choice is the same in every case:  you can apprehend the order of the cosmos in love and awe, and accept your place in it, even when that conflicts with the cravings of your ego, or you can put your ego and its cravings at the center of your world and insist that the order of the cosmos doesn’t matter if it gets in the way of what you think you want.  It’s a very old choice: which will you have, the love of power or the power of love?

What makes this particularly important just now is that we’re all facing that choice today with unusual intensity, in relation to part of the order of the cosmos that not all religions have studied as carefully as they might. Yes, that’s the order of the biosphere, the fabric of natural laws and cycles that keep all of us alive. It’s a teaching of Druidry that this manifestation of the order of things is of the highest importance to humanity, and not just because human beings have messed with that order in remarkably brainless ways over the last three hundred years or so. Your individual actions toward the biosphere are an expression of the divide just sketched out. Do you recognize that the living Earth has its own order, that this order imposes certain hard constraints on what human beings can or should try to do, and do you embrace that order and accept those constraints in your own life for the greater good of the living Earth and all that lives upon her? Or do you shrug it off, or go through the motions of fashionable eco-piety, and hop into your SUV lifestyle and slam the pedal to the metal?

Science can’t answer that question, because science isn’t about values. (When people start claiming otherwise, what’s normally happened is that they’ve smuggled in a set of values from some religion or other—most commonly the civil religion of progress.) Science can tell us how fast we’re depleting the world’s finite oil supplies, and how quickly the signs of unwelcome ecological change are showing up around us; it can predict how soon this or that or the other resource is going to run short, and how rapidly the global climate will start to cost us in blood; it can even tell us what actions might help make the future less miserable than it will otherwise be, and which ones will add to the misery—but it can’t motivate people to choose the better of these, to decide to change their lives for the benefit of the living Earth rather than saying with a shrug, “I’m sure they’ll think of something” or “I’ll be dead before it happens” or “We’re all going to be extinct soon, so it doesn’t matter,” and walking away.

That’s why I’ve been talking at such length about the end of the civil religion of progress here, and why I’ll be going into more detail about the religious landscape of the deindustrial world as we proceed.  Religion is the dimension of human culture that deals most directly with values, and values are the ultimate source of all human motivation. It’s for this reason that I feel it’s crucial to find a common language that will bridge the gap between religions and the environmental sciences, to get science and religion both to settle down on their own sides of the border that should properly separate them—and to show that there’s a path beyond the misguided struggle between them. We’ll talk more about that path next week.

Wednesday, July 17, 2013

Held Hostage by Progress

The continual recycling of repeatedly failed predictions in the peak oil community, the theme of last week’s post here, is anything but unique these days. Open a copy of today’s newspaper (or the online equivalent), and it’s a safe bet that you’ll find at least one op-ed piece calling enthusiastically for the adoption of some political, military, or economic policy that’s failed every single time it’s been tried. It’s hard, in fact, to think of any broadly accepted policy in any dimension of public life today that can’t be accurately described in those terms.

Arnold Toynbee, whose sprawling study of historical cycles is among the constellations by which this blog navigates, pointed out quite some time ago that this process of self-inflicted failure is one of the standard ways that civilizations write their own obituaries. In his formulation, societies thrive so long as the creative minority that leads them can keep on coming up with new responses to the challenges the world throws their way—a process that normally requires the regular replacement of the society’s leadership from below, so that new leaders with new ideas can rise to the top.

When that process of replacement breaks down, and the few people who still rise into the ruling class from lower down the pyramid are selected for their willingness to go along with the status quo rather than for their commitment to new ideas that work, what was once a creative minority degenerates into a dominant minority, which rules by coercion because it can no longer inspire by example. You can tell that this has happened to your society when every crisis gets met with the same stereotyped set of responses, even when those responses clearly don’t work. That happens because dominant minorities identify themselves with certain policies, and with the social roles and narratives that go with those policies, and it takes much more than mere failure to shake those obsessions loose.

The resulting one-track thinking can go very far indeed.  The ideology of the Roman Empire, for example, copied the theological vision of Roman Pagan religion and projected it onto the world of politics. Roman Pagans saw the universe as a place of chaotic powers that had to be subjected to the benevolent rule of a cosmic paterfamilias by way of Jove’s thunderbolts. Roman social thought understood history in the same way, as a process by which an original state of chaos was bashed into obedience by Rome’s legions and subjected to the benevolent rule of the emperor.  For much of Rome’s imperial history, that model even made a certain amount of sense, as substantial parts of the Mediterranean world that had been repeatedly ravaged by wars beforehand experienced an age of relative peace and prosperity under Roman rule.

The problem was simply that this way of dealing with problems had little relevance to the pressures that gutted the Roman Empire in its final years, and trying to apply it anyway very quickly turned into a massive source of problems in its own right. The endless expansion of the Roman military required by increasingly drastic attempts to hammer the world into obedience imposed crippling tax burdens across Roman society, driving whole economic sectors into bankruptcy, and the government responded to this by passing laws requiring every man to practice the same profession as his father, whether he could afford to do so or not. Across the dying empire, whenever one extension of centralized imperial authority turned into a costly flop, some even more drastic act of centralization was the only thinkable option, until finally the whole system fell to pieces.

Modern industrial civilization, especially but not only in its American expression, is well on its way to this same destination by a slightly different road. Across the board, in politics, in economics, in energy policy, in any other field you care to name, the enthusiastic pursuit of repeatedly failed policies has become one of the leitmotifs of contemporary life.  I’d like to focus on one of those briefly, partly because it’s a classic example of the kind, partly because it shows with rare clarity the thinking that underlies the whole phenomenon. The example I have in mind is the ongoing quest for fusion power.

Scientists in the US and something like a dozen other countries have been busy at that quest since the 1950s. In the process, they’ve discovered something well worth knowing about fusion power:  if it can be done at all, on any scale smaller than a star—and the jury’s still out on that one—it can’t be done at a price that any nation on Earth can possibly afford.  The dream of limitless cheap fusion power that filled the pages of gosh-wow newspaper articles and science fiction stories in the 1950s and 1960s is thus as dead as a sack full of doornails. Has this stopped the continuing flow of billions of dollars of grant money into round after futile round of gargantuan fusion-power projects? Surely you jest.

Thus fusion researchers are stuck in the same self-defeating loop as those peak oil mavens who repeat the same failed prediction for the umpteenth time in a row, in the serene conviction that this time it’ll come true.  They’re approaching the situation in a way that prevents them from learning from their mistakes, no matter how many times the baseball bat of failure whacks them upside the head. In the case of the fusion scientists, what drives that loop is evident enough:  the civil religion of progress and, in particular, the historical mythology at the core of that religion.

Fusion researchers by and large see themselves as figures standing at the cutting edge of one important branch of techological progress. Given their training, their history, and the cultural pressures that surround them and define their work, it’s all but impossible for them to do anything else. That’s what has them boxed into a dead end with no easy exits, because the way progress is conceptualized in contemporary culture is fatally out of step with the facts on the ground.

Progress, as the word literally means, is continued forward motion in one direction. To believers in the civil religion of progress, that’s the shape of history:  whatever it is that matters—moral improvement, technological prowess, economic expansion, or what have you—marches invincibly onward over time, and any setbacks in the present will inevitably be overcome in the future, just as equivalent setbacks in the past were overcome by later generations. To join the marching legions of progress, according to the myth, is to enlist on the side of history’s winners and to help the inevitable victory come about just that little bit sooner, just as to oppose progress is to fight valiantly in a misguided cause and lose.

That’s the myth that guides contemporary industrial society, just as the myth of Jupiter clobbering the Titans and imposing the rule of law on a fractious cosmos was the myth that guided Roman society. In the broadest sense, whether any given change is “progressive” or “regressive” has to be settled by good old-fashioned politics, since changes don’t arrive with these labels branded on their backsides. Once a group of people have committed themselves to the claim that a change they’re trying to bring about is progressive, though, they’re trapped; no matter what happens, the only action the myth allows them to consider is that of slogging gamely onwards under the conviction that the obstacles will inevitably give way if they just keep at it. Thus the fusion research community is stuck perpetually pushing on a door marked PULL and wondering why it won’t open.

Of course fusion researchers also have deeply pragmatic reasons for their refusal to learn the lessons of repeated failure. Careers, reputations, and million-dollar grants depend on keeping up the pretense that further investment in fusion research has any chance of producing something more than a collection of vastly overpriced laboratory curiosities, and the field of physics is so specialized these days that shutting down fusion research programs would leave most fusion researchers with few marketable job skills relevant to anything this side of flipping burgers. Thus the charade goes on, funded by granting agencies just as committed to that particular corner of the myth of progress as the researchers whose salaries they pay, and continuing to swallow vast amounts of money, resources, and intellectual talent that might accomplish quite a bit if they could be applied to some less futile task.

The fusion research community, in effect, is being held hostage by the myth of progress. I’ve come to think that a great deal of contemporary science is caught in the same bind.  By and large, the research programs that get funding and prestige are those that carry forward existing agendas, and the law of diminishing returns—which applies to scientific research as it does to all other human activities—means that the longer an existing agenda has been pursued, the fewer useful discoveries are likely to be made by pursuing it further.  Yet the myth of progress has no place for the law of diminishing returns; in terms of the myth, every step along the forward march of progress must lead to another step, and that to another still.  This is why, to glance briefly at another example, efforts to craft a unified field theory out of Einsteinian relativity and quantum physics still get ample funding today, despite a century of total failure, while scores of research projects that might actually yield results go unfunded.

It does no good to science, in other words, to be imprisoned within the myth of endless linear progress. I’ve wondered more than once what modern science would look like if some philosophical equivalent of a SWAT team were to kick down the doors of the temple of Progress and liberate the hostages held inside. My best guess is that, freed from the myth, science would look like a tree, rather than a road leading into infinite distance:  rooted in mathematics and logic, supported by the strong trunk of the scientific method, extending branches, twigs and leaves in all directions, some of which would thrive while others would inevitably fail. Its leaves would spread out to catch as many of the rays of the light of truth as the finite nature of the tree allowed, but if one branch—the one called “fusion research,” let’s say—strayed into a lightless zone, the tree of science would direct its resources elsewhere and let that branch turn into a dry stick.

Eventually, the whole tree would reach its maximum growth, and after a lifespan of some centuries or millennia more, it would weaken, fail, and die, leaving its remains as a nurse log to nurture a new generation of intellectual saplings. That’s the way that Greek logic unfolded over time, and modern science started out its existence as one of the saplings nurtured on classical logic’s vast fallen trunk. More generally, that’s history’s way with human intellectual, cultural, and artistic systems of all kinds, and only the blinders imposed by the myth of progress make it impossible for most people in today’s industrial world to see science in the same terms.

That same logic is not restricted to science, either.  If some force of philosophers packing high-caliber syllogisms and fallacy-piercing ammunition ever does go charging through the door of the temple of Progress, quite a few people may be startled by the identity of some of the hostages who are led out blinking into light and freedom. It’s not just the sciences that are tied up and blindfolded there; nearly all the Western world’s religions share the same fate.

It’s important here to recognize that the myth of progress provides two potential roles for those who buy into its preconceptions. As noted earlier in this post, they can join the winning side and enlist in the marching legions of progress, or they can join the losing side, struggle against progress, and heroically fail. Both those roles are necessary for the enactment of the myth, and the raw power of popular culture can be measured in the ease with which nearly every religious tradition in the Western world, including those whose traditions are radically opposed to either one, have been pushed into one role or the other. The major division is of course that between liberal and conservative denominations; the former have by and large been reduced to the role of cheerleaders for progress, while the latter have by and large been assigned the corresponding role as cannon fodder for the side that’s destined to lose. 

The interplay between the two sides of the religious spectrum has been made rather more complex by the spectacularly self-defeating behavior of most North American denominations following the Second World War. In those years, a series of wildly popular books—John A.T. Robinson’s Honest to God, Pierre Berton’s The Comfortable Pew, and others of the same kind—argued in effect that, in order to be properly progressive, Christian churches ought to surrender their historic beliefs, practices, and commitments, and accept whatever diminished role they might be permitted by the mainstream of liberal secular society.  Some of these books, such as Robinson’s, were written by churchmen; others, such as Berton’s, were not, but all of them were eagerly received by liberal churches across the English-speaking world.

The case of The Comfortable Pew is particularly intriguing, as the Anglican Church of Canada hired a well-known Canadian atheist to write a book about what was wrong with their church and what they should do about it, and then gamely took his advice.  Other denominations were not quite so forthright in expressing a death wish, but the results were broadly similar.  Across the board, liberal churches reworked seminary programs to imitate secular liberal arts degrees, abandoned instruction in religious practice, took up the most radical forms of scriptural criticism, and redefined their clergy as amateur social service providers and progressive activists with a sideline in rites of passage. Since most people who go to churches or synagogues are there to practice their religion, not to provide their clergy with an admiring audience for political monologues and lessons in fashionable agnosticism, this shift was promptly followed a steep plunge in the number of people who attended services in all the liberal denominations. Here again, the logic of progress made it all but impossible for church leaders to learn the lesson taught by failure, and most liberal denominations have remained in a death spiral ever since.

Meanwhile, conservative denominations were busy demonstrating that the opposite of one bad idea is usually another bad idea. Panicked by the global expansion of Communism—you rarely heard that latter word in American public discourse in the 1950s and 1960s without the adjective “godless” tacked on its front end—and the sweeping social changes triggered by postwar prosperity, the leaders of the conservative denominations moved as eagerly as their liberal counterparts to embrace the role that the myth of progress offered them. Along with William F. Buckley and the other architects of postwar American pseudoconservatism, they redefined themselves in opposition to the progressive agenda of their time, and never seemed to notice that they were so busy standing against this, that, and the other that most of them forgot to stand for anything at all.

The social pressure to conform to stereotypes and resist progress in every sense drove the weirdest dimension of late 20th century American Christian pseudoconservatism, the holy war against Darwinian evolution. Nowhere in the Bible does it say that the luminous poetry of the first chapter of Genesis must be treated as a geology textbook, nor is a literal reading of Genesis mandated by any of the historic creeds of the Christian churches. Nonetheless “Thou shalt not evolve” got turned into an ersatz Eleventh Commandment, and devout Christians exercised their ingenuity to the utmost to find ways to ignore the immense and steadily expanding body of evidence from geology, molecular biology, paleontology, and genetics that backed Darwin’s great synthesis. That and such sideshows as the effort to insist on the historical reality of the Noah’s ark story, despite conclusive geological evidence disproving it, crippled the efforts of conservative Christians to reach outside their existing audience.

The conservative denominations never quite managed to discard their historic beliefs, practices and commitments with the same enthusiasm shown by their liberal counterparts, preferring to maintain them in mummified form while political activism took center stage; still, the result was much the same.  Today, the spokespersons for conservative religious denominations in America speak and act as though reinstating the mores and politics that America had in the late 1940s has become the be-all and end-all of their religion. In response, a growing number of former parishioners of conservative denominations have withdrawn into the rapidly growing Home Church movement, in which families meet in living rooms with their neighbors to pray and study the Bible together. If that trend accelerates, as it appears to be doing, today’s conservative megachurches may soon turn into cavernous spaces visited once a week by a handful of retirees, just like the once-bustling liberal churches across the road.

The hijacking of religious institutions by the competing civil religion of progress has thus turned out to be a disaster on both sides of the political divide.  The distortions imposed on religion, once it was taken hostage by the myth of progress, thus correspond closely to the distortions imposed on science during its own imprisonment by the same myth. As the civil religion of progress begins to lose its grip on the collective imagination of our time, in turn, both science and religion thus will have to undergo a difficult process of reappraisal, in which many of the mistaken commitments of recent decades will need to be renegotiated or simply abandoned. Harrowing as that process may be, it may just have an unexpected benefit—a negotiated truce in the pointless struggle between science and religion, or even a creative rapprochement between these two distinct and complementary ways of relating to the universe. I’ll discuss these possibilities in next week’s post.

Wednesday, July 10, 2013

Asking the Hard Questions

There are nights, now and then, when I sit up late with a glass of bourbon and look back over the long strange trip that’s unfolded over the last thirty years or so.  When a substantial majority of Americans straight across the political landscape convinced themselves in the early 1980s that mouthing feel-good slogans and clinging to extravagant lifestyles over the short term made more sense than facing up to the hard choices that might have given our grandchildren a livable future, that choice kickstarted a flight into fantasy that continues to this day.

Over the seven years that I’ve been writing and posting essays here on The Archdruid Report, in turn, a tolerably good sample of the resulting fantasies have been dumped on my electronic doorstep by readers who were incensed by my lack of interest in playing along. There’s a certain amusement value in reviewing that sample, but a retrospective glance that way has another advantage: the common threads that unite the fantasies in question form a pattern of central importance to the theme that this sequence of posts is trying to explore.

Back in 2006, when I made my first posts suggesting that the future waiting for us on the far side of Hubbert’s peak was a long, ragged descent punctuated by crises, there were three common ways of dismissing that prediction. The first insisted that once the price of petroleum got near $100 a barrel, the sheer cost of fueling the industrial economy would trigger the economic crisis to end all economic crises and bring civilization crashing down at once. The second insisted that once that same price threshold was met, any number of exciting new renewable energy technologies would finally become profitable, resulting in a green-energy boom and a shiny future.  The third insisted that once that price threshold was met, the law of supply and demand would flood the market with petroleum, force prices back down, and allow the march of economic growth to continue merrily on its way.

A case could be made that those were reasonable hypotheses at the time. Still, the price of oil went soaring past $100 a barrel over the next few years, and none of those predictions panned out. We did have a whopping economic crisis in 2008, but emergency actions on the part of central banks kept the global economy from unraveling; a variety of renewable energy technologies got launched onto the market, but it took massive government subsidies to make any of them profitable, and all of them together provide only a very small fraction of our total energy use; and, of course, as prices rose, a certain amount previously uneconomical oil did find its way to market, but production remains locked into a plateau and the price remains stubbornly high.

That is to say, the perfect storms weren’t, the game-changing events didn’t, and a great many prophets ended up taking a total loss on their predictive investments. It’s the aftermath, though, that matters. By and large, the people who were making these claims didn’t stop, look around, and say, “Hmm, clearly I got something wrong.  Is there another way of thinking about the implications of peak oil that makes more sense of the data?” Instead, they found other arguments to back the same claims, or simply kept repeating them at higher volume. For a while there, you could go visit certain peak oil bloggers every January and read the same predictions of imminent economic doom that appeared there the year before, and then go to another set of peak oil bloggers and read equally recycled predictions that this would be the breakthrough year for some green energy source or other, and in neither case was there any sign that any of them had learned a thing from all the times those same predictions had failed before.

Nor were they alone—far from it.  When I think about the number of arguments that have been posted here over the last seven years, in an effort to defend the claim that the Long Descent can’t possibly happen, it’s enough to make my head spin, even without benefit of bourbon. I’ve fielded patronizing lectures from believers in UFOs, New Age channelers, and the fake-Mayan 2012 prophecy, airily insisting that once the space brothers land, the New Age dawns, or what have you, we’ll all discover that ecological limits and the laws of thermodynamics are illusions created by lower states of consciousness. Likewise, I’ve received any number of feverish pronouncements that asteroids, solar flares, methane burps from the sea floor or, really, just about anything you can imagine short of titanic space walruses with photon flippers, are going to wipe out humanity in the next few years or decades and make the whole issue moot.

It’s been a wild ride, really.  I’ve been labeled dogmatic and intolerant for pointing out to proponents of zero point energy, abiotic oil, and similar exercises in wishful thinking that insisting that a completely unproven theory will inevitably save us may not be the most sensible strategy in a time of crisis. I’ve been dismissed as closed-minded by believers in artificial intelligence, fusion power, and an assortment of other technological will-o’-the-wisps for asking why promises of imminent sucess that have been repeated word for word every few years since the 1950s still ought to be considered credible today  I’ve been accused of being a stooge for the powers of evil for questioning claims that Bush—er, make that Clinton—uh, well, let’s try Dubya—um, okay, then, Obama, is going to suspend the constitution, impose a totalitarian police state and start herding us all into camps, and let’s not even talk about the number of people who’ve gotten irate with me when I failed to be impressed by their insistence that the Rapture will happen before we run out of oil.

Not one of these claims is new, any more than the claims of imminent economic collapse, green-energy breakthroughs, or oceans of petroleum just waiting to be drilled. Most of them have been recycled over and over again, some for over a century—the New Age, for example, was originally slated to arrive in 1879, and in fact the most popular alternative spirituality magazine in 1890s Britain was titled The New Age—and the few that have only been through a few seasons’ worth of reruns follow familiar patterns and thus fail in equally familiar ways. If the point of making predictions in the first place has anything to do with anticipating the future we’re actually likely to get, these claims have flopped resoundingly, and yet they remain wildly popular.

Now of course there are good reasons why they should be popular. All the claims about the future I’ve listed are, in practical terms, incentives to inaction and evasions of responsibility.  If rising oil prices are guaranteed to bring on a rush of new green energy options, then we don’t have to change our lifestyles, because pretty soon we’ll be able to power them on sun or wind or what have you; if rising oil prices are guaranteed to bring on a rush of new petroleum sources, well, then we don’t need to change our lifestyles, either, and we can make an extra donation to the Sierra Club or something to assuage any lingering ecological guilt we might have. The same goes for any of the other new technologies that are supposedly going to provide us with, ahem, limitless energy sometime very soon—and you’ll notice that in every case, supplying us with all that energy is someone else’s job.

On the other hand, if the global economy is sure to go down in flames in the next few years, or runaway climate change is going to kill us all, or some future president is finally going to man up, impose a police state and march us off to death camps, it’s not our fault, and there’s nothing we can do that matters anyway, so we might as well just keep on living our comfortable lifestyles while they’re still here, right? It may be impolite to say this, but it needs to be said: any belief about the future that encourages people to sit on their backsides and do nothing but consume scarce resources, when there’s a huge amount that could be done to make the future a better place and a grave shortage of people doing it, is a luxury this age of the world can’t afford.

Still, I’d like to cycle back to the way that failed predictions are recycled, because it leads straight to the heart of an unrecognized dimension of the predicament of our time. Since the future can’t be known in advance, attempts to predict it have to rely on secondhand evidence.  One proven way to collect useful evidence concerning the validity of a prediction is to ask what happened in the past when somebody else made that same prediction.  Another way is to look for situations in the past that are comparable to the one the prediction discusses, in order to see what happened then. A prediction that fails either one of these tests usually needs to be put out to pasture; one that fails both—that has been made repeatedly in the past and failed every time, and that doesn’t account for the way that comparable situations have turned out—ought to be sent to the glue factory instead.

It’s in this light that the arguments used to defend repeatedly failed predictions can be understood. I’ve discussed these arguments at some length in recent posts:  the endlessly repeated claim that it’s different this time, the refusal to think about the implications of well-documented sources of negative feedback, the insistence that a prediction must be true if no one’s proved that it’s impossible, and so on. All of them are rhetorical gimmicks meant to stonewall the kind of assessment I’ve just outlined. Put another way, they’re attempts to shield repeatedly failed predictions from the normal and healthy consequences of failure.

Think about that for a bit. From the time that our distant ancestors ventured out onto the East African savannas and started to push the boundaries of their nervous systems in ways for which millions of years of treetop living did little to prepare them, their survival and success have been a function of their ability to come up with mental models of the world that more or less correspond to reality where it counts. If there were ever australopithecines that couldn’t do the sort of basic reality testing that allows food to be distinguished from inedible objects, and predators from harmless animals, they didn’t leave any descendants. Since then, as hominids and then humans developed more and more elaborate mental models of the world, the hard-won ability to test those models against the plain facts of experience with more and more precision has been central to our achievement.

In the modern West, we’ve inherited two of the great intellectual revolutions our species has managed—the creation of logic and formal mathematics in ancient Greece, and the creation of experimental science in early modern Europe—and both of those revolutions are all about reality testing. Logic is a system for making sure that mental models make sense on their own terms, and don’t stray into fallacy or contradiction; experimental science is a system for checking some mental models, those that deal with the quantifiable behavior of matter and energy, against the facts on the ground. Neither system is foolproof, but then neither is anything else human, and if both of them survive the decline and fall of our present civilization, there’s every reason to hope that future civilizations will come up with ways to fill in some of their blind spots, and add those to the slowly accumulating body of effective technique that provides one of the very few long-term dynamics to history.

It remains true, though, that all the many methods of reality testing we’ve evolved down through the millennia, from the most basic integration of sense inputs hardwired into the human brain right on up to the subtleties of propositional logic and the experimental method, share one central flaw. None of them will work if their messages are ignored—and that’s what’s going on right now, as a vast majority of people across the modern industrial world scramble to find reasons to cling to a range of popular but failed predictions about the future, and do their level best to ignore the evidence that a rather more unpopular set of predictions about the future is coming true around them. 

Look around, dear reader, and you’ll see a civilization in decline, struggling ineffectually with the ecological overshoot, the social disintegration, the institutional paralysis, and the accelerating decay of infrastructure that are part and parcel of the normal process by which civilizations die. This is what the decline and fall of a civilization looks like in its early-to-middle stages—and it’s also what I’ve been talking about, very often in so many words, since not long after this blog got under way seven years ago.  Back then, as I’ve already mentioned, it was reasonable to propose that something else might happen, that we’d get the fast crash or the green-energy breakthrough or all the new petroleum that the law of supply and demand was supposed to provide us, but none of those things happened. (Of course, neither did the mass landing of UFOs or any of the other more colorful fantasies, but then that was never really in question.)  It’s time to recognize that the repetition of emotionally appealing but failed predictions is not a helpful response to the crisis of our time, and in fact has done a great deal to back us into the corner we’re now in. What was Ronald Reagan’s airy twaddle about “morning in America,” after all, but another emotionally appealing failed prophecy of the kind I’ve just been discussing?

Thus I’d like to suggest that from now on, any claim about the future needs to be confronted up front by the two hard questions proposed above.  What happened at other times when people made the same prediction, or one that’s closely akin to it? What happened in other situations that are comparable to the one the prediction attempts to address?  Any prediction that claims to be about a future we might actually encounter should be able to face these two questions without resorting to the kind of rhetorical evasions noted above. Any prediction that has to hide behind those evasions, in turn, needs to be recognized as being irrelevant to any future we might actually encounter. My own predictions, by the way, stand or fall by the same rule, and I encourage my readers to ask those questions of each prediction I make, and answer them through their own research.

Yes, I’m aware that those two questions pack an explosive punch that makes dynamite look weak. It’s embarrassingly common in contemporary life for theories to be embraced because of their emotional appeal, and then defended with every rhetorical trick in the book against any inconvenient contact with unsympathetic facts. As suggested in last week’s post, that’s a common feature of civilizations toward the end of their rationalist period, when abstract reason gets pushed to the point of absurdity and then well beyond it.  Fantasies about the shape of the future aren’t uncommon at such times, but I don’t know of another civilization in all of recorded history that has put as much energy as ours into creating and defending abstract theories about the shape of the future. With any luck, the civilizations that come after ours will learn from our mistakes, and direct their last and most overblown abstractions in directions that will do less harm.

In the meantime, those of us who are interested in talking about the kinds of future we might actually encounter might find it useful to give up the standard modern habit of choosing a vision of the future because it’s emotionally appealing, demanding that the world fulfill whatever dream we happen to have, and filling our minds with defensive gimmicks to keep from hearing when the world says “no.” That requires a willingness to ask the questions I mentioned above, and to accept the answers, even when they aren’t what we’d like them to be.  More generally, it requires a willingness to approach the universe of our experience from a standpoint that’s as stunningly unfashionable these days as it is necessary—a standpoint of humility.

What would it mean if, instead of trying to impose an emotionally appealing narrative on the future, and shouting down any data that conflicts with it, we were to approach the universe of our experience with enough humility to listen to the narratives the universe itself offers us?  That’s basically what I’ve been trying to suggest here all along, after all. That’s the point to my repeated references to history, because history is our species’ accumulated body of knowledge of the way human affairs unfold over time, and approaching that body of knowledge with humility and a willingness to listen to the stories it tells is a proven way to catch hints about the shape of the future as it unfolds.

That’s also the point to my equally frequent references to ecology, because history is simply one subset of the behavior of living things over time—the subset that deals with human organisms—and also because ecological factors have played a huge and all too often unrecognized role in the rise and fall of human societies. Whether humans are smarter than yeast is less important than the fact, and of course it is a fact, that humans, yeast, and all other living things are subject to the same ecological laws and thus inevitably experience similar processes over time. Attentive listening to the stories that history tells, and the even richer body of stories that nature tells, is the one reliable way we’ve got to figure out what those processes are before they clobber us over the head.

That act of humility, finally, may be the best ticket out of the confusion that the collective imagination of our time has created around itself, the proliferation of abstractions divorced from reality that makes it so hard to see the future looming up ahead of us.  By turning our attention to what actually happens in the world around us, and asking the hard but necessary questions about our preferred notions concerning that world and its future, we might just be able to extract ourselves far enough from that confusion to begin to grapple with the challenges of our time. In the process, we’ll have to confront once again the issues with which this series of posts started out—the religious dimension of peak oil and the end of the industrial age. We’ll proceed with that discussion next week.

Wednesday, July 03, 2013

A Peculiar Absence of Bellybones

The fixation on imaginary “perfect storms” critiqued in last week’s post is only one expression of a habit of thinking that pervades contemporary American culture and, to a lesser extent, most other industrial societies.  I’ve referred to this habit in a couple of posts in this series already, but it deserves closer attention, if only to help make sense of the way that individuals, institutions, and whole societies so often get blindsided these days by utterly predictable events.

Like several of the other themes already explored in this sequence, the habit of thinking I have in mind was explored by Oswald Spengler in The Decline of the West. His way of discussing it, though, relies on turns of phrase that don’t translate well into English, and philosophical concepts that were familiar to every reader in 1918 Germany and completely opaque to most readers in 2013 America. To make sense of it, I’ll need to reframe the discussion by way of an excursion into deep time, so we can talk about the difference between what can happen and what does happen.

Unlike the Marcellus shale, the Barnett shale, and some of its other distant geological cousins, the Burgess shale doesn’t contain any appreciable amounts of oil or natural gas. What it does contain is a vast number of delicate fossils from the Cambrian period. It’s been argued that your ancestors and mine are there in the Burgess shale, in the form of a tiny, wriggling whatsit called Pikaia with a little strip of cartilage running down its back, the first very rough draft of what eventually turned into your backbone. There are plenty of other critters there that are unlike anything else since that time, and it’s perfectly plausible to imagine that they, rather than Pikaia, might have left descendants who evolved into the readers of this blog, but that’s not what happened.  Intelligent beings descended from five-eyed, single-tentacled Opabinia were possible; they could have happened, but they didn’t, and once that was settled, a whole world of possibilities went away forever. There was no rational reason for that exclusion; it just happened that way.

Let’s take a closer look at Pikaia, though.  Study it closely, and you can just about see the fish that its distant descendants will become. The strip of cartilage runs along the upper edge of its body, where fish and all other vertebrates have their backbones. It didn’t have to be there; if Pikaia happened to have cartilage along its lower edge, then fish and all the other vertebrates to come would have done just as well with a bellybone in place of a backbone, and you and I would have the knobbly bumps of vertebrae running up our abdomens and chests. Once Pikaia came out ahead in the struggle for survival, that possibility went wherever might-have-beens spend their time. There’s no logical reason why we don’t have bellybones; it simply turned out that way, and the consequences of that event still constrain us today.

Fast forward 200 million years or so, and a few of Pikaia’s putative descendants were learning to deal with the challenges and possibilities of muddy Devonian swamps by wriggling up out of the water, and gulping air into their swim bladders to give them a bit of extra oxygen.  It so happens that these fish had four large fins toward the underside of their bodies. Many other fish at the time had other fin patterns instead, and if the successful proto-lungfish had happened to come from a lineage with six fins underneath, then amphibians, reptiles, birds, and mammals would have six limbs today instead of four.  A six-limbed body plan is perfectly viable—ask any insect—but the vertebrates that ventured onto land had four, and once that happened, the question was settled. Nothing makes six-legged mammals impossible, but there aren’t any and never will be.  In an abstract sense, they can happen, but in the real world, they don’t, and it’s only history that explains why.

Today, another 400 million years later, most of the possible variables shaping life in this planet’s biosphere are very tightly constrained by an intricate network of ecological pressures rooted in the long history of the planet. Those constraints, among other things, drive convergent evolution—the process by which living things from completely different evolutionary lineages end up looking and behaving like each other.  100 million years ago, when the Earth had its normal hothouse climate and reptiles were the dominant vertebrates, the icthyosaurs, a large and successful family of seagoing reptiles, evolved what we now think of as the basic dolphin look; when they went extinct and a cooling planet gave mammals the edge, seagoing mammals competing for the same ecological niche gave us today’s dolphins and porpoises. Their ancestors, by the way, looked like furry crocodiles, and for good reason; if you’re going to fill a crocodile’s niche, as the protocetaceans did, the pressures that the rest of the biosphere brings to bear on that niche pretty much require you to look and act like a crocodile.

The lesson to be drawn from these examples, and countless others, is that evolution isn’t free to do everything that, in some abstract sense, it could possibly do. Between the limits imposed by the genetics of the organism struggling to adapt, and the even stronger limits imposed by the pressures of the environment within which that struggle is taking place, there are only so many options available, and on a planet that’s had living things evolving on it for two billion years or so, most of those options will have already been tried out at least once. Even when something new emerges, as happens from time to time, that doesn’t mean that all bets are off; it simply means that familiar genetic and environmental constraints are going to apply in slightly different ways. That means that there are plenty of things that theoretically could happen that never will happen, because the constraints pressing on living things don’t have room for them.

That much is uncontroversial, at least among students of evolutionary ecology. Apply the same point of view to human history, though, and you can count on a firestorm of protest. Nonetheless, that’s exactly what I’ve been trying to do in this blog over the last seven years—to point out that historical change is subject to limits imposed by the historical trajectories of societies struggling to adapt, and the even stronger limits imposed by the pressures of the environment within which that struggle is taking place; worse still, to point out that societies have an equivalent of convergent evolution, which can be studied by putting different societies side by side and comparing their historical trajectories, and that this reveals otherwise untraceable constraints and thus allows meaningful predictions to be made about the future of our own civilization. Each of those proposals offends several of the most basic assumptions with which most people nowadays approach the future; put them all together—well, let’s just say that it’s no surprise that each weekly post here can count on fielding its quota of spit-slinging denunciations.

As regular readers of this blog know, a great many of these quarrels arrange themselves around the distinction I’ve just drawn. Whether we’re talking about 2012 or near-term human extinction or the latest claim that some piece of other of energy-related vaporware will solve the world’s increasingly intractable energy and resource shortages, my critics say, “It could happen!” and I reply, “But it won’t.” They proceed to come up with elaborate scenarios and arguments showing that, in fact, whatever it is could possibly happen, and get the imperturbable answer, “Yes, I know all that, but it still won’t happen.” Then it doesn’t happen, and the normal human irritation at being wrong gets thrown in the blender with a powerful sense of the unfairness of things—after all, that arrogant so-and-so of an archdruid didn’t offer a single solitary reason why whatever it was couldn’t possibly happen!—to make a cocktail that’s uncommonly hard to swallow.

There’s a reason, though, why these days the purveyors of repeatedly disproved predictions, from economists through fusion-power proponents to believers in the current end of the world du jour, so constantly use arguments about what can happen and so consistently ignore what does happen. It’s a historical reason, and it brings us a big step closer to the heart of this sequence of posts.

When Nietzsche proclaimed the death of God to a mostly uninterested 19th century, as I mentioned in an earlier post in this sequence, he was convinced that he was doing something utterly unprecedented—and he was wrong. If he’d been a little more careful about checking his claims against what he’d learned as a classical philologist, he would have remembered that the gods also died in ancient Greece in the fourth century BCE, and that the rationalist revolt against established religion in the Greek world followed the same general course as its equivalent in western Europe and the European diaspora two millennia or so later.  Put the materialist philosophers of the Greek Enlightenment side by side with the corresponding figures in its European equivalent, or line up the skeptical barbs aimed at Homer’s portrayal of the gods and goddesses of Greece with those shot at the Bible’s portrayal of the god of Christianity—by Nietzsche among others!—and the similarities are hard to miss.

What’s more, the same thing has happened elsewhere.  India went through its rationalist period beginning in the sixth century BCE, giving rise to full-blown atomist and materialist philosophies as well as an important school of logic, the Nyaya; it’s indicative of the tone of that period that the two great religious movements founded then, Buddhism and Jainism, in their earliest documented forms were wholly uninterested in gods. The equivalent period in ancient China began about a century later, with its own achievements in logic and natural science and its own dismissal of formal religion—sacrifices and rites are important for social reasons, Confucius argues, but to busy oneself excessively with them shows that one is ignorant and unreasonable.

It’s a standard element of the trajectory of literate civilizations through time. Every human society comes out of the shadows of its origins well equipped with a set of beliefs about what does happen. Since most human societies in their early phases are either wholly innocent of writing, or have lost most of a former tradition of literacy in the collapse of some previous civilization, those beliefs are normally passed down by way of the oldest and most thoroughly proven system of information storage and transfer our species has invented—that is to say, mythology:  a collection of vivid, colorful stories, usually in verse, that can be learned starting in early childhood and remembered letter-perfect into advanced old age.  Since the information storage capacity of myths is large but not limitless, each myth in a mature mythology is meant to be understood and interpreted on several levels, and learning how to unpack the stories is an essential part of education as an adult in these societies.

For human societies that rely on hunter-gatherer, nomadic pastoral, or village horticultural economies, mythology is amply suited to their information storage and transfer needs, and it’s rare for these to go looking for other options. Those societies that take to field agriculture and build urban centers, though, need detailed records, and that usually means writing or some close equivalent, such as the knotted cords of the old Incas.  Widespread public literacy seems to be the trigger that sets off the collapse of mythic thinking.  Where literacy remains the specialty of a priesthood jealous of its privileges, among the ancient Maya or in Egypt before the New Kingdom, writing is simply a tool for recordkeeping and ceremonial proclamations, but once it gets into general circulation, rationalism of one kind or another follows in short order; an age of faith gives way to an age of reason.

That transformation has many dimensions, but one of the more important is a refocusing from what does happen to what can happen. At the time, that refocusing is a very good thing. Literacy in an age of faith tends to drive what might be called the rationalization of religion; myths get written down, scribes quarrel over which versions are authentic and what interpretations are valid, until what had been a fluid and flexible oral tradition stiffens into scripture, while folk religion—for the time being, we can define that messy category “religion” in purely functional terms as the collection of customary rites and beliefs that go with a particular set of mythic narratives—goes through a similar hardening into an organized religion with its own creed and commandments. That process of rigidification robs oral tradition of the flexibility and openness to reinterpretation that gives it much of its strength, and helps feed the building pressures that will eventually tear the traditional religion to shreds.

It’s the rise of rational philosophy that allows people in a literate civilization to get out from under the weight of a mummified version of what does happen and start exploring alternative ideas about what can happen.  That’s liberating, and it’s also a source of major practical advantages, as life in a maturing urban civilization rarely fits a set of mythic narratives assembled in an older and, usually, much simpler time.  It becomes possible to ask new questions and speculate about the answers, and to explore a giddy range of previously unexamined options.

That much of the story is hardwired into the historical vision of contemporary Western culture. It’s the next part of the story, though, that leads to our present predicament. The wild freedom of the early days of the rationalist rebellion never lasts for long.  Some of the new ideas that unfold from that rebellion turn out to be more popular and more enduring than others, and become the foundations on which later rationalists build their own ideas.  With the collapse of traditional religions, in turn, people commonly turn to civil religions as a source of values and meaning, and popular civil religions that embrace some form of rationalist thought, as most do, end up imbuing it with their own aura of secondhand holiness.  The end result of the rationalist rebellion is thus a society as heavily committed to the supposed truth of some set of secular dogmas as the religion it replaced was to its theological dogmas.

You know that this point has arrived when the rebellion starts running in reverse, and people who want to think ideas outside the box start phrasing them, not in terms of rational philosophy, but in terms of some new or revived religion.  The rebellion of rationalism thus eventually gives rise to a rebellion against rationalism, and this latter rebellion packs a great deal more punch than its predecessor, because the rationalist pursuit of what can happen has a potent downside: it can’t make accurate predictions of the phenomena that matter most to human beings, because it fixates on what can happen rather than paying attention to what does happen.

It’s only in the fantasies of extreme rationalists, after all, that the human capacity for reason has no hard limits. The human brain did not evolve for the purpose of understanding the universe and everything in it; it evolved to handle the considerably less demanding tasks of finding food, finding mates, managing relations with fellow hominids, and driving off the occasional leopard. We’ve done some remarkable things with a brain adapted for those very simple purposes, to be sure, but the limits imposed by our ancestry are still very much in place.

Those limits show most clearly when we attempt to understand processes at work in the world. There are some processes in the world that are simple enough, and sufficiently insulated from confounding variables, that a mathematical model that can be understood by the human mind is a close enough fit to allow the outcome of the process to be predicted.  That’s what physics is about, and chemistry, and the other “hard” sciences: the construction of models that copy, more or less, the behavior of parts of the world that are simple enough for us to understand.  The fact that some processes in the world lend themselves to that kind of modeling is what gives rationalism its appeal.

The difficulty creeps in, though, when those same approaches are used to try to predict the behavior of phenomena that are too complex to conform to any such model. You can make such predictions with fairly good results if you pay attention to history, because history is the product of the full range of causes at work in comparable situations, and if A leads to B over and over again in a sufficiently broad range of contexts, it’s usually safe to assume that if A shows up again, B won’t be far behind. Ignore history, though, and you throw away your one useful source of relevant data; ignore history, come up with a mental model that says that A will be followed by Z, and insist that since this can happen it will happen, and you’re doomed.

Human behavior, individual as well as collective, is sufficiently complex that it falls into the category of things that rational models divorced from historical testing regularly fail to predict.  So do many other things that are part of everyday life, but it’s usually the failure of rational philosophies to provide a useful understanding of human behavior that drives the revolt against rationalism. Over and over again, rational philosophies have proclaimed the arrival of a better world defined by some abstract model of how human beings ought to behave, some notion or other of what can happen, and the actions people have taken to achieve that better world have resulted in misery and disaster; the appeal of rationalism is potent enough that it normally takes a few centuries of repeated failures for the point to be made, but once it sinks in, the age of reason is effectively over.

That doesn’t mean that the intellectual tools of rationalism go away—quite the contrary; the rise of what Spengler called the Second Religiosity involves sweeping transformations of religion and rational philosophy alike. More precisely, it demands the abandonment of extreme claims on both sides, and the recognition of what it is that each does better than the other. What comes after the age of reason isn’t a new age of faith—not right away, at least; that’s further down the road—but an age in which the claims of both contenders are illuminated by the lessons of history: an age of memory.

That’s why, a few centuries after the rationalists of Greece, India, and China had denounced or dismissed the gods, their heirs quietly accepted a truce with the new religious movements of their time, and a few centuries further on, the heirs of those heirs wove the values taught by the accepted religion into their own philosophical systems. That’s also why, over that same time, the major religions of those cultures quietly discarded claimsthat couldn’t stand up to reasonable criticism.  Where the Greeks of the Archaic period believed in the literal truth of the Greek myths, and their descendants of the time of Socrates and Plato were caught up in savage debates over whether the old myths had any value at all, the Greeks of a later age accepted Symmachus’ neat summary—“Myths are things that never happened, but always are”—and saw no conflict at all between pouring a libation to Zeus the Thunderer and taking in a lecture on physics in which thunderbolts were explained by wholly physical causes.

That state of mind is very far from the way that most people in the contemporary industrial world, whether or not they consider themselves to be religious, approach religious beliefs, narratives, and practices.  The absurd lengths to which today’s Christian fundamentalists take their insistence on the historical reality of the Noah’s ark story, for example, in the face of conclusive geological evidence that nothing of the sort happened in the time frame the Biblical narrative provides for it, is equalled if not exceeded by the lengths to which their equal and opposite numbers in the atheist camp take their insistence that all religions everywhere can be reduced to these terms.

Still, I’d like to suggest that this rapprochement is the most likely shape for the religious future of a declining industrial world, and that it also offers the best hope we’ve got for getting at least some of the achievements of the last three centuries or so through the difficult years ahead. How that process might play out is a complex matter; we’ll begin discussing it next week.