Wednesday, May 25, 2016

Against Cultural Senility

For the connoisseur of sociopolitical absurdity, the last few weeks’ worth of news cycles very nearly defines the phrase “target-rich environment.” I note, for example, that arch-neoconservative Robert Kagan—the founder of the Project for a New American Century and principal architect of this nation’s idiotically bloodthirsty Middle East policies, a man who never met a body bag he didn’t like—has jumped party lines to endorse Hillary Clinton’s presidential ambitions.

Under other conditions I’d wonder if Kagan had decided to sandbag Clinton’s hopes, using a lethal dose of deadpan satire to point out that her policy stances are indistinguishable from those of George W. Bush: you know, the guy that so many Democrats denounced as evil incarnate just eight short years ago. Unfortunately, nothing so clever seems to be in the works. Kagan seems to be quite sincere in his adulation for Clinton. What’s more, his wife Victoria Nuland, a Hillary Clinton protegé in the State Department and a major player in the Obama administration’s pursuit of Cold War brinksmanship against Russia, is now being rumored as Clinton’s most likely pick for Secretary of State.

For unintended satire, that one’s hard to beat Still, I’d say it has been outdone by another recent story, which noted that the students at Brown University, one of this nation’s Ivy League universities, are upset. Turns out they’re so busy protesting for social justice these days that they don’t have enough time to keep up with their classwork, and yet heir professors are still expecting papers to be turned in on time—a demand that strikes the students as grossly unfair. A savage parody off some right-wing website? Nope; the story appeared in the Brown University student paper earlier this month.

To be fair to the students, they’re not the only ones who have redefined the purpose of a university education in a way that, for the sake of politeness, we’ll call “quirky.” Radical faculty members, who encourage this reenactment of their vanished youth as a political equivalent of Münchausen syndrome by proxy, are doing much the same thing. Then, of course, you’ve got corporations who think that universities are places where prospective employees go to pay for their own job training, university bureaucrats who bubble marketing-firm sewage about offering students the “university experience,” and so on through an entire galaxy of self-regarding and self-important cant. The one thing that finds no place among all these competing redefinitions is, predictably enough, learning.

I’ve mentioned before on this blog the need to devise new opportunities for learning, and in particular a new structure for adult education that isn’t subservient to the increasingly blatant political and financial interests of the academic industry. More broadly, the concept of learning has been a core theme of this blog since it began—partly because modern industrial society’s stunning inability to learn the lessons of repeated failure looms so large in public life today, partly because learning ways to make sense of the world and practical skills for dealing with the converging crises of our time ranks high on the to-do list for anyone who takes the future seriously. I think, therefore, that it’s time to move that discussion to center stage, and talk about learning and education in the context of the Long Descent.

We could start that discussion in many different places, but the whinefest under way at Brown just now makes as good a springboard as any. We can, I think, presume that universities don’t exist for the sake of giving privileged youth a place to play at changing the world, before they settle down to a lifetime of propping up the status quo in corporate and government careers. Nor do they exist for any of the other dubious purposes mentioned above. What, then, is a university for?

That’s best approached by looking at the other two legs of the institutional tripod that once supported American education. In the long-gone days when the United States still had an educational system that worked, that system sorted itself out into three broad categories of schools: public schools, trade schools, and universities. Public schools existed for the purpose of providing the basic intellectual skills that would allow young people to participate in society as productive citizens. Trade schools existed for the purpose of teaching the technical skills that would allow graduates to find steady work in the skilled trades. In the trade school category, we can also include medical schools and the few law schools that existed then—most lawyers got their legal training through apprenticeship until well into the twentieth century—and other institutions meant to turn out trained professionals, such as divinity schools.

Then there were the universities. The grand old American habit of highfalutin’ obfuscation that used to double the length of commencement addresses and Congressional speeches alike makes it a bit difficult to tease out, from the rhetoric of the day, the intended purpose of a university education, but attending to what was actually taught there in the late nineteenth and very early twentieth centuries makes the point tolerably clear: universities existed to launch students into a full-on, face-first encounter with that foreign country we call the past. That’s why the university curriculum back then focused on such subjects as history, classics, literature, and the like—and why the word “literature” in an academic setting generally excluded anything written within living memory.

This was of course exactly the thing the educational revolutions of our time targeted and, for the most part, destroyed. Under the banner of “relevance,” reformers across the American academic scene in the 1960s and 1970s pushed for the replacement of the traditional curriculum with something more up-to-date, modern, progressive—in a word, fashionable. Alongside the great crusade for relevance came the proliferation of new departments and degree programs. Thereafter, what was left of the old curriculum was assailed by proponents of various flavors of postmodernism, and after that came what’s known in the academic biz as “critical theory”—that is, ideologies of condemnation and exclusion that focus on race, gender, and other markers of privilege and disprivilege in society.

All of these changes, among their other impacts, had the effect of distancing students from the collision with the past that was central to the older approach to university education. The crusade for relevance and the mass production of new departments and degree programs did this in a straightforward fashion, by redirecting attention from the past to the present—it’s not accidental that the great majority of the new departments and degree programs focused on one or another aspect of modernity, or that by “relevant” the educational radicals of the Sixties generally meant “written within our lifetimes.” The other two movements just named did the same thing, though, albeit in a somewhat subtler way.

The common theme shared by the various movements lumped together as “postmodernism” was the imposition of a thick layer of interpretive theory between the student and the text. The postmodernists liked to claim that their apparatus of theory enabled them to leap nimbly into and out of texts from every place and time while understanding them all, but that was precisely what the theory didn’t do. Instead, if you’ll excuse the metaphor, it functioned as a sort of intellectual condom, meant to prevent students from conceiving any unexpected ideas as a result of their intercourse with the past. Those of my readers who encountered the sort of scholarly publication that resulted will recall any number of “conversations with the text” written along these lines, which sedulously kept the text from getting a word in edgewise, while quoting Derrida et al. at dreary length in every second or third paragraph.

If postmodernism claimed to engage in a conversation with the text, though, critical theory—still the rage in many American universities these days—subjects it to a fair equivalent of the Spanish Inquisition: one by one, texts are hauled before a tribunal, tortured with an assortment of critical instruments until they confess, suffer condemnation for their purported errors, and are then dragged off by a yelling mob to be burnt at the stake. The erasure of the past here has two aspects. On the one hand, critical-theory proponents are fond of insisting that students should never be required to read any text that has been so condemned; on the other, one very effective way of learning nothing from the past is to be too busy preening oneself over one’s moral superiority to one’s ancestors to learn from anything they might have had to say.

Popular though these moves were in the academic industry, I’d like to suggest that they were disastrously misguided at best, and have played a large role in helping to generate a widespread, and seriously destructive condition in our collective life. I’ll give a suggestive name to that condition a little later on. First, I want to talk about why the suppression of the past is as problematic as it is.

Johann Wolfgang von Goethe liked to point out that a person who knows only one language doesn’t actually know any languages at all. He was quite right, too. Only when you learn a second language do you begin to discover how many things you thought were true about the universe are merely artifacts of the grammatical and semantic structure of your first language. Where that language is vague, so are your thoughts; where that language runs several distinct meanings together in a single word, so do you; where that language imposes arbitrary structures on the complexities of experience—why, unless you have some experience with another way of assembling the world into linguistic patterns, it’s a safe bet that you’ll do the same thing even when you’re not talking or even thinking in words.

Here’s an example. People who only speak English tend to think in terms of linear cause-and-effect relationships. Listen to Americans try to understand anything, and you’ll see that habit in full flower. If something happens, they want to know what one thing caused it, and what one thing will result from it. In the real world, it almost never happens that just one cause sets just one process in motion and has just one effect; in the real world, wildly complex, tangled chains of interaction go into even the simplest event, and spin out from there to infinity—but that’s not the way Americans like to think.

Why? Because the normal sentence structure in English has a subject—someone who causes an action—followed by a verb—the action of the subject—and then usually by an object—the thing on which the action has an effect. That’s our usual grammar, and so that’s the usual pattern of our thoughts.

There are, as it happens, plenty of languages that don’t have the same structure. In modern Welsh, for example, most sentences begin with a form of the verb “to be.” Where an English speaker would say “The children are playing in the yard,” a Welsh speaker would say “Mae’r plant yn chwarae yn yr ardd,” literally “It is the children at play in the yard.” Most English sentences imply a cause-and-effect relationship (the cause “children” have the effect “playing”), that is, while most Welsh sentences imply a complex condition of being (the current state of things includes the phenomena “children” in the condition of “playing”). If you know both languages well enough to think in both, you won’t default to either option—and you won’t necessarily be stuck with just those two options, either, because once you get used to switching from one to another, you can easily conceive of other alternatives.

What’s true of language, I’d like to suggest, is also true—and may in fact be even more true—of the ideas and preconceptions of an era: if you only know one, you don’t actually know one at all. Just as the person who knows only one language remains trapped in the grammatical and semantic habits of that language, the person who has only encountered the thought of one era remains trapped in the presuppositions, habitual notions, and unexamined assumptions of that era. 

I’ve used the word “trapped,” but that choice of phrasing misstates one very important aspect of the phenomenon: the condition that results is very comfortable. Most of the big questions have easy answers, and those that are still open—well, everyone’s secure in the knowledge that once those are solved, by some linear extrapolation of the current methods of inquiry, the answers will by definition fit easily into the framework that’s already been established for them. Debates about what’s right and wrong, what’s true and false, what’s sane and stark staring crazy all take place within the limits of a universally accepted structure of ideas that are all the more powerful because nobody discusses them and most people don’t even consciously notice that they’re there.

The supposed openness to innovation and diversity that’s said to characterize modern industrial society does precisely nothing to counteract that effect. The vagaries of intellectual and cultural trends, and the antics of dissident subcultures in art, religion, and politics, all take place within the narrow limits of a conventional wisdom which, again, is not so much believed as tacitly assumed. Watch any avant-garde movement closely, and it’s not hard to notice that its idea of rebelling against the status quo amounts to taking the conventional wisdom just a little further than anyone else has gotten around to going recently—and when that loses its charm, you can bet that in a generation or so, some new movement will come along and do it all over again, and convince themselves that they’re being revolutionary in doing something their parents, grandparents, and great-grandparents did in their day.

Thus, for example, public masturbation as a form of performance art has been invented at intervals of thirty to forty years since the late nineteenth century. It’s happened so far, that I know of, in the 1890s, the 1920s, the 1950s, and the 1980s, and we can probably expect a new round any time now. Each of the self-proclaimed cutting-edge artistic movements that went in for this not especially interesting habit framed it as a revolutionary act, using whatever kind of grandiose rhetoric was popular just then; and then the crowds got bored, and three decades later the next generation was at it again.

The history of the flying car, which has been invented at regular intervals since the 1920s, follows exactly the same rhythm, and displays exactly the same total subservience to the conventional wisdom of modern industrial culture. (A case could probably be made that there’s no shortage of masturbatory features in our collective obsession with flying cars, but that’s a discussion for another time.) For the purposes of our present discussion, the flying car is a particularly useful example, because it points to the chief problem with unthinking subservience to the predigested thought of an era: people in that condition lose the ability to learn from their mistakes.

There are a galaxy of good reasons why we don’t have flying cars, after all. One of the most important is that the engineering demands of aircraft design and automobile design are almost exactly opposed to one another—the lighter an airplane is, the better it flies, while a car needs a fair amount of weight to have good traction; aircraft engines need to be optimized for speed, while car engines need to be optimized for torque, and so on through a whole series of contrasts. A flying car is thus by definition going to be mediocre both as a car and as a plane, and due to the added complexities needed to switch from one mode of travel to the other, it’s going to cost so much that for the same price you can get a good car and a good plane, with enough left over to pay hangar rental for quite some time.

None of this is particularly hard to figure out. What’s more, it’s been demonstrated over and over again by the flying cars that have been invented, patented, and tested repeatedly down through the years. That being the case, why do audiences at TED Talks still clap frantically when someone tells them that they can expect flying cars on the market any day now? Because the presuppositions of modern industrial society deny the existence of limits and inescapable tradeoffs, and when the lessons of failure point up the reality of these things, those lessons remain unlearnt.

I wish that all the consequences of subservience to unnoticed presuppositions were that harmless. Take any of the rising spiral of crises that are building up around modern industrial society these days; in every single case, the reason that the obviously necessary steps aren’t being done is that the conventional wisdom of our time forbids thinking about those steps, and the reason that the lessons of repeated failure aren’t being learned is that the conventional wisdom of our time denies that any such failures can happen. We live in an era of cultural senility, in which the vast majority of people stare blankly at an unwelcome future and keep on doing all the things that are bringing that future on.

The erasure of the past from the curriculum of American universities is far from the only factor that’s brought about that catastrophic reality, but I suspect its role in that process has been significant. The era of cultural senility came in when the generation of the Sixties, the generation that insisted on excising the past from its university education, hit its thirties and rose into positions of influence, and it’s gotten steadily worse since that time. The inability of our society to learn from its mistakes or question its preconceptions has thus become a massive political fact—and a massive political liability.

None of the consequences of that inability are particularly original. It so happens, for example, that a little less than 2500 years ago, influential voices in another rich and powerful democratic society embraced the same policies that Robert Kagan and his fellow neoconservatives have been promoting in our time. The backers of this Project for a New Athenian Century believed that these policies would confirm Athens’ hegemony over the ancient Greek world; what happened instead was a nightmare of imperial overstretch, war, and economic and political collapse, from which Athens, and Greece as a whole, never recovered. You can read all about it in the writings of Thucydides, one of the supposedly irrelevant authors that most educated people read before the 1960s and next to nobody reads today.

That’s an obvious benefit of reading Thucycides. Less obvious and even more important is the subtler insight that you can get from Thucydides, or for that matter from any long-dead author. Thucydides was not a modern politically correct American liberal, or for that matter a modern patriotically correct American neoconservative. His basic assumptions about the world differ drastically from those of any modern reader, and those assumptions will jar, over and over again, against the very different notions that form the automatic substructure of thought in the modern mind.

If Thucydides doesn’t offend you, in fact, you’re probably not paying attention—but that’s precisely the point. If you exercise the very modest amount of intellectual courage that’s needed to get past being offended, and try to understand why the world looked the way it did when seen through Thucydides’ view of the world and yours, your knowledge of your preconceptions and your ability to make sense of the world when it doesn’t happen to fit those preconceptions will both expand. Both those gains are well worth having as our society hurtles down its current trajectory toward an unwelcome future.

**********
Homework Assignment #1

Since this series of posts is on education, yes, there’s going to be homework. Your assignment for the next two weeks consists of choosing a book-length work of fiction that (a) you haven’t previously read, and (b) was written before 1900, and reading it. It can be anything that fits these capacious limits: Little Women, The Epic of Gilgamesh, The Scarlet Letter, The Tale of Genji, or something else entirely—take your pick. Whatever book you choose, read it cover to cover, and pay attention to the places where the author’s assumptions about the world differ from yours. Don’t pass judgment on the differences; just notice them, and think about what it would have been like to see the world the way the author did.

Wednesday, May 18, 2016

Retrotopia: A Distant Scent of Blood

This is the sixteenth installment of an exploration of some of the possible futures discussed on this blog, using the toolkit of narrative fiction. Our narrator, having recovered from a bout of the flu, goes for a walk, meets someone he’s encountered before, and begins to understand why the Lakeland Republic took the path it did...

***********
The next morning I felt pretty good, all things considered, and got up not too much later than usual. It was bright and clear, as nice an autumn day as you could ask for. I knew I had two days to make up and a lot of discussions and negotiations with the Lakeland Republic government still waited, but I’d been stuck in my room for two days and wanted to stretch my legs a bit before I headed back into another conference room at the Capitol. I compromised by calling Melanie Berger and arranging to meet with her and some other people from Meeker’s staff after lunch. That done, once I’d finished my morning routine, I headed down the stairs and out onto the street.

I didn’t have any particular destination in mind, just fresh air and a bit of exercise, and two or three random turns brought me within sight of the Capitol. That sent half a dozen trains of thought scurrying off in a bunch of directions, and one of them reminded me that I hadn’t seen a scrap of news for better than two days. Another couple of blocks and I got to Kaufer’s News, where the same scruffy-looking woman was sitting on the same wooden stool, surrounded by the same snowstorm of newspapers and magazines. I bought that day’s Toledo Blade, and since it was still way too early to put anything into my stomach, I crossed the street, found a park bench in front of the Capitol that had sunlight all over it, sat down and started reading.

There was plenty of news. The president of Texas had just denounced the Confederacy for drilling for natural gas too close to the Texas border, and the Confederate government had issued the kind of curt response that might mean nothing and might mean trouble.  The latest word from the Antarctic melting season was worse than before; Wilkes Land had chucked up a huge jokulhlaup—yeah, I had to look the word up the first time I saw it, too; it means a flood of meltwater from underneath a glacier—that tore loose maybe two thousand square miles of ice and had half the southern Indian Ocean full of bergs.

There was another report out on the lithium crisis, from another bunch of experts who pointed out yet again that the world was going to run out of lithium for batteries in another half dozen years and all the alternatives were much more expensive; I knew better than to think that the report would get any more action than the last half dozen had.  Back home, meanwhile, the leaders of the Dem-Reps had a laundry list of demands for the new administration, most of which involved Montrose ditching her platform and adopting theirs instead.  There’d been no response from the Montrose transition team, which was probably just as well. I knew what Ellen would say to that and it wasn’t fit to print.

Still, the thing I read first was an article on the satellite situation. There was a squib on the front page about that, and a big article with illustrations on pages four and five. It was as bad as I’d feared. The weather satellite that got hit on Friday had thrown big chunks of itself all over, and two more satellites had already been hit.  The chain reaction was under way, and in a year or so putting a satellite into the midrange orbits would be a waste of money—a few days, a week at most, and some chunk of scrap metal will come whipping out of nowhere at twenty thousand miles an hour and turn your umpty-billion-yuan investment into a cloud of debris ready to share the love with anything else in orbit.

That reality was already hitting stock markets around the world—telecoms were plunging, and so was every other economic sector that depended too much on satellites. Most of the Chinese manufacturing sector was freaking out, too, because a lot of their exports go by way of the Indian Ocean, and satellite data’s the only thing that keeps container ships out of the way of icebergs. Economists were trying to rough out the probable hit to global GDP, and though estimates were all over the map, none of them was pretty.  The short version was that everybody was running around screaming.

Everybody outside the Lakeland Republic, that is. The satellite crisis was an academic concern there. I mean that literally; the paper quoted a professor of astronomy from Toledo University, a Dr. Marjorie Vanich, about the work she and her grad students were doing on the mathematics of orbital collisions, and that was the only consequence the whole mess was having inside the Lakeland borders. I shook my head. Progress was going to win out eventually, I told myself, but the Republic’s retro policies certainly seemed to deflect a good many hassles in the short term.

I finished the first section, set down the paper. Sitting there in the sunlight of a clear autumn day, with a horsedrawn cab going clip-clop on the street in front of me, schoolchildren piling out of a streetcar and heading toward the Capitol for a field trip, pedestrians ducking into Kaufer’s News or the little hole-in-the-wall café half a block from it, and the green-and-blue Lakeland Republic flag flapping leisurely above the whole scene, all the crises and commotions in the newspaper I’d just read might as well have been on the far side of the Moon. For the first time I found myself wishing that the Lakeland Republic could find some way to survive over the long term after all.  The thought that there could be someplace on the planet where all those crises just didn’t matter much was really rather comforting.

I got up, stuck the paper into one of the big patch pockets of my trench coat, and started walking, going nowhere in particular. A clock on the corner of a nearby building told me I still had better than an hour to kill before lunch. I looked around, and decided to walk all the way around the Capitol, checking out the big green park that surrounded it and the businesses and government offices nearby. I thought of the Legislative Building back home in Philadelphia, with its walls of glass and metal and its perpetually leaky roof; I thought of the Presidential Mansion twelve blocks away, another ultramodern eyesore, where one set of movers hauling Bill Barfield’s stuff out would be crossing paths just then with another set of movers hauling Ellen Montrose’s stuff in; I thought of the huge bleak office blocks sprawling west and south from there, where people I knew were busy trying to figure out how to cope with a rising tide of challenges that didn’t look as though it was ever going to ebb.

I got to one end of the park, turned the corner. A little in from the far corner was what looked like a monument of some sort, a big slab of dark red stone up on end, with something written on it. Shrubs formed a rough ring around it, and a couple of trees looked on from nearby. I wondered what it was commemorating, started walking that way. When I got closer, I noticed that there was a ring of park benches inside the circle of shrubs, and one person sitting on one of the benches; it wasn’t until I was weaving through the gap between two shrubs that I realized it was the same Senator Mary Chenkin I’d met at the Atheist Assembly the previous Sunday. By the time I’d noticed that, she’d spotted me and got to her feet, and so I went over and did and said the polite thing, and we got to talking.

The writing on the monument didn’t enlighten me much. It had a date on it—29 APRIL 2024—and nothing else. I’d just about decided to ask Chenkin about it when she said, “I bet they didn’t brief you about this little memento of ours—and they probably should have, if you’re going to make any kind of sense of what we’ve done here in the Lakeland Republic. Do you have a few minutes?”

“Most of an hour,” I said. “If you’ve got the time—”

“I should be at a committee meeting later on, but there should be plenty of time.” She waved me to the bench and then perched on the front of it, facing me.

“You probably know about DM-386 corn, Mr. Carr,” she said. “The stuff that had genes from poisonous starfish spliced into it.”

“Yeah.” Ugly memories stirred.  “I would have had a kid brother if it wasn’t for that.”

“You and a lot of others.” She shook her head.  “Gemotek, the corporation that made it, used to have its regional headquarters right here.” She gestured across the park toward the Capitol. “A big silver glass and steel skyscraper complex, with a plaza facing this way.  It got torn down right after the war, the steel went to make rails for the Toledo streetcar system, and the site—well, you’ll understand a little further on why we chose to put our Capitol there.

“But it was 2020, as I recall, when Gemotek scientists held a press conference right here to announce that DM-386 was going to save the world from hunger.” Another shake of her head dismissed the words. “Did they plant much of it up where your family lived?”

“Not to speak of.  We were in what used to be upstate New York, and corn wasn’t a big crop.”

“Well, there you are. Here, we’re the buckle on the corn belt:  the old states of Ohio, Indiana, Illinois, and across into Iowa and Nebraska. Gemotek marketed DM-386 heavily via exclusive contracts with local seed stores, and it was literally everywhere. They insisted it was safe, the government insisted it was safe, the experts said the same thing—but nobody bothered to test it on pregnant women.”

“I remember,” I said.

“And down here, it wasn’t just in the food supply.  The pollen had the toxin in it, and that was in the air every spring.  After the first year’s crop, what’s more, it got into the water table in a lot of places. So there were some counties where the live birth rate dropped by half over a two year period.”

She leaned toward me. “And here’s the thing. Gemotek kept insisting that it couldn’t possibly be their corn, and the government backed them. They brought in one highly paid expert after another to tell us that some new virus or other was causing the epidemic of stillbirths. It all sounded plausible, until you found out that the only countries in the world that had this supposed virus were countries that allowed DM-386 corn to cross their borders. The media wouldn’t mention that, and if you said something about it on the old internet, or any other public venue, Gemotek would slap you with a libel suit. They’d win, too—they had all the expert opinion on their side that money could buy. All the farmers and the other people of the corn belt had on their side was unbiased epidemiology and too many dead infants.

“So by the fall and winter of 2023, the entire Midwest was a powderkeg. A lot of farmers stopped planting DM-386, even though Gemotek had a clause in the sales agreement that let them sue you for breach of contract if you did that. Seed stores that stocked it got burnt to the ground, and Gemotek sales staff who went out into farm country didn’t always come back. There were federal troops here by then—not just Homeland Security, also regular Army with tanks and helicopters they’d brought up from the South after the trouble in Knoxville and Chattanooga the year before—and you had armed bands of young people and military vets springing up all over the countryside. It was pretty bad.

“By April, it was pretty clear that next to nobody in the region was planting Gemotek seeds—not just DM-386, anything from that company. Farmers were letting their farms go fallow if they couldn’t get seed they thought was safe. That’s when Michael Yates, who was the CEO of Gemotek, said he was going to come to Toledo and talk some sense into the idiots who thought there was something wrong with his product. By all accounts, yes, that’s what he said.”

All of a sudden I remembered how the story ended, but didn’t say anything.

“So he came here—right where we’re sitting now.  The company made a big fuss in the media, put up a platform out in front of the building, put half a dozen security guards around it, and thought that would do the job. Yates was a celebrity CEO—” Unexpectedly, she laughed. “That phrase sounds so strange nowadays. Still, there were a lot of them before the Second Civil War: flashy, outspoken, hungry for publicity. He was like that. He flew in, and came out here, and started mouthing the same canned talking points Gemotek flacks had been rehashing since the first wave of stillbirths hit the media.

“I think he even believed them.” She shrugged. “He wasn’t an epidemiologist or even a geneticist, just a glorified salesman who thought his big paycheck made him smarter than anyone else, and he lived the sort of bicoastal lifestyle the rich favored in those days.  If he’d ever set foot in the ‘flyover states’ before then, I never heard of it. But of course the crowd wasn’t having any of it. Something like nine thousand people showed up.  They were shouting at him, and he was trying to make himself heard, and somebody lunged for the platform and a security guard panicked and opened fire, and the crowd mobbed the platform. It was all over in maybe five minutes. As I recall, two of the guards survived. The other four were trampled and beaten to death, and nineteen people were shot—and Michael Yates was quite literally torn to shreds. There was hardly enough left of him to bury.

“So that’s what happened on April 29th, 2024. The crowd scattered as soon as it was all over, before Homeland Security troops could get here from their barracks; the feds declared a state of emergency and shut Toledo down, and then two days later the riots started down in Birmingham and the National Guard units sent to stop them joined the rioters. Your historians probably say that that’s where and when the Second Civil War started, and they’re right—but this is where the seed that grew into the Lakeland Republic got planted.”

“Hell of a seed,” I said, for want of anything better.

“I won’t argue. But this—” Her gesture indicated the monument, and the shadow of a vanished building.  “—this is a big part of why the whole Midwest went up like a rocket once the Birmingham riots turned serious, and why nothing the federal government did to get people to lay down their arms did a bit of good. Every family I knew back in those days had either lost a child or knew someone who had—but it wasn’t just that. There had been plenty of other cases where the old government put the financial interests of big corporations ahead of the welfare of its people—hundreds of them, really—but this thing was that one straw too many.

“And then, when the fighting was over, the constitutional convention was meeting, and people from the World Bank and the IMF flew in to offer us big loans for reconstruction, care to guess what one of their very first conditions was?”

I didn’t have to answer; she saw on my face that I knew the story. “Exactly, Mr. Carr. The provisional government had already passed a law banning genetically modified organisms until adequate safety tests could be done, and the World Bank demanded that we repeal it.  To them it was just a trade barrier. Of course all of us in the provisional government knew perfectly well that if we agreed to that, we’d be facing Michael Yates’ fate in short order, so we called for a referendum.”

She shook her head, laughed reminiscently. “The World Bank people went ballistic. I had one of their economists with his face six inches from mine, shouting threats for fifteen minutes in half-coherent English without a break. But we held the referendum, the no vote came in at 89%, we told the IMF and the World Bank to pack their bags and go home, and the rest of our history unfolded as you’ve seen—and a lot of it was because of a pavement streaked with blood, right here.”

Something in her voice just then made me consider her face closely, and read something in her expression that I don’t think she’d intended me to see. “You were there, weren’t you?” I asked.

She glanced up at me, looked away, and after a long moment nodded.

A long moment passed. The clop-clop of a horsedrawn taxi came close, passed on into the distance. “Here’s the thing,” she said finally. “All of us who were alive then—well, those who didn’t help tear Michael Yates to pieces helped tear the United States of America to pieces.  It was the same in both cases:  people who had been hurt and deceived and cheated until they couldn’t bear it any longer, who finally lashed out in blind rage and then looked down and saw the blood on their hands.  After something like that, you have to come to terms with the fact that what’s done can never be undone, and try to figure out what you can do that will make it turn out to be worthwhile after all.”

She took a watch out of her purse, then, glanced at it, and said, “Oh dear. They’ve been waiting for me in the committee room for five minutes now. Thank you for listening, Mr. Carr—will I see you at the Assembly next Sunday?”

“That’s the plan,” I told her. She got up, we made the usual polite noises, and she hurried away toward the Capitol. Maybe she was late for her meeting, and maybe she’d said more than she’d intended to say and wanted to end the conversation. I didn’t greatly care, as I wanted a little solitude myself just then.

I’d known about DM-386 corn, of course, and my family wasn’t the only one I knew that lost a kid to the fatal lung defects the starfish stuff caused if the mother got exposed to it in the wrong trimester. For that matter, plenty of other miracle products have turned out to have side effects nasty enough to rack up a fair-sized body count. No, it was thinking of the pleasant old lady I’d just been sitting with as a young woman with blood dripping from her hands.

Every nation starts that way. The Atlantic Republic certainly did—I knew people back home who’d been guerrillas in the Adirondacks and the Alleghenies, and they’d talk sometimes about things they’d seen and done that made my blood run cold.  The old United States got its start the same way, two and a half centuries further back. I knew that, but I hadn’t been thinking about it when I’d sat on the park bench musing about how calm the Lakeland Republic seemed in the middle of all the consternation outside its borders. It hadn’t occurred to me what had gone into making that calm happen.

The breeze whispering past the stone monument seemed just then to have a distant scent of blood on it. I turned and walked away.

Wednesday, May 11, 2016

A Few Notes on Burkean Conservatism

Several times recently, in posts on this blog discussing the vagaries of current American politics, I’ve had occasion to reference my own political philosophy by name. This has caused a certain amount of confusion and curiosity, because the moniker I mentioned—“moderate Burkean conservative”—falls nowhere on the narrow range of political opinions allowed into our collective discourse these days.

Now of course a good part of the confusion arises because the word “conservative” no longer means what it once meant—that is to say, a person who wants to conserve something. In today’s America, conservatives who actually want to conserve are as rare as liberals who actually want to liberate.  The once-significant language of an earlier era has had the meaning sucked right out of it, the better to serve as camouflage for a kleptocratic feeding frenzy in which both establishment parties participate with equal abandon. Putting meaning back into the words can be a risky proposition, in turn, because so many Americans are used to waving them about as arbitrary noises linked to an assortment of vague emotions, the common currency of what passes for thought in so much of modern American life.

Nonetheless, I think the risk is worth taking, if only because a genuine conservatism—that is, a point of view oriented toward finding things worth conserving, and then doing something to conserve them—is one of the few options that offer any workable strategies for the future as the United States accelerates along the overfamiliar trajectory of a democracy in terminal crisis.

Let’s start with the least familiar of the terms I mentioned above, “Burkean.” The reference is to the Anglo-Irish writer, philosopher, and politician Edmund Burke (1729-1797), generally considered the founder of the Anglo-American conservative tradition. This is all the more interesting in that Burke himself was none of the things that gets labeled “conservative” in today’s America. For example, while he was himself an Anglican Christian, he defended the rights of Catholics to freedom of worship at a time when this was a very unpopular stance—roughly on a par with defending the rights of Satanists in today’s America—and lent his own home to a group of Hindus traveling in Britain who had been refused any other place to celebrate one of their religious holidays.

He was also an outspoken supporter of the American colonists in their attempts to seek redress against the British government’s predatory and punitive trade policies, and maintained his support even when all peaceful options had been exhausted and the colonists rose in rebellion. Yet this was the man who, toward the end of his life, penned Reflections on the Revolution in France, which critiqued the French revolutionaries in incisive terms, and which has much the same place in the history of Anglo-American conservatism that The Communist Manifesto has in the history of the modern radical left.

This doesn’t mean, by the way, that Burkean conservatives quote Burke’s writings the way Marxists quote Marx or Objectivists quote Ayn Rand. Like other human beings, Burke was a blend of strengths and weaknesses, principles and pragmatism, and the political culture of his time and place accepted behavior that most people nowadays consider very dubious indeed.  Those of my readers who want to hear what Burke had to say can find Reflections on the Revolution in France online, or in any decent used book store; those who want to engage in ad hominem argument can find plenty of ammunition in any biography of Burke they care to consult. What I propose to do here is something a bit different—to take Burke’s core ideas and set them out in a frame many of my readers will recognize at once.

The foundation of Burkean conservatism is the recognition that human beings aren’t half as smart as they like to think they are. One implication of this recognition is that when human beings insist that the tangled realities of politics and history can be reduced to some set of abstract principles simple enough for the human mind to understand, they’re wrong. Another is that when human beings try to set up a system of government based on abstract principles, rather than allowing it to take shape organically out of historical experience, the results will pretty reliably be disastrous.

What these imply, in turn, is that social change is not necessarily a good thing. It’s always possible that a given change, however well-intentioned, will result in consequences that are worse than the problems that the change is supposed to fix. In fact, if social change is pursued in a sufficiently clueless fashion, the consequences can cascade out of control, plunging a nation into failed-state conditions, handing it over to a tyrant, or having some other equally unwanted result. What’s more, the more firmly the eyes of would-be reformers are fixed on appealing abstractions, and the less attention they pay to the lessons of history, the more catastrophic the outcome will generally be.

That, in Burke’s view, was what went wrong in the French Revolution. His thinking differed sharply from continental European conservatives, in that he saw no reason to object to the right of the French people to change a system of government that was as incompetent as it was despotic. It was, the way they went about it—tearing down the existing system of government root and branch, and replacing it with a shiny new system based on fashionable abstractions—that was problematic. What made that problematic, in turn, was that it simply didn’t work Instead of establishing an ideal republic of liberty, equality, and fraternity, the wholesale reforms pushed through by the National Assembly plunged France into chaos, handed the nation over to a pack of homicidal fanatics, and then dropped it into the waiting hands of an egomaniacal warlord named Napoleon Bonaparte.

Two specific bad ideas founded in abstractions helped feed the collapse of revolutionary France into chaos, massacre, tyranny, and pan-European war. The first was the conviction, all but universal among the philosophes whose ideas guided the revolution, that human nature is entirely a product of the social order. According to this belief, the only reason people don’t act like angels is that they live in an unjust society, and once that is replaced by a just society, why, everybody would behave the way the moral notions of the philosophes insisted they should. Because they held this belief, in turn, the National Assembly did nothing to protect their shiny up-to-date system against such old-fashioned vices as lust for power and partisan hatred, with results that made the streets of Paris run with blood.

The second bad idea had the same effect as the first. This was the conviction, also all but universal among the philosophes, that history moved inevitably in the direction they wanted: from superstition to reason, from tyranny to liberty, from privilege to equality, and so on. According to this belief, all the revolution had to do to bring liberty, equality, and fraternity was to get rid of the old order, and voila—liberty, equality, and fraternity would pop up on cue. Once again, things didn’t work that way. Where the philosophes insisted that history moves ever upward toward a golden age in the future, and the European conservatives who opposed them argued that history slides ever downward from a golden age in the past, Burke’s thesis—and the evidence of history—implies that history has no direction at all.

The existing laws and institutions of a society, Burke proposed, grow organically out of that society’s history and experience, and embody a great deal of practical wisdom. They also have one feature that the abstraction-laden fantasies of world-reformers don’t have, which is that they have been proven to work. Any proposed change in laws and institutions thus needs to start by showing, first, that there’s a need for change; second, that the proposed change will solve the problem it claims to solve; and third, that the benefits of the change will outweigh its costs. Far more often than not, when these questions are asked, the best way to redress any problem with the existing order of things turns out to be the option that causes as little disruption as possible, so that what works can keep on working.

That is to say, Burkean conservatism can be summed up simply as the application of the precautionary principle to the political sphere.

The precautionary principle? That’s the common-sense rule that before you do anything, you need to figure out whether it’s going to do more good than harm. We don’t do things that way in the modern industrial world. We dump pesticides into the biosphere, carbon dioxide into the air, and inadequately tested drugs into our bodies, and then figure out from the results what kind of harm they’re going to cause. That’s a thoroughly stupid way of going about things, and the vast majority of the preventable catastrophes that are dragging modern industrial society down to ruin result directly from that custom.

Behind it, in turn, lies one of the bad ideas cited above—the notion that history moves inevitably in the direction we want. Yes, that’s the myth of progress, the bizarre but embarrassingly widespread notion that history is marching ever onward and upward, and so anything new is better just because it’s new, which keeps so many people from asking obvious questions about where our civilization is headed and whether any sane person would want to go there. I’ve discussed this in quite a few earlier posts here, as well as in my book After Progress; I mention it here to point out one of the ways that the political views I’m explaining just now interface with the other ideas I’ve discussed here and elsewhere.

The way that a moderate Burkean conservatism works in practice will be easiest to explain by way of a specific example. With this in mind, I’m going to go out of my way to offend everyone, by presenting a thoroughly conservative argument—in the original, Burkean sense of that word “conservative,” of course—in favor of the right to same-sex marriage.

We’ll have to pause first for a moment, though, to talk about that word “right.” This is necessary because by and large, when Americans hear the word “right,” their brains melt into a puddle of goo. The assumption these days seems to be that there’s some indefinite number of abstract rights hovering out there in notional space, and all of them are absolute and incontrovertible, so that all you have to say is “I have a right to [whatever]!” and everybody is supposed to give you whatever it is right away. Of course everybody doesn’t, and the next step is the kind of shrill shouting match that makes up so much of American political nonconversation these days, in which partisans of the right to X and partisans of the right to Y yell denunciations at each other for trying to deprive each other of their rights.

If you happen to be a religious person, and believe in a religion that teaches that God or the gods handed down a set of rules by which humans are supposed to live, then it probably does make sense to talk like this, because you believe that rights exist in the mind of the deity or deities in question. If you’re not a religious person, and claim to have a right that other people don’t recognize, you’ll have a very interesting time answering questions like these: in what way does this supposed right exist? How do you “have it”—and how do the rest of us tell the difference between this right you claim to have and, say, an overdeveloped sense of entitlement on your part?

All these confusions come from the attempt to claim that rights have some kind of abstract existence of their own. To the Burkean conservative, this is utter nonsense. A right, from the Burkean point of view, is an agreement among the members of a community to allow some sort of behavior. That’s what it is, and that’s all it is. The right to vote, say, exists because the people of a given nation, acting through political institutions, confers it on a certain class of persons—say, all adult citizens.

What if you don’t have a right, and believe that you should have it? That’s called “having an opinion.” There’s nothing wrong with having an opinion, but it doesn’t confer a right. If you want to have the right you think you should have, your job is to get your community to confer it on you. In a perfect world, there would no doubt be some instant, foolproof way to establish a right, but we don’t live in a perfect world. We live in a world where the slow, awkward tools of representative democracy and judicial review, backed up by public debate, are the least easily abused options we’ve yet found to accomplish this task. (That doesn’t mean, please note, that they can’t be abused; it means that they’re not quite as prone to abuse as, say, the institutions of theocracy or military dictatorship.)

With that in mind, we can proceed to the right to same-sex marriage. The first question to ask is whether government has any business getting involved in the issue at all. That’s not a minor question. The notion that legislation is the solution to every problem has produced a vast number of avoidable disasters. In this case, though, what prevented same-sex couples from marrying was governmental regulations. Changing those regulations requires governmental action.

The second question to ask is whether government has any compelling interest in the existing state of affairs. History shows that letting government interfere in people’s private lives is a very risky thing to do, and while it can be necessary, there has to be a compelling interest to justify it—for example, in the case of laws prohibiting child abuse, the compelling interest of protecting children against violence. No such compelling interest justifies government interference in the marital decisions of legally competent, consenting adults; as noted further on, “Ewww, gross!” does not count as a compelling interest.

The third question to ask is whether the people who will be affected by the change actually want the change. That’s not a minor question, either; history is full of grand projects, supposedly meant to help some group of people, that were rejected by the people who were to be “helped,” and those inevitably turn out badly. In this case, though, there were plenty of same-sex couples who wanted to get married and couldn’t. Notice also that the proposed change was permissive rather than mandatory—that is, same-sex couples could get married, but they could also stay unmarried. As a general rule of thumb, permissive regulations don’t require the same level of suspicion as mandatory regulations.

The fourth question to ask is whether anyone would be harmed by the change. Here it’s important to keep in mind that “harmed” does not mean “offended;” nor, for that matter, are you harmed by being kept from forcing others to do what you want them to do. One of the eternal annoyances of liberty is that others inevitably use it in ways that you and I find offensive.  We put up with the inconvenience because that’s the price of having liberty ourselves. Claims that this or that person is going to be harmed by a change thus need to evince specific, concrete, measurable harm. In this case, that standard was not met, as there are no Purple Hearts issued for being butthurt.

The fifth question is whether the proposed change is a wholly new right, a significant expansion of an existing right, or the extension of an existing right in its current form to a group of people who did not previously have it. Creating a wholly new right can be a risky endeavor, as it’s hard to figure out in advance how that will interact with existing rights and institutions. A significant expansion of an existing right is less hazardous, but it still needs to be approached with care. Extending an existing right in its current form to people who don’t previously have it, by contrast, tends to be the safest of changes, since it’s easy to figure out what the results will be—all you have to do is see what effect it has had in its more restricted application. In this case, an existing right was to be extended to same-sex couples, who would have the same rights and responsibilities as couples who married under existing law.

The sixth question, given that the right in question is being extended in its current form to a group of people who didn’t previously have it, is whether that right has been extended before. In this case, the answer is yes. Marriage between people of different races used to be illegal in many American states. When extending the right of marriage to mixed-race couples was being debated, the same arguments deployed against same-sex marriage got used, but all of them amounted in practice to someone being offended. Mixed-race marriages were legalized, a lot of mixed-race couples got married, none of the horrible consequences imagined by the opposition ever got around to happening, and that was that.

So, to sum up, we have a group of people who want a permissive regulation granting them a right already held by other people. No actual harm has been demonstrated by those opposed to granting that right, and no compelling interest prevents government from granting that right. The same right has been extended before with no negative consequences, and a very simple change in the wording of existing marriage laws will confer the right. Under these circumstances, there is vastly more justification for granting the right than for refusing it, and it should therefore be granted.

No doubt some people will take offense at so mealy-mouthed an adding up of pros and cons. Where are the ringing affirmations of justice, equality, and other grand abstract principles? That, of course, is exactly the point. In the real world, grand abstract principles count for little. In a society that values liberty—not, please note, as a grand abstract principle, but as a mutual agreement that people can do as they wish so long as that doesn’t infringe on the established rights of others—what matters when someone petitions for redress of a grievance is simply whether that petition can be granted without any such infringement. The questions asked above, and the institutions of representative democracy and judicial review, are there to see to it that this happens. Do they always succeed? Of course not; they just do a marginally better job than any other system. In the real world, that’s justification enough.

What about the religious communities that are opposed to that right? (This is where I’m going to shift gears from offending my readers on the rightward end of things to offending those on the other end of the political spectrum.) Conservative Christian groups are a religious minority in America today, and it’s a well-established rule in American law and custom that reasonable accommodation should be made to religious minorities when this can be done without violating the agreed-upon rights of others. That doesn’t give conservative Christians the right to force other people to follow conservative Christian teachings, any more than it would give Jews the right to forbid the sale of pork in America’s grocery stores. It does mean that conservative Christians should not be forced to participate in activities they consider sinful, any more than Jewish delicatessens should be forced to sell pork.

By and large, businesses that serve the general public are rightly required to serve the general public, rather than picking and choosing who they will or won’t serve, but there are valid exceptions, and religion is one of them. I’m told that in New York State, orthodox Jewish businesses are legally allowed to post signage stating that Jewish religious law applies on the premises, and this exempts them from certain laws governing other businesses; thus, for example, a woman who enters such a business with uncovered hair will not be served.  It would be a reasonable accommodation for conservative Christian businesses that cater to weddings to be able to post signage noting that they only provide services to the kinds of weddings authorized by their own religious laws. That would let same-sex couples take their business elsewhere; it would also let people who support the right of same-sex marriage know which businesses to boycott, just as it would let conservative Christians support their co-religionists.

Again, any number of shiny abstractions could be brandished about to insist that conservative Christian businesses should not have that right, but here again, we’re not dealing with abstractions. We’re dealing with the need to find reasonable accommodation for differing beliefs in a society that, at least in theory, values liberty. Claims that this or that person will be harmed by letting a religious minority practice its faith on privately owned business premises, again, have to evince specific, concrete, measurable harm. Being offended doesn’t count here, either, nor does whatever suffering comes your way from being denied the power to make other people do what you think they ought to do.

My readers may have noticed that, given the arrangements just outlined, nobody in the debate over same-sex marriages would get everything they want. That’s at least as offensive as anything else I’ve suggested in this post, but it’s the foundation of Burkean conservatism, and of democratic politics in general. In the messy, gritty world of actual politics, nobody can ever count on getting everything they want—even if they shout at the top of their lungs that they have a right to it—and the best that can be expected is that each side in any controversy will get the things they most need. That’s the kind of resolution that allows a society to function, instead of freezing up into permanent polarization the way America has done in recent years—and it’s the kind of resolution that might just possibly get some semblance of representative democracy intact through the era of crisis looming ahead of us just now.

*******
Two other things. First, I’m frankly astounded by the outpouring of congratulations—not to mention tip jar contributions—that came in response to last week’s post on the tenth anniversary of The Archdruid Report. On the off chance that anyone didn’t get thanked sufficiently, please know that the lapse wasn’t intentional! I’m more grateful than I can say for the support and encouragement I’ve received from the community of readers that’s emerged around this blog.

Second, I’m delighted to announce that the first issue of Into the Ruins, as far as I know the first-ever magazine of deindustrial science fiction, is now in print. This is the periodical equivalent of the After Oil anthologies, chockfull of new stories selected by editor Joel Caris; those who like compelling stories about the future we’re actually likely to get won’t want to miss it. Subscribers should be getting their copies shortly if those haven’t arrived already; as for the rest of you—well, what are you waiting for? ;-) You can order copies or buy a subscription at this website.

Wednesday, May 04, 2016

The Dawn of the Cthulhucene: A Retrospective

"This year’s Earth Day in Ashland, Oregon, where I live, featured an interfaith service at the local Unitarian church, and I wasn’t too surprised to get a call inviting me to be one of the presenters.”

That was the opening sentence of the first post ever to appear on The Archdruid Report, Real Druids, which went up ten years ago this Friday. When I typed those words, I had no clear idea of what I was going to do with the blog I’d just started. The end of the publishing industry I wrote for in those days was just then waking up to the marketing potential of author blogs; I was also in the third year of my unpaid day job as head of the Ancient Order of Druids in America (AODA), a small and old-fashioned Druid order distinctly out of step with the pop-culture Neopaganism of the time, and hoped to use a blog to bring the order to the attention of anyone out there who might be interested in something so unfashionable.

So I sat down at the computer, logged into my Blogger account, clicked on the button marked new post, and stared blankly at it for a while before I started to type. That, as they say, is how it all began.

In terms of the perspectives with which this blog deals—the grand sweep of human history, and the much vaster sweep of geological and evolutionary deep time—ten years is less than an eyeblink. In terms of a single human life, though, it’s a considerable span. Over that period I’ve moved from Ashland to Cumberland, Maryland, the red-brick mill town in the north central Appalachians where I now live. My writing career has burgeoned since then, too, helped along considerably by the two novels and nine nonfiction books that started out as sequences of blog posts.

My other career, the unpaid one mentioned above, also went through plenty of changes—if any of my readers ever have the opportunity to become the presiding officer of a nearly defunct Druid order and help it get back on its feet, I certainly recommend the experience! Still, twelve years in the hot seat was enough, and at the winter solstice just past I stepped down as Grand Archdruid of AODA with a sigh of relief, and handed the management of the order over to my successor Gordon Cooper.

There have been plenty of other changes over the last ten years, of course, and quite a few of them also affected The Archdruid Report. One that had a particularly significant impact was the rise, fall, and resurgence of the peak oil scene. Most of a decade before that first post, a handful of people—most of them petroleum geologists and the like—noticed that oil was being pumped much more quickly than new oilfields were being discovered. Now of course this turn of events had been predicted in quite a bit of detail well before then; back in the 1970s, in particular, when the phrase “limits to growth” hadn’t yet become taboo in polite company, plenty of people noticed that trying to extract an infinite supply of oil from a finite planet was guaranteed to end badly. 

That awareness didn’t survive the coming of the Reagan counterrevolution. More precisely, it survived only on the far fringes of the collective conversation of our time, where the few of us who refused to drink Ronnie’s koolaid spent most of two decades trying to figure out how to live in a civilization that, for all intents and purposes, seemed to have succumbed to a collective death wish. Still, our time in exile didn’t last forever.  It was 1998, as I recall, when I found the original Running On Empty email list—one of the first online meeting places for people concerned about peak oil—and I stayed with the movement thereafter as it slowly grew, and the rising tide of data made the case for imminent peak oil harder and harder to dismiss out of hand.

Two books published in the early 2000s—Richard Heinberg’s The Party’s Over and James Howard Kunstler’s The Long Emergency—helped launch the peak oil movement into public awareness. Not incidentally, those were also the books that convinced me that it might just be possible to talk frankly about the predicament of industrial society: not just peak oil, but the broader collision between the economic ideology of limitless growth and the hard realities of a fragile planet. The Archdruid Report came out of that recognition, though I thought at first that its audience would be limited to the Druid community; I figured that people who had embraced Druid nature spirituality might be more open to the kind of intellectual heresy I had in mind. The blog turned out to have a much broader audience than that, but it took me quite a while to realize that, and longer still to recognize its implications.

Meanwhile the peak oil movement hit its own peak between 2008 and 2010, and began skidding down the far side of its own Hubbert curve. That’s standard for movements for social change, though it was probably worsened by the premature triumphalism that convinced many peakniks that once they’d proved their case, governments had to do something about the impending crisis, and that also led some large peak oil organizations to spend money they didn’t have trying to run with the big dogs. At this point, as the fracking bubble falters and the economy misbehaves in ways that conventional economic theory can’t account for but peak oil theory can, the bottom has likely been reached, and a much shorter period of exile is duly ending.  Talk about peak oil in the media and the political sphere is picking up again, and will accelerate as the consequences of another decade of malign neglect bear down with increasing force on the industrial world.

One of the things I find most interesting about this trajectory is that it didn’t impact The Archdruid Report in the way I would have expected. During the years when the peak oil movement was all over large portions of the internet, my monthly page views and other site stats remained fairly modest. It wasn’t until 2010, when the peak oil scene was beginning to falter, that my stats started to climb steadily; my first breakout all-over-the-internet post came in 2011, and thereafter readership has remained high, wobbling up and down around an average of a quarter million page views a month. All ten of my top ten posts, in terms of total unique page views, appeared between 2011 and this year. On the off chance my readers are interested, here they are:

4. How Not to Play the Game, June 29, 2011
5. An Elegy for the Age of Space, August 24, 2011
6. The Next Ten Billion Years, September 4, 2013
7. Into an Unknown Country, January 2, 2013
9. The Recovery of the Human, February 1, 2012

(I discovered in the process of making this list, by the way, that the Blogger gizmo for tracking all time top posts doesn’t actually do what it’s supposed to do. Like so much of the internet, it provides the illusion of exact data but not the reality, and I had to go back over the raw numbers to get an accurate list. My readers may draw their own conclusions about the future of a society that increasingly relies on internet-filtered information as a source of guidance.)

None of these posts are only about peak oil, or even about peak energy.  You’ll find references to the hard physical and geological limits of the energy resources available to our species in most of them, to be sure, and quite a few detailed discussions of those limits and their implications among the other 489 posts that have appeared here in the last decade. That said, those limits aren’t quite central to this blog’s project. They derive, like the other common themes here, from something else.

The philosopher Arthur Schopenhauer noted that his treatise The World as Will and Representation, massive though it is, was simply the working out of a single idea in all its ramifications. The same is true of this blog, though I’m an essayist and novelist rather than an analytical philosopher, and thus my pursuit of the idea I’m trying to pin down has been somewhat more discursive and rambling than his. (I make no apologies for that fact; I write the way I like to write, for those who like to read it.) Not all ideas can be summed up in a few words or a snappy slogan.  In particular, the more thoroughly an idea challenges our basic preconceptions about the nature of things, and the more stark the gap between its implications and those of the conventional wisdom, the more thoroughly and patiently it must be explored if it’s going to be understood at all.

Even so, there are times when an unexpected turn of phrase can be used, if not to sum up a challenging idea, at least to point in its direction forcefully enough to break through some of the barriers to understanding. Thanks to one of my readers—tip of the archdruidical hat to Mgalimba—I encountered such a turn of phrase last week. That came via a 2014 talk by cutting-edge thinker Donna Haraway, in which she challenged a currently popular label for the geological period we’re entering, the Anthropocene, and proposed her own coinage: the Cthulhucene.

She had specific reasons for the proposal, and I’d encourage my readers to see what she had to say about those, but I have somewhat different reasons for adopting the term. H.P. Lovecraft, who invented the squid-faced, dragon-winged, monster-clawed devil-god Cthulhu for one of his best stories, used that being and the other tentacled horrors of his imaginary pantheon to represent a concept as alien to the conventional thought of industrial society as the Great Old Ones themselves. The term Lovecraft used for that concept was “indifferentism”—the recognition that the universe is utterly indifferent to human beings, not sympathetic, not hostile, not anything, and that it’s really rather silly of us, all things considered, to expect it to conform to our wishes, expectations, or sense of entitlement.

Does this seem embarrassingly obvious? The irony, and it’s a rich one, is that most people nowadays who insist that the universe is indifferent to humanity turn around and make claims about the future that presuppose exactly the opposite. I’ve long since lost track of the number of committed atheists I’ve met, for example, who readily agreed that the universe is indifferent to our desires, but then insisted there has to be some other energy resource out there at least as cheap, concentrated, and abundant as the ones we’re currently using up. That claim only makes sense if you assume that the supplies of matter and energy in the cosmos have somehow been arranged for our benefit; otherwise, no, there doesn’t have to be any other resource out there. We could simply use up what we’ve got, and then have to get by without concentrated energy sources for the rest of the time our species happens to exist.

That’s far from the only example of stealth anthropocentrism I’ve encountered in the same context. I’ve also long since lost track of the number of committed atheists who reject the idea of a caring cosmos out of hand, but then go on to claim that technological progress of the kind we’ve made is irreversible. That claim only makes sense if you assume that history is somehow arranged for our benefit, so that we don’t have to worry about sliding back down the long slope we climbed so laboriously over the last five centuries or so.  If history is indifferent to our preferences, by contrast, the way down is just as easy as the way up, and decline and fall waits for us as it did for all those dead civilizations in the past.

Then there’s the most embarrassing claim of all, the devout insistence that humanity’s destiny lies out there in space. “Destiny” is a theological concept, and it’s frankly risible to find it being tossed around so freely by people who insist they’ve rejected theology, but let’s go a step further here. If the universe is in fact indifferent to our wishes and desires, the mere fact that a certain number of people have gotten worked up over science-fiction visions of zooming off toward the stars does not oblige the universe to make space travel a viable option for our species. There are in fact very good reasons to think that it’s not a viable option, but you won’t get many people to admit that these days. We (or, rather, some of us) dream of going to the stars, therefore it must be possible for us to go to the stars—and before you claim that human beings can achieve anything they can imagine, dear reader, I encourage you to read up on the long history of attempts to build a working perpetual motion machine.

I’ve picked on atheists in these three examples, and to some extent that’s unfair. It’s true that most of the really flagrant examples of stealth anthropocentrism I’ve encountered over the last ten years came from people who made quite a point of their atheism, but of course there’s no shortage of overt toxic anthropocentrism over on the religious side of things—I’m thinking here of those Christian fundamentalists who claim that Christ is coming soon and therefore it doesn’t matter how savagely we lay waste a world they themselves claim that God made and called good. I’ve met atheists, to be fair, who recognize that their belief in the absence of purpose in the cosmos implies that no providence will protect us from the consequences of our own stupidity. I’ve also met religious people who recognize that the universe defined by their beliefs is theocentric, not anthropocentric, and that human beings might therefore want to cultivate the virtue of humility and attend to the purposes that God or the gods might have in mind, rather than assuming in blithe arrogance that whatever humanity thinks it wants, it ought to get.

The dawn of the Cthulhucene represents the arrival of a geological period in which those latter ways of understanding the world will be impossible to ignore any longer. We are beginning to learn no matter how hard we scrunch our eyes shut and plug our ears and shout “La, la, la, I can’t hear you” to the rest of the universe, the universe is not going to give us what we want just because we want it:  that the resources we waste so cluelessly will not be replaced for our benefit, and we will have to face every one of the consequences of the damage we do to the planetary biosphere that keeps us alive. In place of the megalomaniacal fantasy of Man the Conqueror of Nature, striding boldly from star to star in search of new worlds to plunder, we are beginning to see a vast and alien shape rising before us out of the mists of the future, a shape we might as well call Cthulhu: winged, scaled, tentacled, clawed, like a summary of life on earth, regarding us with utterly indifferent eyes.

In those eyes, we balding social primates are of no more importance in the great scheme of things than the trilobites or the dinosaurs, or for that matter the countless species—intelligent or otherwise—that will come into being long after the last human being has gone to join the trilobites and dinosaurs in Earth’s library of fossil beds.  The sooner we grasp that, the easier it will be for us to drop the misguided anthropocentric delusions that blind us to our situation, wake up to the mess we’ve made of things, and get to work trying to save as many of the best achievements of the last three hundred years or so before the long night of the deindustrial dark ages closes in around us.

Given that the universe is simply not interested in pandering to the fantasies of omnipotence currently fashionable among influential members of our species—given that no special providence is going to rescue us from the consequences of our assorted stupidities, no resource fairy is going to give us a shiny new energy source to make up for the resources we now squander so recklessly, and the laws of nature are already sending the results of our frankly brainless maltreatment of the biosphere back in our faces with an utter lack of concern for our feelings and interests—how should we then live? That’s the theme that I’ve been trying to explore, in one way or another, since this blog got under way. It’s a vast theme, and one that I haven’t even begun to exhaust yet. I have no idea if I’m still going to be blogging here ten years from now, but if not, it won’t be due to lack of things to talk about.

One more thing deserves to be said here, though. All along the journey that brought me from that first tentative post to this week’s retrospective, one of the things that’s made the way easier and a good deal more enjoyable has been the enthusiasm, understanding, and critical insight that’s been shown by so many of my readers. Time after time, faced with the choice of backing away from a controversial subject or plunging ahead, I’ve taken the plunge, and discovered that my readers were more than ready to jump with me. Time after time, too, when I’ve offered a rough sketch of some part of the landscape I’m trying to explore, my readers have asked questions and posed challenges that helped me immeasurably in clarifying my thinking and discarding approximations that didn’t work. As this blog begins its eleventh year, I’d like to thank everyone who’s made a comment here—and also everyone who’s made a donation to the tip jar and thus helped me afford the hours each week that go into these blog posts. My gratitude goes with each of you; I hope you’ve found the journey so far as rewarding as I have.