It’s a funny thing, this attempt to discuss the future in advance. Much of the time, like everyone else in the business, I talk about the future as though it’s a place we simply haven’t reached yet, with a geography that can be explored at least to some extent from the vantage point of the present. That’s not entirely inappropriate; so much of the near future has been defined in advance by choices already made and opportunities long since foregone that it’s not at all hard to sketch out the resulting shape of things.
Still, the choices we make in the present are as often as not defined by our beliefs about the future, and so there’s a complicated series of feedback loops that comes into play. Self-fulfilling prophecies are one option, but far from the most common. Much more often, predictions about the future that gain enough of an audience to become a force in their own right kickstart patterns of change that go ricocheting off in unexpected directions and bring about a future that nobody expected.
Industrial civilization’s attempt to expand out into interplanetary space, the theme of last week’s post here on The Archdruid Report, is a case in point. The handful of space technologies that turned out to have practical uses, and the technological advances that spun off from each of the major space programs, weren’t anticipated at all by the people who ordered the rockets to be built, the satellites to be launched and the astronauts to risk their lives. Cold War rivalry played a major role, to be sure, but that rivalry could have expressed itself in any number of terrestrial ways. What very few people noticed then or later was the extent to which all parties involved took their core assumptions and ideas from an utterly improbable source—a genre of pulp literature that most people at the time dismissed as lowbrow trash suitable only for twelve-year-old boys. Yes, I’m talking about science fiction.
I’m not sure how many people have noticed that science fiction is the one really distinctive form of fiction created by industrial civilization. Romances, satires, and adventure stories are practically as old as written language; the novel of character and manners had its first great flowering in tenth-century Japan, and detective stories were written in medieval China; even fantasy fiction of the modern kind, which deliberately cashes in on legends nobody actually believes any more, flourished in Renaissance Europe—it still amazes me that nobody has rewritten Amadis of Gaul to fit the conventions of modern fantasy fiction and republished it as “the sixteenth century’s number one fantasy bestseller,” which it unquestionably was. Science fiction—the branch of literature that focuses on the impact of scientific and technological progress—is the exception.
It’s important, for what follows, to be clear about definitions here. A story about traveling to another world isn’t necessarily a work of science fiction; the ancient Greek satirist Lucian wrote one about a ship tossed up to the Moon by a waterspout, and Cyrano de Bergerac—yes, that Cyrano; you didn’t know he was a real person, did you?—wrote a ripsnorter about traveling to the Moon and the Sun via a series of even more unlikely gimmicks; both of them were engaging pieces of absurdity riffing off the fact that nobody actually thought the thing could ever happen. It took Mary Shelley, watching the rain splash down on a Swiss vacation as her husband’s literary friends toyed with ghost stories in much the same spirit as Lucian imagined moonflight, to create a new kind of literature. Frankenstein, the novel she started on that vacation, became a bestseller precisely because it was believable; recent advances in the life sciences, especially Alessandro Volta’s eerie experiment that caused a frog’s amputated leg to kick by running electricity through it, made it entirely plausible at the time that some researcher might take things the next step and bring a dead body to life.
Take a single scientific or technological breakthrough, combine it with the modern world, and see what happens—all through the 19th century, and into the 20th, that’s what science fiction (or “scientifiction,” as it was often called) meant. Jules Verne and H.G. Wells, the two great masters of the early genre, rang just about every change on that theme that the technology of the age would justify. Of course both of them wrote voyages to the Moon; in an age of explosive technological progress, traveling to the Moon had moved just that little bit closer to plausibility, but it was just one of the many lively improbabilities they and other authors explored in their stories.
Except, of course, that a good many of them didn’t stay improbable for long. The feedback loop I mentioned earlier was coming into play; in the first decades of the twentieth century, a generation that had grown up on Verne and Wells started putting scientifiction’s dreams into practice. Captain Nemo’s Nautilus quickly took on an uncomfortable reality as the first U-boats slid down the ways. Wells’ “The Land Ironclads” provided the conceptual blueprint for the first generation of tanks, just as his The War in the Air got militaries around the world thinking of the possibilities of aerial bombardment. Most of the other technological notions in turn of the century science fiction got tried out by somebody or other during those years, and those that worked found ready acceptance among audiences that had plenty of fictional models in the back of their minds.
Meanwhile, the fictional models were shifting focus. It was in the 1920s and 1930s that science fiction changed from a genre about any kind of scientific and technological advance you care to name, which it had been until then, to a genre that was basically about space travel. Slowly—it wasn’t an overnight change by any means—stories about spaceships and alien worlds came to dominate the pulp magazines that were the major SF venue of the time; voyages to the Moon became old hat, something to stick in the backstory; Mars and Venus became preferred destinations, and then E.E. “Doc” Smith shot the characters in his Lensman series across interstellar space, and what Brian Aldiss later described as science fiction’s “billion year spree” was on.
By the late 1940s, many of the most popular science fiction writers were working within a common vision of the future—a future history that began sometime in the near future with the first voyages to the Moon and then went on from there, colonizing the solar system, then leaping the gap that separated our solar system from others and beginning the settlement of the galaxy. Whether humanity would meet alien life forms out there in space was a subject of much disagreement; the more romantic authors peopled Mars and Venus with intelligent species of their own, but the spectrum ran from there to authors who imagined a galaxy full of empty but inhabitable planets just waiting for Homo sapiens to inhabit them. Even among the imaginary galaxies that bristled with alien species, though, they might as well have been human beings; the universe of the original Star Trek series, where the vast majority of “aliens” were extras from Central Casting with a bit of funny makeup splashed on, was a pretty fair reflection of the SF of a few decades earlier.
It’s a useful exercise to go back and read essays by the SF authors of the 20th century’s middle decades—Isaac Asimov and Arthur C. Clarke were particularly prolific in this vein, but there were plenty of others—and take in what they had to say about the coming Space Age. It wasn’t, by and large, something they felt any need to promote or argue for; it was simply, necessarily going to happen. There would be the first tentative flights into space, followed by the first Moon landing; somewhere in there the first of many space stations would go into orbit, perhaps as a way station to the Moon; Mars and Venus were next on the agenda, first the landings, then the bases, then the colonies, growing as naturally as Jamestown or Plymouth into booming new frontier societies; the asteroids and the moons of Jupiter and Saturn would follow in due order, followed by the outer planets and the cometary halo, and then would come the challenge of the stars.
Among the most fascinating details that popped up here and there in this literature, though, was the conviction that science fiction itself—the literature, the writers, and the subculture that grew up around it in the 1930s and became something like a force of nature in the decades that followed—would play a major role in all this. I’ve long mislaid the title of the Isaac Asimov essay that argued that science fiction had the role of advance scouts on the great onward march of human progress, revealing new avenues for advance here, discovering dead ends and boobytraps there. That wasn’t just Asimov exercising his unusually well-developed ego, either; SF fans, droves of them, shared his opinion. "Fans are Slans," the saying went—I wonder how many people these days even remember A.E. Van Vogt’s novel Slan, much less the race of superhuman mutants that gave it its title; a great many fans saw themselves in that overly roseate light.
What makes this all the more intriguing is that all this happened at a time when science fiction was widely considered very nearly the last word in lowbrow reading. Until the paperback revolution of the late 1950s, most science fiction appeared in pulp magazines—so called because of the wretched quality of the paper they were printed on—with trashy covers on the front and ads for X-ray spectacles and Charles Atlas strength lessons in the back. The cheap mass-marketed paperbacks that picked up from the pulps dropped the ads but by and large kept the tacky cover art. ("There has been a great deal of talk about the big questions of science fiction," SF author L. Sprague de Camp said once. "The truly big question of science fiction is ‘What is that woman in a brass brassiere doing on the cover of my book?’") As for the stories themselves—well, there were a handful of very good authors and some very good short stories and novels, but let’s be honest; there was much, much more that was really, astonishingly bad.
Just as the young engineers and military officers of 1910 had all grown up reading Jules Verne and H.G. Wells, though, as America stumbled into its age of global empire after the Second World War, a very large number of its young men (and a much smaller but still significant fraction of its young women) had grown up daydreaming of rockets to Mars and adventures with the Space Patrol. All that was required to make those daydreams a powerful force in the American collective imagination was a well-aimed shock, and that was supplied in 1957 when a small group of Soviet scientists and military officers talked their superiors into letting them strap a 22-inch steel sphere on top of a big new ICBM and launch it into Earth orbit.
The advent of Sputnik I sent the United States into something halfway between a tantrum and a nervous breakdown. Suddenly it became absolutely essential, in the minds of a great many Americans, for the US to beat "godless Russia" in the Space Race. For their part, delighted to find an effective way to goad the United States, Soviet leaders started putting real money into their space program, scoring one achievement after another while Americans played a feeble game of catch-up. Before long a new US president was announcing a massively funded plroject to put men on the moon, the first rockets were blasting off from Cape Canaveral, and a nation already intrigued by the notion of outer space, and alternately amused and intrigued by the space-centered folk mythology of the UFO phenomenon, signed on to the opening stages of the grand future history already sketched out for them by decades of pulp science fiction.
For the next decade and a half or so, the feedback loop I’ve described shifted into overdrive as fantasies of a future among the stars shaped the decisions of politicians and the public alike. By the time the Apollo program was well underway, staff at NASA was already sketching out the next generation of manned interplanetary spacecraft that would follow the Moon landing and cutting blueprints for the probes that would begin the exploration of the solar system. That’s when things started to run off the rails that seemingly led to the stars, because the solar system revealed by those probes turned out to have very little in common with the "New Worlds for Man" that the fantasies required.
It takes a while reading old books on the prospects of space travel to grasp just how wide a gap those first planetary probes opened up. Respected scientists were claiming as late as the 1960s that the Moon was a world of romantic vistas with needle-pointed mountains glinting under starlight; it turned out to be gray, largely featureless, and stunningly dull. Venus was supposed to be a tropical planet, warmer than Earth but still probably inhabitable; it turned out to be a searing furnace of a world with surface temperatures hot enough to melt metal. Since 19th century astronomers mistook optical effects of telescopes pushed to their limit for markings on the Martian surface, Mars had been the great anchor for dreams of alien intelligence and offworld adventure; when the first Viking lander touched down in 1976, the Grand Canals and alien swordsmen of Barsoom and its godzillion equivalents went wherever wet dreams go to die, and were duly replaced by what looked for all of either world like an unusually dull corner of Nevada, minus water, air, and life.
Those were also the years when Mariner and Voyager probes brought back image after image of a solar system that, for all its stunning beauty and grandeur, cointained only one world that was fit for human habitation, and that happened to be the one on which we already lived. As the photos of one utterly uninhabitable world after another found their way into one lavish National Geographic article after another, you could all but hear the air leaking out of the dream of space, and even the most vigorous attempts to keep things moving launched by science fiction fans and other enthusiasts for space travel found themselves losing ground steadily. To stand the title of Frederik Pohl’s engaging memoir on its head, science fiction turned out to be the way the future wasn’t.
And science fiction itself? It fragmented and faded. The boost in respectability the space program gave to science fiction gave it a larger and more critical market, and thus midwifed some of the greatest works of the genre; a series of loudly ballyhooed literary movements, none of them particularly long-lived, zoomed off in assorted directions and, as avant-garde movements generally do, left most of their audience behind; efforts at crass commercial exploitation, of which the Star Wars franchise was the most lucrative example, came swooping down for their share of the kill. While other media boomed—visual media are always a couple of decades behind print—the sales of science fiction novels peaked in the late 1970s and early 1980s and then began a decline that still continues, and a genre that had once exercised a potent gravitational force on the collective imagination turned back into just another option in the smorgasbord of mass-produced popular entertainment.
It’s a trajectory worth studying for more reasons than one. The intersection of imperial extravagance, technological triumphalism, and anti-Communist panic that flung billions of dollars into a quest to put men on the Moon made it possible, for a little while, for a minority of visionaries with a dream about the future to think that their dream was about to become reality. The dream unraveled, though, when the rest of the universe failed to follow the script, and a great many of the visionaries found themselves sitting in the dust wondering what happened.
That’s not an uncommon event. The dream of a new American century hawked by the neoconservatives a decade and a half ago, though it ranked down there with the tawdriest bits of pulp science fiction, traced the same trajectory. The election of George W. Bush and the 2001 terror attacks on New York and Washington DC gave them a window of opportunity to try to make that dream a reality, and it turned into exactly the sort of disaster you’d expect when a group of academic intellectuals try to impose their theories on the real world. It would be less embarrassing if the notion of invading a Third World country and turning it into a happy little puppet democracy hadn’t been tried over and over again, without a single success, since the Spanish-American War.
For that matter, the movement toward sustainability in the 1970s, the subject of a great many posts on this blog, followed a similar trajectory. That movement, as I’ve argued, might have succeeded—I grant that it was a long shot at best. Yet the rush of initial enthusiasm, the real achievements that were made, and the bleak discovery that the political and cultural tide had turned against it and the rest of the dream was not going to come within reach are very reminiscent of the arc traced above.
Still, the example that comes most forcefully to mind just now is the one this blog is meant to address, the movement—or perhaps proto-movement—trying to do something useful in the face of peak oil. Right now it’s still gamely poised on the fringes, attracting members and brief bursts of attention, weaving disparate perspectives into early drafts of the vision that will eventually catapult it into the big time. That’s still several years and a Sputnik moment or two away, but the increasingly frantic attempts of both American parties to treat the end of the age of cheap energy as a public relations problem suggest to me that sooner or later that time is going to come.
The temptation when that happens, and it’s a potent one, will be to assume that whatever window of opportunity opens then can be counted upon to last, on the one hand, and will lead to whatever encouraging future the vision promises on the other. Neither of those is guaranteed, and depending on the shape the vision takes, neither one may even be possible. The question that needs to be kept in mind, straight through from the giddy enthusiasm of the initial successes to the bitter winding down that will more than likely follow, is how much useful work can be accomplished during the interval we get.
Wednesday, August 31, 2011
Wednesday, August 24, 2011
An Elegy for the Age of Space
The orbiters are silent now, waiting for the last awkward journey that will take them to the museums that will warehouse the grandest of our civilization’s failed dreams. There will be no countdown, no pillar of flame to punch them through the atmosphere and send them whipping around the planet at orbital speeds. All of that is over.
In Houston, the same silence creeps through rooms where technicians once huddled over computer screens as voices from space crackled over loudspeakers. The screens are black now, the mission control rooms empty, and most of the staff have already gotten their pink slips. On the Florida coast, where rusting gantries creak in the wind and bats flutter in cavernous buildings raised for the sake of a very different kind of flight, another set of lauch pads sinks slowly into their new career as postindustrial ruins.
There are still rockets lifting off elsewhere, to be sure, adding to the globe’s collection of satellites and orbiting space junk. The International Space Station still wheels through the sky, visited at intervals by elderly Soyuz capsules, counting down the days and the missions until its scheduled deorbiting in 2016. In America, a few big corporations have manned space projects on the drawing boards, angling for whatever federal funding survives the next few rounds of our national bankruptcy proceedings, and a few billionaires here and elsewhere are building hobby spacecraft in roughly the same spirit that inspired their Gilded Age equivalents to maintain luxury yachts and thoroughbred stables.
Still, something has shifted. A tide that was expected to flow for generations and centuries to come has peaked and begun to ebb. There will still be rockets surging up from their launch pads for years or decades to come, and some few of them will have human beings on board, but the momentum is gone. It’s time to start coming to terms with the winding down of the age of space.
Ironically, one of the best pieces of evidence for that was the shrill reception given to an article in The Economist announcing The End of the Space Age. The irony was particularly delicious in that The Economist is a British periodical, and Britain has already been through its own retreat from space. During the first half of the 20th century, the British Interplanetary Society was among the most prestigious groups calling for manned space missions, but dreams of a British presence in space collapsed around the same time as Britain’s empire and industrial economy did. It’s hard to miss the schadenfreude in The Economist’s editorial stance, but it was even harder to overlook the bluster and denial splashed across the blogosphere in its wake.
A little perspective might be useful here. When the space shuttle first came off the drawing boards, the much-repeated theory was that it would be the first of a new breed of spacecraft that would make a flight from Cape Canaveral to orbit as commonplace as a flight from New York to Chicago. The next generation would swap out the shuttle’s disposable fuel tank and solid-fuel boosters for a fully reusable first stage that would take a shuttle-equivalent most of the way into orbit, then come back to Earth under its own power and get refueled for the next launch. Further down the road, but already in the concept phase, were spaceplanes that could take off from an ordinary runway and use standard jet engines to get to 50,000 feet or so, where rocket engines would cut in for the leap to orbit. Single-use rockets? In the minds of the space-savvy, they were already as outdated as Model T Fords.
Yet here we are in 2011, the space shuttle program is over, the replacements weren’t built, and for the five years of scheduled life the International Space Station has left, its crews will be getting there via the 1960s-era technology of Soyuz space capsules atop single-use rockets. As for the rest of the steps toward space everyone in the 1960s assumed we would have taken by now—the permanent space stations, the base on the Moon, the manned missions to Mars, and the rest of it—only the most hardcore space fans talk about them any more, and let’s not even discuss their chances of getting significant funding this side of the twelfth of never.
Mind you, I’m not cheering. Though I realized some years ago that humanity isn’t going to the stars—not now, not in the lifetime of our species—the end of the shuttle program with no replacement in sight still hit me like a body blow. It’s not just a generational thing, though it’s partly that; another large part of it was growing up where and when I did. By that I don’t just mean in the United States in the middle decades of the last century, but specifically in the triumphant years between John Glenn’s first orbital flight and Neil Armstrong’s final step onto lunar soil, in a suburb south of Seattle where every third family or so had a father who worked in the aerospace industry. Yes, I remember exactly where I was sitting and what was happening the moment that Walter Cronkite told the world that Apollo 11 had just landed on the Moon.
You didn’t grow up as a geeky, intellectual kid in that sort of setting without falling in love with space. Of course it didn’t hurt that the media was filled to the bursting point with space travel—turn on the tube any evening during my childhood, and if you didn’t get Lost In Space or Star Trek you’d probably catch The Invaders or My Favorite Martian—and children’s books were no different; among my favorites early on was Ronnie Rocket and Suzie Saucer, and I went from there to The Wonderful Flight to the Mushroom Planet, The Spaceship Under the Apple Tree—well, you get the picture. (I won’t even get into science fiction here; that’s a subject that deserves an entire post to itself.) Toys? The G.I. Joe accessory I treasured most in those days was a plastic Mercury space capsule with space suit to match; I also played with Major Matt Mason, Man In Space, and plenty of less efficiently marketed toys as well.
The future that most people imagined in those days had plenty of options primed to catch a young boy’s imagination, to be sure. Sealab—does anybody remember Sealab these days?—was the Navy’s attempt to compete with the romance of space, complete with breathless National Geographic articles about "a new world of limitless resources beneath the sea." (Ahem.) For a while, I followed Sealab as passionately as I did the space program, and yes, my G.I. Joe also had a wetsuit and scuba gear. That was common enough, and so were my less scientific fixations of the time, the monster lore and paranormal phenomena and the like; when you’re stuck growing up in suburbia in a disintegrating family and the only source of hope you can come up with is the prospect that the world isn’t as tepidly one-dimensional as everyone around you insists it has to be, you take encouragement where you find it.
You might think that a kid who was an expert on werewolf trivia at age ten would have gone in for the wildest of space fantasies, but I didn’t. Star Trek always seemed hokey to me. (I figured out early on that Star Trek was a transparent pastiche of mid-1960s US foreign policy, with the Klingons as Russia, the Vulcans as Japan, the Romulans as Red China, and Captain Kirk as a wish-fulfillment fantasy version of Gen. William Westmoreland who always successfully pacified his extraterrestrial Vietnams.) Quite the contrary; my favorite spacecraft model kit, which hung from a length of thread in my bedroom for years, was called the Pilgrim Observer: some bright kit designer’s vision of one of the workhorse craft of solar system exploration in the late 20th century.
Dilithium crystals, warp drives, and similar improbabilities had no place in the Pilgrim Observer. Instead, it had big tanks for hydrogen fuel, a heavily shielded nuclear engine on a long boom aft, an engagingly clunky command module up front bristling with telescopes and dish antennas—well, here again, you get the picture; if you know your way around 1970s space nonfiction, you know the kit. It came with a little booklet outlining the Pilgrim I’s initial flyby missions to Mars and Venus, all of it entirely plausible by the standards the time. That was what delighted me. Transporter beams and faster-than-light starflight, those were fantasy, but I expected to watch something not too far from Pilgrim I lifting off from Cape Canaveral within my lifetime.
That didn’t happen, and it’s not going to happen. That was a difficult realization for me to reach, back in the day, and it’s one a great many Americans are doing their level best to avoid right now. There are two solid reasons why the future in space so many of us thought we were going to get never arrived, and each one provides its own reasons for evasion. We’ve talked about both of them in this blog at various times, and there’s more than the obvious reason to review them now.
The first, simply put, is that the United States has lost the space race. Now of course it was less a single race than a whole track and field competition, with the first event, the satellite shot-put contest (winner: Russia, with Sputnik I), followed by the single-orbit dash (winner: Russia, with Vostok I) and a variety of longer sprints (winner: much more often than not, Russia). The run to the Moon was the first real US gold medal—we did half a dozen victory laps back out there just to celebrate—and we also scored big in the planetary probe toss competition, with a series of successful Mariner and Voyager missions that mostly showed us just how stunningly inhospitable the rest of the solar system was. The race that ultimately counted, though, was the marathon, and Russia’s won that one hands down; they’re still in space, and we aren’t.
Behind that unwelcome news is the great geopolitical fact of the early 21st century, the decline and imminent fall of the American empire. Like any number of empires before us, we’ve gotten ourselves wedged tightly into the predictable downside of hegemony—the stage at which the costs of maintaining the economic imbalances that channel wealth from empire to imperial state outstrip the flow of wealth those imbalances are meant to produce. Once that stage arrives, the replacement of the failing empire by some new distribution of power is a foregone conclusion; the only question is how long the process will take and how brutal the final cost to the imperial state will turn out to be.
The Cold War competition between the United States and the Soviet Union was a standard contest to see which empire would outlast the other. The irony, and it’s a rich one, is that the loser of that contest was pretty much guaranteed to be the winner in a broader sense. When the Soviet Union collapsed, Russia had an empire wrenched out of its hands, and as a result it was forced to give up the struggle to sustain the unsustainable. The United States kept its empire intact, and as a result it has continued that futile but obsessive fight, stripping its national economy to the bare walls in order to prop up a global military presence that will sooner or later bankrupt it completely. That’s why Russia still has a functioning space program, while the United States may have trouble finding the money to launch cheap fireworks by the time its empire finally slips from its fingers.
It’s our decidedly mixed luck, as discussed here more than once in the past, that America is entering on the downslope of its imperial decline just as a much vaster curve has peaked and begun to arc in the same direction. That’s the second reason that the space age is ending, not just for us but for humanity. In the final analysis, space travel was simply the furthest and most characteristic offshoot of industrial civilization, and depended—as all of industrial civilization depends—on vast quantities of cheap, highly concentrated, readily accessible energy. That basic condition is coming to an end around us right now. Petroleum has already reached its global production peak as depletion rates shoot past the rate at which new fields can be found and brought on line; natural gas and coal are not far behind—the current bubble in shale gas will be over in five or, just possibly, ten years—and despite decades of animated handwaving, no other energy source has proven to yield anything close to the same abundance and concentration of energy at anything like the same cost.
That means, as I’ve shown in detail in past posts here, that industrial civilization will be a short-lived and self-terminating phenomenon. It doesn’t mean, or at least doesn’t have to mean, that future civilizations will have to make do with an equivalent of the much simpler technological suites that civilizations used before the industrial age; I’ve argued at some length here and elsewhere that an ecotechnic society—a civilization that supports a relatively advanced technology on a modest scale using the diffuse and limited energy provided by sustainable sources, without wrecking the planet—is a live option, if not in the immediate future, then after the dark age the misguided choices of the recent past have prepared for us.
Still, of the thousands of potential technological projects that might appeal to the limited ambitions and even more strictly limited resources of some future ecotechnic society, space travel will rank very, very low. It’s possible that the thing will be done, perhaps in the same spirit that motivated China a little while back to carry out a couple of crisp, technically capable manned orbital flights; ten thousand years from now, putting a human being into orbit will still probably be the most unanswerable way for a civilization to announce that it’s arrived. There are also useful things to be gained by lofting satellites for communication and observation purposes, and it’s not at all impossible that now and then, over the centuries and millennia to come, the occasional satellite will pop up into orbit for a while, and more space junk will be added to the collection already in place.
That’s not the vision that fired a generation with enthusiasm for space, though. It’s not the dream that made Konstantin Tsiolkovsky envision Earth as humanity’s cradle, that set Robert Goddard launching rockets in a Massachusetts farmyard and hurled Yuri Gagarin into orbit aboard Vostok I. Of all people, it was historical theorist Oswald Spengler who characterized that dream most precisely, anatomizing the central metaphor of what he called Faustian civilization—yes, that’s us—as an eternal outward surge into an emptiness without limit. That was never a uniquely American vision, of course, though American culture fixated on it in predictable ways; a nation that grew up on the edge of vastness and cherished dreams of heading west and starting life over again was guaranteed to think of space, in the words of the Star Trek cliché, as "the final frontier." That it did indeed turn out to be our final frontier, the one from which we fell back at last in disarray and frustration, simply adds a mordant note to the tale.
It’s crucial to realize that the fact that a dream is entrancing and appeals to our core cultural prejudices is no guarantee that it will come true, or even that it can. There will no doubt be any number of attempts during the twilight years of American empire to convince Americans to fling some part of the energies and resources that remain to them into a misguided attempt to relive the dream and claim some supposed destiny among the stars. That’s not a useful choice at this stage of the game. Especially but not only in America, any response to the crisis of our time that doesn’t start by using much less in the way of energy and resources simply isn’t serious. The only viable way ahead for now, and for lifetimes to come, involves learning to live well within our ecological limits; it might also help if we were to get it through our heads that the Earth is not humanity’s cradle, or even its home, but rather the whole of which each of us, and our species, is an inextricable part.
That being said, it is far from inappropriate to honor the failed dream that will shortly be gathering dust in museums and rusting in the winds that blow over Cape Canaveral. Every civilization has some sprawling vision of the future that’s destined never to be fulfilled, and the dream of infinite expansion into space was ours. The fact that it didn’t happen, and arguably never could have happened, takes nothing away from the grandeur of its conception, the passion, genius, and hard work that went into its pursuit, or the sacrifices made on its behalf. Some future poet or composer, perhaps, will someday gather it all up in the language of verse or music, and offer a fitting elegy to the age of space.
Meanwhile, some 240,000 miles from the room where I write this, a spidery metallic shape lightly sprinkled with meteoritic dust sits alone in the lunar night on the airless sweep of Mare Tranquillitatis. On it is a plaque which reads WE CAME IN PEACE FOR ALL MANKIND. Even if no other human eyes ever read that plaque again, as seems likely, it’s a proud thing to have been able to say, and a proud thing to have done. I can only hope that the remembrance that our species once managed the thing offers some consolation during the bitter years ahead of us.
In Houston, the same silence creeps through rooms where technicians once huddled over computer screens as voices from space crackled over loudspeakers. The screens are black now, the mission control rooms empty, and most of the staff have already gotten their pink slips. On the Florida coast, where rusting gantries creak in the wind and bats flutter in cavernous buildings raised for the sake of a very different kind of flight, another set of lauch pads sinks slowly into their new career as postindustrial ruins.
There are still rockets lifting off elsewhere, to be sure, adding to the globe’s collection of satellites and orbiting space junk. The International Space Station still wheels through the sky, visited at intervals by elderly Soyuz capsules, counting down the days and the missions until its scheduled deorbiting in 2016. In America, a few big corporations have manned space projects on the drawing boards, angling for whatever federal funding survives the next few rounds of our national bankruptcy proceedings, and a few billionaires here and elsewhere are building hobby spacecraft in roughly the same spirit that inspired their Gilded Age equivalents to maintain luxury yachts and thoroughbred stables.
Still, something has shifted. A tide that was expected to flow for generations and centuries to come has peaked and begun to ebb. There will still be rockets surging up from their launch pads for years or decades to come, and some few of them will have human beings on board, but the momentum is gone. It’s time to start coming to terms with the winding down of the age of space.
Ironically, one of the best pieces of evidence for that was the shrill reception given to an article in The Economist announcing The End of the Space Age. The irony was particularly delicious in that The Economist is a British periodical, and Britain has already been through its own retreat from space. During the first half of the 20th century, the British Interplanetary Society was among the most prestigious groups calling for manned space missions, but dreams of a British presence in space collapsed around the same time as Britain’s empire and industrial economy did. It’s hard to miss the schadenfreude in The Economist’s editorial stance, but it was even harder to overlook the bluster and denial splashed across the blogosphere in its wake.
A little perspective might be useful here. When the space shuttle first came off the drawing boards, the much-repeated theory was that it would be the first of a new breed of spacecraft that would make a flight from Cape Canaveral to orbit as commonplace as a flight from New York to Chicago. The next generation would swap out the shuttle’s disposable fuel tank and solid-fuel boosters for a fully reusable first stage that would take a shuttle-equivalent most of the way into orbit, then come back to Earth under its own power and get refueled for the next launch. Further down the road, but already in the concept phase, were spaceplanes that could take off from an ordinary runway and use standard jet engines to get to 50,000 feet or so, where rocket engines would cut in for the leap to orbit. Single-use rockets? In the minds of the space-savvy, they were already as outdated as Model T Fords.
Yet here we are in 2011, the space shuttle program is over, the replacements weren’t built, and for the five years of scheduled life the International Space Station has left, its crews will be getting there via the 1960s-era technology of Soyuz space capsules atop single-use rockets. As for the rest of the steps toward space everyone in the 1960s assumed we would have taken by now—the permanent space stations, the base on the Moon, the manned missions to Mars, and the rest of it—only the most hardcore space fans talk about them any more, and let’s not even discuss their chances of getting significant funding this side of the twelfth of never.
Mind you, I’m not cheering. Though I realized some years ago that humanity isn’t going to the stars—not now, not in the lifetime of our species—the end of the shuttle program with no replacement in sight still hit me like a body blow. It’s not just a generational thing, though it’s partly that; another large part of it was growing up where and when I did. By that I don’t just mean in the United States in the middle decades of the last century, but specifically in the triumphant years between John Glenn’s first orbital flight and Neil Armstrong’s final step onto lunar soil, in a suburb south of Seattle where every third family or so had a father who worked in the aerospace industry. Yes, I remember exactly where I was sitting and what was happening the moment that Walter Cronkite told the world that Apollo 11 had just landed on the Moon.
You didn’t grow up as a geeky, intellectual kid in that sort of setting without falling in love with space. Of course it didn’t hurt that the media was filled to the bursting point with space travel—turn on the tube any evening during my childhood, and if you didn’t get Lost In Space or Star Trek you’d probably catch The Invaders or My Favorite Martian—and children’s books were no different; among my favorites early on was Ronnie Rocket and Suzie Saucer, and I went from there to The Wonderful Flight to the Mushroom Planet, The Spaceship Under the Apple Tree—well, you get the picture. (I won’t even get into science fiction here; that’s a subject that deserves an entire post to itself.) Toys? The G.I. Joe accessory I treasured most in those days was a plastic Mercury space capsule with space suit to match; I also played with Major Matt Mason, Man In Space, and plenty of less efficiently marketed toys as well.
The future that most people imagined in those days had plenty of options primed to catch a young boy’s imagination, to be sure. Sealab—does anybody remember Sealab these days?—was the Navy’s attempt to compete with the romance of space, complete with breathless National Geographic articles about "a new world of limitless resources beneath the sea." (Ahem.) For a while, I followed Sealab as passionately as I did the space program, and yes, my G.I. Joe also had a wetsuit and scuba gear. That was common enough, and so were my less scientific fixations of the time, the monster lore and paranormal phenomena and the like; when you’re stuck growing up in suburbia in a disintegrating family and the only source of hope you can come up with is the prospect that the world isn’t as tepidly one-dimensional as everyone around you insists it has to be, you take encouragement where you find it.
You might think that a kid who was an expert on werewolf trivia at age ten would have gone in for the wildest of space fantasies, but I didn’t. Star Trek always seemed hokey to me. (I figured out early on that Star Trek was a transparent pastiche of mid-1960s US foreign policy, with the Klingons as Russia, the Vulcans as Japan, the Romulans as Red China, and Captain Kirk as a wish-fulfillment fantasy version of Gen. William Westmoreland who always successfully pacified his extraterrestrial Vietnams.) Quite the contrary; my favorite spacecraft model kit, which hung from a length of thread in my bedroom for years, was called the Pilgrim Observer: some bright kit designer’s vision of one of the workhorse craft of solar system exploration in the late 20th century.
Dilithium crystals, warp drives, and similar improbabilities had no place in the Pilgrim Observer. Instead, it had big tanks for hydrogen fuel, a heavily shielded nuclear engine on a long boom aft, an engagingly clunky command module up front bristling with telescopes and dish antennas—well, here again, you get the picture; if you know your way around 1970s space nonfiction, you know the kit. It came with a little booklet outlining the Pilgrim I’s initial flyby missions to Mars and Venus, all of it entirely plausible by the standards the time. That was what delighted me. Transporter beams and faster-than-light starflight, those were fantasy, but I expected to watch something not too far from Pilgrim I lifting off from Cape Canaveral within my lifetime.
That didn’t happen, and it’s not going to happen. That was a difficult realization for me to reach, back in the day, and it’s one a great many Americans are doing their level best to avoid right now. There are two solid reasons why the future in space so many of us thought we were going to get never arrived, and each one provides its own reasons for evasion. We’ve talked about both of them in this blog at various times, and there’s more than the obvious reason to review them now.
The first, simply put, is that the United States has lost the space race. Now of course it was less a single race than a whole track and field competition, with the first event, the satellite shot-put contest (winner: Russia, with Sputnik I), followed by the single-orbit dash (winner: Russia, with Vostok I) and a variety of longer sprints (winner: much more often than not, Russia). The run to the Moon was the first real US gold medal—we did half a dozen victory laps back out there just to celebrate—and we also scored big in the planetary probe toss competition, with a series of successful Mariner and Voyager missions that mostly showed us just how stunningly inhospitable the rest of the solar system was. The race that ultimately counted, though, was the marathon, and Russia’s won that one hands down; they’re still in space, and we aren’t.
Behind that unwelcome news is the great geopolitical fact of the early 21st century, the decline and imminent fall of the American empire. Like any number of empires before us, we’ve gotten ourselves wedged tightly into the predictable downside of hegemony—the stage at which the costs of maintaining the economic imbalances that channel wealth from empire to imperial state outstrip the flow of wealth those imbalances are meant to produce. Once that stage arrives, the replacement of the failing empire by some new distribution of power is a foregone conclusion; the only question is how long the process will take and how brutal the final cost to the imperial state will turn out to be.
The Cold War competition between the United States and the Soviet Union was a standard contest to see which empire would outlast the other. The irony, and it’s a rich one, is that the loser of that contest was pretty much guaranteed to be the winner in a broader sense. When the Soviet Union collapsed, Russia had an empire wrenched out of its hands, and as a result it was forced to give up the struggle to sustain the unsustainable. The United States kept its empire intact, and as a result it has continued that futile but obsessive fight, stripping its national economy to the bare walls in order to prop up a global military presence that will sooner or later bankrupt it completely. That’s why Russia still has a functioning space program, while the United States may have trouble finding the money to launch cheap fireworks by the time its empire finally slips from its fingers.
It’s our decidedly mixed luck, as discussed here more than once in the past, that America is entering on the downslope of its imperial decline just as a much vaster curve has peaked and begun to arc in the same direction. That’s the second reason that the space age is ending, not just for us but for humanity. In the final analysis, space travel was simply the furthest and most characteristic offshoot of industrial civilization, and depended—as all of industrial civilization depends—on vast quantities of cheap, highly concentrated, readily accessible energy. That basic condition is coming to an end around us right now. Petroleum has already reached its global production peak as depletion rates shoot past the rate at which new fields can be found and brought on line; natural gas and coal are not far behind—the current bubble in shale gas will be over in five or, just possibly, ten years—and despite decades of animated handwaving, no other energy source has proven to yield anything close to the same abundance and concentration of energy at anything like the same cost.
That means, as I’ve shown in detail in past posts here, that industrial civilization will be a short-lived and self-terminating phenomenon. It doesn’t mean, or at least doesn’t have to mean, that future civilizations will have to make do with an equivalent of the much simpler technological suites that civilizations used before the industrial age; I’ve argued at some length here and elsewhere that an ecotechnic society—a civilization that supports a relatively advanced technology on a modest scale using the diffuse and limited energy provided by sustainable sources, without wrecking the planet—is a live option, if not in the immediate future, then after the dark age the misguided choices of the recent past have prepared for us.
Still, of the thousands of potential technological projects that might appeal to the limited ambitions and even more strictly limited resources of some future ecotechnic society, space travel will rank very, very low. It’s possible that the thing will be done, perhaps in the same spirit that motivated China a little while back to carry out a couple of crisp, technically capable manned orbital flights; ten thousand years from now, putting a human being into orbit will still probably be the most unanswerable way for a civilization to announce that it’s arrived. There are also useful things to be gained by lofting satellites for communication and observation purposes, and it’s not at all impossible that now and then, over the centuries and millennia to come, the occasional satellite will pop up into orbit for a while, and more space junk will be added to the collection already in place.
That’s not the vision that fired a generation with enthusiasm for space, though. It’s not the dream that made Konstantin Tsiolkovsky envision Earth as humanity’s cradle, that set Robert Goddard launching rockets in a Massachusetts farmyard and hurled Yuri Gagarin into orbit aboard Vostok I. Of all people, it was historical theorist Oswald Spengler who characterized that dream most precisely, anatomizing the central metaphor of what he called Faustian civilization—yes, that’s us—as an eternal outward surge into an emptiness without limit. That was never a uniquely American vision, of course, though American culture fixated on it in predictable ways; a nation that grew up on the edge of vastness and cherished dreams of heading west and starting life over again was guaranteed to think of space, in the words of the Star Trek cliché, as "the final frontier." That it did indeed turn out to be our final frontier, the one from which we fell back at last in disarray and frustration, simply adds a mordant note to the tale.
It’s crucial to realize that the fact that a dream is entrancing and appeals to our core cultural prejudices is no guarantee that it will come true, or even that it can. There will no doubt be any number of attempts during the twilight years of American empire to convince Americans to fling some part of the energies and resources that remain to them into a misguided attempt to relive the dream and claim some supposed destiny among the stars. That’s not a useful choice at this stage of the game. Especially but not only in America, any response to the crisis of our time that doesn’t start by using much less in the way of energy and resources simply isn’t serious. The only viable way ahead for now, and for lifetimes to come, involves learning to live well within our ecological limits; it might also help if we were to get it through our heads that the Earth is not humanity’s cradle, or even its home, but rather the whole of which each of us, and our species, is an inextricable part.
That being said, it is far from inappropriate to honor the failed dream that will shortly be gathering dust in museums and rusting in the winds that blow over Cape Canaveral. Every civilization has some sprawling vision of the future that’s destined never to be fulfilled, and the dream of infinite expansion into space was ours. The fact that it didn’t happen, and arguably never could have happened, takes nothing away from the grandeur of its conception, the passion, genius, and hard work that went into its pursuit, or the sacrifices made on its behalf. Some future poet or composer, perhaps, will someday gather it all up in the language of verse or music, and offer a fitting elegy to the age of space.
Meanwhile, some 240,000 miles from the room where I write this, a spidery metallic shape lightly sprinkled with meteoritic dust sits alone in the lunar night on the airless sweep of Mare Tranquillitatis. On it is a plaque which reads WE CAME IN PEACE FOR ALL MANKIND. Even if no other human eyes ever read that plaque again, as seems likely, it’s a proud thing to have been able to say, and a proud thing to have done. I can only hope that the remembrance that our species once managed the thing offers some consolation during the bitter years ahead of us.
Wednesday, August 17, 2011
The Twilight of Meaning
This is not going to be an easy post to write, and I’m not at all sure it will be any easier to understand; I trust my readers will bear with me. I could begin it in any number of places, but the one that seems most important just now is the vestibule of the little public library six blocks away from my house. It’s a solid if unimaginative brick rectangle of Eighties vintage, one room not quite so full of books as it ought to be, another room in back for the librarians to work, a meeting space, restrooms, and a vestibule where books that are being discarded from the collection are shelved for sale.
That’s standard practice in most public libraries these days. If a book hasn’t been checked out for three years, or if it needs repairs and there isn’t a huge demand for it, it goes onto the sale shelf. Prices range from cheap to absurdly cheap; the sale doesn’t bring in a huge amount, but at a time of sparse and faltering budgets, every bit helps. The exception is children’s books, which aren’t for sale at all. They’re in a cart marked FREE, and if they don’t get taken in a month or so, they go into the trash, because there simply isn’t any demand for them. That was where, a few months ago, I spotted a copy of Kate Seredy’s 1938 Newberry Award winner The White Stag.
The vast majority of my readers will no doubt find the reference opaque. Still, back when I was a child—no, dinosaurs didn’t quite walk the earth back then, though it sometimes feels that way—winners of the Newberry Award, one of the two most prestigious US awards for children’s literature, still counted for quite a bit. Most libraries with a children’s collection of any size had the whole set, and most children’s librarians were enthusiastic about getting them into the hands of young readers. That’s not how I found The White Stag—I needed nobody’s encouragement to read, and Seredy’s compelling illustrations of galloping horsemen and magical beasts were well aimed to catch my eye—but find it I did, and that’s how medieval Hungarian legends about the coming of Attila the Hun wove their way permanently into the crawlspaces of my imagination.
So that was the book, one among dozens, that was awaiting its fate in the free cart at the South Cumberland Public Library. I already have a copy, and I decided to take the risk that somebody would find the one in the cart before it got tossed in the trash. As it happens, it was the right choice; the next week it was gone. I’ll never know whether some grandparent recognized it from his or her own childhood and took it as a gift, or whether some child caught sight of the cover, pulled it from the cart, and was caught by the magic of a tale that makes today’s canned children’s fantasies look like the pasty commercial product they are, but at least I can hope that it was something like that.
The White Stag was written the year my father was born. In my youth you could find books that old and much older, plenty of them, in small town public libraries all over the country. Nowadays, increasingly, you can’t. What you get instead are shelf upon shelf of whatever’s new, glossy, popular and uncontroversial, massaged into innocuousness by marketing specialists and oozing a fetid layer of movie, toy, and video game tie-ins from all orifices, all part of the feedback loop that endlessly recycles the clichés of current popular culture into minds that, in many cases, have never encountered anything else. In the process, the threads of our collective memory are coming silently apart.
I don’t think it’s going too far to describe the result as a kind of cultural senility. That concept certainly goes a long way to explain the blank and babbling incoherence with which America in particular stares vacantly at its onrushing fate. Without a sense of the past and its meaning, without narratives that weave the events of our daily lives into patterns that touch the principles that matter, we lack the essential raw materials of thought, and so our collective reasoning processes, such as they are, spit out the same rehashed nonsolutions over and over again.
It will doubtless be objected that we have the internet, and thus all the information we could possibly need. We do indeed have the internet, where sites discussing the current color of Lady Gaga’s pubic hair probably outnumber sites discussing Newberry Award books by a thousand to one. We have an effectively limitless supply of information, but then it’s not information that I got from reading The White Stag at age eight, and it’s not a lack of information that’s dragging us down to a sorry end.
The problem—for it is a problem, and thus at least in theory capable of solution, rather than a predicament, which simply has to be put up with—is the collapse of the framework of collective meanings that gives individual facts their relevance. That framework of meanings consists, in our culture and every other, of shared narratives inherited from the past that form the armature on which our minds place data as it comes in.
A couple of years ago, in a discussion on this blog that touched on this same point, I made the mistake of referring to those narratives by their proper name, which is myth. Those of you who know how Americans think know exactly what happened next: plenty of readers flatly insisted on taking the word in its debased modern sense of “a story that isn’t true,” and insisted in tones ranging from bafflement to injured pride that they didn’t believe in any myths, and what was I talking about?
The myths you really believe in, of course, are the ones you don’t notice that you believe. The myth of progress is still like that for most people. Even those who insist that they no longer believe in progress very often claim that we can have a better world for everybody if we do whatever they think we ought to do. In the same way, quite a few of the people who claim that they’ve renounced religion and all its works still believe, as devoutly as any other fundamentalist, that it’s essential to save everybody else in the world from false beliefs; the central myth of evangelical religion, which centers on salvation through having the right opinions, remains welded into place even among those who most angrily reject the original religious context of that myth.
But there’s a further dimension to the dynamics of—well, let’s just call them cultural narratives, shall we?—unfolding in America today. When the shared narratives from the past break apart, and all you’ve got is popular culture spinning feedback loops in the void, what happens then?
What happens is the incoherence that’s become a massive political fact in America today. That incoherence takes at least three forms. The first is the rise of subcultures that can’t communicate with one another at all. We had a display of that not long ago in the clash over raising the deficit limit. To judge by the more thoughtful comments in the blogosphere, I was far from the only person who noticed that the two sides were talking straight past each other. It wasn’t simply that the two sides had differing ideas about government finance, though of course that’s also true; it’s that there’s no longer any shared narrative about government that’s held in common between the two sides. The common context is gone; it’s hard to think of a single political concept that has the same connotations and meanings to a New England liberal that it has to an Oklahoma conservative.
It’s crucial to recognize, though, that these subcultures are themselves riddled with the same sort of incoherence that pervades society as a whole; this is the second form of incoherence I want to address. I wonder how many of the devout Christians who back the Republican Party, for example, realize that the current GOP approach to social welfare issues is identical to the one presented by Anton Szandor LaVey in The Satanic Bible. (Check it out sometime; the parallels are remarkable.) It may seem odd that believers in a faith whose founder told his followers to give all they had to the poor now by and large support a party that’s telling America to give all it has to the rich, but that’s what you get when a culture’s central narratives dissolve; of course it’s also been my experience that most people who claim they believe in the Bible have never actually read more than a verse here and there.
Mind you, the Democratic Party is no more coherent than the GOP. Since the ascendancy of Reagan, the basic Democrat strategy has been to mouth whatever slogans you think will get you elected and then, if you do land in the White House, chuck the slogans, copy the policies of the last successful Republican president, and hope for the best. Clinton did that with some success, copying to the letter Reagan’s borrow-and-spend policies at home and mostly toothless bluster abroad; of course he had the luck to ride a monstrous speculative bubble through his two terms, and then hand it over to the GOP right as it started to pop. Obama, in turn, has copied the younger Bush’s foreign and domestic policies with equal assiduity but less success; partly that’s because the two Middle Eastern wars he’s pursued with such enthusiasm were a bad idea from the start, and partly because his attempts to repeat Bush’s trick of countering the collapse of one speculative bubble by inflating another haven’t worked so far.
I’ve discussed more than once before in these posts the multiple ironies of living at a time when the liberals have forgotten how to liberate and the conservatives have never learned how to conserve. Still, there’s a third dimension to the incoherence of contemporary America, and it appears most clearly in the behavior of people whose actions are quite literally cutting their own throats. The kleptocratic frenzy under way at the top of the economic pyramid is the best example I can think of.
Back in the 1930s, a substantial majority of the American rich realized that the only way to stop the rising spiral of depressions that threatened to end here, as in much of Europe, in fascist takeovers was to allow a much larger share of the national wealth to go to the working classes. They were quite correct, because it’s wages rather than investments that are the real drivers of economic prosperity. The logic here is as simple as it is irrefutable. When people below the rentier class have money, by and large, they spend it, and those expenditures provide income for businesses. Rising wages thus drive rising business income, increased employment, and all the other factors that make for prosperity.
On the other hand, when more money shifts to the rentier class – the people who live on investments – a smaller fraction goes to consumer expenditures, and the higher up the ladder you go, the smaller the fraction becomes. Close to the summit, nearly all income gets turned into investments of a more or less speculative nature, which take it out of the productive economy altogether. (How many people are employed to manufacture a derivative?) This recognition was the basis for the American compromise of the 1930s, a compromise brokered by the very rich Franklin Roosevelt and backed by a solid majority of financiers and industrialists at the time, who recognized that pursuing their own short-term profit at the expense of economic prosperity and national survival was not exactly a bright idea.
Yet this not very bright idea is now standard practice across the board on the upper end of the American economy. The absurd bonuses “earned” by bankers in recent years are only the most visible end of a pervasive culture of executive profiteering, aided and abetted by both parties and shrugged off by boards of directors who have by and large misplaced their fiduciary duty to the stockholders. This and other equally bad habits have drawn a pre-1930s share of the national wealth to the upper end of the economic spectrum, and accordingly produced a classic pre-1930s sequence of bubbles and crashes.
None of this takes rocket science to understand; nor does it demand exceptional thinking capacity to realize that pushed too far, a set of habits that prioritizes short-term personal profits over the survival of the system that makes those profits possible could very well leave top executives dangling from lampposts—or, as was the case in the very late 19th and early 20th centuries, so common a target for homegrown terrorists that people throwing bombs through the windows of magnates’ cars was a theme for music-hall ditties. What it takes, rather, is the sense of context that comes from shared narratives deriving from the past—in this case, the recognition that today’s economic problems derive from the policies that caused the same problems most of a century ago would probably be enough.
Still, that recognition—more broadly, the awareness that the lessons of the past have something to teach the present—requires a kind of awareness that’s become very uncommon in America these days, and I’ve come to think that the main culprit at all levels of society is precisely the feedback loop mentioned earlier, the transformation of culture into marketing that exists for no other purpose than to sell more copies of itself. The replacement of The White Stag and its peers with the Care Bears and theirs is only one small part of that transformation, though it’s a telling one. There’s no tragedy in the Care Bears universe, no history, and no change, just a series of interchangeable episodes in which one-dimensional figures lurch mechanically through their routines and end exactly where they started, just in time for the closing flurry of ads.
The popular culture on offer to adults is by and large more complex, but no less subject to the pressures of manufactured popular culture. (The public library in Seattle, to my horror, once put up splashy ads asking, “What if everyone in Seattle read the same book?” Why, then we’d have even more of a mental monoculture than we’ve got already.) There the interchangeable unit is less often the episode than the movie, the novel, or the series. Whether the protagonist finds true love, catches the murderer, gets bitten by the vampire, saves the world from destruction, or whatever other generic gimmick drives the plot, you know perfectly well that when you finish this one there are hundreds more just like it ready to go through the same mechanical motions. Their sole originality is the effort to ring as many changes on a standard formula as possible—hey, let’s do another pirate zombie romantic mystery, but this time with Jane Austen! The result is like taking a loaf of Wonder Bread and spreading something different on every slice, starting with Marmite and ending with motor oil; there are plenty of surface variations, but underneath it’s always the same bland paste.
Business executives, you may be interested to know, read very little other than mystery novels and pop business books. I don’t know that anybody’s done a survey on what politicians read, but I doubt it’s anything more edifying. It’s really a closed loop; from the top to the bottom of the social pyramid, one or another form of mass-market popular culture makes up most of the mental input of Americans, and I trust most of my readers know the meaning of the acronym GIGO. Then we look baffled when things don’t work out, because we don’t know how to deal with tragedy or history or change, and trying to impose some form of Care Bear logic on the real world simply doesn’t work.
I mentioned earlier that this is a problem, not a predicament, and that it therefore has a solution. As it happens, I have no reason to think that more than a handful of people will be willing to embrace the solution, but it’s still worth mentioning for their sake, and for another reason I’ll get to in a bit. The solution? It’s got two steps, which are as follows.
1. Pull the plug on current popular culture in your own life. Cutting back a little doesn’t count, and no, you don’t get any points for feeling guilty about wallowing in the muck. Face it, your television will do you more good at the bottom of a dumpster than it will sitting in your living room, and the latest pirate zombie romantic mystery, with or without Jane Austen, is better off gathering cobwebs in a warehouse; you don’t need any of it, and it may well be wrecking your capacity to think clearly.
2. Replace it with something worth reading, watching, hearing, or doing. You may well have your own ideas about what goes in this category, but in case you don’t, I have a suggetion: go looking among things that are older than you are.
Yes, I’m quite serious, and for more than one reason. First, one of the advantages of time is that the most forgettable things get forgotten; there was a huge amount of vapid popular culture in the 19th century, for example, but only the most erudite specialists know much about it now. Your chances of finding something worth reading or watching or hearing or doing goes up as time has more of a chance to run its filter on the results. Second, even if what you find is pablum, it’s the pablum of a different time, and will clash with mental habits tuned to the pablum of this time, with useful results. When the visual conventions of a Humphrey Bogart movie strike you as staged and hokey, stop and ask yourself how current popular culture will look fifty years from now—if anybody’s looking at them at all, that is.
That, of course, is the third reason, the one I hinted at a few paragraphs back: current popular culture, like so much else of contemporary American society, is almost uniquely vulnerable to the multiple impacts of an industrial civilization in decline. Fifty years from now, the way things are going just now, the chances that anybody will be able to watch a Care Bears video are pretty close to nil; most of today’s media don’t age well, and all of them depend directly and indirectly on energy inputs that our society can scarcely maintain now and almost certainly won’t be able to maintain for most Americans for more than a decade or two longer. Beyond that, you’re going to need something more durable, and a great deal of what was in circulation before the era of mass culture will still be viable after that era is over once and for all.
There’s more to it, too, but to get there we’re going to have to take a detour through a conversation that almost nobody in America wants to have just now. We’ll get into that next week.
That’s standard practice in most public libraries these days. If a book hasn’t been checked out for three years, or if it needs repairs and there isn’t a huge demand for it, it goes onto the sale shelf. Prices range from cheap to absurdly cheap; the sale doesn’t bring in a huge amount, but at a time of sparse and faltering budgets, every bit helps. The exception is children’s books, which aren’t for sale at all. They’re in a cart marked FREE, and if they don’t get taken in a month or so, they go into the trash, because there simply isn’t any demand for them. That was where, a few months ago, I spotted a copy of Kate Seredy’s 1938 Newberry Award winner The White Stag.
The vast majority of my readers will no doubt find the reference opaque. Still, back when I was a child—no, dinosaurs didn’t quite walk the earth back then, though it sometimes feels that way—winners of the Newberry Award, one of the two most prestigious US awards for children’s literature, still counted for quite a bit. Most libraries with a children’s collection of any size had the whole set, and most children’s librarians were enthusiastic about getting them into the hands of young readers. That’s not how I found The White Stag—I needed nobody’s encouragement to read, and Seredy’s compelling illustrations of galloping horsemen and magical beasts were well aimed to catch my eye—but find it I did, and that’s how medieval Hungarian legends about the coming of Attila the Hun wove their way permanently into the crawlspaces of my imagination.
So that was the book, one among dozens, that was awaiting its fate in the free cart at the South Cumberland Public Library. I already have a copy, and I decided to take the risk that somebody would find the one in the cart before it got tossed in the trash. As it happens, it was the right choice; the next week it was gone. I’ll never know whether some grandparent recognized it from his or her own childhood and took it as a gift, or whether some child caught sight of the cover, pulled it from the cart, and was caught by the magic of a tale that makes today’s canned children’s fantasies look like the pasty commercial product they are, but at least I can hope that it was something like that.
The White Stag was written the year my father was born. In my youth you could find books that old and much older, plenty of them, in small town public libraries all over the country. Nowadays, increasingly, you can’t. What you get instead are shelf upon shelf of whatever’s new, glossy, popular and uncontroversial, massaged into innocuousness by marketing specialists and oozing a fetid layer of movie, toy, and video game tie-ins from all orifices, all part of the feedback loop that endlessly recycles the clichés of current popular culture into minds that, in many cases, have never encountered anything else. In the process, the threads of our collective memory are coming silently apart.
I don’t think it’s going too far to describe the result as a kind of cultural senility. That concept certainly goes a long way to explain the blank and babbling incoherence with which America in particular stares vacantly at its onrushing fate. Without a sense of the past and its meaning, without narratives that weave the events of our daily lives into patterns that touch the principles that matter, we lack the essential raw materials of thought, and so our collective reasoning processes, such as they are, spit out the same rehashed nonsolutions over and over again.
It will doubtless be objected that we have the internet, and thus all the information we could possibly need. We do indeed have the internet, where sites discussing the current color of Lady Gaga’s pubic hair probably outnumber sites discussing Newberry Award books by a thousand to one. We have an effectively limitless supply of information, but then it’s not information that I got from reading The White Stag at age eight, and it’s not a lack of information that’s dragging us down to a sorry end.
The problem—for it is a problem, and thus at least in theory capable of solution, rather than a predicament, which simply has to be put up with—is the collapse of the framework of collective meanings that gives individual facts their relevance. That framework of meanings consists, in our culture and every other, of shared narratives inherited from the past that form the armature on which our minds place data as it comes in.
A couple of years ago, in a discussion on this blog that touched on this same point, I made the mistake of referring to those narratives by their proper name, which is myth. Those of you who know how Americans think know exactly what happened next: plenty of readers flatly insisted on taking the word in its debased modern sense of “a story that isn’t true,” and insisted in tones ranging from bafflement to injured pride that they didn’t believe in any myths, and what was I talking about?
The myths you really believe in, of course, are the ones you don’t notice that you believe. The myth of progress is still like that for most people. Even those who insist that they no longer believe in progress very often claim that we can have a better world for everybody if we do whatever they think we ought to do. In the same way, quite a few of the people who claim that they’ve renounced religion and all its works still believe, as devoutly as any other fundamentalist, that it’s essential to save everybody else in the world from false beliefs; the central myth of evangelical religion, which centers on salvation through having the right opinions, remains welded into place even among those who most angrily reject the original religious context of that myth.
But there’s a further dimension to the dynamics of—well, let’s just call them cultural narratives, shall we?—unfolding in America today. When the shared narratives from the past break apart, and all you’ve got is popular culture spinning feedback loops in the void, what happens then?
What happens is the incoherence that’s become a massive political fact in America today. That incoherence takes at least three forms. The first is the rise of subcultures that can’t communicate with one another at all. We had a display of that not long ago in the clash over raising the deficit limit. To judge by the more thoughtful comments in the blogosphere, I was far from the only person who noticed that the two sides were talking straight past each other. It wasn’t simply that the two sides had differing ideas about government finance, though of course that’s also true; it’s that there’s no longer any shared narrative about government that’s held in common between the two sides. The common context is gone; it’s hard to think of a single political concept that has the same connotations and meanings to a New England liberal that it has to an Oklahoma conservative.
It’s crucial to recognize, though, that these subcultures are themselves riddled with the same sort of incoherence that pervades society as a whole; this is the second form of incoherence I want to address. I wonder how many of the devout Christians who back the Republican Party, for example, realize that the current GOP approach to social welfare issues is identical to the one presented by Anton Szandor LaVey in The Satanic Bible. (Check it out sometime; the parallels are remarkable.) It may seem odd that believers in a faith whose founder told his followers to give all they had to the poor now by and large support a party that’s telling America to give all it has to the rich, but that’s what you get when a culture’s central narratives dissolve; of course it’s also been my experience that most people who claim they believe in the Bible have never actually read more than a verse here and there.
Mind you, the Democratic Party is no more coherent than the GOP. Since the ascendancy of Reagan, the basic Democrat strategy has been to mouth whatever slogans you think will get you elected and then, if you do land in the White House, chuck the slogans, copy the policies of the last successful Republican president, and hope for the best. Clinton did that with some success, copying to the letter Reagan’s borrow-and-spend policies at home and mostly toothless bluster abroad; of course he had the luck to ride a monstrous speculative bubble through his two terms, and then hand it over to the GOP right as it started to pop. Obama, in turn, has copied the younger Bush’s foreign and domestic policies with equal assiduity but less success; partly that’s because the two Middle Eastern wars he’s pursued with such enthusiasm were a bad idea from the start, and partly because his attempts to repeat Bush’s trick of countering the collapse of one speculative bubble by inflating another haven’t worked so far.
I’ve discussed more than once before in these posts the multiple ironies of living at a time when the liberals have forgotten how to liberate and the conservatives have never learned how to conserve. Still, there’s a third dimension to the incoherence of contemporary America, and it appears most clearly in the behavior of people whose actions are quite literally cutting their own throats. The kleptocratic frenzy under way at the top of the economic pyramid is the best example I can think of.
Back in the 1930s, a substantial majority of the American rich realized that the only way to stop the rising spiral of depressions that threatened to end here, as in much of Europe, in fascist takeovers was to allow a much larger share of the national wealth to go to the working classes. They were quite correct, because it’s wages rather than investments that are the real drivers of economic prosperity. The logic here is as simple as it is irrefutable. When people below the rentier class have money, by and large, they spend it, and those expenditures provide income for businesses. Rising wages thus drive rising business income, increased employment, and all the other factors that make for prosperity.
On the other hand, when more money shifts to the rentier class – the people who live on investments – a smaller fraction goes to consumer expenditures, and the higher up the ladder you go, the smaller the fraction becomes. Close to the summit, nearly all income gets turned into investments of a more or less speculative nature, which take it out of the productive economy altogether. (How many people are employed to manufacture a derivative?) This recognition was the basis for the American compromise of the 1930s, a compromise brokered by the very rich Franklin Roosevelt and backed by a solid majority of financiers and industrialists at the time, who recognized that pursuing their own short-term profit at the expense of economic prosperity and national survival was not exactly a bright idea.
Yet this not very bright idea is now standard practice across the board on the upper end of the American economy. The absurd bonuses “earned” by bankers in recent years are only the most visible end of a pervasive culture of executive profiteering, aided and abetted by both parties and shrugged off by boards of directors who have by and large misplaced their fiduciary duty to the stockholders. This and other equally bad habits have drawn a pre-1930s share of the national wealth to the upper end of the economic spectrum, and accordingly produced a classic pre-1930s sequence of bubbles and crashes.
None of this takes rocket science to understand; nor does it demand exceptional thinking capacity to realize that pushed too far, a set of habits that prioritizes short-term personal profits over the survival of the system that makes those profits possible could very well leave top executives dangling from lampposts—or, as was the case in the very late 19th and early 20th centuries, so common a target for homegrown terrorists that people throwing bombs through the windows of magnates’ cars was a theme for music-hall ditties. What it takes, rather, is the sense of context that comes from shared narratives deriving from the past—in this case, the recognition that today’s economic problems derive from the policies that caused the same problems most of a century ago would probably be enough.
Still, that recognition—more broadly, the awareness that the lessons of the past have something to teach the present—requires a kind of awareness that’s become very uncommon in America these days, and I’ve come to think that the main culprit at all levels of society is precisely the feedback loop mentioned earlier, the transformation of culture into marketing that exists for no other purpose than to sell more copies of itself. The replacement of The White Stag and its peers with the Care Bears and theirs is only one small part of that transformation, though it’s a telling one. There’s no tragedy in the Care Bears universe, no history, and no change, just a series of interchangeable episodes in which one-dimensional figures lurch mechanically through their routines and end exactly where they started, just in time for the closing flurry of ads.
The popular culture on offer to adults is by and large more complex, but no less subject to the pressures of manufactured popular culture. (The public library in Seattle, to my horror, once put up splashy ads asking, “What if everyone in Seattle read the same book?” Why, then we’d have even more of a mental monoculture than we’ve got already.) There the interchangeable unit is less often the episode than the movie, the novel, or the series. Whether the protagonist finds true love, catches the murderer, gets bitten by the vampire, saves the world from destruction, or whatever other generic gimmick drives the plot, you know perfectly well that when you finish this one there are hundreds more just like it ready to go through the same mechanical motions. Their sole originality is the effort to ring as many changes on a standard formula as possible—hey, let’s do another pirate zombie romantic mystery, but this time with Jane Austen! The result is like taking a loaf of Wonder Bread and spreading something different on every slice, starting with Marmite and ending with motor oil; there are plenty of surface variations, but underneath it’s always the same bland paste.
Business executives, you may be interested to know, read very little other than mystery novels and pop business books. I don’t know that anybody’s done a survey on what politicians read, but I doubt it’s anything more edifying. It’s really a closed loop; from the top to the bottom of the social pyramid, one or another form of mass-market popular culture makes up most of the mental input of Americans, and I trust most of my readers know the meaning of the acronym GIGO. Then we look baffled when things don’t work out, because we don’t know how to deal with tragedy or history or change, and trying to impose some form of Care Bear logic on the real world simply doesn’t work.
I mentioned earlier that this is a problem, not a predicament, and that it therefore has a solution. As it happens, I have no reason to think that more than a handful of people will be willing to embrace the solution, but it’s still worth mentioning for their sake, and for another reason I’ll get to in a bit. The solution? It’s got two steps, which are as follows.
1. Pull the plug on current popular culture in your own life. Cutting back a little doesn’t count, and no, you don’t get any points for feeling guilty about wallowing in the muck. Face it, your television will do you more good at the bottom of a dumpster than it will sitting in your living room, and the latest pirate zombie romantic mystery, with or without Jane Austen, is better off gathering cobwebs in a warehouse; you don’t need any of it, and it may well be wrecking your capacity to think clearly.
2. Replace it with something worth reading, watching, hearing, or doing. You may well have your own ideas about what goes in this category, but in case you don’t, I have a suggetion: go looking among things that are older than you are.
Yes, I’m quite serious, and for more than one reason. First, one of the advantages of time is that the most forgettable things get forgotten; there was a huge amount of vapid popular culture in the 19th century, for example, but only the most erudite specialists know much about it now. Your chances of finding something worth reading or watching or hearing or doing goes up as time has more of a chance to run its filter on the results. Second, even if what you find is pablum, it’s the pablum of a different time, and will clash with mental habits tuned to the pablum of this time, with useful results. When the visual conventions of a Humphrey Bogart movie strike you as staged and hokey, stop and ask yourself how current popular culture will look fifty years from now—if anybody’s looking at them at all, that is.
That, of course, is the third reason, the one I hinted at a few paragraphs back: current popular culture, like so much else of contemporary American society, is almost uniquely vulnerable to the multiple impacts of an industrial civilization in decline. Fifty years from now, the way things are going just now, the chances that anybody will be able to watch a Care Bears video are pretty close to nil; most of today’s media don’t age well, and all of them depend directly and indirectly on energy inputs that our society can scarcely maintain now and almost certainly won’t be able to maintain for most Americans for more than a decade or two longer. Beyond that, you’re going to need something more durable, and a great deal of what was in circulation before the era of mass culture will still be viable after that era is over once and for all.
There’s more to it, too, but to get there we’re going to have to take a detour through a conversation that almost nobody in America wants to have just now. We’ll get into that next week.
Wednesday, August 10, 2011
Salvaging Health
The old chestnut about living in interesting times may not actually be a Chinese curse, as today’s urban folklore claims, but it certainly comes to mind when glancing back over the smoldering wreckage of the past week. In the wake of a political crisis here in America that left both sides looking more than ever like cranky six-year-olds, a long-overdue downgrade of America’s unpayable debt, and yet another round of fiscal crisis in the Eurozone, stock and commodity markets around the globe roared into a power dive from which, as I write this, they show no sign of recovering any time soon.
In England, meanwhile, one of those incidents Americans learned to dread in the long hot summers of the Sixties—a traffic stop in a poor minority neighborhood, a black man shot dead by police under dubious circumstances—has triggered four nights of looting and rioting, as mobs in London and elsewhere organized via text messages and social media, brushed aside an ineffectual police presence, plundered shops and torched police stations, and ripped gaping holes in their nation’s already shredding social fabric. It seems that “Tottenham” is how the English pronounce “Watts,” except that the fire this time is being spread rather more efficiently with the aid of Blackberries and flashmobs.
Government officials denounced the riots as “mindless thuggery,” but it’s considerably more than that. As one looter cited in the media said, “this is my banker’s bonus”—the response of the bottom of the social pyramid, that is, to a culture of nearly limitless corruption further up. It bears remembering that the risings earlier this year in Tunisia, Egypt, and elsewhere began with exactly this sort of inchoate explosion of rage against governments that responded to economic crisis by tightening the screws on the poor; it was only when the riots showed the weakness of the existing order that more organized and ambitious movements took shape amid the chaos. It’s thus not outside the bounds of possibility, if the British government keeps on managing the situation as hamhandedly as it’s done so far, that the much-ballyhooed Arab Spring may be followed by an English Summer—and just possibly thereafter by a European Autumn.
One way or another, this is what history looks like as it’s happening. Those of my readers who have been following along for a year or two, and have made at least a decent fraction of the preparations I’ve suggested, are probably as well prepared for the unfolding mess as anyone is likely to be. Those who have just joined the conversation, or were putting aside preparations for some later date—well, once the rubble stops bouncing and the smoke clears, you’ll have the chance to assess what possibilities are still open and what you have the resources to accomplish. In the meantime, I want to continue the sequence of posts already under way, and discuss another of the things that’s going to have to be salvaged as the current system grinds awkwardly to a halt.
The theme of this week’s discussion, I’m sorry to say, is another issue split down the middle by the nearly Gnostic dualisms that bedevil contemporary American society. Just as Democrats and Republicans denounce each other in incandescent fury, and fundamentalist atheists compete with fundamentalist Christians in some sort of Olympics of ideological intolerance, the issues surrounding health care in America these days have morphed unhelpfully into a bitter opposition between the partisans of mainstream medicine and the proponents of alternative healing. The radicals on both sides dismiss the other side as a bunch of murderous quacks, while even those with more moderate views tend to regard the other end of the spectrum through a haze of suspicion tinged with bad experiences and limited knowledge.
I stay out of such debates as often as I can, but this one hasn’t given me that choice. Ironically, that’s because I’ve experienced both sides of the issue. On the one hand, I’m alive today because of modern medicine. At the age of seven, I came down with a serious case of scarlet fever. That’s a disease that used to kill children quite regularly, and in a premodern setting, it almost certainly would have killed me. As it was, I spent two weeks flat on my back, and pulled through mostly because of horse doctor’s doses of penicillin, administered first with syringes that to my seven-year-old eyes looked better suited for young elephants, and thereafter in oral form, made palatable with an imitation banana flavoring I can still call instantly to mind.
Then there’s the other side of the balance. My wife has lifelong birth defects in her legs and feet, because her mother’s obstetrician prescribed a drug that was contraindicated for pregnant women because it causes abnormalities in fetal limb development. My only child died at birth because my wife’s obstetrician did exactly the same thing, this time with a drug that was well known to cause fatal lung abnormalities. Several years later we found out by way of a media exposé that the latter doctor had done the same thing to quite a few other women, leaving a string of dead babies in his wake. The response of the medical board, once the media exposure forced them to do something, was quite standard; they administered a mild reprimand. If this reminds you of the Vatican’s handling of pedophile priests, well, let’s just say the comparison has occurred to me as well.
Deaths directly caused by American health care are appallingly common. A widely cited 2000 study by public health specialist Dr. Barbara Starwood presented evidence that bad medical care kills more Americans every year than anything but heart disease and cancer, with adverse drug effects and nosocomial (hospital- and clinic-spread) infections the most common culprits. A more comprehensive study prepared outside the medical mainstream, but based entirely on data from peer-reviewed medical journals, argued that the actual rate was much higher—higher, in fact, than any other single cause. That’s part of what makes the controversies over American health care so challenging; mainstream medical care saves a lot of lives in America, but because of the pressures of the profit motive, and the extent to which institutional barriers protect incompetent practitioners and dangerous and ineffective remedies, it also costs a lot of lives as well.
Even so, if I could find a competent, affordable general practitioner to give me annual checkups and help me deal with the ordinary health issues middle-aged men tend to encounter, I’d be happy to do so. The catch here is that little word "affordable." Along with those birth defects, my wife has celiac disease, a couple of food allergies, and a family history with some chronic health problems in it; for that matter, my family history is by no means squeaky clean; we’re both self-employed, and so health insurance would cost us substantially more than our mortgage. That’s money we simply don’t have. Like a large and growing fraction of Americans, therefore, we’ve turned to alternative medicine for our health care.
The more dogmatic end of the mainstream medical industry tends to dismiss all alternative healing methods as ineffective by definition. That’s self-serving nonsense; the core alternative healing modalities, after all, are precisely the methods of health care that were known and practiced in the late 19th century, before today’s chemical and surgical medicine came on the scene, and they embody decades or centuries of careful study of health and illness. There are things that alternative health methods can’t treat as effectively as the current mainstream, of course, but the reverse is also true.
Still, behind the rhetoric of the medical industry lies a fact worth noting: alternative medical methods are almost all much less intensive than today’s chemical and surgical medicine. The best way to grasp the difference is to compare it to other differences between life in the late 19th century and life today—say, the difference between walking and driving a car. Like alternative medicine, walking is much slower, it requires more personal effort, and there are destinations that, realistically speaking, are out of its reach; on the other hand, it has fewer negative side effects, costs a lot less, and dramatically cuts your risk of ending up buttered across the grill of a semi because somebody else made a mistake.
Those differences mean that you can’t use alternative medicine the way you use the mainstream kind. If I neglect a winter cold, for example, I tend to end up with bacterial bronchitis. A physician nowadays can treat that with a simple prescription of antibiotics, and unless the bacterium happens to be resistant—an issue I’ll be discussing in more detail in a bit—that’s all there is to it. If you’re using herbs, on the other hand, handling bacterial bronchitis is a more complex matter. There are very effective herbal treatments, and if you know them, you know exactly what you’re getting and what the effects will be. On the other hand, you can’t simply pop a pill and go on with your day; you have to combine the herbal infusions with rest and steam inhalation, and pay attention to your symptoms so you can treat for fever or other complications if they arise. You very quickly learn, also, that if you don’t want the bronchitis at all, you can’t simply ignore the first signs of an oncoming cold; you have to notice it and treat it.
Here’s another example. I practice t’ai chi, and one of the reasons is that it’s been documented via controlled studies to be effective preventive medicine for many of the chronic health problems Americans tend to get as they get old. You can treat those same problems with drugs, to be sure, if you’re willing to risk the side effects, but again, you can’t just pop a t’ai chi pill and plop yourself back down on the sofa. You’ve got to put in at least fifteen minutes of practice a day, every day, to get any serious health benefits out of it. (I do more like forty-five minutes a day, but then I’m not just practicing it for health.) It takes time and effort, and if you’ve spent a lifetime damaging your health and turn to t’ai chi when you’re already seriously ill, it’s unlikely to do the trick.
All these points are relevant to the core project of this blog, in turn, because there’s another difference between alternative health care and the medical mainstream. All the core alternative modalities were all developed before the age of cheap abundant fossil fuel energy, and require very little in the way of energy and raw material inputs. Conventional chemical and surgical medicine is another thing entirely. It’s wholly a creation of the age of petroleum; without modern transport and communications networks, gargantuan supply chains for everything from bandages through exotic pharmaceuticals to spare parts for lab equipment, a robust electrical supply, and many other products derived from or powered by cheap fossil fuels, the modern American medical system would grind to a halt.
In the age of peak oil, that level of dependency is not a survival trait, and it’s made worse by two other trends. The first, mentioned earlier in this post, is the accelerating spread of antibiotic resistance in microbes. The penicillin that saved my life in 1969 almost certainly wouldn’t cure a case of scarlet fever today; decades of antibiotic overuse created a textbook case of evolution in action, putting ferocious selection pressure on microbes in the direction of resistance. The resulting chemical arms race is one that the microbes are winning, as efforts by the pharmaceutical industry to find new antibiotics faster than microbes can adapt to them fall further and further behind. Epidemiologists are seriously discussing the possibility that within a few decades, mortality rates from bacterial diseases may return to19th-century levels, when they were the leading cause of death.
The second trend is economic. The United States has built an extraordinarily costly and elaborate health care system, far and away the most expensive in the world, on the twin pillars of government subsidies and employer-paid health benefits. As we lurch further into what Paul Kennedy called "imperial overstretch"—the terminal phase of hegemony, when the costs of empire outweigh the benefits but the hegemonic power can’t or won’t draw back from its foreign entanglements—the government subsidies are going away, while health benefits on the job are being gutted by rising unemployment rates and the frantic efforts of the nation’s rentier class to maintain its standards of living at the expense of the middle classes and the poor.
Requiring people who can’t afford health insurance at today’s exorbitant rates to pay for it anyway under penalty of law—the centerpiece of Obama’s health care "reform"—was a desperation move in this latter struggle, and one that risks a prodigious political backlash. If Obama’s legislation takes effect as written in 2014, and millions of struggling American families find themselves facing a Hobson’s choice between paying a couple of thousand a month or more for health insurance they can’t afford, or paying heavy fines they can’t afford either, it’s probably a safe bet that the US will elect a Tea Party president in 2016 and repeal that—along with much else. Whether that happens or not, it’s clear at this point that the United States can no longer afford the extraordinarily costly health care system it’s got, and the question at this point is simply what will replace it.
In the best of all possible worlds, the existing medical system would come to terms with the bleak limits closing in around it, and begin building a framework that could provide basic health care at a reasonable price to the poor and working classes. It actually wouldn’t be that difficult, but it would require the medical industry to remove at least some of the barriers that restrict medical practice to a small number of very highly paid professionals, and to accept significant declines in quarterly profits, doctors’ salaries, and the like. Maybe that could happen, but so far there doesn’t seem to be any sign of a movement in that direction. Instead, health care costs continue to rise as the economy stalls, moving us deeper into a situation where elaborate and expensive health care is available to a steadily narrowing circle of the well-to-do, while everyone outside the circle has to make do with what they can afford—which, more and more often, amounts to the 19th-century medicine provided by alternative health care.
Thus I’m not especially worried about the survival of alternative healing. Despite the fulminations of authority figures and the occasional FDA witch hunt, the alternative healing scene is alive and well, and its reliance on medicines and techniques that were viable before the age of cheap abundant fossil fuels means that it will be well equipped to deal with conditions after cheap energy of any kind is a thing of the past. No, what concerns me is the legacy of today’s mainstream medicine—the medicine that saved my life at age seven, and continues, despite its difficulties and dysfunctions, to heal cases that the best doctors in the world a century and a quarter ago had to give up as hopeless.
Even if a movement of the sort I’ve suggested above were to take place, a great deal of that would be lost or, at best, filed away for better times. The most advanced medical procedures at present require inputs that a deindustrial society simply isn’t going to be able to provide. Still, there’s quite a bit that could be saved, if those who have access to the techniques in question were to grasp the necessity of saving them. As it stands, the only people who can salvage those things are the physicians who are legally authorized to use them; the rest of us can at best get a working grasp of sanitation and sterile procedure, the sort of wilderness-centered first aid training that assumes that a paramedic won’t be there in ten minutes, and the sort of home nursing skills that the Red Cross used to teach in the 1950s and 1960s—you can still find the Red Cross Home Nursing Manual in the used book market, and it’s well worth getting a copy and studying it.
Other than that, it’s up to the physicians and the various institutions they staff and advise. If they step up to the plate, the deindustrial future will have the raw materials from which to evolve ways of healing that combine the best of mainstream and alternative methods. If they don’t, well, maybe enough written material will survive to enable the healers of the future to laboriously rediscover and reinvent some of today’s medical knowledge a few centuries down the road. While the decision is being made, those of us who don’t have a voice in it have our own decisions to make: if we have the money and are willing to accept one set of risks, to make use of today’s chemical and surgical medicine while it’s still around; if we have the interest and are willing to accept another set of risks, to make use of one or more methods of alternative medicine; or if neither option seems workable or desirable, to come to terms with a reality that all of us are eventually going to have to accept anyway, which is that life and health are fragile transitory things, and that despite drugs and surgeries on the one hand, or herbs and healing practices on the other, the guy with the scythe is going to settle the matter sooner or later with the one answer every human being gets at last.
In England, meanwhile, one of those incidents Americans learned to dread in the long hot summers of the Sixties—a traffic stop in a poor minority neighborhood, a black man shot dead by police under dubious circumstances—has triggered four nights of looting and rioting, as mobs in London and elsewhere organized via text messages and social media, brushed aside an ineffectual police presence, plundered shops and torched police stations, and ripped gaping holes in their nation’s already shredding social fabric. It seems that “Tottenham” is how the English pronounce “Watts,” except that the fire this time is being spread rather more efficiently with the aid of Blackberries and flashmobs.
Government officials denounced the riots as “mindless thuggery,” but it’s considerably more than that. As one looter cited in the media said, “this is my banker’s bonus”—the response of the bottom of the social pyramid, that is, to a culture of nearly limitless corruption further up. It bears remembering that the risings earlier this year in Tunisia, Egypt, and elsewhere began with exactly this sort of inchoate explosion of rage against governments that responded to economic crisis by tightening the screws on the poor; it was only when the riots showed the weakness of the existing order that more organized and ambitious movements took shape amid the chaos. It’s thus not outside the bounds of possibility, if the British government keeps on managing the situation as hamhandedly as it’s done so far, that the much-ballyhooed Arab Spring may be followed by an English Summer—and just possibly thereafter by a European Autumn.
One way or another, this is what history looks like as it’s happening. Those of my readers who have been following along for a year or two, and have made at least a decent fraction of the preparations I’ve suggested, are probably as well prepared for the unfolding mess as anyone is likely to be. Those who have just joined the conversation, or were putting aside preparations for some later date—well, once the rubble stops bouncing and the smoke clears, you’ll have the chance to assess what possibilities are still open and what you have the resources to accomplish. In the meantime, I want to continue the sequence of posts already under way, and discuss another of the things that’s going to have to be salvaged as the current system grinds awkwardly to a halt.
The theme of this week’s discussion, I’m sorry to say, is another issue split down the middle by the nearly Gnostic dualisms that bedevil contemporary American society. Just as Democrats and Republicans denounce each other in incandescent fury, and fundamentalist atheists compete with fundamentalist Christians in some sort of Olympics of ideological intolerance, the issues surrounding health care in America these days have morphed unhelpfully into a bitter opposition between the partisans of mainstream medicine and the proponents of alternative healing. The radicals on both sides dismiss the other side as a bunch of murderous quacks, while even those with more moderate views tend to regard the other end of the spectrum through a haze of suspicion tinged with bad experiences and limited knowledge.
I stay out of such debates as often as I can, but this one hasn’t given me that choice. Ironically, that’s because I’ve experienced both sides of the issue. On the one hand, I’m alive today because of modern medicine. At the age of seven, I came down with a serious case of scarlet fever. That’s a disease that used to kill children quite regularly, and in a premodern setting, it almost certainly would have killed me. As it was, I spent two weeks flat on my back, and pulled through mostly because of horse doctor’s doses of penicillin, administered first with syringes that to my seven-year-old eyes looked better suited for young elephants, and thereafter in oral form, made palatable with an imitation banana flavoring I can still call instantly to mind.
Then there’s the other side of the balance. My wife has lifelong birth defects in her legs and feet, because her mother’s obstetrician prescribed a drug that was contraindicated for pregnant women because it causes abnormalities in fetal limb development. My only child died at birth because my wife’s obstetrician did exactly the same thing, this time with a drug that was well known to cause fatal lung abnormalities. Several years later we found out by way of a media exposé that the latter doctor had done the same thing to quite a few other women, leaving a string of dead babies in his wake. The response of the medical board, once the media exposure forced them to do something, was quite standard; they administered a mild reprimand. If this reminds you of the Vatican’s handling of pedophile priests, well, let’s just say the comparison has occurred to me as well.
Deaths directly caused by American health care are appallingly common. A widely cited 2000 study by public health specialist Dr. Barbara Starwood presented evidence that bad medical care kills more Americans every year than anything but heart disease and cancer, with adverse drug effects and nosocomial (hospital- and clinic-spread) infections the most common culprits. A more comprehensive study prepared outside the medical mainstream, but based entirely on data from peer-reviewed medical journals, argued that the actual rate was much higher—higher, in fact, than any other single cause. That’s part of what makes the controversies over American health care so challenging; mainstream medical care saves a lot of lives in America, but because of the pressures of the profit motive, and the extent to which institutional barriers protect incompetent practitioners and dangerous and ineffective remedies, it also costs a lot of lives as well.
Even so, if I could find a competent, affordable general practitioner to give me annual checkups and help me deal with the ordinary health issues middle-aged men tend to encounter, I’d be happy to do so. The catch here is that little word "affordable." Along with those birth defects, my wife has celiac disease, a couple of food allergies, and a family history with some chronic health problems in it; for that matter, my family history is by no means squeaky clean; we’re both self-employed, and so health insurance would cost us substantially more than our mortgage. That’s money we simply don’t have. Like a large and growing fraction of Americans, therefore, we’ve turned to alternative medicine for our health care.
The more dogmatic end of the mainstream medical industry tends to dismiss all alternative healing methods as ineffective by definition. That’s self-serving nonsense; the core alternative healing modalities, after all, are precisely the methods of health care that were known and practiced in the late 19th century, before today’s chemical and surgical medicine came on the scene, and they embody decades or centuries of careful study of health and illness. There are things that alternative health methods can’t treat as effectively as the current mainstream, of course, but the reverse is also true.
Still, behind the rhetoric of the medical industry lies a fact worth noting: alternative medical methods are almost all much less intensive than today’s chemical and surgical medicine. The best way to grasp the difference is to compare it to other differences between life in the late 19th century and life today—say, the difference between walking and driving a car. Like alternative medicine, walking is much slower, it requires more personal effort, and there are destinations that, realistically speaking, are out of its reach; on the other hand, it has fewer negative side effects, costs a lot less, and dramatically cuts your risk of ending up buttered across the grill of a semi because somebody else made a mistake.
Those differences mean that you can’t use alternative medicine the way you use the mainstream kind. If I neglect a winter cold, for example, I tend to end up with bacterial bronchitis. A physician nowadays can treat that with a simple prescription of antibiotics, and unless the bacterium happens to be resistant—an issue I’ll be discussing in more detail in a bit—that’s all there is to it. If you’re using herbs, on the other hand, handling bacterial bronchitis is a more complex matter. There are very effective herbal treatments, and if you know them, you know exactly what you’re getting and what the effects will be. On the other hand, you can’t simply pop a pill and go on with your day; you have to combine the herbal infusions with rest and steam inhalation, and pay attention to your symptoms so you can treat for fever or other complications if they arise. You very quickly learn, also, that if you don’t want the bronchitis at all, you can’t simply ignore the first signs of an oncoming cold; you have to notice it and treat it.
Here’s another example. I practice t’ai chi, and one of the reasons is that it’s been documented via controlled studies to be effective preventive medicine for many of the chronic health problems Americans tend to get as they get old. You can treat those same problems with drugs, to be sure, if you’re willing to risk the side effects, but again, you can’t just pop a t’ai chi pill and plop yourself back down on the sofa. You’ve got to put in at least fifteen minutes of practice a day, every day, to get any serious health benefits out of it. (I do more like forty-five minutes a day, but then I’m not just practicing it for health.) It takes time and effort, and if you’ve spent a lifetime damaging your health and turn to t’ai chi when you’re already seriously ill, it’s unlikely to do the trick.
All these points are relevant to the core project of this blog, in turn, because there’s another difference between alternative health care and the medical mainstream. All the core alternative modalities were all developed before the age of cheap abundant fossil fuel energy, and require very little in the way of energy and raw material inputs. Conventional chemical and surgical medicine is another thing entirely. It’s wholly a creation of the age of petroleum; without modern transport and communications networks, gargantuan supply chains for everything from bandages through exotic pharmaceuticals to spare parts for lab equipment, a robust electrical supply, and many other products derived from or powered by cheap fossil fuels, the modern American medical system would grind to a halt.
In the age of peak oil, that level of dependency is not a survival trait, and it’s made worse by two other trends. The first, mentioned earlier in this post, is the accelerating spread of antibiotic resistance in microbes. The penicillin that saved my life in 1969 almost certainly wouldn’t cure a case of scarlet fever today; decades of antibiotic overuse created a textbook case of evolution in action, putting ferocious selection pressure on microbes in the direction of resistance. The resulting chemical arms race is one that the microbes are winning, as efforts by the pharmaceutical industry to find new antibiotics faster than microbes can adapt to them fall further and further behind. Epidemiologists are seriously discussing the possibility that within a few decades, mortality rates from bacterial diseases may return to19th-century levels, when they were the leading cause of death.
The second trend is economic. The United States has built an extraordinarily costly and elaborate health care system, far and away the most expensive in the world, on the twin pillars of government subsidies and employer-paid health benefits. As we lurch further into what Paul Kennedy called "imperial overstretch"—the terminal phase of hegemony, when the costs of empire outweigh the benefits but the hegemonic power can’t or won’t draw back from its foreign entanglements—the government subsidies are going away, while health benefits on the job are being gutted by rising unemployment rates and the frantic efforts of the nation’s rentier class to maintain its standards of living at the expense of the middle classes and the poor.
Requiring people who can’t afford health insurance at today’s exorbitant rates to pay for it anyway under penalty of law—the centerpiece of Obama’s health care "reform"—was a desperation move in this latter struggle, and one that risks a prodigious political backlash. If Obama’s legislation takes effect as written in 2014, and millions of struggling American families find themselves facing a Hobson’s choice between paying a couple of thousand a month or more for health insurance they can’t afford, or paying heavy fines they can’t afford either, it’s probably a safe bet that the US will elect a Tea Party president in 2016 and repeal that—along with much else. Whether that happens or not, it’s clear at this point that the United States can no longer afford the extraordinarily costly health care system it’s got, and the question at this point is simply what will replace it.
In the best of all possible worlds, the existing medical system would come to terms with the bleak limits closing in around it, and begin building a framework that could provide basic health care at a reasonable price to the poor and working classes. It actually wouldn’t be that difficult, but it would require the medical industry to remove at least some of the barriers that restrict medical practice to a small number of very highly paid professionals, and to accept significant declines in quarterly profits, doctors’ salaries, and the like. Maybe that could happen, but so far there doesn’t seem to be any sign of a movement in that direction. Instead, health care costs continue to rise as the economy stalls, moving us deeper into a situation where elaborate and expensive health care is available to a steadily narrowing circle of the well-to-do, while everyone outside the circle has to make do with what they can afford—which, more and more often, amounts to the 19th-century medicine provided by alternative health care.
Thus I’m not especially worried about the survival of alternative healing. Despite the fulminations of authority figures and the occasional FDA witch hunt, the alternative healing scene is alive and well, and its reliance on medicines and techniques that were viable before the age of cheap abundant fossil fuels means that it will be well equipped to deal with conditions after cheap energy of any kind is a thing of the past. No, what concerns me is the legacy of today’s mainstream medicine—the medicine that saved my life at age seven, and continues, despite its difficulties and dysfunctions, to heal cases that the best doctors in the world a century and a quarter ago had to give up as hopeless.
Even if a movement of the sort I’ve suggested above were to take place, a great deal of that would be lost or, at best, filed away for better times. The most advanced medical procedures at present require inputs that a deindustrial society simply isn’t going to be able to provide. Still, there’s quite a bit that could be saved, if those who have access to the techniques in question were to grasp the necessity of saving them. As it stands, the only people who can salvage those things are the physicians who are legally authorized to use them; the rest of us can at best get a working grasp of sanitation and sterile procedure, the sort of wilderness-centered first aid training that assumes that a paramedic won’t be there in ten minutes, and the sort of home nursing skills that the Red Cross used to teach in the 1950s and 1960s—you can still find the Red Cross Home Nursing Manual in the used book market, and it’s well worth getting a copy and studying it.
Other than that, it’s up to the physicians and the various institutions they staff and advise. If they step up to the plate, the deindustrial future will have the raw materials from which to evolve ways of healing that combine the best of mainstream and alternative methods. If they don’t, well, maybe enough written material will survive to enable the healers of the future to laboriously rediscover and reinvent some of today’s medical knowledge a few centuries down the road. While the decision is being made, those of us who don’t have a voice in it have our own decisions to make: if we have the money and are willing to accept one set of risks, to make use of today’s chemical and surgical medicine while it’s still around; if we have the interest and are willing to accept another set of risks, to make use of one or more methods of alternative medicine; or if neither option seems workable or desirable, to come to terms with a reality that all of us are eventually going to have to accept anyway, which is that life and health are fragile transitory things, and that despite drugs and surgeries on the one hand, or herbs and healing practices on the other, the guy with the scythe is going to settle the matter sooner or later with the one answer every human being gets at last.
Wednesday, August 03, 2011
Salvaging Science
Last week’s post on the collapse of American education strayed a certain distance from the core theme of the present series of posts, the reinvention of a “green wizardry” based on methods and technologies tried and proven during the energy crises of the Seventies. There’s a purpose to the divagation, since what’s being discussed here is an educational project in the broad sense of the term, and one that has no real chance of being embraced by the established educational institutions of American society.
Mind you, it’s quite possible that a university here or a community college there might make itself useful in one way or another. Once it becomes impossible to ignore the mismatch between the energy resources available to Americans and the habits of extravagant energy use that have grown up in this country over the last three decades, new college programs to train alternative energy professionals will no doubt pop up like mushrooms after a hard spring rain, competing with the handful of such programs that already exist fto attract an expected torrent of students. Some of these future programs may even be worth the time, though with the current trajectory of college expenses, it would amaze me if any turn out to be worth the cost.
They’ll probably get the students, though. It’s hardwired into the American psyche these days that a problem for one person is a business opportunity for someone else, and preferably someone else with the right degree. Books with titles such as Profit from the Peak are already festooning bookstore and library shelves, and there will doubtless be much more of the same thinking on display as peak oil continues its journey from a fringe concern to an inescapable reality. That’s the American mindset as the 21st century moves deeper into its first great period of crisis; if scientists were to announce tomorrow that America was about to sink beneath the waves like Atlantis, I don’t doubt for a moment that tens of thousands of Americans would rush out and try to launch new careers manufacturing and selling water wings.
The green wizardry I’ve been discussing in these posts doesn’t lend itself to that sort of thinking, because it’s not intended for specialists. Now of course it will help a few careers along—unless you’re a dab hand with plumbing, for example, you’re better off getting a professional to install your solar water heating system—and it may get some started—I’ve spoken several times already about the range of small businesses that will be needed as the global economy winds down and maintaining, rebuilding, and repurposing old technologies becomes a significant economic sector. Still, most of the techniques and strategies I’ve been discussing aren’t well suited to make money for some new wave of specialists; their value is in making life more livable for ordinary people who hope to get by in the difficult times that are gathering around us right now.
It’s worth noting, in fact, that the twilight of the contemporary cult of specialization is one of the implications of peak oil. A couple of decades ago, the mathematician Ilya Prigogine showed by way of dizzyingly complex equations that the flow of energy through a system tends to increase the complexity of the system over time. It’s a principle that’s seen plenty of application in biology, among other fields, but I don’t think it’s been applied to history as often as it should have. There does seem to be a broad positive correlation between the energy per capita available, on average, to the members of a human society, and the number of different occupational roles available to members of that society.
As energy per capita soared to its peak in the industrial world of the late twentieth century, hyperspecialization was the order of the day; as energy per capita declines—and it’s been declining for some time now—the range of specializations that can be supported by the economy will also decline, and individuals and families will have to take up the slack, taking over tasks that for some decades now have been done by professionals. During the transitional period, at least, this will doubtless generate a great deal of commotion, as professional specialists whose jobs are going away try to defend their jobs by making life as difficult as possible for those people who, trying to get by in difficult times, choose the do-it-yourself route. That process is already well under way in a variety of professions, and we’ll be discussing a particularly visible example of it in next week’s post, but this week I want to use this lens to examine the future of one of the industrial world’s distinctive creations, the grab bag of investigative methods, ideas about the universe, and social institutions we call “science.”
It’s rarely remembered these days that until quite recently, scientific research was mostly carried on by amateurs. The word “scientist” wasn’t even coined until 1833; before then, and for some time after, the research programs that set modern science on its way were carried out by university professors in other disciplines, middle class individuals with spare time on their hands, and wealthy dilletantes for whom science was a more interesting hobby than horse racing or politics. Isaac Newton, for example, taught mathematics at Cambridge; Gilbert White founded the science of ecology with his Natural History of Selborne in his spare time as a clergyman; Charles Darwin came from a family with a share of the Wedgwood pottery fortune, had a clergyman’s education, and paid his own way around the world on the H.M.S. Beagle.
It took a long time for scence as a profession to catch on, because—pace a myth very widespread these days—science contributed next to nothing to the technological revolutions that swept the western world in the eighteenth and nineteenth centuries. Until late in the nineteenth century, in fact, things generally worked the other way around: engineers and basement tinkerers discovered some exotic new effect, and then scientists scrambled to figure out what made it happen. James Clerk Maxwell, whose 1873 book Electricity and Magnetism finally got out ahead of the engineers to postulate the effects that would become the basis for radio, began the process by which science took the lead in technological innovation, but it wasn’t until the Second World War that science had matured enough to become the engine of discovery it then became. It was then that government and business investment in basic research took off, creating the institutionalized science of the present day.
Throughout the twentieth century, investment in scientific research proved to be a winning bet on the grand scale; it won wars, made fortunes, and laid the groundwork for today’s high-tech world. It’s a common belief these days that more of the same will yield more of the same—that more scentific research will make it possible to fix the world’s energy problems and, just maybe, its other problems as well. Popular as that view is, there’s good reason to doubt it.
The core problem is that scientific research was necessary, but not sufficient, to create today’s industrial societies. Cheap abundant energy was also necessary, and was arguably the key factor. In a very real sense, the role of science from the middle years of the nineteenth century on was basically figuring out new ways to use the torrents of energy that came surging out of wells and mines to power history’s most extravagant boom. Lacking all that energy, the technological revolutions of the last few centuries very likely wouldn’t have happened at all; the steam turbine, remember, was known to the Romans, who did nothing with it because all the fuel they knew about was committed to other uses. Since the sources of fuel we’ll have after fossil fuels finish depleting are pretty much the same as the ones the Romans had, and we can also expect plenty of pressing needs for the energy sources that remain, it takes an essentially religious faith in the inevitability of progress to believe that another wave of technological innovation is right around the corner.
The end of the age of cheap abundant energy is thus also likely to be the end of the age in which science functions as a force for economic expansion. There are at least two other factors pointing in the same direction, though, and they need to be grasped to make sense of the predicament we’re in.
First, science itself is well into the territory of diminishing returns, and most of the way through the normal life cycle of a human method of investigation. What last week’s post described as abstraction, the form of intellectual activity that seeks to reduce the complexity of experience into a set of precisely formulated generalizations, always depends on such a method. Classical logic is another example, and it’s particularly useful here because it completed its life cycle long ago and so can be studied along its whole trajectory through time.
Logic, like the scientific method, was originally the creation of a movement of urban intellectuals in a society emerging from a long and troubled medieval period. Around the eighth century BCE, ancient Greece had finally worked out a stable human ecology that enabled it to finish recovering from the collapse of Mycenean society some six centuries before; olive and grapevine cultivation stabilized what was left of the fragile Greek soil and produced cash crops eagerly sought by markets around the eastern Mediterranean, bringing in a flood of wealth; the parallel with rapidly expanding European economies during the years when modern science first took shape is probably not coincidental. Initial ventures in the direction of what would become Greek logic explored various options, some more successful than others; by the fifth century BCE, what we may as well call the logical revolution was under way, and the supreme triumphs of logical method occupied the century that followed. Arithmetic, geometry, music theory, and astronomy underwent revolutionary developments.
That’s roughly where the logical revolution ground to a halt, too, and the next dozen centuries or so saw little further progress. There were social factors at work, to be sure, but the most important factor was inherent in the method: using the principles of logic as the Greeks understood them, there’s only so far you can go. Logical methods that had proved overwhelmingly successful against longstanding problems in mathematics worked far less well on questions about the natural world, and efforts to solve the problems of human life as though they were logical syllogisms tended to flop messily. Once the belief in the omnipotence of logic was punctured, on the other hand, it became possible to sort out what it could and couldn’t do, and—not coincidentally—to assign it a core place in the educational curriculum, a place it kept right up until the dawn of the modern world.
I know it’s utter heresy even to hint at this, but I’d like to suggest that science, like logic before it, has gotten pretty close to its natural limits as a method of knowledge. In Darwin’s time, a century and a half ago, it was still possible to make worldshaking scientific discoveries with equipment that would be considered hopelessly inadequate for a middle school classroom nowadays; there was still a lot of low hanging fruit to be picked off the tree of knowledge. At this point, by contrast, the next round of experimental advances in particle physics depends on the Large Hadron Collider, a European project with an estimated total price tag around $5.5 billion. Many other branches of science have reached the point at which very small advances in knowledge are being made with very large investments of money, labor, and computing power. Doubtless there will still be surprises in store, but revolutionary discoveries are very few and far between these days
Yet there’s another factor pressing against the potential advancement of science, and it’s one that very few scientists like to talk about. When science was drawn up into the heady realms of politics and business, it became vulnerable to the standard vices of those realms, and one of the consequences has been a great deal of overt scientific fraud.
A study last year published in the Journal of Medical Ethics surveyed papers formally retracted between 2000 and 2010 in the health sciences. About a quarter of them were retracted for scientific fraud, and half of these had a first author who had had another paper previously retracted for scientific fraud. Coauthors of these repeat offenders had, on average, three other papers each that had been retracted. Americans, it may be worth noting, far more often had papers retracted for fraud, and were repeat offenders, than their overseas colleagues.
I don’t know how many of my readers were taught, as I was, that science is inherently self-policing and that any researcher who stooped to faking data would inevitably doom his career. Claims like these are difficult to defend in the face of numbers of the sort just cited. Logic went through the same sort of moral collapse in its time; the English word "sophistry" commemorates the expert debaters of fourth-century Greece who could and did argue with sparkling logic for anyone who would pay them.
To be fair, scientists as a class would have needed superhuman virtue to overcome the temptations of wealth, status, and influence proffered them in the post-Second World War environment, and it’s also arguably true that the average morality of scientists well exceeds that of businesspeople or politicians. That still leaves room for a good deal of duplicity, and it’s worth noting that this has not escaped the attention of the general public. It’s an item of common knowledge these days that the court testimony or the political endorsement of a qualified scientist, supporting any view you care to name, can be had for the cost of a research grant or two. I’m convinced that this is the hidden subtext in the spreading popular distrust of science that is such a significant feature in our public life: a great many Americans, in particular, have come to see scientific claims as simply one more rhetorical weapon brandished by competing factions in the social and political struggles of our day.
This is unfortunate, because—like logic—the scientific method is a powerful resource; like logic, again, there are things it can do better than any other creation of the human mind, and some of those things will be needed badly in the years ahead of us. Between the dumping of excess specializations in a contracting economy, the diminishing returns of scientific research itself, and the spreading popular distrust of science as currently practiced, the likelihood that any significant fraction of today’s institutional science will squeeze through the hard times ahead is minimal at best. What that leaves, it seems to me, is a return to the original roots of science as an amateur pursuit.
There are still some corners of the sciences—typically those where there isn’t much money in play—that are open to participation by amateurs. There are also quite a few branches of scientific work that are scarcely being done at all these days—again, because there isn’t much money in play—and their number is likely to increase as funding cuts continue. To my mind, one of the places where these trends intersect with the needs of the future is in local natural history and ecology, the kind of close study of nature’s patterns that launched the environmental sciences, back in the day. To cite an example very nearly at random, it would take little more than a microscope, a notebook, and a camera to do some very precise studies of the effect of organic gardening methods on soil microorganisms, beneficial and harmful insects, and crop yields, or to settle once and for all the much-debated question of whether adding biochar to garden soil has any benefits in temperate climates.
These are things the green wizards of the future are going to need to be able to figure out. With much scientific research in America moving in what looks uncomfortably like a death spiral, the only way those skills are likely to make it across the crisis ahead of us is if individuals and local groups pick them up and pass them on to others. Now is probably not too soon to get started, either.
Mind you, it’s quite possible that a university here or a community college there might make itself useful in one way or another. Once it becomes impossible to ignore the mismatch between the energy resources available to Americans and the habits of extravagant energy use that have grown up in this country over the last three decades, new college programs to train alternative energy professionals will no doubt pop up like mushrooms after a hard spring rain, competing with the handful of such programs that already exist fto attract an expected torrent of students. Some of these future programs may even be worth the time, though with the current trajectory of college expenses, it would amaze me if any turn out to be worth the cost.
They’ll probably get the students, though. It’s hardwired into the American psyche these days that a problem for one person is a business opportunity for someone else, and preferably someone else with the right degree. Books with titles such as Profit from the Peak are already festooning bookstore and library shelves, and there will doubtless be much more of the same thinking on display as peak oil continues its journey from a fringe concern to an inescapable reality. That’s the American mindset as the 21st century moves deeper into its first great period of crisis; if scientists were to announce tomorrow that America was about to sink beneath the waves like Atlantis, I don’t doubt for a moment that tens of thousands of Americans would rush out and try to launch new careers manufacturing and selling water wings.
The green wizardry I’ve been discussing in these posts doesn’t lend itself to that sort of thinking, because it’s not intended for specialists. Now of course it will help a few careers along—unless you’re a dab hand with plumbing, for example, you’re better off getting a professional to install your solar water heating system—and it may get some started—I’ve spoken several times already about the range of small businesses that will be needed as the global economy winds down and maintaining, rebuilding, and repurposing old technologies becomes a significant economic sector. Still, most of the techniques and strategies I’ve been discussing aren’t well suited to make money for some new wave of specialists; their value is in making life more livable for ordinary people who hope to get by in the difficult times that are gathering around us right now.
It’s worth noting, in fact, that the twilight of the contemporary cult of specialization is one of the implications of peak oil. A couple of decades ago, the mathematician Ilya Prigogine showed by way of dizzyingly complex equations that the flow of energy through a system tends to increase the complexity of the system over time. It’s a principle that’s seen plenty of application in biology, among other fields, but I don’t think it’s been applied to history as often as it should have. There does seem to be a broad positive correlation between the energy per capita available, on average, to the members of a human society, and the number of different occupational roles available to members of that society.
As energy per capita soared to its peak in the industrial world of the late twentieth century, hyperspecialization was the order of the day; as energy per capita declines—and it’s been declining for some time now—the range of specializations that can be supported by the economy will also decline, and individuals and families will have to take up the slack, taking over tasks that for some decades now have been done by professionals. During the transitional period, at least, this will doubtless generate a great deal of commotion, as professional specialists whose jobs are going away try to defend their jobs by making life as difficult as possible for those people who, trying to get by in difficult times, choose the do-it-yourself route. That process is already well under way in a variety of professions, and we’ll be discussing a particularly visible example of it in next week’s post, but this week I want to use this lens to examine the future of one of the industrial world’s distinctive creations, the grab bag of investigative methods, ideas about the universe, and social institutions we call “science.”
It’s rarely remembered these days that until quite recently, scientific research was mostly carried on by amateurs. The word “scientist” wasn’t even coined until 1833; before then, and for some time after, the research programs that set modern science on its way were carried out by university professors in other disciplines, middle class individuals with spare time on their hands, and wealthy dilletantes for whom science was a more interesting hobby than horse racing or politics. Isaac Newton, for example, taught mathematics at Cambridge; Gilbert White founded the science of ecology with his Natural History of Selborne in his spare time as a clergyman; Charles Darwin came from a family with a share of the Wedgwood pottery fortune, had a clergyman’s education, and paid his own way around the world on the H.M.S. Beagle.
It took a long time for scence as a profession to catch on, because—pace a myth very widespread these days—science contributed next to nothing to the technological revolutions that swept the western world in the eighteenth and nineteenth centuries. Until late in the nineteenth century, in fact, things generally worked the other way around: engineers and basement tinkerers discovered some exotic new effect, and then scientists scrambled to figure out what made it happen. James Clerk Maxwell, whose 1873 book Electricity and Magnetism finally got out ahead of the engineers to postulate the effects that would become the basis for radio, began the process by which science took the lead in technological innovation, but it wasn’t until the Second World War that science had matured enough to become the engine of discovery it then became. It was then that government and business investment in basic research took off, creating the institutionalized science of the present day.
Throughout the twentieth century, investment in scientific research proved to be a winning bet on the grand scale; it won wars, made fortunes, and laid the groundwork for today’s high-tech world. It’s a common belief these days that more of the same will yield more of the same—that more scentific research will make it possible to fix the world’s energy problems and, just maybe, its other problems as well. Popular as that view is, there’s good reason to doubt it.
The core problem is that scientific research was necessary, but not sufficient, to create today’s industrial societies. Cheap abundant energy was also necessary, and was arguably the key factor. In a very real sense, the role of science from the middle years of the nineteenth century on was basically figuring out new ways to use the torrents of energy that came surging out of wells and mines to power history’s most extravagant boom. Lacking all that energy, the technological revolutions of the last few centuries very likely wouldn’t have happened at all; the steam turbine, remember, was known to the Romans, who did nothing with it because all the fuel they knew about was committed to other uses. Since the sources of fuel we’ll have after fossil fuels finish depleting are pretty much the same as the ones the Romans had, and we can also expect plenty of pressing needs for the energy sources that remain, it takes an essentially religious faith in the inevitability of progress to believe that another wave of technological innovation is right around the corner.
The end of the age of cheap abundant energy is thus also likely to be the end of the age in which science functions as a force for economic expansion. There are at least two other factors pointing in the same direction, though, and they need to be grasped to make sense of the predicament we’re in.
First, science itself is well into the territory of diminishing returns, and most of the way through the normal life cycle of a human method of investigation. What last week’s post described as abstraction, the form of intellectual activity that seeks to reduce the complexity of experience into a set of precisely formulated generalizations, always depends on such a method. Classical logic is another example, and it’s particularly useful here because it completed its life cycle long ago and so can be studied along its whole trajectory through time.
Logic, like the scientific method, was originally the creation of a movement of urban intellectuals in a society emerging from a long and troubled medieval period. Around the eighth century BCE, ancient Greece had finally worked out a stable human ecology that enabled it to finish recovering from the collapse of Mycenean society some six centuries before; olive and grapevine cultivation stabilized what was left of the fragile Greek soil and produced cash crops eagerly sought by markets around the eastern Mediterranean, bringing in a flood of wealth; the parallel with rapidly expanding European economies during the years when modern science first took shape is probably not coincidental. Initial ventures in the direction of what would become Greek logic explored various options, some more successful than others; by the fifth century BCE, what we may as well call the logical revolution was under way, and the supreme triumphs of logical method occupied the century that followed. Arithmetic, geometry, music theory, and astronomy underwent revolutionary developments.
That’s roughly where the logical revolution ground to a halt, too, and the next dozen centuries or so saw little further progress. There were social factors at work, to be sure, but the most important factor was inherent in the method: using the principles of logic as the Greeks understood them, there’s only so far you can go. Logical methods that had proved overwhelmingly successful against longstanding problems in mathematics worked far less well on questions about the natural world, and efforts to solve the problems of human life as though they were logical syllogisms tended to flop messily. Once the belief in the omnipotence of logic was punctured, on the other hand, it became possible to sort out what it could and couldn’t do, and—not coincidentally—to assign it a core place in the educational curriculum, a place it kept right up until the dawn of the modern world.
I know it’s utter heresy even to hint at this, but I’d like to suggest that science, like logic before it, has gotten pretty close to its natural limits as a method of knowledge. In Darwin’s time, a century and a half ago, it was still possible to make worldshaking scientific discoveries with equipment that would be considered hopelessly inadequate for a middle school classroom nowadays; there was still a lot of low hanging fruit to be picked off the tree of knowledge. At this point, by contrast, the next round of experimental advances in particle physics depends on the Large Hadron Collider, a European project with an estimated total price tag around $5.5 billion. Many other branches of science have reached the point at which very small advances in knowledge are being made with very large investments of money, labor, and computing power. Doubtless there will still be surprises in store, but revolutionary discoveries are very few and far between these days
Yet there’s another factor pressing against the potential advancement of science, and it’s one that very few scientists like to talk about. When science was drawn up into the heady realms of politics and business, it became vulnerable to the standard vices of those realms, and one of the consequences has been a great deal of overt scientific fraud.
A study last year published in the Journal of Medical Ethics surveyed papers formally retracted between 2000 and 2010 in the health sciences. About a quarter of them were retracted for scientific fraud, and half of these had a first author who had had another paper previously retracted for scientific fraud. Coauthors of these repeat offenders had, on average, three other papers each that had been retracted. Americans, it may be worth noting, far more often had papers retracted for fraud, and were repeat offenders, than their overseas colleagues.
I don’t know how many of my readers were taught, as I was, that science is inherently self-policing and that any researcher who stooped to faking data would inevitably doom his career. Claims like these are difficult to defend in the face of numbers of the sort just cited. Logic went through the same sort of moral collapse in its time; the English word "sophistry" commemorates the expert debaters of fourth-century Greece who could and did argue with sparkling logic for anyone who would pay them.
To be fair, scientists as a class would have needed superhuman virtue to overcome the temptations of wealth, status, and influence proffered them in the post-Second World War environment, and it’s also arguably true that the average morality of scientists well exceeds that of businesspeople or politicians. That still leaves room for a good deal of duplicity, and it’s worth noting that this has not escaped the attention of the general public. It’s an item of common knowledge these days that the court testimony or the political endorsement of a qualified scientist, supporting any view you care to name, can be had for the cost of a research grant or two. I’m convinced that this is the hidden subtext in the spreading popular distrust of science that is such a significant feature in our public life: a great many Americans, in particular, have come to see scientific claims as simply one more rhetorical weapon brandished by competing factions in the social and political struggles of our day.
This is unfortunate, because—like logic—the scientific method is a powerful resource; like logic, again, there are things it can do better than any other creation of the human mind, and some of those things will be needed badly in the years ahead of us. Between the dumping of excess specializations in a contracting economy, the diminishing returns of scientific research itself, and the spreading popular distrust of science as currently practiced, the likelihood that any significant fraction of today’s institutional science will squeeze through the hard times ahead is minimal at best. What that leaves, it seems to me, is a return to the original roots of science as an amateur pursuit.
There are still some corners of the sciences—typically those where there isn’t much money in play—that are open to participation by amateurs. There are also quite a few branches of scientific work that are scarcely being done at all these days—again, because there isn’t much money in play—and their number is likely to increase as funding cuts continue. To my mind, one of the places where these trends intersect with the needs of the future is in local natural history and ecology, the kind of close study of nature’s patterns that launched the environmental sciences, back in the day. To cite an example very nearly at random, it would take little more than a microscope, a notebook, and a camera to do some very precise studies of the effect of organic gardening methods on soil microorganisms, beneficial and harmful insects, and crop yields, or to settle once and for all the much-debated question of whether adding biochar to garden soil has any benefits in temperate climates.
These are things the green wizards of the future are going to need to be able to figure out. With much scientific research in America moving in what looks uncomfortably like a death spiral, the only way those skills are likely to make it across the crisis ahead of us is if individuals and local groups pick them up and pass them on to others. Now is probably not too soon to get started, either.