Wednesday, February 27, 2013

The End of the Shale Bubble?

It’s been a little more than a year since I launched the present series of posts on the end of America’s global empire and the future of democracy in the wake of this nation’s imperial age. Over the next few posts I plan on wrapping that theme up and moving on. However traumatic the decline and fall of the American empire turns out to be, after all, it’s just one part of the broader trajectory that this blog seeks to explore, and other parts of that trajectory deserve discussion as well.

I’d planned to have this week’s post take last week’s discussion of voluntary associations further, and talk about some of the other roles that can be filled, in a time of economic contraction and social disarray, by groups of people using the toolkit of democratic process and traditional ways of managing group activities and assets. Still, that topic is going to have to wait another week, because one of the other dimensions of the broader trajectory just mentioned is moving rapidly toward crisis.

It’s hard to imagine that anybody in today’s America has escaped the flurry of enthusiastic media coverage of the fracking phenomenon.  Still, that coverage has included so much misinformation that it’s probably a good idea to recap the basics here. Hydrofracturing—“fracking” in oil industry slang—is an old trick that has been used for decades to get oil and natural gas out of rock that isn’t porous enough for conventional methods to get at them. As oil and gas extraction techniques go, it’s fairly money-, energy- and resource-intensive, and so it didn’t see a great deal of use until fairly recently.

Then the price of oil climbed to the vicinity of $100 a barrel and stayed there. Soaring oil prices drove a tectonic shift in the US petroleum industry, making it economically feasible to drill for oil in deposits that weren’t worth the effort when prices were lower. One of those deposits was the Bakken shale, a sprawling formation of underground rock in the northern Great Plains, which was discovered back in the 1970s and sat neglected ever since due to low oil prices. To get any significant amount of oil out of the Bakken, you have to use fracking technology, since the shale isn’t porous enough to let go of its oil any other way. Once the rising price of crude oil made the Bakken a paying proposition, drilling crews headed that way and got to work, launching a lively boom.

Another thoroughly explored rock formation further east, the Marcellus shale, attracted attention from the drilling rigs for a different reason, or rather a different pair of reasons. The Marcellus contains no oil to speak of, but some parts of it have gas that is high in natural gas liquids—“wet gas” is the industry term for this—and since those liquids can replace petroleum in some applications, they can be sold at a much higher price than natural gas. Meanwhile, companies across the natural gas industry looked at the ongoing depletion of US coal reserves, and the likelihood of government mandates favoring natural gas over coal for power generation, and decided that these added up to a rosy future for natural gas prices.  Several natural gas production firms thus started snapping up leases in the Marcellus country of Pennsylvania and neighboring states, and a second boom got under way.

As drilling in the Bakken and Marcellus shales took off, several other shale deposits, some containing oil and natural gas, others just natural gas, came in for the same sort of treatment. The result was a modest temporary increase in US petroleum production, and a more substantial but equally temporary increase in US natural gas production.  It could never be anything more than temporary, for reasons hardwired into the way fracking technology works.

If you’ve ever shaken a can of soda pop good and hard and then opened it, you know something about fracking that countless column inches of media cheerleading on the subject have sedulously avoided. The technique is different, to be sure, but the effect of hydrofracturing on oil and gas trapped in shale is not unlike the effect of a hard shake on the carbon dioxide dissolved in soda pop:  in both cases, you get a sudden rush toward the outlet, which releases most of what you’re going to get.  Oil and gas production from fracked wells thus starts out high but suffers ferocious decline rates—up to 90% in the first year alone.  Where a conventional, unfracked well can produce enough oil or gas to turn a profit for decades if it’s well managed, fracked wells in tight shales like the Bakken and Marcellus quite often stop becoming a significant source of oil or gas within a few years of drilling.

The obvious response to this problem is to drill more wells, and this accordingly happened. That isn’t a panacea, however. Oil and gas exploration is a highly sophisticated science, and oil and gas drilling companies can normally figure out the best sites for wells long before the drill bit hits the ground. Since they are in business to make money, they normally drill the best sites first. When that sensible habit intersects with the rapid production decline rates found in fracked wells, the result is a brutal form of economic arithmetic:  as the best sites are drilled and the largest reserves drained, drilling companies have to drill more and more wells to keep the same amount of oil or gas flowing.  Costs go up without increasing production, and unless prices rise, profits get hammered and companies start to go broke.

They start to go broke even more quickly if the price of the resource they’re extracting goes down as the costs of maintaining production go up.  In the case of natural gas, that’s exactly what happened. Each natural gas production company drew up its projections of future prices on the assumption that ordinary trends in production would continue.  As company after company piled into shale gas, though, production soared, and the harsh economic downturn that followed the 2008 housing market crash kept plummeting natural gas prices from spurring increased use of the resource; so many people were so broke that even cheap natural gas was too expensive for any unnecessary use.

Up to that point, the fracking story followed a trajectory painfully familiar to anyone who knows their way around the economics of alternative energy.  From the building of the first solar steam engines before the turn of the last century, through the boom-and-bust cycle of alternative energy sources in the late 1970s, right up to the ethanol plants that were launched with so much fanfare a decade ago and sold for scrap much more quietly a few years later, the pattern’s the same, a repeated rhythm of great expectations followed by shattered dreams. . 

Here’s how it works. A media panic over the availability of some energy resource or other sparks frantic efforts to come up with a response that won’t require anybody to change their lifestyles or, heaven help us, conserve. Out of the flurry of available resources and technologies, one or two seize the attention of the media and, shortly thereafter, the imagination of the general public.  Money pours into whatever the chosen solution happens to be, as investors convince themselves that there’s plenty of profit to be made backing a supposedly sure thing, and nobody takes the time to ask hard questions.  In particular, investors tend to lose track of the fact that something can be technically feasible without being economically viable, and rosy estimates of projected cash flow and return on investment take the place of meaningful analysis.

Then come the first financial troubles, brushed aside by cheerleading “analysts” as teething troubles or the results of irrelevant factors certain to pass off in short order. The next round of bad news follows promptly, and then the one after that; the first investors begin to pull out; sooner or later, one of the hot companies that has become an icon in the new industry goes suddenly and messily bankrupt, and the rush for the exits begins. Barring government subsidies big enough to keep some shrunken form of the new industry stumbling along thereafter, that’s usually the end of the road for the former solution du jour, and decades can pass before investors are willing to put their money into the same resource or technology again.

That’s the way that the fracking story started, too. By the time it was well under way, though, a jarring new note had sounded:  the most prestigious of the US mass media suddenly started parroting the most sanguine daydreams of the fracking industry.  They insisted at the top of their lungs that the relatively modest increases in oil and gas production from fracked shales marked a revolutionary new era, in which the United States would inevitably regain the energy independence it last had in the 1950s, and prosperity would return for all—or at least for all who jumped aboard the new bandwagon as soon as possible. Happy days, we were told, were here again.

What made this barrage of propaganda all the more fascinating was the immense gaps that separated it from the realities on and under the ground in Pennsylvania and North Dakota. The drastic depletion rates from fracked wells rarely got a mention, and the estimates of how much oil and gas were to be found in the various shale deposits zoomed upwards with wild abandon.  Nor did the frenzy stop there; blatant falsehoods were served up repeatedly by people who had every reason to know that they were false—I’m thinking here of the supposedly energy-literate pundits who insisted, repeatedly and loudly, that the Green River shale in the southwest was just like the Bakken and Marcellus shales, and would yield abundant oil and gas once it was fracked. (The Green River shale, for those who haven’t been keeping score, contains no oil or gas at all; instead, it contains kerogen, a waxy hydrocarbon goo that would have turned into oil or gas if it had stayed deep underground for a few million years longer, and kerogen can’t be extracted by fracking—or, for that matter, by any other economically viable method.)

Those who were paying attention to all the hoopla may have noticed that the vaporous claims being retailed by the mainstream media around the fracking boom resembled nothing so much as the equally insubstantial arguments most of the same media were serving up around the housing boom in the years immediately before the 2008 crash. The similarity isn’t accidental, either. The same thing happened in both cases:  Wall Street got into the act.

A recent report from financial analyst Deborah Rogers, Shale and Wall Street (you can download a copy in PDF format here), offers a helpful glimpse into the three-ring speculative circus that sprang up around shale oil and shale gas during the last three years or so.  Those of my readers who suffer from the delusion that Wall Street might have learned something from the disastrous end of the housing bubble are in for a disappointment: the same antics, executed with the same blissful disregard for basic honesty and probity, got trotted out again, with results that will be coming down hard on what’s left of the US economy in the months immediately ahead of us. 

If you remember the housing bubble, you know what happened.  Leases on undrilled shale fields were bundled and flipped on the basis of grotesquely inflated claims of their income potential; newly minted investment vehicles of more than Byzantine complexity—VPPs, "volumetric production payments," are an example you’ll be hearing about quite a bit in a few months, once the court cases begin—were pushed on poorly informed investors and promptly began to crash and burn; as the price of natural gas dropped and fracking operations became more and more unprofitable, "pump and dump" operations talked up the prospects of next to worthless properties, which could then be unloaded on chumps before the bottom fell out.  It’s an old story, if a tawdry one, and all the evidence suggests that it’s likely to finish running its usual course in the months immediately ahead.

There are at least two points worth making as that happens. The first is that we can expect more of the same in the years immediately ahead.  Wall Street culture—not to mention the entire suite of economic expectations that guides the behavior of governments, businesses, and most individuals in today’s America—assumes that the close-to-zero return on investment that’s become standard in the last few years is a temporary anomaly, and that a good investment ought to bring in what used to be considered a good annual return: 4%, 6%, 8%, or more. What only a few thinkers on the fringes have grasped is that such returns are only normal in a growing economy, and we no longer have a growing economy.

Sustained economic growth, of the kind that went on from the beginning of the industrial revolution around 1700 to the peak of conventional oil production around 2005, is a rare anomaly in human history.  It became a dominant historical force over the last three centuries because cheap abundant energy from fossil fuels could be brought into the economy at an ever-increasing rate, and it stopped because geological limits to fossil fuel extraction put further increases in energy consumption permanently out of reach. Now that fossil fuels are neither cheap nor abundant, and the quest for new energy sources vast and concentrated enough to replace them has repeatedly drawn a blank, we face several centuries of sustained economic contraction—which means that what until recently counted as the groundrules of economics have just been turned on their head.

You will not find many people on Wall Street capable of grasping this. The burden of an outdated but emotionally compelling economic orthodoxy, to say nothing of a corporate and class culture that accords economic growth the sort of unquestioned aura of goodness other cultures assign to their gods, make the end of growth and the coming of permanent economic decline unthinkable to the financial industry, or for that matter to the millions of people in the industrial world who rely on investments to pay their bills.  There’s a strong temptation to assume that those 8% per annum returns must still be out there, and when something shows up that appears to embody that hope, plenty of people are willing to rush into it and leave the hard questions for later.  Equally, of course, the gap thus opened between expectations and reality quickly becomes a happy hunting ground for scoundrels of every stripe.

Vigorous enforcement of the securities laws might be able to stop the resulting spiral into a permanent bubble-and-bust economy. For all the partisan bickering in Washington DC, though, a firm bipartisan consensus since the days of George W. Bush has placed even Wall Street’s most monumental acts of piracy above the reach of the law. The Bush and Obama administrations both went out of their way to turn a blind eye toward the housing bubble’s spectacular frauds, and there’s no reason to think Obama’s appointees in the Justice Department will get around to doing their jobs this time either.  Once the imminent shale bust comes and goes, in other words, it’s a safe bet that there will be more bubbles, each one propping up the otherwise dismal prospects of the financial industry for a little while, and then delivering another body blow to the economies of America and the world as it bursts.

This isn’t merely a problem for those who have investments, or those whose jobs depend in one way or another on the services the financial industry provides when it’s not too busy committing securities fraud to get around to it. The coming of a permanent bubble-and-bust economy puts a full stop at the end of any remaining prospect for even the most tentative national transition away from our current state of dependence on fossil fuels.  Pick a project, any project, from so sensible a step as rebuilding the nation’s long-neglected railroads all the way to such pie-in-the-sky vaporware as solar power satellites, and it’s going to take plenty of investment capital.  If it’s to be done on any scale, furthermore, we’re talking about a period of decades in which more capital every year will have to flow into the project. 

The transition to a bubble-and-bust economy makes that impossible.  Bubbles last for an average of three years or so, so even if the bubble-blowers on Wall Street happen by accident on some project that might actually help, it will hardly have time to get started before the bubble turns to bust, the people who invested in the project get burned, and the whole thing tumbles down into disillusionment and  bankruptcy.  If past experience is anything to go by, furthermore, most of the money thus raised will be diverted from useful purposes into the absurd bonuses and salaries bankers and brokers think society owes them for their services. 

Over the longer run, a repeated drumbeat of failed investments and unpunished fraud puts the entire system of investment itself at risk.  The trust that leads people to invest their assets, rather than hiding them in a hole in the ground, is a commons; like any commons, it can be destroyed by abuse; and since the federal government has abandoned its statutory duty to protect that commons by enforcing laws against securities fraud, a classic tragedy of the commons is the most likely outcome, wrecking the system by which our society directs surplus wealth toward productive uses and putting any collective response to the end of the fossil fuel age permanently out of reach.

All these are crucial issues. Still, there’s a second point of more immediate importance. I don’t think anybody knows exactly how big the shale bubble has become, but it’s been one of Wall Street’s few really large profit centers over the last three years.  It’s quite possible that the bubble is large enough to cause a major financial panic when it bursts, and send the United States and the world down into yet another sharp economic downturn.  As Yogi Berra famously pointed out, it’s tough to make predictions, especially about the future; still, I don’t think it’s out of place to suggest that sensible preparations for hard times might be wise just now, and if any of my readers happen to have anything invested in the shale or financial industries, I’d encourage them to consider other options in the fairly near term.

Wednesday, February 20, 2013

In a Time of Limits

When the French nobleman Alexis de Tocqueville toured the newly founded American republic in the early years of the nineteenth century, he encountered plenty of things that left him scratching his head. The national obsession with making money, the atrocious food, and the weird way that high culture found its way into the most isolated backwoods settings—“There is hardly a pioneer's hut which does not contain a few odd volumes of  Shakespeare,” he wrote; “I remember reading the feudal drama of Henry V for the first time in a log cabin”—all intrigued him, and found their way into the pages of his remarkable book Democracy in America.

Still, one of the things de Tocqueville found most astonishing bears directly on the theme I’ve been developing over the last several weeks here on The Archdruid Report.  The Americans of his time, when they wanted to make something happen, didn’t march around with placards or write their legislators demanding that the government do it. Instead, far more often than not, they simply put together a private association for the purpose, and did it themselves. De Tocqueville wrote:

“Americans combine to give fêtes, found seminaries, build churches, distribute books, and send missionaries to the antipodes. Hospitals, prisons, and schools take shape in that way. Finally, if they want to proclaim a truth or propagate some feeling by the encouragement of a great example, they form an association. In every case, at the head of any new undertaking, where in France you would find the government or in England some territorial magnate, in the United States you are sure to find an association. I have come across several types of association in America of which, I confess, I had not previously the slightest conception, and I have often admired the extreme skill they show in proposing a common object for the exertions of very many and in inducing them voluntarily to pursue it.”

The types of associations de Tocqueville encountered used an assortment of ingenious legal structures borrowed, for the most part, from English common law.  Those of my readers who dislike the role of corporations in contemporary American life may be interested to know that the distant and much less toxic ancestor of today’s corporate structure was one of them. A corporation, back in those days, was not a privileged legal pseudoperson with more rights and fewer responsibilities than a human being.  It was simply a group of people who set out to accomplish some purpose, and got a charter from the state government allowing them to raise money for that purpose by selling shares of stock.  Most had single purposes—building a hospital, say, or a canal—and limited lifespans, defined either as a fixed number of years or until the purpose of the corporation was completed.

Making money was sometimes an object in such exercises, but by no means always. Members of a local religious community who wanted to build a new church, for example, would very often do that by organizing a corporation to raise money for the construction costs. Each member would buy as many shares as he or she could afford, and fellow believers in neighboring towns who wanted to support the project might also buy shares to chip in.  When the church was finished, the corporation would be wound up, and thereafter a portion of the income from tithes and donations might be set apart to buy back the shares a few at a time; on the other hand, the members of the congregation might consider themselves well repaid when they filed into a new building on Sunday mornings. It was a remarkably effective means of local microfinance, and it paid for a huge number of civic improvements and public amenities in the young republic.

Not all associations that directed their efforts toward the public good in this way were corporations, and I hope I may be allowed a personal reminiscence about one of the others. I think most of my regular readers know that I’m a Freemason. Yes, I’m quite aware that this makes me an object of vilification across most of the further reaches of contemporary American political life; no, I don’t particularly care, though it tickles my sense of absurdity to be accused to my face now and then of being one of the evil space lizards David Icke rants about in his books. In the town where I live, the various Masonic bodies meet in a large and lovely century-old building—which was, by the way, constructed by the sort of building corporation described earlier in this essay—and share certain projects in common. This year, I serve as the secretary to one of those, the Masonic Endowment Fund. Half a century ago, it had a different and more revealing name:  the Masonic Relief Fund.

Here’s how it functioned back in the day. Donations from living members, bequests from dead ones, and a variety of fundraising activities kept the Fund supplied with money, which it invested, and the interest from those investments went to help Masonic brothers and their families who fell onto hard times.  The members of the Relief Board, who were appointed by each lodge or other Masonic body, investigated each case and then started writing checks. 

Elderly brethren still recall the days when a long hard winter meant that the poorer families in town would run out of money for coal well before spring, and those who had a family member in the Masons could count on the truck from the local coal firm making a delivery anyway.  Groceries, visiting nurse services, school expenses, funeral costs—the Relief Fund covered all those and much more.  Everyone in the local Masonic scene supported the project to whatever extent their means allowed.  Partly that was because that’s what you do when you’re a Freemason, and partly it was because everyone knew that, however well off they were at the moment, some unexpected turn of events might leave them in a situation where they had to rely on the Fund for their own survival.

The Fund no longer buys truckloads of coal for poor Masonic families in the winter, and the reason for that is a microcosm of the impact of empire on American communities. In the wake of Lyndon Johnson’s “Great Society” programs of the 1960s, government welfare programs took the place of the Masonic Relief Fund and its many equivalents in the lives of the local poor.  Requests for help slowed and then stopped, and the Relief Board found itself sitting on an increasing pile of money that no one seemed to need any more. After much discussion, and by vote of the Masonic bodies that funded the Board, its name was changed to the Masonic Endowment Fund and its income was put to work paying for improvements on an aging Masonic building.

The same thing, often in much more drastic terms, happened to many other voluntary organizations that once occupied the roles currently more or less filled by government social welfare programs. In 1920, for example, something like 3500 different fraternal orders existed in the United States, and around 50% of the country’s adult population—counting both genders and all ethnic groups, by the way—belonged to at least one of them. Reasons for belonging ranged across the whole spectrum of human social phenomena, but there were hard pragmatic reasons to put in a petition for membership at your local lodge of the Odd Fellows, Knights of Pythias, or what have you:  at a time when employers generally didn’t provide sick pay and other benefits for employees, most fraternal orders did.

If you belonged to the Odd Fellows, for example, you went to lodge one evening a week and paid your weekly dues, which were generally around 25 cents—that is, about the equivalent of a $20 bill today. In exchange, if you became too sick to work, the lodge would give you sick pay, and if you died, the lodge would cover the costs of your funeral and guarantee that your family would be taken care of.  If, as often happened, the husband belonged to the Odd Fellows and the wife to the Rebekahs, the women’s branch of the same organization, the family had a double claim on the order’s help.

Here again, I can call on something more personal than the abstract facts that can be gotten from old history books. My paternal grandfather’s father was a city police officer in the wide-open port town of Aberdeen, Washington, and an Odd Fellow. In 1920 he was shot to death in the line of duty, leaving a wife and thirteen children. The Aberdeen Odd Fellows lodge paid for his funeral and then took care of his family—his children until they reached adulthood, his widow for the rest of her life. It’s not an accident that my grandfather, when he was in his twenties, became a founding member of a community service organization, Active 20-30 International.

In 1920, Odd Fellowship was at the peak of its size and influence, and ranked as the largest fraternal organization in North America.  Today, it’s a faint and flickering shadow of its former self. When welfare programs and employer-paid pensions came in, the core function of the Odd Fellows and a great many organizations like it went out by the same door.  So, in due time, did most of the organizations. We once had a thriving Odd Fellows lodge here in Cumberland; the building is still there, with the letters IOOF in mosaic work on the front step, but the lodge is long gone.

Now it’s only fair to point out that the shift from private relief funds to government welfare programs had certain definite advantages. The voluntary associations that handled relief in the pre-welfare era—fraternal orders such as the Freemasons and Odd Fellows, religious bodies such as churches and synagogues, and the like—had limited resources, and usually conserved them by limiting their relief payments to their members, or to other narrowly defined populations.  To return to a point made earlier in these posts, the relief organizations of an earlier day had to treat their resources as a commons that could be destroyed by abuse, and so they put a variety of limits on access to those resources to make sure that the people who got help actually needed it, and weren’t simply trying to game the system.

The deep pockets of government  in an era of national expansion made it possible, for a while, to ignore such factors. The new social welfare programs could reach out to everyone who needed them, or at least to everyone whose claim to need them advanced the agenda of one political faction or another.  The resulting largesse was distributed in various amounts all over the class spectrum—welfare for the poor, a dizzying array of direct and indirect federal subsidies for the middle class, ample tax loopholes and corporate handouts to benefit the rich—and did a great deal to fund the lavish lifestyles Americans by and large enjoyed during their nation’s imperial zenith.

That’s the kind of thing a society can do when it can draw on half a billion years of stored sunlight to prop up its economy, not to mention a global empire that gives it privileged access to the energy, raw materials, and products of industry that a half a billion years of stored sunlight made possible. Whether or not it was a good idea isn’t a question I particularly want to discuss at this point. It’s far more important, it seems to me, to recognize that the welfare states of the late 20th century were the product of a vast but temporary abundance of energy and the products of energy; they did not exist before that glut of energy arrived, and it’s thus a safe bet that they won’t exist after the glut is gone.

I think it’s at least as safe a bet, mind you, that nobody in America will be willing to face that fact until long after the abundance of the recent past is a fading memory.  The last decade or so of bickering in Washington DC is more than adequate evidence of the way the winds are blowing. Republicans talk about jobs, Democrats talk about justice, but in both cases what’s going on is best described as a bare-knuckle brawl over which party’s voting blocs get to keep their accustomed access to the federal feeding trough. Choose any point on the convoluted political landscape of modern America, and the people at that position eagerly criticize those handouts that don’t benefit them, but scream like irate wildcats if anything threatens their own access to government largesse. 

I suspect, for what it’s worth, that the last US government handouts to be paid out will be those that prop up the lifestyles of the American middle class: the financial aid that keeps middle class families from having to shoulder the whole cost of a college education for their children, the social security and medicare that keeps them from having to foot the whole bill for their old age and that of their parents, the galaxy of programs intended to make it easier for them to afford homeownership, and all the rest of it. Programs that benefit the middle class disproportionately already make up the largest share of US federal entitlement programs, dwarfing the 2% or so of the federal budget that goes to the poor, or the 5% or so that counts as corporate welfare, and that figure is a fair measure of the political clout the middle class can wield in defense of its privileges.

It would be pleasant to suppose, as the United States slides down the trajectory of imperial decline and the number of Americans in serious trouble increases, that middle class voters would recognize the severity of the situation and support, say, means tests on middle-class welfare programs, so that those who don’t actually need help can be asked to step aside in favor of those who do. I hope none of my readers plan on holding their breath and waiting for this to happen, though. Quite the contrary:  as economic contraction accelerates and energy and resource shortages bite harder, the fight around the feeding trough will just get worse. I doubt many of the combatants will take enough time from the struggle to notice that, in the long run, it’s a fight with no winners.

In the time of limits ahead of us, no country on earth will be able to afford a welfare state of the kind that was common in industrial societies over the last century or so. That’s one of the harsh realities of our predicament.  National economies powered by diffuse renewable energy sources, bound by strict ecological limits, and forced to cope with the cascading instabilities of a damaged planetary biosphere, simply won’t be able to produce the surplus wealth needed to make that a possibility. Methods of providing for urgent social needs that worked in the days before the economy of abundance are another matter, and for this reason it makes sense to suggest a revival of the old American custom of forming voluntary associations to fund and manage public amenities.

There are at least two important advantages to such a project, and both of them take the form of lessons that Americans once knew and are going to have to learn again in a hurry. The first is that a private association doesn’t have the luxury of pretending that it has limitless resources.  Currently some 60% of Americans receive more in the way of financial benefits from government than they pay in taxes.  Conservative pundits like to insist that this means the well-to-do are getting robbed by taxation, but the facts of the matter are considerably more troubling: the gap in question is being covered with borrowed money—which means, in an era of quantitative easing, that it’s being paid by printing money.  That’s a recipe for massive economic trouble in the short term. A private association, by contrast, can’t print its own money, and if its members vote themselves more benefits than the treasury can pay for, they will discover promptly enough why this isn’t a good idea. 

That’s the first advantage. The second is closely related to it.  The benefit funds of the old fraternal orders and their equivalents across the spectrum of voluntary associations learned two crucial lessons very early on.  The first was that some people are solely interested in gaming the system for whatever they can get out of it, and are unwilling to contribute anything in return. The second is that allowing such people to get their way, and drain the fund of its resources, is a fast road to failure. The standard response was to limit access to the benefit fund to those who had contributed to it, or the organization that sponsored it, at least to the extent of joining the organization and keeping current on their dues.

That’s why the Masonic Relief Fund here in Cumberland only bought coal for those poor families who had a family member in Freemasonry, and why so many of the comparable funds operated by other lodges, by churches, and by a range of other social institutions in the pre-welfare days had similar rules.  The reasoning involved in this custom is the same logic of the commons we’ve discussed several times already in this series of posts. A relief fund is a commons; like any other commons it can be destroyed if those who have access to it are allowed to abuse it for their own benefit; to prevent that from happening, limits on access to the commons are essential.

There were also organizations that existed to provide help to people other than their own members, and in fact most of the old lodges directed some part of their efforts to helping people in the community at large—as indeed most of them still do. Here again, though, access to the limited resources that were available was controlled, so that those resources could be directed where, in the best judgment of the sponsoring organization, they would do the most good.  Nineteenth-century talk about “the deserving poor”—that is, those whose poverty was not primarily the result of their own choices and habits, and who thus might well be able to better their conditions given some initial help—is deeply offensive to many people in our present culture of entitlement, but it reflects a hard reality. 

Whether the habit of throwing money at the poor en masse is a useful response to poverty or not, the fact remains that a post-imperial America on the downslope of the age of cheap energy won’t have the resources to maintain that habit even with the poor it’s already got, much less the vastly increased numbers of the newly poor we’ll have as what’s left of our economy winds down.  Those who worry about ending up destitute as that process unfolds need to be aware that they won’t be able to turn to the government for help, and those whose sense of compassion leads them to want to do something helpful for the poor might want to consider some less futile response than trying to pry loose government funds for that purpose. 

In both cases, the proven approaches of a less extravagant time might be worth adopting, or at least considering. It’s fair to admit that the voluntary associations central to those approaches won’t be able to solve all the problems of a post-imperial society in a deindustrializing world, but then neither will anything else; they can, however, accomplish some good. In a time of limits, that may well be the best that can be done.

Wednesday, February 13, 2013

Skin In The Game

The old-fashioned school districts that provided me with a convenient example in last week’s post here on The Archdruid Report represent a mode of politics that nobody, but nobody, talks about in today’s America.  Across the whole landscape of our contemporary political life, with remarkably few exceptions, when people talk about the relationship between the political sphere and the rest of life, the political sphere they have in mind consists of existing, centralized governmental systems.

That’s as true of those who denounce political interference in the lives of individuals and communities, by and large, as it is of those who insist that such interference can be a very good thing. It’s as though, in the American collective imagination, the political sphere consists only of the established institutions of government, and the established—and distinctly limited—ways that individual citizens can have an influence on those institutions.  The idea that citizens might create their own local political structures, for purposes they themselves choose, and run them themselves, using the tools of democratic process, has vanished completely from our national conversation.

Less than a lifetime ago, however, this was a standard way of making constructive change in America.Local school districts were only one example, though they were probably the most pervasive. Most of the time, when people in a community wanted to create some public amenity or solve some community problem, they did it by creating a local, single-purpose governmental body with an elected board and strictly limited powers of taxation to pay the bills.  Sewer districts, streetcar lines, public hospitals, you name it, that’s usually how they were run.  The state government had supervision over all these bodies, which was normally taken care of by—you guessed it—state boards whose members were, once again, elected by the general public.

Was it a perfect system? Of course not.  The interlocking checks and balances of board supervision and elections were no more foolproof than any other mode of democratic governance, and a certain fraction of these single-purpose local governmental bodies failed due to corruption or mismanagement. Still, a substantial majority of them do seem to have worked tolerably well, and they had a crucial advantage not shared by today’s more centralized ways of doing things:  if something went wrong, the people who had the power to change things were also the people most directly affected.

If the management of your local sewer district turned out to be hopelessly incompetent, for example, you didn’t have to try to get a distant and uninterested state or federal bureaucracy to stir out of its accustomed slumber and do its job, nor did you have to try to find some way to convince tens of thousands of unaffected voters in distant parts of the state to cast their votes to throw somebody out of office for reasons that didn’t matter to them in the least.  The right to vote in the next sewer board election was limited to those people who were actually served by the sewer district, who paid the bills of the district with their monthly assessments, who’d had to deal with balky sewers for the last two years, and were thus qualified to judge whether a “Throw the Rascals Out” campaign was justified. Keeping control of the system in the hands of the people most directly affected by it thus served as a preventive to the serene indifference to failure that pervades so much of American government today.

It might be worth proposing as a general rule, in fact, that democratic governance works best when the people directly affected by any function of government have direct control over those people who run that function of government, subject to appropriate oversight by those responsible for maintaining the public commons.  In the case of our imaginary sewer district, that means giving those who live within the district the sole power to choose members of the board, while placing the local board under the supervision of a state board tasked with making sure local decisions didn’t violate state public health standards and the like. In the case of the school districts described in last week’s post, it meant giving the local school boards broad powers to set policy for the schools they administered, giving citizens who lived within the school district the sole right to vote in school elections, and placing the school boards under the supervision of a state board of education that was charged with enforcing a few very broad educational standards, health and safety regulations, and so on.

As long as the roles of state and federal governments remained that of policing the commons, the system worked quite well—better, by most measures, than the failed equivalents we have today.  What put paid to it was the explosive spread of government centralization after the Second World War, and this in turn was driven by the imperial tribute economy I described earlier in this series of posts:  the set of deliberately unbalanced economic arrangements by which something like a third of the world’s wealth is channeled every year to the five per cent of humanity that live in the United States.

The linchpin of local control, as it turned out, was local funding.  Sewer districts, school districts, and all the other little local governmental bodies received all their funding directly from the people they served, by whatever arrangements the voters in the district had accepted when the district was founded. When federal and state governments gained the power to dangle million-dollar grants in front of the various local governments, most if not all of them took the bait, and only later discovered that the power to grant or withhold funding trumps every other form of political power in our society.  That was how the local single-purpose governments were stripped of their autonomy and turned into instruments of centralized government, subject to micromanagement by state and federal bureaucracies.

That process of centralization was justified in many cases by claims of efficiency.  Now of course when somebody starts prattling about efficiency, the question that needs to be asked is “efficient at what?” A screwdriver is highly efficient at turning screws but very inefficient as a means for pounding nails; the modern corporate economy, in much the same sense, is highly efficient at concentrating paper wealth in the hands of the already rich, and very inefficient at such other tasks as producing goods and services. It’s interesting to speculate about just what it is that centralized bureaucracies can do more efficiently than local single-purpose governmental bodies, but in retrospect, we can be certain that running schools, sewer districts, and other public goods do not belong in that category.

I discussed last week some of the reasons for thinking that today’s massively centralized American education system is much less effective at teaching children to read, write, calculate, and exercise the other basic skills essential to life in a modern society than the old-fashioned, locally managed, locally funded school districts of the not so distant past.  The responses I fielded to those comments intrigued me. One typical commenter insisted that she found it “incredibly hard to believe” that educational standards in the one-room schoolhouses of yesteryear were higher than those in school districts today. Now of course it takes only a glance at the old McGuffey’s Readers, the standard reading textbooks in those one-room schoolhouses, to show that levels of reading comprehension, grammar, and vocabulary that were considered normal at every elementary school grade level in the late 19th century were vastly greater than those achieved in today’s schools; in fact, the reading ability assumed by the first pages of the 8th grade McGuffey’s is by no means common in American college classes today.

The collapse of educational standards that can be observed here, and in a hundred similar examples, has had many causes.  Still, it’s far from irrelevant to note that a similar collapse has taken place in many other areas in which the old system of independent local governmental bodies has been replaced by micromanagement by state or federal bureaucracies.  That collapse has been discussed nearly as widely in the media as the implosion of American education, and it’s ironic to note that, just as media discussions of public education’s breakdown have gone out of their way to avoid considering the role of overcentralization in driving that collapse, the media coverage of the parallel breakdown I have in mind has been just as careful to avoid touching on this same issue.

The collapse in question? The disintegration of America’s infrastructure in recent decades.

A great many factors, to be sure, have had a part in creating the crisis in our national infrastructure, just as a great many factors have contributed to the parallel crisis in our national public education system. In both cases, though, I’d like to suggest that overcentralization has played a crucial role in both crises. There are at least three reasons why, all other things being equal, a centralized government bureaucracy will by and large be less able to provide good schools, working sewers, and other public goods than a local governmental body of the type we’ve been discussing.

First, centralized government bureaucracies aren’t accountable for their failures.  To borrow a bit of gambler’s slang, they have no skin in the game.  No matter how disastrous the consequences of an administrative decision made in the state or national capital, the bureaucrats who made the decision will continue to draw their pay, exercise their authority, and pursue whatever fashionable agendas they picked up in college or elsewhere, even if their actions turn out to be hopelessly counterproductive in terms of the goals their bureaucracy ostensibly exists to serve.  Local single-purpose governmental bodies by and large don’t have that freedom; if the local sewer board pursues policies that fail to provide adequate sewer service to the people in the sewer district, the members of the board had better look for other jobs come the next local election.

Second, centralized government bureaucracies provide many more places for money to get lost. If you’ve got a bureaucracy at the national level—say, the federal Department of Education—another bureaucracy at each state level—say, state Departments of Education—and still another bureaucracy at the local level—say, current school districts, a good many of which have hundreds of employees filling administrative positions these days—and all of these are doing a job that used to be done by a handful of employees working for each school board, one whale of a lot of money that might otherwise go to improve schools is being siphoned off into administrative salaries and expenses.  The same thing is true of the money that might go to repair bridges and sewer pipes; how much of that goes instead to pay for administrative staff in the federal Department of Transportation and the equivalent state and county bureaucracies? All this is aside from graft and corruption, which is also an issue; it’s a good general rule that the more hands money must pass through on its way to a project, the higher the likelihood that some of those hands will have sticky fingers. 

The third reason is subtler, and ties back into the proposal I made several weeks back, that the proper role of government is that of preserving the public commons. To make a commons work, there needs to be some system in place to monitor the state of the commons, assess how changes will impact it, and prohibit those things that will cause harm to it.  On a purely local level, as Elinor Ostrom showed, a self-regulating commons is easy to establish and easy to maintain, since it’s in the direct self-interest of everyone who benefits from the commons to prevent anyone else from abusing it. The local single-purpose governmental bodies discussed in this week’s post rely on that logic: if you depend on the local sewer board to provide you with sewage service, to return to our example, you have a very strong incentive not to permit the board to ignore its duties.

Still, for a variety of reasons, the mechanisms of local government don’t always function as they should.  It’s for this reason that the American political tradition long ago evolved the useful habit, already referred to, of making the decisions of local government subject to review at the state level, by way of the supervisory boards discussed earlier.  The state boards, like the local boards they supervised, were elected by the voters, so they remained accountable for their failures. More importantly, though, was the simple fact that the officials tasked with assessing the legality and appropriateness of policies were not the same officials that were making the policies.

This is a basic principle of cybernetics, by the way. If you’ve got one system carrying out a function, and another system monitoring how well the first system carries out its function, you need to make sure that the only input the second system receives from the first system is the input that allows the second system to carry out its monitoring function. Otherwise you get feedback loops that prevent the second system from doing what it’s supposed to do. That’s exactly the problem we have now.  When public schools are being micromanaged by regulations drafted by federal bureaucrats, who is assessing the legality and appropriateness of those regulations? The same federal bureaucrats—and whether you analyze this by way of cybernetics, politics, or plain common sense, this is a recipe for disaster.

These three factors—the lack of accountability endemic to centralized professional bureaucracies; the tendency for money to get lost as it works its way down through the myriad layers of a centralized system; and the unhelpful feedback loops that spring up when the policy-making and monitoring functions of government are confounded—go a long ways to explain the cascading failure of many of the basic systems that an older, more localized, and less centralized approach to government used to maintain in relatively good order.  The accelerating decline of American public education and the disintegration of the national infrastructure are only two examples of this effect in practice; there are plenty of others—a great deal of what’s wrong with America’s health care system, for example, can be traced to the same process of overcentralization.

I’m pleased to say, though, that help is on the way. On second thought, “pleased” is probably not the right word, since the help in question will almost certainly bring about the wholesale implosion of a great many of the basic systems that provide public goods to Americans, and its arrival will have to be followed by the slow, costly, and potentially painful rebuilding of those systems from the ground up. The source of that unwelcome assistance, of course, is the twilight of America’s global empire.  In the absence of the torrents of unearned wealth American society currently receives from the imperial wealth pump, a great many of the centralized systems in place today—governmental, corporate, and nonprofit—will probably stop functioning altogether. Those who think they will cheer this development are invited to imagine how they will feel when their sewers stop working and nobody, anywhere, is willing or able to do anything about that fact.

As the impact of America’s imperial decline echoes through the fabric of the nation, a great many of the basic systems of everyday life will need to be repaired and replaced. One of the very few tools that might enable that to be done effectively is the system of local single-purpose governmental bodies that I’ve discussed in this post.  As municipal services become intermittent or stop altogether, schools shut down, and infrastructure collapses, people with skin in the game—local residents, that is, who want basic services enough to tax themselves at a modest rate to pay for them—could readily use the old system to pick up the pieces from imploding government bureaucracies. 

Equally, the same process can be used to pursue any number of public goods not currently served at all by existing governmental systems. All that’s needed is for something that used to be an integral part of American community life to be rediscovered and put back to work, before the imperial structures that replaced them finish coming apart.  Mind you, the system of local single-purpose government bodies is far from the only elements of an older way of community that could use being rediscovered and restored; next week we’ll talk about another.

**********************
Those who are interested in ordering an advance copy of my forthcoming book Not The Future We Ordered: The Psychology of Peak Oil and the Myth of Eternal Progress can now get a 20% discount from the North American distributor. Visit this webpage, place your order, and enter the code NTFWO during checkout; this code will be good until the end of March. This book covers aspects of peak oil and the future of industrial society that I haven’t discussed in detail in these blog posts, focusing on the psychological and emotional impacts of peak oil on societies committed to the civil religion of progress.  Those of my readers who can handle the end of progress as an imminent fact, and are interested in what that implies, may want to have a look at it.

Wednesday, February 06, 2013

The Center Cannot Hold

When William Butler Yeats put the phrase I’ve used as the title for this week’s post into the powerful and prescient verses of “The Second Coming,” he had deeper issues in mind than the crisis of power in a declining American empire.  Still, the image is anything but irrelevant here; the political evolution of the United States over the last century has concentrated so many of the responsibilities of government in Washington DC that the entire American system is beginning to crack under the strain.

This is admittedly not the way you’ll hear the centralization of power in America discussed by those few voices in our national conversation who discuss it at all. On the one hand are the proponents of centralized power, who insist that leaving any decision at all in the hands of state or local authorities is tantamount to handing it over to their bogeyman du jour—whether that amounts to the bedsheet-bedecked Southern crackers who populate the hate speech of the left, say, or the more diverse gallery of stereotypes that plays a similar role on the right.  On the other hand are those who insist that the centralization of power in America is the harbinger of a totalitarian future that will show up George Orwell as an incurable optimist.

I’ve already talked in a number of previous posts about the problems with this sort of thinking, with its flattening out of the complexities of contemporary politics into an opposition between warm fuzzy feelings and cold prickly ones.  I’d like, to pursue the point a little further, to offer two unpopular predictions about the future of American government. The first is that the centralization of power in Washington DC has almost certainly reached its peak, and will be reversing in the decades ahead of us. The second is that, although there will inevitably be downsides to that reversal, it will turn out by and large to be an improvement over the system we have today.  These predictions unfold from a common logic; both are consequences of the inevitable failure of overcentralized power.

It’s easy to get caught up in abstractions here, and even easier to fall into circular arguments around the functions of political power that attract most of the attention these days—for example, the power to make war.  I’ll be getting to this latter a bit further on in this post, but I want to start with a function of government slightly less vexed by misunderstandings. The one I have in mind is education.

In the United States, for a couple of centuries now, the provision of free public education for children has been one of the central functions of government.  Until fairly recently, in most of the country, it operated in a distinctive way.  Under legal frameworks established by each state, local school districts were organized by the local residents, who also voted to tax themselves to pay the costs of building and running schools.  Each district was managed by a school board, elected by the local residents, and had extensive authority over the school district’s operations. 

In most parts of the country, school districts weren’t subsets of city, township, or county governments, or answerable to them; they were single-purpose independent governments on a very small scale, loosely supervised by the state and much more closely watched by the local voters. On the state level, a superintendent of schools or a state board of education, elected by the state’s voters, had a modest staff to carry out the very limited duties of oversight and enforcement assigned by the state legislature.  On the federal level, a bureaucracy not much larger supervised the state boards of education, and conducted the even more limited duties assigned it by Congress.

Two results of that system deserve notice.  First of all, since individual school districts were allowed to set standards, chose textbooks, and manage their own affairs, there was a great deal of diversity in American education. While reading, writing, and ‘rithmetic formed the hard backbone of the school day, and such other standards as history and geography inevitably got a look in as well, what else a given school taught was as varied as local decisions could make them. What the local schools put in the curriculum was up to the school board and, ultimately, to the voters, who could always elect a reform slate to the school board if they didn’t like what was being taught.

Second, the system as a whole gave America a level of public literacy and general education that was second to none in the industrial world, and far surpassed the poor performance of the far more lavishly funded education system the United States has today. In a previous post, I encouraged readers to compare the Lincoln-Douglas debates of 1858 to the debates in our latest presidential contest, and to remember that most of the people who listened attentively to Lincoln and Douglas had what then counted as an eighth-grade education.  The comparison has plenty to say about the degeneration of political thinking in modern America, but it has even more to say about the extent to which the decline in public education has left voters unprepared to get past the soundbite level of thinking.

Those of my readers who want an even more cogent example are encouraged to leaf through a high school textbook from before the Second World War. You’ll find that the reading comprehension, reasoning ability, and mathematical skill expected as a matter of course from ninth-graders in 1930 is hard to find among American college graduates today.  If you have kids of high school age, spend half an hour comparing the old textbook with the one your children are using today.  You might even consider taking the time to work through a few of the assignments in the old textbook yourself.

Plenty of factors have had a role in the dumbing-down process that gave us our current failed system of education, to be sure, but I’d like to suggest that the centralization of power over the nation’s educational system in a few federal bureaucracies played a crucial role.  To see how this works, again, a specific example is useful.  Let’s imagine a child in an elementary school in Lincoln, Nebraska, who is learning how to read.  Ask yourself this:  of all the people concerned with her education, which ones are able to help that individual child tackle the daunting task of figuring out how to transform squiggles of ink into words in her mind?

The list is fairly small, and her teacher and her parents belong at the top of it. Below them are a few others:  a teacher’s aide if her classroom has one, an older sibling, a friend who has already managed to learn the trick.  Everyone else involved is limited to helping these people do their job.  Their support can make that job somewhat easier—for example, by making sure that the child has books, by seeing to it that the classroom is safe and clean, and so on—but they can’t teach reading. Each supporting role has supporting roles of its own; thus the district’s purchasing staff, who keep the school stocked with textbooks, depend on textbook publishers and distributors, and so on.  Still, the further you go from the child trying to figure out that C-A-T means “cat,” the less effect any action has on her learning process.

Now let’s zoom back 1200 miles or so to Washington DC and the federal Department of Education. It’s a smallish federal bureaucracy, which means that in the last year for which I was able to find statistics, 2011, it spent around $71 billion.  Like many other federal bureaucracies, its existence is illegal. I mean that quite literally; the US constitution assigns the federal government a fairly limited range of functions, and “those powers necessary and convenient” to exercise them; by no stretch of the imagination can managing the nation’s public schools be squeezed into those limits. Only the Supreme Court’s embarrassingly supine response to federal power grabs during most of the twentieth century allows the department to exist at all.

So we have a technically illegal bureaucracy running through $71 billion of the taxpayers’ money in a year, which is arguably not a good start. The question I want to raise, though, is this:  what can the staff of the Department of Education do that will have any positive impact on that child in the classroom in Lincoln, Nebraska?  They can’t teach the child themselves; they can’t fill any of the supporting roles that make it possible for the child to be taught. They’re 1200 miles away, enacting policies that apply to every child in every classroom, irrespective of local conditions, individual needs, or any of the other factors that make teaching a child to read different from stamping out identical zinc bushings.

There are a few—a very few—things that can usefully be done for education at the national level. One of them is to make sure that the child in Lincoln is not denied equal access to education because of her gender, her skin color, or the like. Another is to provide the sort of overall supervision to state boards of education that state boards of education traditionally provided to local school boards. There are a few other things that belong on the same list.  All of them can be described, to go back to a set of ideas I sketched out a couple of weeks ago, as measures to maintain the commons.

Public education is a commons. The costs are borne by the community as a whole, while the benefits go to individuals:  the children who get educated, the parents who don’t have to carry all the costs of their children’s education, the employers who don’t have to carry all the costs of training employees, and so on. Like any other commons, this one is vulnerable to exploitation when it’s not managed intelligently, and like most commons in today’s America, this one has taken quite a bit of abuse lately, with the usual consequences. What makes this situation interesting, in something like the sense of the apocryphal Chinese proverb, is that the way the commons of public education is being managed has become the principal force wrecking the commons.

The problem here is precisely that of centralization. The research for which economist Elinor Ostrom won her Nobel Prize a few years back showed that, by and large, effective management of a commons is a grassroots affair; those who will be most directly affected by the way the commons is managed are also its best managers.  The more distance between the managers and the commons they manage, the more likely failure becomes, because two factors essential to successful management simply aren’t there. The first of them is immediate access to information about how management policies are working, or not working, so that those policies can be adjusted immediately if they go wrong; the second is a personal stake in the outcome, so that the managers have the motivation to recognize when a mistake has been made, rather than allowing the psychology of previous investment to seduce them into pursuing a failed policy right into the ground.

Those two factors don’t function in an overcentralized system.  Politicians and bureaucrats don’t get to see the consequences of their failed decisions up close, and they don’t have any motivation to admit that they were wrong and pursue new policies—quite the contrary, in fact. Consider, for example, the impact of the No Child Left Behind (NCLB) Act, pushed through Congress by bipartisan majorities and signed with much hoopla by George W. Bush in 2002. In the name of accountability—a term that in practice means “finding someone to punish”—the NCLB Act requires mandatory standardized testing at specific grade levels, and requires every year’s scores to be higher than the previous year’s, in every school in the nation. Teachers and schools that fail to accomplish this face draconian penalties.

My readers may be interested to know that next year, by law, every child in America must perform at or above grade level. It’s reminiscent of the imaginary town of Lake Wobegon—“where all the children are above average”—except that this is no joke; what’s left of America’s public education system is being shredded by the efforts of teachers and administrators to save their jobs in a collapsing economy, by teaching to the tests and gaming the system, under the pressure of increasingly unreal mandates from Washington DC. Standardized test scores have risen slightly; meaningful measures of literacy, numeracy, and other real-world skills have continued to move raggedly downward, and you can bet that the only response anybody in Washington is going to be willing to discuss is yet another round of federal mandates, most likely even more punitive and less effective than the current set.

Though I’ve used education as an example, nearly every part of American life is pervaded by the same failed logic of overcentralization. Another example?  Consider the Obama administration’s giddy pursuit of national security via drone attacks.  As currently operated, Predator drones are the ne plus ultra in centralized warfare; each drone attack has to be authorized by Obama himself, the drone is piloted via satellite link from a base in Nevada, and you can apparently sit in the situation room in the White House and watch the whole thing live. Hundreds of people have been blown to kingdom come by these attacks so far, in the name of a war on terror that Obama’s party used to denounce.

Now of course that habit only makes sense if you’re willing to define young children and wedding party attendees as terrorists, which seems a little extreme to me. Leaving that aside, though, there’s a question that needs to be asked:  is it working?  Since none of the areas under attack are any less full of anti-American insurgents than they have been, and the jihadi movement has been able to expand its war dramatically in recent weeks into Libya and Mali, the answer is pretty clearly no.  However technically superlative the drones themselves are, the information that guides them comes via the notoriously static-filled channels of intelligence collection and analysis, and the decision to use them takes place in the even less certain realms of tactics and strategy; nor is it exactly bright, if you want to dissuade people from seeking out Americans and killing them, to go around vaporizing people nearly at random in parts of the world where avenging the murder of a family member is a sacred duty.

In both cases, and plenty of others like them, we have other alternatives, but all of them require the recognition that the best response to a failed policy isn’t a double helping of the same.  That recognition is nowhere in our collective conversation at the moment. It would be useful if more of us were to make an effort to put it there, but there’s another factor in play.  The center really cannot hold, and as it gives way, a great many of today’s political deadlocks will give way with it.

Eliot Wigginton, the teacher in rural Georgia who founded the Foxfire project and thus offered the rest of us an elegant example of what can happen when a purely local educational venture is given the freedom to flower and bear fruit, used to say that the word “learn” is properly spelled F-A-I-L. That’s a reading lesson worth taking to heart, if only because we’re going to have some world-class chances to make use of it in the years ahead.  One of the few good things about really bad policies is that they’re self-limiting; sooner or later, a system that insists on embracing them is going to crash and burn, and once the rubble has stopped bouncing and the smoke clears away, it’s not too hard for the people standing around the crater to recognize that something has gone very wrong.  In that period of clarity, it’s possible for a great many changes to be made, especially if there are clear alternatives available and people advocating for them.

In the great crises that ended each of America’s three previous rounds of anacyclosis—in 1776, in 1861, and in 1933—a great many possibilities that had been unattainable due to the gridlocked politics of the previous generation suddenly came within reach. In those past crises, the United States was an expanding nation, geographically, economically, and in terms of its ability to project power in the world; the crisis immediately ahead bids fair to arrive in the early stages of the ensuing contraction. That difference has important effects on the nature of the changes before us.

Centralized power is costly—in money, in energy, in every other kind of resource.  Decentralized systems are much cheaper.  In the days when the United States was mostly an agrarian society, and the extravagant abundance made possible by a global empire and reckless depletion of natural resources had not yet arrived, the profoundly localized educational system I sketched out earlier was popular because it was affordable.  Even a poor community could count on being able to scrape together the political will and the money to establish a school district, even if that meant a one-room schoolhouse  with one teacher taking twenty-odd children a day through grades one through eight. That the level of education that routinely came out of such one-room schoolhouses was measurably better than that provided by today’s multimillion-dollar school budgets is just one more irony in the fire.

On the downside of America’s trajectory, as we descend from empire toward whatever society we can manage to afford within the stringent limits of a troubled biosphere and a planet stripped of most of its nonrenewable resources, local systems of the one-room schoolhouse variety are much more likely to be an option than centralized systems of the sort we have today.  That shift toward the affordably local will have many more consequences; I plan on exploring another of them next week.