Wednesday, January 25, 2017

How Great the Fall Can Be

While I type these words, an old Supertramp CD is playing in the next room. Those of my readers who belong to the same slice of an American generation I do will likely remember the words Roger Hodgson is singing just now, the opening line from “Fool’s Overture”:

“History recalls how great the fall can be...”

It’s an apposite quote for a troubled time.

Over the last year or so, in and among the other issues I’ve tried to discuss in this blog, the US presidential campaign has gotten a certain amount of air time. Some of the conversations that resulted generated a good deal more heat than light, but then that’s been true across the board since Donald Trump overturned the established certainties of American political life and launched himself and the nation on an improbable trajectory toward our current situation. Though the diatribes I fielded from various sides were more than occasionally tiresome, I don’t regret making the election a theme for discussion here, as it offered a close-up view of issues I’ve been covering for years now.

A while back on this blog, for example, I spent more than a year sketching out the process by which civilizations fall and dark ages begin, with an eye toward the next five centuries of North American history—a conversation that turned into my book Dark Age America. Among the historical constants I discussed in the posts and the book was the way that governing elites and their affluent supporters stop adapting their policies to changing political and economic conditions, and demand instead that political and economic conditions should conform to their preferred policies. That’s all over today’s headlines, as the governing elites of the industrial world cower before the furious backlash sparked by their rigid commitment to the failed neoliberal nostrums of global trade and open borders.

Another theme I discussed in the same posts and book was the way that science and culture in a civilization in decline become so closely identified with the interests of the governing elite that the backlash against the failed policies of the elite inevitably becomes a backlash against science and culture as well. We’ve got plenty of that in the headlines as well. According to recent news stories, for example, the Trump administration plans to scrap the National Endowment for the Arts, the National Endowment for the Humanities, and the Corporation for Public Broadcasting, and get rid of all the federal offices that study anthropogenic climate change.

Their termination with extreme prejudice isn’t simply a matter of pruning the federal bureaucracy, though that’s a factor. All these organizations display various forms of the identification of science and culture with elite values just discussed, and their dismantling will be greeted by cheers from a great many people outside the circles of the affluent, who have had more than their fill of patronizing lectures from their self-proclaimed betters in recent years. Will many worthwhile programs be lost, along with a great deal that’s less than worthwhile?  Of course. That’s a normal feature of the twilight years of a civilization.

A couple of years before the sequence of posts on dark age America, for that matter, I did another series on the end of US global hegemony and the rough road down from empire. That sequence also turned into a book, Decline and Fall. In the posts and the book, I pointed out that one of the constants of the history of democratic societies—actual democracies, warts and all, as distinct from the imaginary “real democracy” that exists solely in rhetoric—is a regular cycle of concentration and diffusion of power. The ancient Greek historian Polybius, who worked it out in detail, called it anacyclosis.

A lot can be said about anacyclosis, but the detail that’s relevant just now is the crisis phase, when power has become so gridlocked among competing power centers that it becomes impossible for the system to break out of even the most hopelessly counterproductive policies. That ends, according to Polybius, when a charismatic demagogue gets into power, overturns the existing political order, and sets in motion a general free-for-all in which old alliances shatter and improbable new ones take shape. Does that sound familiar? In a week when union leaders emerged beaming from a meeting with the new president, while Democrats are still stoutly defending the integrity of the CIA, it should.

For that matter, one of the central themes of the sequence of posts and the book was the necessity of stepping back from global commitments that the United States can no longer afford to maintain. That’s happening, too, though it’s being covered up just now by a great deal of Trumped-up bluster about a massive naval expansion. (If we do get a 350-ship navy in the next decade, I’d be willing to bet that a lot of those ships will turn out to be inexpensive corvettes, like the ones the Russians have been using so efficiently as cruise missile platforms on the Caspian Sea.)  European politicians are squawking at top volume about the importance of NATO, which means in practice the continuation of a scheme that allows most European countries to push most of the costs of their own defense onto the United States, but the new administration doesn’t seem to be buying it.

Mind you, I’m far from enthusiastic about the remilitarization of Europe. Outside the brief interval of enforced peace following the Second World War, Europe has been a boiling cauldron of warfare since its modern cultures began to emerge out of the chaos of the post-Roman dark ages. Most of the world’s most devastating wars have been European in origin, and of course it escapes no one’s attention in the rest of the world that it was from Europe that hordes of invaders and colonizers swept over the entire planet from the sixteenth through the nineteenth centuries, as often as not leaving total devastation in their wake. In histories written a thousand years from now, Europeans will have the same sort of reputation that Huns and Mongols have today—and it’s only in the fond fantasies of those who think history has a direction that those days are definitely over.

It can’t be helped, though, for the fact of the matter is that the United States can no longer afford to foot the bill for the defense of other countries. Behind a facade of hallucinatory paper wealth, our nation is effectively bankrupt. The only thing that enables us to pay our debts now is the status of the dollar as the world’s reserve currency—this allows the Treasury to issue debt at a breakneck pace and never have to worry about the cost—and that status is trickling away as one country after another signs bilateral deals to facilitate trading in other currencies. Sooner or later, probably in the next two decades, the United States will be forced to default on its national debt, the way Russia did in 1998.  Before that happens, a great many currently overvalued corporations that support themselves by way of frantic borrowing will have done the same thing by way of the bankruptcy courts, and of course the vast majority of America’s immense consumer debt will have to be discharged the same way.

That means, among other things, that the extravagant lifestyles available to affluent Americans in recent decades will be going away forever in the not too distant future. That’s another point I made in Decline and Fall and the series of posts that became raw material for it. During the era of US global hegemony, the five per cent of our species who lived in the United States disposed of a third of the world’s raw materials and manufactured products and a quarter of its total energy production. That disproportionate share came to us via unbalanced patterns of exchange hardwired into the global economy, and enforced at gunpoint by the military garrisons we keep in more than a hundred countries worldwide. The ballooning US government, corporate, and consumer debt load of recent years was an attempt to keep those imbalances in place even as their basis in geopolitics trickled away. Now the dance is ending and the piper has to be paid.

There’s a certain bleak amusement to be had from the fact that one of the central themes of this blog not that many years back—“Collapse Now and Avoid the Rush”—has already passed its pull date. The rush, in case you haven’t noticed, is already under way. The fraction of US adults of working age who are permanently outside the work force is at an all-time high; so is the fraction of young adults who are living with their parents because they can’t afford to start households of their own. There’s good reason to think that the new administration’s trade and immigration policies may succeed in driving both those figures down, at least for a while, but of course there’ll a price to be paid for that—and those industries and social classes that have profited most from the policies of the last thirty years, and threw their political and financial weight behind the Clinton campaign, will be first in line to pay it. Vae victis!*

More generally, the broader landscape of ideas this blog has tried to explore since its early days remains what it is. The Earth’s economically accessible reserves of fossil carbon dwindle day by day; with each year that passes, on average, the amount of coal, oil, and natural gas burnt exceeds the amount that’s discovered by a wider margin; the current temporary glut in the oil markets is waning so fast that analysts are predicting the next price spike as soon as 2018. Talk of transitioning away from fossil fuels to renewable energy, on the one hand, or nuclear power on the other, remains talk—I encourage anyone who doubts this to look up the amount of fossil fuels burnt each year over the last two decades and see if they can find a noticeable decrease in global fossil fuel consumption to match the much-ballyhooed buildout of solar and wind power.

The industrial world remains shackled to fossil fuels for most of its energy and all of its transportation fuel, for the simple reason that no other energy source in this end of the known universe provides the abundant, concentrated, and fungible energy supply that’s needed to keep our current lifestyles going. There was always an alternative—deliberately downshifting out of the embarrassing extravagance that counts for normal lifestyles in the industrial world these days, accepting more restricted ways of living in order to leave a better world for our descendants—but not enough people were willing to accept that alternative to make a difference while there was still a chance.

Meanwhile the other jaw of the vise that’s tightening around the future is becoming increasingly visible just now. In the Arctic, freak weather systems has sucked warm air up from lower latitudes and brought the normal process of winter ice formation to a standstill. In the Antarctic, the Larsen C ice shelf, until a few years ago considered immovable by most glaciologists, is in the process of loosing an ice sheet the size of Delaware into the Antarctic Ocean. I look out my window and see warm rain falling; here in the north central Appalachians, in January, it’s been most of a month since the thermometer last dipped below freezing. The new administration has committed itself to do nothing about anthropogenic climate change, but then, despite plenty of talk, the Obama administration didn’t do anything about it either.

There’s good reason for that, too. The only way to stop anthropogenic climate change in its tracks is to stop putting greenhouse gases into the atmosphere, and doing that would require the world to ground its airlines, turn its highways over to bicycles and oxcarts, and shut down every other technology that won’t be economically viable if it has to depend on the diffuse intermittent energy available from renewable sources. Does the political will to embrace such changes exist? Since I know of precisely three climate change scientists, out of thousands, who take their own data seriously enough to cut their carbon footprint by giving up air travel, it’s safe to say that the answer is “no.”

So, basically, we’re in for it.

The thing that fascinates me is that this is something I’ve been saying for the whole time this blog has been appearing. The window of opportunity for making a smooth transition to a renewable future slammed shut in the early 1980s, when majorities across the industrial world turned their backs on the previous decade’s promising initiatives toward sustainability, and bought into the triumphalist rhetoric of the Reagan-Thatcher counterrevolution instead. Since then, year after weary year, most of the green movement—with noble exceptions—has been long on talk and short on action.  Excuses for doing nothing and justifications for clinging to lifestyles the planet cannot support have proliferated like rabbits on Viagra, and most of the people who talked about sustainability at all took it for granted that the time to change course was still somewhere conveniently off in the future. That guaranteed that the chance to change course would slide steadily further back into the past.

There was another detail of the post-Seventies sustainability scene that deserves discussion, though, because it’s been displayed with an almost pornographic degree of nakedness in the weeks just past. From the early days of the peak oil movement in the late 1990s on, a remarkably large number of the people who talked eagerly about the looming crisis of our age seemed to think that its consequences would leave them and the people and things they cared about more or less intact. That wasn’t universal by any means; there were always some people who grappled with the hard realities that the end of the fossil fuel age was going to impose on their own lives; but all things considered, there weren’t that many, in comparison to all those who chattered amiably about how comfortable they’d be in their rural doomsteads, lifeboat communities, Transition Towns, et al.

Now, as discussed earlier in this post, we’ve gotten a very modest helping of decline and fall, and people who were enthusiastically discussing the end of the industrial age not that long ago are freaking out six ways from Sunday. If a relatively tame event like the election of an unpopular president can send people into this kind of tailspin, what are they going to do the day their paychecks suddenly turn out to be worth only half as much in terms of goods and services as before—a kind of event that’s already become tolerably common elsewhere, and could quite easily happen in this country as the dollar loses its reserve currency status?

What kinds of meltdowns are we going to get when internet service or modern health care get priced out of reach, or become unavailable at any price?  How are they going to cope if the accelerating crisis of legitimacy in this country causes the federal government to implode, the way the government of the Soviet Union did, and suddenly they’re living under cobbled-together regional governments that don’t have the money to pay for basic services? What sort of reaction are we going to see if the US blunders into a sustained domestic insurgency—suicide bombs going off in public places, firefights between insurgent forces and government troops, death squads from both sides rounding up potential opponents and leaving them in unmarked mass graves—or, heaven help us, all-out civil war?

This is what the decline and fall of a civilization looks like. It’s not about sitting in a cozy earth-sheltered home under a roof loaded with solar panels, living some close approximation of a modern industrial lifestyle, while the rest of the world slides meekly down the chute toward history’s compost bin, leaving you and yours untouched. It’s about political chaos—meaning that you won’t get the leaders you want, and you may not be able to count on the rule of law or even the most basic civil liberties. It’s about economic implosion—meaning that your salary will probably go away, your savings almost certainly won’t keep its value, and if you have gold bars hidden in your home, you’d better hope to Hannah that nobody ever finds out, or it’ll be a race between the local government and the local bandits to see which one gets to tie your family up and torture them to death, starting with the children, until somebody breaks and tells them where your stash is located.

It’s about environmental chaos—meaning that you and the people you care about may have many hungry days ahead as crazy weather messes with the harvests, and it’s by no means certain you won’t die early from some tropical microbe that’s been jarred loose from its native habitat to find a new and tasty home in you. It’s about rapid demographic contraction—meaning that you get to have the experience a lot of people in the Rust Belt have already, of walking past one abandoned house after another and remembering the people who used to live there, until they didn’t any more.

More than anything else, it’s about loss. Things that you value—things you think of as important, meaningful, even necessary—are going to go away forever in the years immediately ahead of us, and there will be nothing you can do about it.  It really is as simple as that. People who live in an age of decline and fall can’t afford to cultivate a sense of entitlement. Unfortunately, for reasons discussed at some length in one of last month’s posts, the notion that the universe is somehow obliged to give people what they think they deserve is very deeply engrained in American popular culture these days. That’s a very unwise notion to believe right now, and as we slide further down the slope, it could very readily become fatal—and no, by the way, I don’t mean that last adjective in a metaphorical sense.

History recalls how great the fall can be, Roger Hodgson sang. In our case, it’s shaping up to be one for the record books—and those of my readers who have worked themselves up to the screaming point about the comparatively mild events we’ve seen so far may want to save some of their breath for the times ahead when it’s going to get much, much worse.
_________________
*In colloquial English: “It sucks to lose.”

Wednesday, January 18, 2017

The Hate that Dare Not Speak its Name

As the United States stumbles toward the last act of its electoral process two days from now, and the new administration prepares to take over the reins of power from its feckless predecessor, the obligatory caterwauling of the losing side has taken on an unfamiliar shrillness. Granted, the behavior of both sides in the last few decades of American elections can be neatly summed up in the words “sore loser”; the Republicans in 1992 and 2008 behaved not one whit better than the Democrats in 1980 and 2000.  I think it’s fair, though, to say that the current example has plunged well past the low-water mark set by those dismal occasions. The question I’d like to discuss here is why that should be.

I think we can all admit that there are plenty of reasons why Americans might reasonably object to the policies and appointments of the incoming president, but the same thing has been true of every other president we’ve had since George Washington’s day. Equally, both of our major parties have long been enthusiastic practitioners of the fine art of shrieking in horror at the other side’s behavior, while blithely excusing the identical behavior on their side.  Had the election last November gone the other way, for example, we can be quite certain that all the people who are ranting about Donald Trump’s appointment of Goldman Sachs employees to various federal offices would be busy explaining how reasonable it was for Hillary Clinton to do exactly the same thing—as of course she would have.

That said, I don’t think reasonable differences of opinion on the one hand, and the ordinary hypocrisy of partisan politics on the other, explain the extraordinarily stridency, the venom, and the hatred being flung at the incoming administration by its enemies. There may be many factors involved, to be sure, but I’d like to suggest that one factor in particular plays a massive role here.

To be precise, I think a lot of what we’re seeing is the product of class bigotry.

Some definitions are probably necessary here. We can define bigotry as the act of believing hateful things about all the members of a given category of people, just because they belong to that category. Thus racial bigots believe hateful things about everyone who belongs to races they don’t like, religious bigots do the same thing to every member of the religions they don’t like, and so on through the dismal chronicle of humanity’s collective nastiness.

Defining social class is a little more difficult to do in the abstract, as different societies draw up and enforce their class barriers in different ways. In the United States, though, the matter is made a good deal easier by the lack of a fully elaborated feudal system in our nation’s past, on the one hand, and on the other, the tolerably precise dependency of how much privilege you have in modern American society on how much money you make. Thus we can describe class bigotry in the United States, without too much inaccuracy, as bigotry directed against people who make either significantly more money than the bigot does, or significantly less. (Of course that’s not all there is to social class, not by a long shot, but for our present purposes, as an ostensive definition, it will do.)

Are the poor bigoted against the well-to-do? You bet. Bigotry directed up the social ladder, though, is far more than matched, in volume and nastiness, by bigotry directed down. It’s a source of repeated amusement to me that rich people in this country so often inveigh against the horrors of class warfare. Class warfare is their bread and butter. The ongoing warfare of the rich against the poor, and of the affluent middle and upper middle classes against the working class, create and maintain the vast disparities of wealth and privilege in contemporary American society. What upsets the rich and the merely affluent about class warfare, of course, is the thought that they might someday be treated the way they treat everyone else.

Until last year, if you wanted to experience the class bigotry that’s so common among the affluent classes in today’s America, you pretty much had to be a member of those affluent classes, or at least good enough at passing to be present at the social events where their bigotry saw free play. Since Donald Trump broke out of the Republican pack early last year, though, that hindrance has gone by the boards. Those who want to observe American class bigotry at its choicest need only listen to what a great many of the public voices of the well-to-do are saying about the people who votes and enthusiasm have sent Trump to the White House.

You see, that’s a massive part of the reason a Trump presidency is so unacceptable to so many affluent Americans:  his candidacy, unlike those of all his rivals, was primarily backed by “those people.”

It’s probably necessary to clarify just who “those people” are. During the election, and even more so afterwards, the mainstream media here in the United States have seemingly been unable to utter the words “working class” without sticking the labels “white” in front and “men” behind. The resulting rhetoric seems to be claiming that the relatively small fraction of the American voting public that’s white, male, and working class somehow managed to hand the election to Donald Trump all by themselves, despite the united efforts of everyone else.

Of course that’s not what happened. A huge majority of white working class women also voted for Trump, for example.  So, according to exit polls, did about a third of Hispanic men and about a quarter of Hispanic women; so did varying fractions of other American minority voting blocs, with African-American voters (the least likely to vote for Trump) still putting something like fourteen per cent in his column. Add it all up, and you’ll find that the majority of people who voted for Trump weren’t white working class men at all—and we don’t even need to talk about the huge number of registered voters of all races and genders who usually turn out for Democratic candidates, but stayed home in disgust this year, and thus deprived Clinton of the turnout that could have given her the victory.

Somehow, though, pundits and activists who fly to their keyboards at a moment’s notice to denounce the erasure of women and people of color in any other context are eagerly cooperating in the erasure of women and people of color in this one case. What’s more, that same erasure went on continuously all through the campaign. Those of my readers who followed the media coverage of the race last year will recall confident proclamations that women wouldn’t vote for Trump because his words and actions had given offense to feminists, that Hispanics (or people of color in general) wouldn’t vote for Trump because social-justice activists denounced his attitudes toward illegal immigrants from Mexico as racist, and so on. The media took these proclamations as simple statements of fact—and of course that was one of the reasons media pundits were blindsided by Trump’s victory.

The facts of the matter are that a great many American women don’t happen to agree with feminists, nor do all people of color agree with the social-justice activists who claim to speak in their name. For that matter, may I point out to my fellow inhabitants of Gringostan that the terms “Hispanic” and “Mexican-American” are not synonyms? Americans of Hispanic descent trace their ancestry to many different nations of origin, each of which has its own distinctive culture and history, and they don’t form a single monolithic electoral bloc. (The Cuban-American community in Florida, to cite only one of the more obvious examples, very often vote Republican and  played a significant role in giving that electoral vote-rich state to Trump.)

Behind the media-manufactured facade of white working class men as the cackling villains who gave the country to Donald Trump, in other words, lies a reality far more in keeping with the complexities of American electoral politics: a ramshackle coalition of many different voting blocs and interest groups, each with its own assortment of reasons for voting for a candidate feared and despised by the US political establishment and the mainstream media.  That coalition included a very large majority of the US working class in general, and while white working class voters of both genders were disproportionately more likely to have voted for Trump than their nonwhite equivalents, it wasn’t simply a matter of whiteness, or for that matter maleness.

It was, however, to a very great extent a matter of social class. This isn’t just because so large a fraction of working class voters generally backed Trump; it’s also because Trump saw this from the beginning, and aimed his campaign squarely at the working class vote. His signature red ball cap was part of that—can you imagine Hillary Clinton wearing so proletarian a garment without absurdity?—but, as I pointed out a year ago, so was his deliberate strategy of saying (and tweeting) things that would get the liberal punditocracy to denounce him. The tones of sneering contempt and condescension they directed at him were all too familiar to his working class audiences, who have been treated to the same tones unceasingly by their soi-disant betters for decades now.

Much of the pushback against Trump’s impending presidency, in turn, is heavily larded with that same sneering contempt and condescension—the unending claims, for example, that the only reason people could possibly have chosen to vote for Trump was because they were racist misogynistic morons, and the like. (These days, terms such as “racist” and “misogynistic,” in the mouths of the affluent, are as often as not class-based insults rather than objective descriptions of attitudes.) The question I’d like to raise at this point, though, is why the affluent don’t seem to be able to bring themselves to come right out and denounce Trump as the candidate of the filthy rabble. Why must they borrow the rhetoric of identity politics and twist it (and themselves) into pretzel shapes instead?

There, dear reader, hangs a tale.

In the aftermath of the social convulsions of the 1960s, the wealthy elite occupying the core positions of power in the United States offered a tacit bargain to a variety of movements for social change.  Those individuals and groups who were willing to give up the struggle to change the system, and settled instead for a slightly improved place within it, suddenly started to receive corporate and government funding, and carefully vetted leaders from within the movements in question were brought into elite circles as junior partners. Those individuals and groups who refused these blandishments were marginalized, generally with the help of their more compliant peers.

If you ever wondered, for example, why environmental groups such as the Sierra Club and Friends of the Earth changed so quickly from scruffy fire-breathing activists to slickly groomed and well-funded corporate enablers, well, now you know. Equally, that’s why mainstream feminist organizations by and large stopped worrying about the concerns of the majority of women and fixated instead on “breaking the glass ceiling”—that is to say, giving women who already belong to the privileged classes access to more privilege than they have already. The core demand placed on former radicals who wanted to cash in on the offer, though, was that they drop their demands for economic justice—and American society being what it is, that meant that they had to stop talking about class issues.

The interesting thing is that a good many American radicals were already willing to meet them halfway on that. The New Left of the 1960s, like the old Left of the between-the-wars era, was mostly Marxist in its theoretical underpinnings, and so was hamstrung by the mismatch between Marxist theory and one of the enduring realities of American politics. According to Marxist theory, socialist revolution is led by the radicalized intelligentsia, but it gets the muscle it needs to overthrow the capitalist system from the working classes. This is the rock on which wave after wave of Marxist activism has broken and gone streaming back out to sea, because the American working classes are serenely uninterested in taking up the world-historical role that Marxist theory assigns to them. All they want is plenty of full time jobs at a living wage.  Give them that, and revolutionary activists can bellow themselves hoarse without getting the least flicker of interest out of them.

Every so often, the affluent classes lose track of this, and try to force the working classes to put up with extensive joblessness and low pay, so that affluent Americans can pocket the proceeds. This never ends well.  After an interval, the working classes pick up whatever implement is handy—Andrew Jackson, the Grange, the Populist movement, the New Deal, Donald Trump—and beat the affluent classes about the head and shoulders with it until the latter finally get a clue. This might seem  promising for Marxist revolutionaries, but it isn’t, because the Marxist revolutionaries inevitably rush in saying, in effect, “No, no, you shouldn’t settle for plenty of full time jobs at a living wage, you should die by the tens of thousands in an orgy of revolutionary violence so that we can seize power in your name.” My readers are welcome to imagine the response of the American working class to this sort of rhetoric.

The New Left, like the other American Marxist movements before its time, thus had a bruising face-first collision with cognitive dissonance: its supposedly infallible theory said one thing, but the facts refused to play along and said something very different. For much of the Sixties and Seventies, New Left theoreticians tried to cope with this by coming up with increasingly Byzantine redefinitions of “working class” that excluded the actual working class, so that they could continue to believe in the inevitability and imminence of the proletarian revolution Marx promised them. Around the time that this effort finally petered out into absurdity, it was replaced by the core concept of the identity politics currently central to the American left: the conviction that the only divisions in American society that matter are those that have some basis in biology.

Skin color, gender, ethnicity, sexual orientation, disability—these are the divisions that the American left likes to talk about these days, to the exclusion of all other social divisions, and especially to the exclusion of social class.  Since the left has dominated public discourse in the United States for many decades now, those have become the divisions that the American right talks about, too. (Please note, by the way, the last four words in the paragraph above: “some basis in biology.” I’m not saying that these categories are purely biological in nature; every one of them is defined in practice by a galaxy of cultural constructs and presuppositions, and the link to biology is an ostensive category marker rather than a definition. I insert this caveat because I’ve noticed that a great many people go out of their way to misunderstand the point I’m trying to make here.)

Are the divisions listed above important when it comes to discriminatory treatment in America today? Of course they are—but social class is also important. It’s by way of the erasure of social class as a major factor in American injustice that we wind up in the absurd situation in which a woman of color who makes a quarter million dollars a year plus benefits as a New York stockbroker can claim to be oppressed by a white guy in Indiana who’s working three part time jobs at minimum wage with no benefits in a desperate effort to keep his kids fed, when the political candidates that she supports and the economic policies from which she profits are largely responsible for his plight.

In politics as in physics, every action produces an equal and opposite reaction, and so absurdities of the sort just described have kindled the inevitable blowback. The Alt-Right scene that’s attracted so much belated attention from politicians and pundits over the last year is in large part a straightforward reaction to the identity politics of the left. Without too much inaccuracy, the Alt-Right can be seen as a network of young white men who’ve noticed that every other identity group in the country is being encouraged to band together to further its own interests at their expense, and responded by saying, “Okay, we can play that game too.” So far, you’ve got to admit, they’ve played it with verve.

That said, on the off chance that any devout worshippers of the great god Kek happen to be within earshot, I have a bit of advice that I hope will prove helpful. The next time you want to goad affluent American liberals into an all-out, fist-pounding, saliva-spraying Donald Duck meltdown, you don’t need the Jew-baiting, the misogyny, the racial slurs, and the rest of it.  All you have to do is call them on their class privilege. You’ll want to have the popcorn popped, buttered, and salted first, though, because if my experience is anything to go by, you’ll be enjoying a world-class hissy fit in seconds.

I’d also like to offer the rest of my readers another bit of advice that, again, I hope will prove helpful. As Donald Trump becomes the forty-fifth president of the United States and begins to push the agenda that got him into the White House, it may be useful to have a convenient way to sort through the mix of signals and noise from the opposition. When you hear people raising reasoned objections to Trump’s policies and appointments, odds are that you’re listening to the sort of thoughtful dissent that’s essential to any semblance of democracy, and it may be worth taking seriously. When you hear people criticizing Trump and his appointees for doing the same thing his rivals would have done, or his predecessors did, odds are that you’re getting the normal hypocrisy of partisan politics, and you can roll your eyes and stroll on.

But when you hear people shrieking that Donald Trump is the illegitimate result of a one-night stand between Ming the Merciless and Cruella de Vil, that he cackles in Russian while barbecuing babies on a bonfire, that everyone who voted for him must be a card-carrying Nazi who hates the human race, or whatever other bit of over-the-top hate speech happens to be fashionable among the chattering classes at the moment—why, then, dear reader, you’re hearing a phenomenon as omnipresent and unmentionable in today’s America as sex was in Victorian England. You’re hearing the voice of class bigotry: the hate that dare not speak its name.

Wednesday, January 11, 2017

The Embarrassments of Chronocentrism

It's a curious thing, this attempt of mine to make sense of the future by understanding what’s happened in the past. One of the most curious things about it, at least to me, is the passion with which so many people insist that this isn’t an option at all. In any other context, “Well, what happened the last time someone tried that?” is one of the first and most obviously necessary questions to ask and answer—but heaven help you if you try to raise so straightforward a question about the political, economic, and social phenomena of the present day.

In previous posts here we’ve talked about thoughtstoppers of the “But it’s different this time!” variety, and some of the other means people these days use to protect themselves against the risk of learning anything useful from the hard-earned lessons of the past. This week I want to explore another, subtler method of doing the same thing. As far as I’ve been able to tell, it’s mostly an issue here in the United States, but here it’s played a remarkably pervasive role in convincing people that the only way to open a door marked PULL is to push on it long and hard enough.

It’s going to take a bit of a roundabout journey to make sense of the phenomenon I have in mind, so I’ll have to ask my readers’ forbearance for what will seem at first like several sudden changes of subject.

One of the questions I field tolerably often, when I discuss the societies that will rise after modern industrial civilization finishes its trajectory into history’s compost heap, is whether I think that consciousness evolves. I admit that until fairly recently, I was pretty much at a loss to know how to respond. It rarely took long to find out that the questioner wasn’t thinking about the intriguing theory Julian Jaynes raised in The Origins of Consciousness in the Breakdown of the Bicameral Mind, the Jungian conception Erich Neumann proposed in The Origins and History of Consciousness, or anything of the same kind. Nor, it turned out, was the question usually based on the really rather weird reinterpretations of evolution common in today’s pop-spirituality scene. Rather, it was political.

It took me a certain amount of research, and some puzzled emails to friends more familiar with current left-wing political jargon than I am, to figure out what was behind these questions. Among a good-sized fraction of American leftist circles these days, it turns out it’s become a standard credo that what drives the kind of social changes supported by the left—the abolition of slavery and segregation, the extension of equal (or more than equal) rights to an assortment of disadvantaged groups, and so on—is an ongoing evolution of consciousness, in which people wake up to the fact that things they’ve considered normal and harmless are actually intolerable injustices, and so decide to stop.

Those of my readers who followed the late US presidential election may remember Hillary Clinton’s furious response to a heckler at one of her few speaking gigs:  “We aren’t going back. We’re going forward.” Underlying that outburst is the belief system I’ve just sketched out: the claim that history has a direction, that it moves in a linear fashion from worse to better, and that any given political choice—for example, which of the two most detested people in American public life is going to become the nominal head of a nation in freefall ten days from now—not only can but must be flattened out into a rigidly binary decision between “forward” and “back.”

There’s no shortage of hard questions that could be brought to bear on that way of thinking about history, and we’ll get to a few of them a little later on, but let’s start with the simplest one: does history actually show any such linear movement in terms of social change?

It so happens that I’ve recently finished a round of research bearing on exactly that question, though I wasn’t thinking of politics or the evolution of consciousness when I launched into it. Over the last few years I’ve been working on a sprawling fiction project, a seven-volume epic fantasy titled The Weird of Hali, which takes the horror fantasy of H.P. Lovecraft and stands it on its head, embracing the point of view of the tentacled horrors and multiracial cultists Lovecraft liked to use as images of dread. (The first volume, Innsmouth, is in print in a fine edition and will be out in trade paper this spring, and the second, Kingsport, is available for preorder and will be published later this year.)

One of Lovecraft’s few memorable human characters, the intrepid dream-explorer Randolph Carter, has an important role in the fourth book of my series. According to Lovecraft, Carter was a Boston writer and esthete of the1920s from a well-to-do family, who had no interest in women but a whole series of intimate (and sometimes live-in) friendships with other men, and decidedly outré tastes in interior decoration—well, I could go on. The short version is that he’s very nearly the perfect archetype of an upper-class gay man of his generation. (Whether Lovecraft intended this is a very interesting question that his biographers don’t really answer.) With an eye toward getting a good working sense of Carter’s background, I talked to a couple of gay friends, who pointed me to some friends of theirs, and that was how I ended up reading George Chauncey’s magisterial Gay New York: Gender, Urban Culture, and the Makings of the Gay Male World, 1890-1940.

What Chauncey documents, in great detail and with a wealth of citations from contemporary sources, is that gay men in America had substantially more freedom during the first three decades of the twentieth century than they did for a very long time thereafter. While homosexuality was illegal, the laws against it had more or less the same impact on people’s behavior that the laws against smoking marijuana had in the last few decades of the twentieth century—lots of people did it, that is, and now and then a few of them got busted. Between the beginning of the century and the coming of the Great Depression, in fact, most large American cities had a substantial gay community with its own bars, restaurants, periodicals, entertainment venues, and social events, right out there in public.

Nor did the gay male culture of early twentieth century America conform to current ideas about sexual identity, or the relationship between gay culture and social class, or—well, pretty much anything else, really. A very large number of men who had sex with other men didn’t see that as central to their identity—there were indeed men who embraced what we’d now call a gay identity, but that wasn’t the only game in town by a long shot. What’s more, sex between men was by and large more widely accepted in the working classes than it was further up the social ladder. In turn-of-the-century New York, it was the working class gay men who flaunted the camp mannerisms and the gaudy clothing; upper- and middle-class gay men such as Randolph Carter had to be much more discreet.

So what happened? Did some kind of vast right-wing conspiracy shove the ebullient gay male culture of the early twentieth century into the closet? No, and that’s one of the more elegant ironies of this entire chapter of American cultural history. The crusade against the “lavender menace” (I’m not making that phrase up, by the way) was one of the pet causes of the same Progressive movement responsible for winning women the right to vote and breaking up the fabulously corrupt machine politics of late nineteenth century America. Unpalatable as that fact is in today’s political terms, gay men and lesbians weren’t forced into the closet in the 1930s by the right.  They were driven there by the left.

This is the same Progressive movement, remember, that made Prohibition a central goal of its political agenda, and responded to the total failure of the Prohibition project by refusing to learn the lessons of failure and redirecting its attentions toward banning less popular drugs such as marijuana. That movement was also, by the way, heavily intertwined with what we now call Christian fundamentalism. Some of my readers may have heard of William Jennings Bryan, the supreme orator of the radical left in late nineteenth century America, the man whose “Cross of Gold” speech became the great rallying cry of opposition to the Republican corporate elite in the decades before the First World War.  He was also the prosecuting attorney in the equally famous Scopes Monkey Trial, responsible for pressing charges against a schoolteacher who had dared to affirm in public Darwin’s theory of evolution.

The usual response of people on today’s left to such historical details—well, other than denying or erasing them, which is of course quite common—is to insist that this proves that Bryan et al. were really right-wingers. Not so; again, we’re talking about people who put their political careers on the line to give women the vote and weaken (however temporarily) the grip of corporate money on the US political system. The politics of the Progressive era didn’t assign the same issues to the categories “left” and “right” that today’s politics do, and so all sides in the sprawling political free-for-all of that time embraced some issues that currently belong to the left, others that belong to the right, and still others that have dropped entirely out of the political conversation since then.

I could go on, but let’s veer off in another direction instead. Here’s a question for those of my readers who think they’re well acquainted with American history. The Fifteenth Amendment, which granted the right to vote to all adult men in the United States irrespective of race, was ratified in 1870. Before then, did black men have the right to vote anywhere in the US?

Most people assume as a matter of course that the answer must be no—and they’re wrong. Until the passage of the Fifteenth Amendment, the question of who did and didn’t have voting rights was a matter for each state to decide for itself. Fourteen states either allowed free African-American men to vote in Colonial times or granted them that right when first organized. Later on, ten of them—Delaware in 1792, Kentucky in 1799, Maryland in 1801, New Jersey in 1807, Connecticut in 1814, New York in 1821, Rhode Island in 1822, Tennessee in 1834, North Carolina in 1835, and Pennsylvania in 1838—either denied free black men the vote or raised legal barriers that effectively kept them from voting. Four other states—Massachusetts, Vermont, New Hampshire, and Maine—gave free black men the right to vote in Colonial times and maintained that right until the Fifteenth Amendment made the whole issue moot. Those readers interested in the details can find them in The African American Electorate: A Statistical History by Hanes Walton Jr. et al., which devotes chapter 7 to the subject.

So what happened? Was there a vast right-wing conspiracy to deprive black men of the right to vote? No, and once again we’re deep in irony. The political movements that stripped free American men of African descent of their right to vote were the two great pushes for popular democracy in the early United States, the Democratic-Republican party under Thomas Jefferson and the Democratic party under Andrew Jackson. Read any detailed history of the nineteenth century United States and you’ll learn that before these two movements went to work, each state set a certain minimum level of personal wealth that citizens had to have in order to vote. Both movements forced through reforms in the voting laws, one state at a time, to remove these property requirements and give the right to vote to every adult white man in the state. What you won’t learn, unless you do some serious research, is that in many states these same reforms also stripped adult black men of their right to vote.

Try to explain this to most people on the leftward end of today’s American political spectrum, and you’ll likely end up with a world-class meltdown, because the Jeffersonian Democratic-Republicans and the Jacksonian Democrats, like the Progressive movement, embraced some causes that today’s leftists consider progressive, and others that they consider regressive. The notion that social change is driven by some sort of linear evolution of consciousness, in which people necessarily become “more conscious” (that is to say, conform more closely to the ideology of the contemporary American left) over time, has no room for gay-bashing Progressives and Jacksonian Democrats whose concept of democracy included a strict color bar. The difficulty, of course, is that history is full of Progressives, Jacksonian Democrats, and countless other political movements that can’t be shoehorned into the Procrustean bed of today’s political ideologies.

I could add other examples—how many people remember, for example, that environmental protection was a cause of the far right until the 1960s?—but I think the point has been made. People in the past didn’t divide up the political causes of their time into the same categories left-wing activists like to use today. It’s practically de rigueur for left-wing activists these days to insist that people in the past ought to have seen things in today’s terms rather than the terms of their own time, but that insistence just displays a bad case of chronocentrism.

Chronocentrism? Why, yes. Most people nowadays are familiar with ethnocentrism, the insistence by members of one ethnic group that the social customs, esthetic notions, moral standards, and so on of that ethnic group are universally applicable, and that anybody who departs from those things is just plain wrong. Chronocentrism is the parallel insistence, on the part of people living in one historical period, that the social customs, esthetic notions, moral standards, and so on of that period are universally applicable, and that people in any other historical period who had different social customs, esthetic notions, moral standards, and so on should have known better.

Chronocentrism is pandemic in our time. Historians have a concept called “Whig history;” it got that moniker from a long line of English historians who belonged to the Whig, i.e., Liberal Party, and who wrote as though all of human history was to be judged according to how well it measured up to the current Liberal Party platform. Such exercises aren’t limited to politics, though; my first exposure to the concept of Whig history came via university courses in the history of science. When I took those courses—this was twenty-five years ago, mind you—historians of science were sharply divided between a majority that judged every scientific activity in every past society on the basis of how well it conformed to our ideas of science, and a minority that tried to point out just how difficult this habit made the already challenging task of understanding the ideas of past thinkers.

To my mind, the minority view in those debates was correct, but at least some of its defenders missed a crucial point. Whig history doesn’t exist to foster understanding of the past. It exists to justify and support an ideological stance of the present. If the entire history of science is rewritten so that it’s all about how the currently accepted set of scientific theories about the universe rose to their present privileged status, that act of revision makes currently accepted theories look like the inevitable outcome of centuries of progress, rather than jerry-rigged temporary compromises kluged together to cover a mass of recalcitrant data—which, science being what it is, is normally a more accurate description.

In exactly the same sense, the claim that a certain set of social changes in the United States and other industrial countries in recent years result from the “evolution of consciousness,” unfolding on a one-way street from the ignorance of the past to a supposedly enlightened future, doesn’t help make sense of the complicated history of social change. It was never supposed to do that. Rather, it’s an attempt to backstop the legitimacy of a specific set of political agendas here and now by making them look like the inevitable outcome of the march of history. The erasure of the bits of inconvenient history I cited earlier in this essay is part and parcel of that attempt; like all linear schemes of historical change, it falsifies the past and glorifies the future in order to prop up an agenda in the present.

It needs to be remembered in this context that the word “evolution” does not mean “progress.” Evolution is adaptation to changing circumstances, and that’s all it is. When people throw around the phrases “more evolved” and “less evolved,” they’re talking nonsense, or at best engaging in a pseudoscientific way of saying “I like this” and “I don’t like that.” In biology, every organism—you, me, koalas, humpback whales, giant sequoias, pond scum, and all the rest—is equally the product of a few billion years of adaptation to the wildly changing conditions of an unstable planet, with genetic variation shoveling in diversity from one side and natural selection picking and choosing on the other. The habit of using the word “evolution” to mean “progress” is pervasive, and it’s pushed hard by the faith in progress that serves as an ersatz religion in our time, but it’s still wrong.

It’s entirely possible, in fact, to talk about the evolution of political opinion (which is of course what “consciousness” amounts to here) in strictly Darwinian terms. In every society, at every point in its history, groups of people are striving to improve the conditions of their lives by some combination of competition and cooperation with other groups. The causes, issues, and rallying cries that each group uses will vary from time to time as conditions change, and so will the relationships between groups—thus it was to the advantage of affluent liberals of the Progressive era to destroy the thriving gay culture of urban America, just as it was to the advantage of affluent liberals of the late twentieth century to turn around and support the struggle of gay people for civil rights. That’s the way evolution works in the real world, after all.

This sort of thinking doesn’t offer the kind of ideological support that activists of various kinds are used to extracting from various linear schemes of history. On the other hand, that difficulty is more than balanced by a significant benefit, which is that linear predictions inevitably fail, and so by and large do movements based on them. The people who agreed enthusiastically with Hillary Clinton’s insistence that “we aren’t going back, we’re going forward” are still trying to cope with the hard fact that their political agenda will be wandering in the wilderness for at least the next four years. Those who convince themselves that their cause is the wave of the future are constantly being surprised by the embarrassing discovery that waves inevitably break and roll back out to sea. It’s those who remember that history plays no favorites who have a chance of accomplishing their goals.

Wednesday, January 04, 2017

How Not To Write Like An Archdruid

Among the occasional amusements I get from writing these weekly essays are earnest comments from people who want to correct my writing style. I field one of them every month or so, and the latest example came in over the electronic transom in response to last week’s post. Like most of its predecessors, it insisted that there’s only one correct way to write for the internet, trotted out a set of canned rules that supposedly encapsulate this one correct way, and assumed as a matter of course that the only reason I didn’t follow those rules is that I’d somehow managed not to hear about them yet.

The latter point is the one I find most amusing, and also most curious. Maybe I’m naive, but it’s always seemed to me that if I ran across someone who was writing in a style I found unusual, the first thing I’d want to do would be to ask the author why he or she had chosen that stylistic option—because, you know, any writer who knows the first thing about his or her craft chooses the style he or she finds appropriate for any given writing project. I field such questions once in a blue moon, and I’m happy to answer them, because I do indeed have reasons for writing these essays in the style I’ve chosen for them. Yet it’s much more common to get the sort of style policing I’ve referenced above—and when that happens, you can bet your bottom dollar that what’s being pushed is the kind of stilted, choppy, dumbed-down journalistic prose that I’ve deliberately chosen not to write.

I’m going to devote a post to all this, partly because I write what I want to write about, the way I want to write about it, for the benefit of those who enjoy reading it, and those who don’t are encouraged to remember that there are thousands of other blogs out there that they’re welcome to read instead. Partly, though, the occasional thudding of what Giordano Bruno called “the battering rams of infants, the catapults of error, the bombards of the inept, and the lightning flashes, thunder, and great tempests of the ignorant”—now there was a man who could write!—raises issues that are central to the occasional series of essays on education I’ve been posting here.

Accepting other people’s advice on writing is a risky business—and yes, that applies to this blog post as well as any other source of such advice. It’s by no means always true that “those who can, do; those who can’t, teach,” but when we’re talking about unsolicited writing advice on the internet, that’s the way to bet.  Thus it’s not enough for some wannabe instructor to tell you “I’ve taught lots of people” (taught them what?) or “I’ve helped lots of people” (to do what?)—the question you need to ask is what the instructor himself or herself has written and where it’s been published.

The second of those matters as much as the first. It so happens, for example, that a great many of the professors who offer writing courses at American universities publish almost exclusively in the sort of little literary quarterlies that have a circulation in three figures and pay contributors in spare copies. (It’s not coincidental that these days, most of the little literary quarterlies in question are published by university English departments.) There’s nothing at all wrong with that, if you dream of writing the sort of stories, essays, and poetry that populate little literary quarterlies.

If you want to write something else, though, it’s worth knowing that these little quarterlies have their own idiosyncratic literary culture. There was a time when the little magazines were one of the standard stepping stones to a successful writing career, but that time went whistling down the wind decades ago. Nowadays, the little magazines have gone one way, the rest of the publishing world has gone another, and many of the habits the little magazines encourage (or even require) in their writers will guarantee prompt and emphatic rejection slips from most other writing venues.

Different kinds of writing, in other words, have their own literary cultures and stylistic customs. In some cases, those can be roughly systematized in the form of rules. That being the case, is there actually some set of rules that are followed by everything good on the internet?

Er, that would be no. I’m by no means a fan of the internet, all things considered—I publish my essays here because most of the older venues I’d prefer no longer exist—but it does have its virtues, and one of them is the remarkable diversity of style to be found there. If you like stilted, choppy, dumbed-down journalistic prose of the sort my commenter wanted to push on me, why, yes, you can find plenty of it online. You can also find lengthy, well-argued essays written in complex and ornate prose, stream-of-consciousness pieces that out-beat the Beat generation, experimental writing of any number of kinds, and more. Sturgeon’s Law (“95% of everything is crap”) applies here as it does to every other human creation, but there are gems to be found online that range across the spectrum of literary forms and styles. No one set of rules applies.

Thus we can dismiss the antics of the style police out of hand. Let’s go deeper, though. If there’s no one set of rules that internet writing ought to follow, are there different rules for each kind of writing? Or are rules themselves the problem? This is where things get interesting.

One of the consistent mental hiccups of American popular culture is the notion that every spectrum consists solely of its two extremes, with no middle ground permitted, and that bit of paralogic gets applied to writing at least as often as to anything else. Thus you have, on the one hand, the claim that the only way to write well is to figure out what the rules are and follow them with maniacal rigidity; on the other, the claim that the only way to write well is to throw all rules into the trash can and let your inner genius, should you happen to have one of those on hand, spew forth the contents of your consciousness all anyhow onto the page. Partisans of those two viewpoints snipe at one another from behind rhetorical sandbags, and neither one of them ever manages more than a partial victory, because neither approach is particularly useful when it comes to the actual practice of writing.

By and large, when people write according to a rigidly applied set of rules—any rigidly applied set of rules—the result is predictable, formulaic, and trite, and therefore boring. By and large, when people write without paying any attention to rules at all, the result is vague, shapeless, and maundering, and therefore boring. Is there a third option? You bet, and it starts by taking the abandoned middle ground: in this case, learning an appropriate set of rules, and using them as a starting point, but departing from them wherever doing so will improve the piece you’re writing.

The set of rules I recommend, by the way, isn’t meant to produce the sort of flat PowerPoint verbiage my commenter insists on. It’s meant to produce good readable English prose, and the source of guidance I recommend to those who are interested in such things is Strunk and White’s deservedly famous The Elements of Style. Those of my readers who haven’t worked with it, who want to improve their writing, and who’ve glanced over what I’ve published and decided that they might be able to learn something useful from me, could do worse than to read it and apply it to their prose.

A note of some importance belongs here, though. There’s a thing called writer’s block, and it happens when you try to edit while you’re writing. I’ve read, though I’ve misplaced the reference, that neurologists have found that the part of the brain that edits and the part of the brain that creates are not only different, they conflict with one another.  If you try to use both of them at once, your brain freezes up in a fairly close neurological equivalent of the Blue Screen of Death, and you stop being able to write at all. That’s writer’s block. To avoid it, NEVER EDIT WHILE YOU’RE WRITING. 

I mean that quite literally. Don’t even look at the screen if you can’t resist the temptation to second-guess the writing process. If you have to, turn the screen off, so you can’t even see what you’re writing. Eventually, with practice, you’ll learn to move smoothly back and forth between creative mode and editing mode, but if you don’t have a lot of experience writing, leave that for later. For now, just blurt it all out without a second thought, with all its misspellings and garbled grammar intact.

Then, after at least a few hours—or better yet, after a day or so—go back over the mess, cutting, pasting, adding, and deleting as needed, until you’ve turned it into nice clean text that says what you want it to say. Yes, we used to do that back before computers; the process is called “cut and paste” because it was done back then with a pair of scissors and a pot of paste, the kind with a little spatula mounted on the inside of the lid to help you spread the stuff; you’d cut out the good slices of raw prose and stick them onto a convenient sheet of paper, interspersed with handwritten or freshly typed additions. Then you sat down and typed your clean copy from the pasted-up mess thus produced. Now you know how to do it when the internet finally dries up and blows away. (You’re welcome.)

In the same way, you don’t try to write while looking up rules in Strunk & White. Write your piece, set it aside for a while, and then go over it with your well-worn copy of Strunk & White in hand, noting every place you broke one of the rules of style the book suggests you should follow. The first few times, as a learning exercise, you might consider rewriting the whole thing in accordance with those rules—but only the first few times. After that, make your own judgment call: is this a place where you should follow the rules, or is this a place where they need to be bent, broken, or trampled into the dust? Only you, dear reader-turned-writer, can decide.

A second important note deserves to be inserted at this point, though. The contemporary US public school system can be described without too much inaccuracy as a vast mechanism for convincing children that they can’t write. Rigid rules imposed for the convenience of educators rather than the good of the students, part of the industrial mass-production ethos that pervades public schools in this country, leave a great many graduates so bullied, beaten, and bewildered by bad pedagogy that the thought of writing something for anybody else to read makes them turn gray with fear. It’s almost as bad as the terror of public speaking the public schools also go out of their way to inflict, and it plays a comparable role in crippling people’s capacity to communicate outside their narrow circles of friends.

If you suffer from that sort of educational hangover, dear reader, draw a deep breath and relax. The bad grades and nasty little comments in red ink you got from Mrs. Melba McNitpick, your high school English teacher, are no reflection of your actual capacities as a writer. If you can talk, you can write—it’s the same language, after all. For that matter, even if you can’t talk, you may be able to write—there’s a fair number of people out there who are nonverbal for one reason or another, and can still make a keyboard dance.

The reason I mention this here is that the thought of making an independent judgment about when to follow the rules and when to break them fills a great many survivors of American public schools with dread. In far too many cases, students are either expected to follow the rules with mindless obedience and given bad grades if they fail to do so, or given no rules at all and then expected to conform to unstated expectations they have no way to figure out, and either of these forms of bad pedagogy leaves scars. Again, readers who are in this situation should draw a deep breath and relax; having left Mrs. McNitpick’s class, you’re not subject to her opinions any longer, and should ignore them utterly.

So how do you decide where to follow the rules and where to fold, spindle, and mutilate them? That’s where we walk through the walls and into the fire, because what guides you in your decisions regarding the rules of English prose is the factor of literary taste.

Rules can be taught, but taste can only be learned. Does that sound like a paradox? Au contraire, it simply makes the point that only you can learn, refine, and ripen your literary taste—nobody else can do it for you, or even help you to any significant extent—and your sense of taste is therefore going to be irreducibly personal. When it comes to taste, you aren’t answerable to Mrs. McNitpick, to me, to random prose trolls on the internet, or to anyone else. What’s more, you develop your taste for prose the same way you develop your taste for food: by trying lots of different things, figuring out what you like, and paying close attention to what you like, why you like it, and what differentiates it from the things you don’t like as much.

This is applicable, by the way, to every kind of writing, including those kinds at which the snobs among us turn up their well-sharpened noses. I don’t happen to be a fan of the kind of satirical gay pornography that Chuck Tingle has made famous, for example, but friends of mine who are tell me that in that genre, as in all others, there are books that are well written, books that are tolerable, and books that trip over certain overelongated portions of their anatomy and land face first in—well, let’s not go there, shall we? In the same way, if your idea of a good read is nineteenth-century French comedies of manners, you can find a similar spectrum extending from brilliance to bathos.

Every inveterate reader takes in a certain amount of what I call popcorn reading—the sort of thing that’s read once, mildly enjoyed, and then returned to the library, the paperback exchange, or whatever electronic Elysium e-books enter when you hit the delete button. That’s as inevitable as it is harmless. The texts that matter in developing your personal taste, though, are the ones you read more than once, and especially the ones you read over and over again. As you read these for the third or the thirty-third time, step back now and then from the flow of the story or the development of the argument, and notice how the writer uses language. Learn to notice the really well-turned phrases, the figures of speech that are so apt and unexpected that they seize your attention, the moments of humor, the plays on words, the  passages that match tone and pacing to the subject perfectly.

If you’ve got a particular genre in mind—no, let’s stop for a moment and talk about genre, shall we? Those of my readers who endured a normal public school education here in the US probably don’t know that this is pronounced ZHON-ruh (it’s a French word) and it simply means a category of writing. Satirical gay pornography is a genre. The comedy of manners is a genre. The serious contemporary literary novel is a genre.  So are mysteries, romance, science fiction, fantasy, and the list goes on. There are also nonfiction genres—for example, future-oriented social criticism, the genre in which nine of my books from The Long Descent to Dark Age America have their place. Each genre is an answer to the question, “I just read this and I liked it—where can I find something else more or less like it?”

Every genre has its own habits and taboos, and if you want to write for publication, you need to know what those are. That doesn’t mean you have to follow those habits and taboos with the kind of rigid obedience critiqued above—quite the contrary—but you need to know about them, so that when you break the rules you do it deliberately and skillfully, to get the results you want, rather than clumsily, because you didn’t know any better. It also helps to read the classics of the genre—the books that established those habits and taboos—and then go back and read books in the genre written before the classics, to get a sense of what possibilities got misplaced when the classics established the frame through which all later works in that genre would be read.

If you want to write epic fantasy, for example, don’t you dare stop with Tolkien—it’s because so many people stopped with Tolkien that we’ve got so many dreary rehashes of something that was brilliantly innovative in 1949, complete with carbon-copy Dark Lords cackling in chorus and the inevitable and unendearing quest to do something with the Magic McGuffin that alone can save blah blah blah. Read the stuff that influenced Tolkien—William Morris, E.R. Eddison, the Norse sagas, the Kalevala, Beowulf.  Then read something in the way of heroic epic that he probably didn’t get around to reading—the Ramayana, the Heike Monogatari, the Popol Vuh, or what have youand think through what those have to say about the broader genre of heroic wonder tale in which epic fantasy has its place.

The point of this, by the way, isn’t to copy any of these things. It’s to develop your own sense of taste so that you can shape your own prose accordingly. Your goal, if you’re at all serious about writing, isn’t to write like Mrs. McNitpick, like your favorite author of satirical gay pornography or nineteenth-century French comedies of manners, or like me, but to write like yourself.

And that, to extend the same point more broadly, is the goal of any education worth the name. The word “education” itself comes from the Latin word educatio, from ex-ducere, “to lead out or bring out;” it’s about leading or bringing out the undeveloped potentials that exist inside the student, not shoving some indigestible bolus of canned information or technique down the student’s throat. In writing as in all other things that can be learned, that process of bringing out those undeveloped potentials requires the support of rules and examples, but those are means to an end, not ends in themselves—and it’s in the space between the rules and their inevitable exceptions, between the extremes of rigid formalism and shapeless vagueness, that the work of creation takes place.

That’s also true of politics, by the way—and the conventional wisdom of our time fills the same role there that the rules for bad internet prose do for writing. Before we can explore that, though, it’s going to be necessary to take on one of the more pervasive bad habits of contemporary thinking about the relationship between the present and the past. We’ll tackle that next week.

********************
In not wholly unrelated news, I’m pleased to announce that Merigan Tales, the anthology of short stories written by Archdruid Report readers set in the world of my novel Star’s Reach, is now in print and available for purchase from Founders House. Those of my readers who enjoyed Star’s Reach and the After Oil anthologies won’t want to miss it.