Monday, August 03, 2009
The Archdruid Takes A Break
Once again The Archdruid Report will be on a brief hiatus until the middle of August, s I'll have essentially no internet access until then. Once I'm back online, I'll have some announcements to make, and something new to offer that I think longtime readers will find entertaining. In the meantime, The Ecotechnic Future, which is the sequel to The Long Descent and discusses many of the concepts introduced here over the last couple of years, can now be preordered from New Society. Thank you all for contributing to the ongoing conversation!
Posted by
John Michael Greer
at
10:01 PM
11 comments:
I really like to read.Hope to learn a lot and have a nice experience here! my best regards guys!
_____________
manishfusion
seo jaipur--seo jaipur
8/4/09, 4:31 AM
mjc said...
Who is next to take a break?
(I sure could use one.)
8/4/09, 9:41 PM
Kevin said...
I don't know whether the shape of things to come will assume the outlines you've delineated, but in the last few posts particularly you've displayed a Schumaker-like gift for making the complex comprehensible to the layman. You seem to be getting even better at it.
Yesterday I described some of your thesis concerning oil depletion to a hiking partner. He responded first with the argument that "we'll think of something else, we always have before," then remarked that you can't have read much history! Obviously you've encountered these responses before. It's just another demonstration that modernist faith in progress really is an unacknowledged form of religion.
8/9/09, 3:32 PM
Joseph said...
8/10/09, 4:42 PM
Joseph said...
I filled out a form for the library to order a copy and a copy of Ecotechnic.
8/10/09, 7:00 PM
Rich said...
I'm talking about the Singularity, which is the term used to describe the point when humans succeed in making an artificial intelligence (AI) that is at least as smart as ourselves. The significance is that, once we have an AI that can think (and program) as well as we humans can, it will be able to improve itself at a geometric rate, and will very quickly leave us in the dust ... if we allow it.
Now, this is not automatically a bad thing (Terminator-movie analogies notwithstanding), but it is, potentially, a global game-changing event for all of humanity, and is at least worthy of some consideration. What's more, the date of this milestone is definitely getting close, possibly within a decade ... and this is not some eternally-receding-as-we-approach-it milestone, like nuclear fusion. It is, rather, something of a technological inevitability, barring a significant collapse in societal support for technological innovation before then.
No need to post this publicly (although you're welcome to). Just wanted to suggest it as possible future topic of discussion. On the off chance that this is a new concept for you, Raymond Kurzweil is a well-known futurist who talks about this a lot. Eliezer Yukowsky is a much less well-known name, but he also provides much more detailed (and, I suspect, accurate) information about the concept.
Thanks,
rainbird
8/11/09, 11:15 AM
DIYer said...
The book-burning emperor.
It came up in discussion on TAE, and seems worth exploring. My comment was something to the effect of "when might we see the leadership decide knowledge is dangerous and smash all computers?" -- amongst a number of comments about extremist fundie religion and the power of mythology and belief systems.
8/13/09, 9:43 PM
John Michael Greer said...
MJC, by all means take one. I'll be posting again tomorrow evening.
Kevin, I've got Orlov's book up on my shelf of dire books about the future. Yes, your hiking partner's comments are familiar ground -- I get pretty much that same set of claims all the time, from people who haven't read much history. ;-)
Joseph, please do.
Rainbird, I've discussed the so-called "singularity" a number of times here in the past, though I haven't given it a post of its own -- probably something I should do one of these days. Basically, my take is that the "singularity" is a messianic myth dressed in an technological drag. Roger Penrose's The Emperor's New Mind -- which I'd encourage you to read -- quite some time ago shredded the claim that any currently conceivable computer technology is capable of creating an intelligent machine.
AI is far from a technological inevitability; indeed, it's precisely comparable to fusion power -- the same history of sweeping claims of imminence and inevitability followed by a noticeable failure to follow through on the promises pervades both technologies. The same thing is true of many of the other technotheological notions Kurzweil retails, by the way -- but that's fodder for a future post.
DIYer, that's a complex issue, not least because "the leadership" is a very fluid concept -- remember that the governing classes of a declining civilization are pretty much guaranteed early graves as things turn harsh and those with radically different skill sets -- say, "warlord" rather than "financier" -- have all the advantages. More on this later.
8/18/09, 8:15 AM
DIYer said...
As you mentioned a while back, there may be half a dozen individuals with some claim to the title "President of the United States" a few decades from now. I believe this is likely as well.
I suspect the title "financier" will be rather unpopular for quite a while. As you say, not a useful job skill.
Anyhow there are a number of historical examples, as I understand, where an inbred concentration of political power produced one of these knowledge-averse tyrants.
8/18/09, 8:12 PM
Betsy said...
8/20/09, 8:18 AM
Rich said...
An opposing view; I can't resist ...
This is not a priority for me. I enjoy reading your blog, and am always happy with the subjects you choose to write about.
That said, regarding the singularity, two points for why I see this as qualitatively different from, for instance, nuclear fusion. First, there are no technological "hurdles". We know how to do this now. I am a programmer by trade, and I am confident that I could develop a "thinking" AI ... given the resources. The only thing missing is processing power: faster computers, smaller chips, bigger hard drives ... assuming we continue to improve hardware capability as we have been, I see no clear reason why it is not inevitable.
The other reason is that AI is a byproduct. There are people working to develop computer programs that mimic human thought, but they frankly don't count much. The whole industry of programming is trending towards making applications and devices "smarter". I use applications today that write programs for me--computers programming themselves--and really, quite large, complicated programs, done in seconds, that would take me months to write by hand. Unless the entire IT industry collapses, or starts to veer sharply in some new direction, AI is exactly where we're headed.
Of course, the subject of your blog may well be the show-stopper. And it is quite likely I'm underestimating the timeframe. I will check out the book you mentioned. Also, I'm still working my way through your blog archives; I'll keep an eye out for past commentary on this subject. Thanks again.
8/21/09, 8:26 AM