ad info

 
CNN.com  technology > computing
    Editions | myCNN | Video | Audio | Headline News Brief | Feedback  

 

  Search
 
 

 
TECHNOLOGY
TOP STORIES

Consumer group: Online privacy protections fall short

Guide to a wired Super Bowl

Debate opens on making e-commerce law consistent

(MORE)

TOP STORIES

More than 11,000 killed in India quake

Mideast negotiators want to continue talks after Israeli elections

(MORE)

MARKETS
4:30pm ET, 4/16
144.70
8257.60
3.71
1394.72
10.90
879.91
 


WORLD

U.S.

POLITICS

LAW

ENTERTAINMENT

HEALTH

TRAVEL

FOOD

ARTS & STYLE



(MORE HEADLINES)
*
 
CNN Websites
Networks image


Analysis: a tale of two terrors

CIO

July 6, 2000
Web posted at: 8:27 a.m. EDT (1227 GMT)

(IDG) -- Every now and then, IT professionals stand back and marvel at the astonishing rate of change in our field. We then imagine where it might eventually lead, and the thoughtful among us become terrified.

  MESSAGE BOARD

One version of the terror was expressed recently by Bill Joy, cofounder and chief scientist at Sun Microsystems, in Wired magazine. Joy's terror is a little different from mine. He accepts the pronouncements of Ray Kurzweil and others, who believe that Moore's Law (which states that computers double in speed every year and a half or so) will lead to autonomous machines, perhaps by the year 2020. That is when computers will become, according to some estimates, about as powerful as human brains. (Not that anyone knows enough to really measure brains against computers yet. But for the sake of argument, let's suppose that the comparison is meaningful.) According to this scenario of the terror, computers won't be stuck in boxes. They'll be more like robots, all connected together on the Net, and they'll have quite a bag of tricks.

They'll be able to perform nanomanufacturing, for one thing. They'll quickly learn to reproduce and improve themselves. One fine day without warning, the new supermachines will brush humanity aside as casually as humans clear a forest for a new development. Or perhaps the machines will keep humans around to suffer the sort of indignity portrayed in the movie The Matrix.

MORE COMPUTING INTELLIGENCE
IDG.net   IDG.net home page
  CIO home page
  Read Jaron Lanier's previous column here
  A guru's predictions for a wired world
  The robots are coming
  Reviews & in-depth info at IDG.net
  E-BusinessWorld
  TechInformer
  Questions about computers? Let IDG.net's editors help you
  Darwin Magazine is here!
  Search IDG.net in 12 languages
  News Radio
  * Fusion audio primers
  * Computerworld Minute

Even if the machines would otherwise choose to preserve their human progenitors, evil humans will be able to manipulate the machines to do vast harm to the rest of us. This is a different scenario that Joy also explores. Biotechnology will have advanced to the point that computer programs will be able to manipulate DNA as if it were Javascript. If computers can calculate the effects of drugs, genetic modifications and other biological trickery, and if the tools to realize such tricks are cheap, then all it takes is one madman to, say, create an epidemic targeted at a single race. Joy points out that biotechnology without a strong, cheap information technology component would not be sufficiently potent to bring about this scenario. Rather, it is the ability of software to cheaply model and guide the manipulation of biology that is at the root of this variant of the terror.

I can't fully convey Joy's concerns in this brief account, but I think you get the idea. My version of the terror is different. The main reason is that, so far, Moore's Law applies only to hardware, not software. I dearly wish we could claim that software is improving as quickly as hardware. Indeed, in some cases I wish we could claim that software is improving at all. I remember when I was a young pup in school in the late '70s, and Unix was the beloved operating system at my campus. How I hated Unix, that devilish accumulator of data trash, obscurer of function, enemy of the user! I was optimistic that within a few years Unix would be just a bad dream, vaguely remembered. Now, as we enter a new century and I enter my own middle age, the hot prospect in software, which has energized students and invigorated the investment scene, is none other than a return to Unix, in its Linux incarnation.

If anything, there's a reverse Moore's Law observable in software: As processors become faster and memory becomes cheaper, software becomes correspondingly slower and more bloated, using up all available resources.

Now I know I'm not being entirely fair here. We have better speech recognition and language translation than we used to, for example, and we are learning to run larger databases and networks. But our core software techniques and technologies simply haven't kept up with hardware. Just as the newborn race of robots are about to consume all humanity, our dear old species will likely be saved by a Windows crash. The poor robots will linger pathetically, begging us to reboot them, even though they'll know it would do no good.

What is a long-term future scenario in which hardware keeps getting better and software remains mediocre?

The great thing about crummy software is the amount of employment it generates. If Moore's Law is upheld for another 20 or 30 years, there will not only be a vast amount of computation happening on planet Earth, but the maintenance of that computation will consume the efforts of almost every living person. We're talking about a planet of help desks.

I've previously argued that this future would be a great thing, realizing the socialist dream of full employment by capitalist means. But let's consider the dark side.

Among the many processes that information systems make more efficient is the process of capitalism itself. A nearly friction-free economic environment allows fortunes to be accumulated in a few months instead of in a few decades. Yet the individuals doing the accumulating are still living as long as they used to; longer, in fact. So those individuals who are good at getting rich have a chance to get richer before they die than their equally talented forebears did.

There are two dangers in this. The smaller, more immediate danger is that young people acclimatized to a deliriously receptive economic environment might be emotionally wounded by what the rest of us would consider brief returns to normalcy. I sometimes wonder if some of my students who have gone on to dotcom riches would be able to handle any financial frustration that lasted more than a few days without going into some sort of destructive depression or rage.

The greater danger is that the gulf between the richest and the rest could become transcendentally grave. That is, even if we agree that a rising tide raises all ships, if the rate of the rising of the highest ships is greater than that of the lowest, they will become ever more separated. (And indeed, as pointed out in my last column, concentrations of wealth and poverty have increased in this country in recent years.)

If Moore's Law or something like it is running the show, the scale of the separation could become astonishing. This is where my terror resides, in considering the ultimate outcome of the increasing divide between the ultra-rich and the merely better off.

With the technologies that exist today, the wealthy and the rest aren't all that different; both bleed when pricked, to use the classic example. But with the technology of the next 20 or 30 years, they might become quite different indeed. Will the ultra-rich and the rest of us even be recognizable as the same species?

The possibilities are so obvious and so terrifying that there is almost a banality in stating them. The rich could have their children engineered to be genetically more intelligent, beautiful and joyous, for example. Perhaps they could even be genetically disposed to have a superior capacity for empathy, but only to other people who meet some narrow range of criteria.

Let's explore just one possibility, for the sake of argument. One day the richest among us could turn nearly immortal, becoming virtual gods to the rest of us. (An apparent lack of aging in cell cultures and in some organisms has been demonstrated in the laboratory, though not in a human being as yet.)

I don't want to focus here on the fundamental questions of near-immortality: whether it is moral or even desirable, or where one would find room if immortals insisted on continuing to have children. Instead, let's focus on the question of whether immortality is likely to be expensive.

My guess is that immortality will be cheap if information technology gets much better, and expensive if IT remains as crummy as it is. I suspect that the hardware/software dichotomy will reappear in biotechnology, and indeed in other 21st century technologies. You can think of biotechnology as an attempt to make flesh into a computer, in the sense that biotechnology hopes to manage the processes of biology in ever greater detail, leading at some far horizon to perfect control. Likewise, nanotechnology hopes to do the same thing for materials science. If the body, and the material world at large, becomes more manipulatable, more like a computer's memory, then the limiting factor will be the quality of the software that governs the manipulation.

Even though it's possible to program a computer to do virtually anything, we all know that's really not a very helpful feature of computers. The important fact is this: Getting computers to perform specific tasks of significant complexity in a reliable but modifiable way, without crashes or security breaches, is essentially impossible. We can only approximate this goal, and only at great expense.

Likewise, one can hypothetically program DNA to make virtually any modification in a living thing, and yet designing a particular modification and vetting it thoroughly will likely remain immensely difficult. That is one reason why biological evolution has never found a way to operate quickly. Similarly, one can hypothetically use nanotechnology to make matter do almost anything conceivable, but it will probably turn out to be much harder than we now imagine to get it to do any particular thing of complexity without disturbing side effects.

Scenarios that predict that biotechnology and nanotechnology will be able to quickly and cheaply create startling new things must also imagine that computers will become semiautonomous, super-intelligent, virtuoso engineers. But computers will do no such thing if the last half-century of progress in software serves as a predictor of the next half-century.

In other words, bad software will make biological feats like near-immortality expensive instead of cheap in the future. Even if everything else gets cheaper, the IT side of the effort will get more expensive.

Cheap near-immortality for everyone is a self-limiting proposition. There isn't enough room to accommodate such an adventure. Also, roughly speaking, if immortality were to become cheap, so would the horrific biological weapons of Joy's scenario. On the other hand, expensive near-immortality is something the world could absorb, at least for a good long while, because there would be fewer people involved. Maybe they could even keep the effort quiet.

So here is the irony. The very features of computers that drive us crazy today, and keep us gainfully employed, are the best insurance our species has for long-term survival as we explore the far reaches of technological possibility. On the other hand, those same annoying qualities are what could make the 21st century into a madhouse scripted by the fantasies and desperate aspirations of the super-rich.




RELATED STORIES:
Survey: The future of PCs
August 20, 1999
Pundits' views differ on future of Internet, PCs
November 12, 1998

RELATED IDG.net STORIES:
Read Jaron Lanier's previous column here
CIO
Bill Joy's foreboding
Infoworld
The robots are coming
Computerworld
Kapor: Net hype is dying, reality moving in
Infoworld
Nanotechnology poses U.S. policy challenge
IDG.net
Retired Dell CIO cautions IT industry
Computerworld
A guru's predictions for a wired world
PC World
Internet usage intensifies social contact
Industry Standard

RELATED SITES:
Read Bill Joy's essay on Wired
Jaron Lanier's Web site
National Tele-Immersion Initiative home page

Note: Pages will open in a new browser window
External sites are not endorsed by CNN Interactive.

 Search   

Back to the top   © 2001 Cable News Network. All Rights Reserved.
Terms under which this service is provided to you.
Read our privacy guidelines.