.comment-link {margin-left:.6em;}

Future Imperative

What if technology were being developed that could enhance your mind or body to extraordinary or even superhuman levels -- and some of these tools were already here? Wouldn't you be curious?

Actually, some are here. But human enhancement is an incredibly broad and compartmentalized field. We’re often unaware of what’s right next door. This site reviews resources and ideas from across the field and makes it easy for readers to find exactly the information they're most interested in.

Name:

The future is coming fast, and it's no longer possible to ignore how rapidly the world is changing. As the old order changes -- or more frequently crumbles altogether -- I offer a perspective on how we can transform ourselves in turn... for the better. Nothing on this site is intended as legal, financial or medical advice. Indeed, much of what I discuss amounts to possibilities rather than certainties, in an ever-changing present and an ever-uncertain future.

Monday, March 14, 2005

Review: Two Different Government Views of the Future - AI, AL, Bio, Cyber, Gov, Mind, Nano, Noo, Plan, Psych, Self, SF, SkiP, Soc, Rev, Tech

According to John Smart, "The original version of this article was written for a transformative technology scenario project set in 2032, for the U.S. Army Logistics Transformation Agency."

Smart has an interesting, information-technology-dominated take on the future. Essentially, the only real human enhancement that matters in his future is based on non-invasive computer systems -- advanced expert programs, computer modelling, emerging (limited) AI, etc.

His does a good job of fleshing out the capabilities of non-cybernetic infotech in the world he forecasts, but everything else that might lead to human enhancement -- even the relatively pedestrian, non-controversial ideas, like powered armor for soldiers, seems to get brushed aside.

"All those old twentieth-century bioenhancement ideas about genetic engineering of humans, super-drugs for mental performance, extreme life extension, and brain-machine interfaces (except for people with disabilities), turned out to be like the 1900s' ideas about flying houses and atomic-powered vacuum cleaners: possible in theory, perhaps achievable some day, in theory, but always outcompeted in practice by far more powerful, efficient and less controversial digital alternatives every step of the way."

Here's the curious thing. Another recent government report on potential scientific breatkthroughs says pretty much the opposite about four "converging technologies," nanotech, biotech, infotech and cognotech. (Cognotech is technology designed to enhance the mind.) The "Converging Technologies for Improving Human Performance" document states:

"This report sets goals for societal and educational transformation. Building on the suggestions developed in the five topical groups, and the ideas in the more than 50 individual contributions, the workshop recommended a national R&D priority area on converging technologies focused on enhancing human performance." That's quite a conclusion.

Since computers (infotech) are only one of the technologies this report looks at, there is obviously a difference of opinion here. The implications of the 2002 "Converging Technologies" report are so huge I intend to discuss them in another post. (Probably more than one, given time.) But clearly John Smart's paper takes a different position when he comments on the exclusive use of information technology to "enhance" humans.

"There's just no better bang for the buck, and the social and political repercussions are far less bothersome as well."

Smart adds, regarding the threat of bio-terrorism,

"What about global immunity against bioterror? Human-made super-viruses turned out to be way less dangerous than we feared. The bottom line is that our biological immune systems have always protected us, as a species, tremendously well against these simple invaders, and every plague in history occurred because of "differential immunity" that emerged between certain well immunized groups and other poorly immunized ones within the same species. That's why no pathogen in history has ever killed a species. In other words, in the history of life, immune systems always win, and the more our biologists understand them, the better we get defending against everything that comes."

Now I appreciate the idea that improving computer resources should prove invaluable in dealing with bio-terrorism (speeding the rate at which epidemics can be analyzed genetically, vaccines can be developed, etc). But dismissing human-made viruses at this point seems a little premature, given that bio-terrorism is only exceeded by nuclear terrorism in its immediate threat to humanity.

Neither one may wipe us out (unless one of them sparks a nuclear war), but that result will be of little comfort to millions of dead victims.

Like the offhand dismissal of power armor as a factor in the militaries of 2032, that seems to be making some huge assumptions about the state of technology a quarter-century from now in every field except infotech.

Smart may be limited by the parameters of the Pentagon scenario he was working with. It's not unknown for the military's futurists to look at extreme situations to see what kinds of capabilities would be needed if the world lurched as far as it could in one direction.

Hopefully this scenario was not imposed by someone who wanted to remove all mention of "disturbing" technology from a report. I belong to the school of thought which says that whatever you ultimately choose to do, you need to make your decisions based upon the best possible information available.

Then again, John Smart may simply have a very different viewpoint than my own. =)

Regardless, he has two brief but interesting short stories, Future Heroes 2035: My Friends and I and Future Heroes 2035: The Big Picture. These are great reading if you want to share some of infotech's potential in a very readable way. Not everybody we know out there is a futurist. Really. =)


Future Imperative

0 Comments:

Post a Comment

<< Home