.comment-link {margin-left:.6em;}

Future Imperative

What if technology were being developed that could enhance your mind or body to extraordinary or even superhuman levels -- and some of these tools were already here? Wouldn't you be curious?

Actually, some are here. But human enhancement is an incredibly broad and compartmentalized field. We’re often unaware of what’s right next door. This site reviews resources and ideas from across the field and makes it easy for readers to find exactly the information they're most interested in.


The future is coming fast, and it's no longer possible to ignore how rapidly the world is changing. As the old order changes -- or more frequently crumbles altogether -- I offer a perspective on how we can transform ourselves in turn... for the better. Nothing on this site is intended as legal, financial or medical advice. Indeed, much of what I discuss amounts to possibilities rather than certainties, in an ever-changing present and an ever-uncertain future.

Sunday, June 25, 2006

Gaming AI

The Economist recently shared this article on the recent evolution of limited artificial intelligence in computer games.

Good looks, the video-games industry is discovering, will get you only so far. The graphics on a modern game may far outstrip the pixellated blobs of the 1980s, but there is more to a good game than eye candy. Photo-realistic graphics make the lack of authenticity of other aspects of gameplay more apparent. It is not enough for game characters to look better—their behaviour must also be more sophisticated, say researchers working at the interface between gaming and artificial intelligence (AI).

Today's games may look better, but the gameplay is “basically the same” as it was a few years ago, says Michael Mateas, the founder of the Experimental Game Lab at the Georgia Institute of Technology. AI, he suggests, offers an “untapped frontier” of new possibilities. “We are topping out on the graphics, so what's going to be the next thing that improves gameplay?” asks John Laird, director of the AI lab at the University of Michigan. Improved AI is a big part of the answer, he says. Those in the industry agree. The high-definition graphics possible on next-generation games consoles, such as Microsoft's Xbox 360, are raising expectations across the board, says Neil Young of Electronic Arts, the world's biggest games publisher. “You have to have high-resolution models, which requires high-resolution animation,” he says, “so now I expect high-resolution behaviour.”

The curious thing about this kind of "AI research" is not that it is apt to create free-willed, self-aware, human-equivalent computers. Rather, the fact that a large and prosperous industry (computer gaming) is putting substantial research funds into AI may reflect a larger trend. The push to develop various killer-aps and specific systems has brought a number of "AI" milestones into being already, such as voice-recognition and dictation programs, and progress on facial recognition.

There is a very real possibility we will experience an AI revolution based not on the near-term development of full-fledged thinking machines, but because of two other factors. First, a host of lesser applications, that in themselves expand human capabilities -- enabling ordinary humans to serve as the heart of their own AI system. Imagine, what if you had systems that could do more than merely take dictation or vacuum your floors, but research cures for diseases by sifting through medical journals looking for dual-use drugs, or discovering the purpose of one base pair after another in a creature's genes, by independently setting up experiments and making hypotheses. (Yes, both of those already exist.) By cutting down on the number of extraneous tasks required of most people in our labor force, we can help people focus the energies they have on work that matters. And meanwhile, we can turn more and more of our IT resources to those questions and challenges that matter as well.

The other factor is the increased human participation made possible through globalization. There are many economic consequences to being able to hire highly skilled programmers, engineers and scientists on the other side of the Earth for remarkably little money. One of those consequences is that the massive amounts of human labor still required for so many enterprises can be provided cheaply enough to conserve investment capital... and thus make possible even more productive investments and new, small businesses (and thus, incidentally, more rewarding, high-paying jobs). Setting aside the very real labor issues outsourcing presents -- income inequality, the stagnation of wages, the erosion of job benefits, and, of course, job insecurity -- it becomes obvious that being able to mobilize masses of highly affordable, highly motivated workers at the drop of a hat could be as critical for an entrepreneur of transcendant creative talents as being able to whip up prototype inventions in an hour or less using a "fab lab."

In fact, if you look at those options carefully, you'll notice they're ultimately the same thing -- a way of cutting back on the cost of getting high-quality work done to the point that a small enterprise can afford to startup in the first place, and a large enterprise can afford to explore multiple options or a single option -- a new product line, blue-sky research, a new marketing approach -- that much more thoroughly.

Of course, to best utilize the options this new labor pool (both human and cybernetic) presents, the humans tapping it probably need to be a little bit more than ordinary themselves, and also encouraging extraordinary capacities in their workers. But given the rapid pace of human augmentation technologies, this shouldn't be a problem.

Still, you would be justified in asking, "How is all this different from progress in information technology over the last ten years or so?" Well, aside from the obvious outsourcing twist, and the new computer programs mentioned above, there are a number of permutations to this labor revolution.

This article in Wired discusses the consequences of all the peer production work being done on the Internet, while Amazon's "mechanical turk" serves as another route for people with time to kill to engage in (rather poorly paid) labor that would normally require far more employee training and resources allocated than a company might want to give to a one-time "busy-work" job. There's of course no shortage of "free" computational power for collective efforts like SETI-at-Home, while some companies are "insourcing" jobs to idle people, the way the airline Jet Blue has insourced the taking of reservations by phone to housewives in Utah.

All of these new human and automated labor pools are not merely creating new resources for managers and entrepreneurs. To the extent these tools are easily bent to a corporation's purposes, they reduce the overall amount of time and attention required from some of a company's most prized employees -- the CEO and top managers and innovators whose talents are better used elsewhere.

This transformation may be a first step on the road to a runaway Singularity whose roots are ultimately as much economic in nature as they are technological.

AI, Soc
Future Imperative