Stringing Onward and Upward

December 13, 2001

Book Review
Philip J. Davis

The Next 1,000 Years. Communications of the ACM, March 2001, Vol. 44, No. 1. Sixty-seven Short Articles.

A vision of the future. A chip buried deep in my body speaks:

Chip: Phil, you'll need to take a shave this morning.
Phil: Since I have you around to tell me, what do I need my wife for?
Chip: What indeed?

The celebration of the millennium elicited all sorts of predictive material, ranging from sober evaluations of the prospects for the next few years to religio-apocalyptic thrillers. From the mathematical sciences have come two books about what can be expected in the near future, both of them reviewed in this column*.

The Association for Computing Machinery has also gotten into the act, presenting us with the eyebrow-raising and occasionally disturbing publication under review.

Some of the contributing authors look forward just a few years, to things already on the horizon; some project fifty years ahead, and some mention 1000 years. Some authors are clearly having fun. Some are as serious as death and taxation. It would be superfluous for me to comment on the omnipresent and vital role of prediction in the history of civilization, or on its successes, its failures, its methodologies and formats, its rapid changes, its relation to the very concept of time. And it amazes me what idiotic things people will sometimes predict.

I apologize that I have not taken clips from all the articles. Moreover, the clips that caught my eye are not necessarily those that the individual authors might have selected as making their main points. The clips are followed by my immediate reactions, which appear in boldface.

*****

"I think there's a world market for about five computers."---Thomas Watson, IBM, 1943. (Page 125.)

The classic prediction in the computer field.

"The next 50 years will see computers vanish within people. . . . I predict that nanotechnology and novel modulation techniques as well as the simple density of 4th or 5th generation wireless networks will allow us to dispense with external devices altogether."---Jon Crowcroft, Computer Science, University College, London. (Page 81.)

The epigraph for this article expresses my thoughts here.

"Let's face it, books are dead. History. Toast. . . . Author, reader and critic all become one in the grandeur of the cyberpublishing experience."---Hal Berghel, Computer Science, University of Nevada, Las Vegas. (Page 17.)

An old story: "What Orwell� feared were those who would ban books. What Huxley� feared was that there would be no reason to ban a book, for there would be no one who would want to read one." (Neil Postman in Amusing Ourselves to Death.)

"Imagine valuing a technology on the degree to which it successfully addresses a social challenge instead of being one of the "-ests" mentioned earlier [i.e., fastest, smallest, biggest, or coolest]. Imagine the social challenge driving the investigation and opening up wild new areas that would never have been explored. . . . By presenting the major challenges of computing as technical challenges, we have lost the interest of many brilliant technical minds---often female---because their interest is in using that brilliance to solve real problems rather than creating technology for technology's sake."---Anita Borg, Institute for Women and Technology, Palo Alto. (Page 141.)

Right on, Anita!

"Without more and better math teachers---teachers who are skilled to accommodate the explosion in new knowledge---today's students will be ill-equipped to assume their role as visionaries, experts, producers, innovators the world will need to solve the problems and dream the dreams that will define America's future."---John Glenn, astronaut and former U.S. Senator. (Page 138.)

We often solve old problems by abandoning them. Then we go on to create new generations of problems.

"We are about to embark on a world where computation needs are always satisfied, storage needs are always satisfied, and bandwidth needs are always satisfied, and any device can be added to the network to take advantage of the network."---Ann Winblad and Mark Goremberg, partners in Hummer Winblad Venture, San Francisco. (Page 125.)

But how will the needs be developed? According to an article in a recent issue of American Prospect, no one is using the bandwidth currently available and venture capital is screaming. And there's another possibility: I recall reading years ago that we would soon have as many thruways as we need.

"As Arlene O'Leary has noted in the Educational Technical Review (Spring/Summer 2000), commenting on college undergraduates: 'They no longer want a 'just-in-time' education. They seek a 'just-for-you' customized education.' Imagine a future in which w-WBI [wireless Web-based instruction] provides very specific content that is personalized for its users. Interfaces and content will be tailored to the user's needs and history, and material of little interest will be filtered out."---Ron Vetter, Chair, Computer Science, University of North Carolina, Wilmington. (Page 61.)

Devastating. Destroys what little free will is left to us. I never know what will be of interest to me.

"We're currently exploring making a relationship manager that notices how you are treating people you care about, how you are choosing friends you say you want to spend time with, and encourages you to meet the people you should know. Our belief at MIT Context Aware Computing Group is that these kinds of decisions can often be better accomplished by a computer than a person."---Ted Selker, Context Aware Computing Group, MIT Media Laboratory. (Page 45.)

Ah yes, Yente the cyber-matchmaker. Cf. Fiddler on the Roof for the classic version.

"Displays are implanted directly into the lens of each eye, 'speakers' into the ears, and smells created in the nose or delivered directly to the brain."---Martin Cooper, CEO, ArrayComm, San Jose, California. (Page 55.)

The Prosthetic Man/Woman. Will sex differences be maintained? Do women see things differently than men? George Wald, Nobelist in vision physiology, used to ask: What does it mean to see?

"Should we stay in the mathematical context where precision and rigor are at a premium? Should we relax our notions and allow imperfection, imprecision, and flexibility to expand our notions? I claim that other notions of computing should not only be investigated conceptually but they should be implemented tomorrow."---Dennis Tsichritzis, Informatics Department, University of Geneva, Switzerland. (Page 100.)

Funny, I recently read the same criticism of mathematics itself.

"The smallest element envisioned by physicists are strings. . . . Let us also adopt the technologists' optimism that whatever is not impossible will be done and assume that computers in which the gates are individual strings will appear one day."---Whitfield Diffie, Sun Microsystems, Menlo Park, California. (Page 85.)

I grew up thinking that the world was made up of hard, tiny marbles bouncing around. And now it turns out that it's made up of even teentsier vibrating vermicularities that are all slated to be gates.

"The last two centuries mathematics played a key role in physics because it provided the foundation for establishing new theories. Computer science will play a key role in biology in the centuries to come because those [mathematical] theories are mostly unsuitable. For example, the human genome project would be infeasible without computers, and that task is just a minuscule fraction of what remains to be accomplished to comprehend the behavior of a cell."---Jacques Cohen, Computer Science, Brandeis University. (Page 76.)

No doubt of it. But what then will it mean to have a theory?

"Innovation and competition would be stifled if mandated trusted systems became law. Moreover, the market for digital information products may well be vastly smaller if every piece of information must be tightly locked up at all times."---Pamela Samuelson, Information Management and Law, UC Berkeley. (Page 99.)

Oh, oh. There goes my royalty check for $1.98.

"The day is close at hand when it will be feasible to create designer genetically altered pathogens in college laboratories. After that, we'll have to contend with self-replicating entities created through nanotechnology, the field devoted to manipulating matter on the scale of individual atoms. Although nanoengineered self-replicators are at least one, and probably more than two, decades away, the specter can be described as that of a unstoppable nonbiological cancer."---Ray Kurzweil, CEO, Kurzweil Technologies, Inc., Wellesley Hills, Massachusetts. (Page 91.)

I'm already swamped by e-junk, e-chain letters, and questionable jokes sent to me by lonely people.

"Laptops will be replaced by tablet computers and they, in turn, by digital paper. Walls in our offices will be reactive displays. All the digital accessories cluttering our briefcases, pocketbooks and belts will merge into much more general-purpose communication devices with knowledge of our preferences, our context, and our geographic location."---Andries Van Dam, Department of Computer Science, Brown University. (Page 50.)

Bringing to stark reality the old saying that the walls have ears.

"We believe that two-way immortality where one's experiences are digitally preserved and which then take on a life of their own, will be possible within this century."---Gordon Bell and Jim Gray, Microsoft, Redmond, Washington. (Page 29.)

The best of luck to my future digital clone. I hope it has fewer hangups than I've had.

"This may be the last generation to distinguish between the real and the virtual."---Norman Badler, Center for Human Modeling and Simulation, University of Pennsylvania. (Page 33.)

This may be the last generation to distinguish between heaven and hell.

*Handbook of Discrete and Combinatorial Mathematics, edited by Kenneth Rosen (see "While Surfing a Handbook," June 2001) and Mathematics Unlimited---2001 and Beyond, edited by Bjorn Engquist and Wilfried Schmid (see "The Power of Numerics," September 2001).

� George Orwell: Nineteen Eighty-Four, 1949.

� Aldous Huxley: Brave New World, 1932.

Both the special issue of the Communications of the ACM and this review were written before September 11 and may contain material that might now be treated differently.

Philip J. Davis, professor emeritus of applied mathematics at Brown University, is an independent writer, scholar, and lecturer. He lives in Providence, Rhode Island, and can be reached at [email protected].


Donate · Contact Us · Site Map · Join SIAM · My Account
Facebook Twitter Youtube linkedin google+