Welcome to Asimov's Science Fiction

Stories from Asimov's have won 53 Hugos and 28 Nebula Awards, and our editors have received 20 Hugo Awards for Best Editor.

Click on each cover to order your edition!

For Digital Issues
Current Issue Anthologies Forum e-Asimov's Links Contact Us Contact Us
Subscribe
Asimov's Science Fiction Analog Science Fiction & Fact
On the Net: Singular by James Patrick Kelly
 

 

"Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended." Abstract of a paper given by Vernor Vinge at the VISION-21 Symposium, March 30-31,1993

 

the end begins

Of course being science fiction readers, we saw the Singularity coming long before the mundane world. In fact, one of our best writers and thinkers, Vernor Vinge, actually named it. I can’t find an official page for Vinge but check out the Singular Vernor Vinge Page at <http://www.ugcs.caltech.edu/~phoenix/vinge/>.

Vinge’s idea is that, with the accelerating advances in technology, it won’t be long before something surpasses human intelligence. The paths to that singular something are many. One leads to–and passes through–artificial intelligence. Is it possible to design a hardware/software interface that will embody a strong AI, one that can pass the Turing Test? This famous thought experiment was first proposed by the mathematician, cryptographer, and cybernetic visionary Alan Turing <http://www.turing.org.uk/turing/>. Imagine a setup in which a human and a computer could respond to questions anonymously, so that the interrogator had no clues as to whether the responses came from man or machine, other than by whatever he could glean from their content. If a computer were clever enough to fool the human interrogator, Turing asked, why would we not judge it as intelligent as a human? For years, experts have debated whether the Turing Test is valid and whether any computer will ever be able pass it. But if a computer should pass the test, it seems unlikely that the human standard will be the upper limit of intelligence. Technological momentum will carry our creations far beyond our own gene-given capabilities. And what happens then?

Another path to Singularity leads through improved human/machine interfaces. What if we could enhance our memory and creativity cybernetically? Sophomores could plug a geometry chip into a slot in the backs of their heads. Your Palm Pilot’s granddaughter would live just east of your occipital lobe. Since the eighties, science fiction writers have been busily exploring the notion of wetware, a slippery term which sometimes refers to the human nervous system but which more often describes a mind with both biological and manufactured components. Coinage of the term wetware is commonly attributed to Rudy Rucker <http://www.mathcs.sjsu.edu/faculty/rucker/> Professor of mathematics at San Jose State University and gonzo transrealist.

Yet another path would involve purely biological improvements to intelligence — not to mention the human body. The mapping phase of The Human Genome Project <http://www.ornl.gov/hgmis/> is already ahead of schedule and offers the tantalizing prospect that we might someday be able to control of the evolution of future generations, or even tinker with your genes and mine. The Singularity starts the day after the first posthuman baby is born, or possibly around the time that Grandpa Kelly gets his Hayflick limit <http://www.remissions.org/hayflick%20limit.html> reset.

 

read these now

Vernor Vinge’s original paper <http://www.student.nada.kth.se/~nv91-asa/Trans/vinge> is already eight years old — an eternity in these fast-forward times. And yet it repays close reading or re-reading. After all, Alexis de Tocqueville’s DEMOCRACY IN AMERICA <http://www.wakeamerica.com/past/books/detocqueville/> is a hundred and sixty-six years old and people continue to consult it. Am I saying that a mere science fiction writer’s insights are as important as those of the legendary de Tocqueville? Well, if Vernor Vinge is right, then the Singularity will be the Mother of all Revolutions, more important that the American, French, Industrial, and Russian Revolutions combined. And then squared.

Vinge wrote of the dilemma of "hard" science fiction writers attempting to imagine a future dominated by some form of superhumanity. "More and more, these writers felt an opaque wall across the future. Once, they could put such fantasies millions of years in the future. Now they saw that their most diligent extrapolations resulted in the unknowable . . . soon. Once, galactic empires might have seemed a Post-Human domain. Now, sadly, even interplanetary ones are." He went on to write that as the Singularity hurtles down upon us, more and more of our cultural institutions would sense its looming shadow.

Six years later, two books were published which took a hard look at the Singularity. One was Robot: Mere Machine to Transcendent Mind by Hans Moravec <http://www.frc.ri.cmu.edu/~hpm/>. I commended this book and Professor Moravec to your attention in an earlier installment. Suffice it to say Moravec makes a valiant and at times awe-inspiring attempt to peek through the Singularity (although he does not exactly endorse the concept in so many words) and see what’s on the other side. The other was The Age of Spiritual Machines by Ray Kurzweil <http://www.kurzweilai.net/>. Kurzweil is the Thomas Edison of the computer age; he was principle developer of "the first omni-font optical character recognition device, the first print-to-speech reading machine for the blind, the first CCD flat-bed scanner, the first text-to-speech synthesizer, the first music synthesizer capable of recreating the grand piano and other orchestral instruments, and the first commercially marketed large vocabulary speech recognition." The Age of Spiritual Machines is a delightful read: a tour of the next hundred years guided by a gifted futurist with a sense of humor. In it, Kurzweil argues that we’ll probably be taking the middle path to the Singularity: we and our computers will become one.

You’ll find lively essays by divers hands about the Singularity <http://www.kurzweilai.net/meme/frame.html?m=1> on KurzweilAI.net, including "Tearing Toward the Spike" by Australian sf writer Damien Broderick <http://www.thespike.addr.com/>; "What is Friendly AI?" by Eliezer S. Yudkowsky, a force in the Singularitarian community; "Singularity Math Trialogue" by Kurzweil, Moravec, and Vinge (actually an exchange of email); and an excerpt from Kurweil’s next book, The Singularity Is Near.

As an aside, I should add that KurzweilAI.net is the most amazing site I’ve visited since I began this gig. It addresses not only the Singularity, but also immortality, virtual reality, machine consciousness, and a range possible futures. Some of the great visionaries of our time have contributed essays to the site and return from time to time to defend their thinking on a bulletin board. From this site you can download AARON <http://www.kurzweilcyberart.com/>, an AI that creates original paintings or meet Ramona <http://www.kurzweilai.net/meme/frame.html?m=9> "the first live virtual recording and performing artist," who just happens to be Kuzweil’s female alter ego.

But as important as Robot and The Age of Spiritual Machines were, the general public did not really twig to the idea that we might experience the Singularity in our lifetimes until WIRED <http://www.wired.com/> ran Bill Joy’s <http://www.sun.com/aboutsun/media/ceo/mgt_joy.html> apocalyptic "Why The Future Doesn’t Need Us" <http://www.wired.com/wired/archive/8.04/joy_pr.html> in its April 2000 issue. The memorable cover illustration was of a crumpled page of a dictionary that read in part: "human (adj.) of, belonging to, or typical of the extinct species Homo sapiens." Where Kurzweil takes a fairly optimistic view of the end of the world as we know it, Joy, Chief Scientist and CEO of Sun Microsystems, worries that the coming of superintelligence will lead to human extinction. He draws an analogy between people working on Singularitarian technologies like Kurzweil and Moravec–and himself–and the scientists who developed the atomic bomb. Maybe they should never have signed on to the Manhattan Project and maybe we should now rein in our own R&D efforts. "The only realistic alternative I see" Joy writes, "is relinquishment: to limit development of the technologies that are too dangerous, by limiting our pursuit of certain kinds of knowledge."

sez you

Joy’s article caught the mainstream media by surprise and soon, ABC, BBC, NPR, USA Today, the New York Times and the Washington Post, among others, swarmed the story. The staff at the Center for the Study of Technology and Society <http://www.tecsoc.org/> has posted an excellent summary of the backing and forthing at Bill Joy's Hi-Tech Warning <http://www.tecsoc.org/innovate/focusbilljoy.htm>. This page outlines the arguments for both sides of the relinquishment issue and points to possible compromises.

But there are those who scoff at Bill Joy’s anxiety attack. For example, The Singularity Institute for Artificial Intelligence <http://www.singinst.org/>. Its charitable purpose "is to bring about the Singularity–the technological creation of greater-than-human intelligence–by building real AI. We believe that such a Singularity would result in an immediate, worldwide, and material improvement to the human condition." The Secretary/Treasurer of the Singularity Institute is Eliezer S. Yudkowsky, who has a short piece on Friendly AI over on KurzweilAI.net. From the Institute’s page you can click to the more ambitious Creating Friendly AI 0.9 <http://www.singinst.org/CFAI/index.html>, an enthusiastic book-length exploration of the concept. Then there are the Transhumans, <http://www.aleph.se/Trans/Alliance/>, a group that seeks "the continuation and acceleration of the evolution of intelligent life beyond its currently human form and human limitations by means of science and technology, guided by life-promoting principles and values." Not to mention their close allies, the Extropians <http://www.extropy.org/>, which advocate a specific flavor of transhumanism. You can click to various attacks on relinquishment from this page; among them is "A Response to Bill Joy" by Ray Kurzweil, who serves on the Council of Advisors to the Extropy Institute. Kurzweil disagrees with Joy on the "granularity" of relinquishment, by which he means that entire technologies need not be abandoned when only certain specific outcomes need be prevented. As an example, he points out that Eric Drexler <http://www.foresight.org/FI/Drexler.html>, the guru of nanotechnology, has called for researchers to relinquish development of entities that can replicate in the natural environment.

exit

Do I believe that we’re headed for the Techno-Rapture? I honestly don’t know–although my own novels and stories are certainly filled with transhuman themes. I do believe that the future is going to be strange in ways that may break those who aren’t intellectually and emotionally flexible. But let me come at the question from another direction.

Earlier this year the online version of Locus <http://www.locusmag.com/> asked its readers to "Name the five deceased twentieth century SF & fantasy writers you think will still be read fifty years from now." The top five in order, according to this sample of sf cognoscenti, were Robert A. Heinlein <http://www.dahoudek.com/heinlein/index.html>, Isaac Asimov <http://www.clark.net/pub/edseiler/WWW/asimov_home_page.html>, J.R.R. Tolkien <http://www.tolkiensociety.org/>, Philip K. Dick <http://www.philipkdick.com/> and Frank Herbert <http://www.dunenovels.com/>.

While I might argue with the ranking of these worthies, and urge consideration of several who are missing, they strike me as being a reasonable roster of SF’s Valhalla at this point of time. But the accuracy of this list assumes that we know who will be reading this stuff in 2051.

Vernor Vinge, born in 1944, is a long way from being deceased, thank you very much. But for my money, we — and whatever entities we share the world with in fifty years — will regard him as one of the immortals.

Subscriptions

If you enjoyed this sample and want to read more, Asimov's Science Fiction offers you another way to subscribe to our print magazine. We have a secure server which will allow you to order a subscription online. There, you can order a subscription by providing us with your name, address and credit card information.

Subscribe Now

Copyright

"On the Net: Singular" Copyright © 2001 by James Patrick Kelly, used by permission of the author.

Welcome to Adobe GoLive 5
Current Issue Anthologies Forum electronic Asimov Links Contact Us Subscribe Privacy Statement

To contact us about editorial matters, send an email to Asimov's SF.
Questions regarding subscriptions should be sent to our subscription address.
If you find any Web site errors, typos or other stuff worth mentioning, please send it to the webmaster.

Advertising Information

Asimov's Science Fiction is available at most major bookstores.

Copyright © 2014 Dell Magazines, A Division of Penny Publications, LLC

Current Issue Anthologies Forum Contact Us