Vernor Vinge is an acclaimed science fiction author and futurist. The Long Now Foundation is an organization of technologists, artists, and others dedicated to pondering the challenges facing society on very long time scales, on the order of thousands of years. And “the Singularity” is a concept invented decades ago by Vinge that says, in effect: technological progress is advancing almost unavoidably to a point (called the Singularity) where technology itself will exceed the intelligence and abilities of humans. After the Singularity, continued technological advancement is in the hands of technology that’s literally superhuman. It proceeds at a superhuman pace according to superhuman motives. Just as our laws of physics break down at the event horizon of a black hole, it is in principle impossible for us to make predictions about the future beyond the Singularity, when things will be as incomprehensible to us humans as, in Vinge’s words, “opera is to a flatworm.”
Although Vinge believes that the Singularity is the likeliest non-catastrophic outcome for the future of humanity (and there are many who agree and many who don’t), his talk to The Long Now Foundation addressed alternative, non-Singularity possibilities. What might prevent the Singularity from occurring? War and various catastrophes on a global scale are obvious ones. But there are two interesting non-Singularity possibilities that Vinge did not discuss.
The less interesting and less likely of the two possibilities is that there is some fundamental limit on the complexity of information processing systems, and human brains are already at or near that limit. If these two suppositions are true, then it is not possible for technology to exceed human reasoning or inventing power by a significant amount — though it would still be possible to employ vaster, harder-working armies of reasoning and inventing machines than it would be to recruit similar numbers of people. (Interestingly, Vinge posits just such a fundamental limitation in his science fiction masterpiece, A Fire Upon The Deep — a rousing and thought-provoking adventure, and the only sci-fi story I’ve ever come across that feels truly galactic in scope.)
Here’s the non-Singularity possibility I like better: though machine intelligence may exceed that of humans, human intelligence can keep up, like Dr. Watson arriving at a conclusion or two of his own while following Sherlock Holmes around, or like me surrounding myself with friends whose superior intellect and wit eventually rubbed off on me, at least a little.
Consider that a hundred years ago, it took geniuses at the pinnacle of human intelligence to devise the counterintuitive physical theories of relativity and quantum mechanics that, today, are grasped (in their rudiments) by children in middle school. Consider that the same race of beings that once gazed up at the heavens and made up fairy tales about the constellations has now charted and explained very much of the visible universe, almost all the way back to the beginning of time — and it took only a few dozen centuries.
Perhaps there are realms of thought and invention that require posthuman brainpower to discover. But I’m optimistic that where our future technology leads, we can follow.