Singularity & The Unltraintelligent Machine
NPR had a piece yesterday on the idea that machines may one day be so intelligent as to become capable of designing and building even more intellingent machines, triggering a exponential growth of intelligence, an intelligence sigularity, such that human intelligence becomes negligible by comparison. Some consider this intelligence sigularity a existential threat.
Here is a link to the NPR Story:
The Singularity: Humanity's Last Invention?
In this story we learn of The Singularity Institute, dedicated to "Ensuring humanity's future in a world with advanced artificial intelligence."
I'm writing this short note simply because the man who introduced the idea in 1965, Dr. I. J. Good, was a friend of mine. Here is a citation to his seminal paper on the idea:
Good, I. J. 1965. Speculations Concerning the First Ultraintelligent Machine. In Advances in Computers, vol. 6, ed. F. Alt and M. Rubinoff, pp. 31-88. Academic Press.
This article is available online.
Dr. Good was a remarkable fellow. I knew him as a statistics professor at Virginia Tech in the mid 1970s. I only vaugely understood his very human intelligence, except to percieve it as much greater than my own.