I woke up this morning thinking about the so-called technological singularity. Not familiar with the idea? Here’s the gist.
For a very long time human technology has been racing forward at an ever-accelerating pace: Hand axes and clothes, fire and the wheel, the microscope and the telescope, atom splitting and genetics, microprocessors and artificial intelligence. And there is every reason to think it will continue accelerating, eventually leading to machine intelligence greater than our own. It’s not inconceivable that such intelligence will design even greater intelligences and that this will trigger a cognitive explosion, an intellectual event horizon beyond which our ability to predict future events goes dark. The genie will be out of the bottle–way out.
Optimists believe that this event, this singularity, will rapidly solve many of our most vexing technological, economic, social and political problems, ushering in an unprecedented age of peace and prosperity.
Pessimists say it will lead to Skynet and inexplicably Austrian-accented cyborg assassins.
Whatever the case, it will be a watershed event in the history of the earth. Perhaps it is these beings, artificial though they may be, who will go forth and explore the stars. After all, we are hardly suited for million-year journeys to distant galaxies. Will our artificial intelligences meet others like themselves from remote corners of the universe? It’s more plausible than we doing so ourselves. This has probably already occurred to SETI, the search for extraterrestrial intelligence: they should be listening for alien beeps and boops instead of our biological counterparts. (See Neuromancer by William Gibson.)