September 4, 2009

Castles in the Sky

This is a poem I wrote many years ago...

“Castles in the Sky”

I want to write about castles;
Filled with gallant knights,
Besieged by barbarous hordes
In quest for an entowered maiden.
I want to write of that princess;
Sought by all the world
For her fiery hair, hot red
Dragon’s breath for all who blaspheme
Betray the secret or bespoil the prize.
I want to write about those dragons;
As they wing through the heavens,
Linger weightlessly in thick scaled skin
As they gaze on the land, still mindful
Of an ancient time before men
They wistfully remember and look to the stars.
I want to write about those points of light;
The warmth and life and love
To a thousand other poets far removed
Like dragons they float above me
Like maidens they are encircled
By castles of the mind.

September 3, 2009

The Technological Singularity

Here I will discuss, in brief, an idea that has fascinated me since I first learned of it about a year ago. That idea is the concept of the technological singularity. The term technological singularity uses the singularity point of astrophysics as a metaphor for a geometrically increasing level of technological sophistication beyond which we cannot comprehend the outcomes. In astrophysics, a singularity is a point in space with such strong gravity that matter collapses to a single point and light cannot escape from the event horizon, so it is impossible to see what is occurring within the perimeter of the event horizon. Passing through the event horizon, according to this metaphor, is the process of living with machines that are developed that are more creatively intelligent than the humans who created them. This is the important moment in machine intelligence.

Our calculating machines (computers) have been growing more complex and able to calculate more quickly at an incredible rate since the 1970s, but it has been humans directing this process of machine development. The human brains that design computers have remained very much the same, as far we know, in the last 10,000 years. But through a process scientific development and building upon the technological advances of other researchers, each month has brought incremental changes in computer technology. We now use software as a powerful tool to program better software and to design better hardware, but the process is still human directed, and thus limited by our limited capacities. Machine intelligence is still both inferior to human intelligence in creativity and in processing power, by neurologists' best estimates (though a direct comparison is very difficult because of the different natures of human and machine "thought" at present). The key moment comes when machines can independently design newer, faster machines and replace human beings completely in this development process.

It is not difficult to see why this can create a runaway positive feedback loop. If a machine can design an even better machine in 24 hours and this second machine is twice as fast, then this second machine can design a third machine that is also twice as fast as itself, but it will design this third machine in only 12 hours. The third machine does the same and each generation both increases the computational power and decreases the length of time until the next generation. The numbers in my example are arbitrary, but with the power of this positive feedback loop, it really doesn't matter what numbers we plug in, as the effect is the same and the process will rapidly create machines that far surpass humans in every regard.

Verner Vinge, a technology theorist and popular science fiction author, has written some very prescient theories on the development of technology. He a retired mathematics processor and a very gifted author (though this isn't the place for fiction book recomendations). He is the man who coined the term technological singularity and this paper on it makes for excellent reading: From the first section of that paper, we see that Vinge lays out several possible routes for us to pass over the event horizon of the technological singularity:

* There may be developed computers that are "awake" and superhumanly intelligent. (To date, there has been much controversy as to whether we can create human equivalence in a machine. But if the answer is "yes, we can", then there is little doubt that beings more intelligent can be constructed shortly thereafter.)
* Large computer networks (and their associated users) may "wake up" as a superhumanly intelligent entity.
* Computer/human interfaces may become so intimate that users may reasonably be considered superhumanly intelligent.
* Biological science may provide means to improve natural human intellect.

I suggest reading the rest of the paper for a better understanding of the alternative ways in which the singularity might happen, the possibility of avoiding the singularity, and effects that the singularity is already having on humanity. It may happen very suddenly, as in the fictional universe of Terminator where Skynet "wakes up" and within hours is exterminating humans and spreading across the globe, or it may happen more gradually. We may be able to ride along on this process and either integrate our own intelligence intimately with that of machines or find some way to improve our own brains as a central part of the singularity. On the one hand, there exists the frightening scenario that the machines would outpace us and quickly be trillions of times more powerful and the age of humanity would end and we, as a species would die out. This recalls visions of The Matrix or the even more horrifying world of Harlan Ellison's classic short story "I Have No Mouth and I Must Scream". On the other hand, if we were able to "hitch a ride" on this postive feedback loop of increasing intelligence, we would ourselves become as gods and rise to a new level of existence that is unimaginable now.(you can find the complete text of the short story here:

Imagine for a moment a prehistoric man or woman from 12,000 BCE stepping into the modern world of 2009. Our dress has changed, our language has changed, our culture has evolved, we have not only domesticated and changed animals but we have animated inanimate objects such as cars and planes and they now obey us. We communicate over vast distances by means not only of technology but symbolic expression (written language) as well as verbal communication that would be incomprehensibly complex by the primitive's standards. We understand a large part of the natural world and have a systematized method for enlarging this understanding without resulting to myths. The health issues that plagued early man have disappeared and new ones have emerged, but overall the comforts of living in a modern, technological civilization in the present day are vastly preferably to the hardships of preagricultural nomads of 12,000 BCE. Contemplate the advances from gradual social and technological progress over the past 14,000 years (to pick a date) and then imagine that same level of changes--writing, legalistic societies, combustion engines, electricity, refrigeration, germ theory, flight, transistors--all compressed into a few months, or a few short years. We can predict the world of the singularity with roughly the same accuracy that the man or woman from 12,000 BCE could predict, from his or her vantage point, what the year 2012 will be like (date chosen for prophetic references from Mayan cosmology).

What are your thoughts on the technological singularity? What is your predictions for when it will "happen"?