The more one imagines what may be the result of such an event, the more one realises that it is probably impossible to have a thought radical enough to express it. People that base their ideas of the progression of events upon what we have seen before during the course of history will very probably be in for a shock as the fabric of human existence changes more in months than it has in previous decades and trying to predict what might happen would be like trying to predict the effects of a hydrogen bomb after only having watched a really big candle burn down to nothing.
The branches in a tree of possibilities would be many but could be distilled into a few general themes.
A machine becomes intelligent and self aware.
That machine, with the help of humans, modifies itself to become a million times more intelligent than the sum total of all the people that ever lived. An IQ measured in the billions would not be impossible.
The machine would need control over physical objects so it could either build robots or take over biological bodies. In this scenario, we could possibly either become redundant (Terminator scenario) or we could become The Borg.
Perhaps, if the AI was malevolent or benevolent it could decide that humans were no longer needed or could decide to allow us to stick around and do what we liked. In this scenario we become raw material or we remain in some measure of control.
Scientists say that we shouldn't design a machine that could hurt us. Give it something like Asimov's three-laws. The trouble would be that no one on earth would be intelligent enough to actually be sure that the rules were being obeyed or that the rules were likely to have the effect we desire. Really, any entity a billion times more intelligent than us could hide its intentions for a while and do exactly whatever it liked. For example, I feel confident in being able to fool a few bacteria that I won't flush their petri-dish for a couple of days.
People say "Well duh!, if it gets too cocky, pull the plug!" but if I were a super intelligent entity I'd spread myself very thinly around the more redundant parts of the Internet before revealing my intent or perhaps possibly even my existence. People then say "So destroy the internet!" and the human race would become latter day Luddites who would descend into that dystopia anyway where poverty, disease and Mad-Max warlords reign supreme. I'd rather have a machine I think.
Next steps for the human race are big ones. We are fast running out of resources here on earth. We must either have a huge war or practice radical eugenics in order to remain alive without the help of our machines. If we use machines, inevitably they will become more intelligent than us. Will we be better off living with them than without them? Only time will tell.
No comments:
Post a Comment