I haven't listened to the lecture, but I'm guessing that this "rupture" in the fabric of human existence will not actually occur as the rate at which we extinguish our own chances for survival, with reference to pollution, etcetera, etcetera, seems to share some sort of functional relationship with all advances in technology, as well. So if we evolve technologically at a rate of 20,000 "what-have-you's" per decade, we also (in some functional linear or non-linear way) destroy the livability of the world and all of our ever evaporating chances of bringing it back to a state of sustainability.
Starting from the industrial revolution, that key factor in humans; which is laziness -has led to the subtle and unattainable rise of the lower class towards a more and more consumptive life style in the name of "liberty and freedom for all" -based upon our actual aversion to having anything to do with anything requiring effort.
Industrial machines, at least, reformed slave labor (in some places) and, initially (as far as I'm concerned), had some great benefits to the world at large, but from a larger standpoint this whole "rise of the machine" has ultimately led the world into a false and completely unsustainable view of sorts that we acquire something more, or better, with every technological advance. The opposite is true. With every technological advance we become even more useless to our selves than we already are; our relationship with what it actually takes to survive on a fundamental level is distanced; we spend even less time doing the things we should be doing, and even more time procrastinating in the face of inevitable self annihilation.
We will wipe our selves out before technology surpasses human intelligence. And even if such machinesbecome better equipped than we are to make their own decisions, they had better also at the same time be prepared to find their form of sustenance (which is, obviously, electricity), or deal with their personal illnesses (which are glitches, circuit failures, and "I obviously don't know anything about computer hardware" problems, for example).
We measure intelligence by how capable we are with science, how good our jobs are, and other standards that are simply just artifacts of being human; like the type of books one reads, or - I don't know; what really goes for "intelligence" these days. I can't say. I'm quite happy not listening to any thoughts in my head all day long. It's nice. I prefer the silence in my head to thinking. But real intelligence would be exactly what Buddhism talks about. If we were really intelligent we'd recognize exactly how our impulses direct our thoughts and our wants, and how our wants direct our actions; and finally how our actions affect our environment. We'd abolish the automobile right away, we'd do this and that and this and that.
Anyhow. My question is how does one even compare the biology of a human -in all its complexity, to the capacities of a technology that is comprised mainly of copper and silicon? I think the problem with making an assertion about the intelligence of technology surpassing that of humans is assuming that we have any intelligence at all to begin with. If we did, we might ask why we're becoming concerned over a silicon chipboard surpassing our abilities to win at games of chess, or certain sophisticated diatribes involving probability calculations -if you know what I mean.
If the human brain makes decisions based on the probability of this being greater than the probability of that -and this is by definition the nuance that makes us intelligence, then computer's are going to surpass us. But if I'm hungry enough I'll eat banana peals. I don't care. Anyhow. Going to stop right there. I'm ranting.
-Pondera (I have a point though) [email protected]