The singularity runs both ways. People always get that stupid look on their face talking about AI and our culture’s mad dash to “replace” people with intelligent or highly efficient machinery. I’m not saying that won’t happen but it annoys the shit out of me that people miss the entire historic arc of human beings trying to transform themselves into the most intelligent, highly efficient machinery.
What is the mark of intelligence in AI? Everyone debates but a good working definition is “mimicking” consciousness. I know, I know, we could spend an entire DAY arguing just that phrase alone but bear with me here. Let’s say the AI’s consciousness is only mimicry. That’s all we’d need to declare them independent from us (at last! at last!). Well, that’s pretty efficient isn’t it? Not wasting the energy to become the real thing? Settling, as it were, for partial intelligence?
Here’s my beef. In all aspects of life, people have become increasingly focused on achieving 100%. 100% in production. 100% mistake-free. 100% happy happy happy all the goddamn time. Has it not occurred to anyone that a healthy medium might be some number dramatically lower than ONE HUNDRED PERCENT? Do foxes eat 100% of the squirrels the encounter? Do trees consume 100% of the carbon dioxide in the atmosphere? Does a novel contain the precise number of words it needs to convey a single idea…or is there a bit of excess?
“Science’s” mad dash towards intelligent and highly efficient machinery isn’t a separate strand of our culture. It is our mainstream culture’s stupid inclination (carried over from the industrial revolution) that we should be perfect machines ourselves, possessing human-like computers.
-j