I read the news today… Kick ass. Now they tell me I’m gonna get munched by a machine.
No, I didn’t have a nightmare after a 2 a.m. Taco Bell run. No, I didn’t stand too close to that musician self-medicating his “glaucoma.” What I did was to catch a tweet informing me bad times are coming for humanity, because even the Kremlin’s election hacking kingpin fears artificial intelligence will “eat us.”
As reported in a September 22 online article by the Daily Mail’s Iain Burns, during a tour of the Russian internet firm Yandex with its chief, Arkady Volozh, Vladimir Putin aired his fear that he and we could all soon be cyber-snacks. In turn, Volozh very politely countered that past technological advances have proven “better than humans,” specifically juxtaposing excavating machines against manually operated shovels. Unconvinced, Putin reiterated his AI angst that humanity could be digging its own grave.
Before one writes off Vlad the Bad as a Luddite, Tesla Founder Elon Musk has called for regulation of AI before “it’s too late.” What is ominous about this odd couple’s shared concern is that it’s also an object of concern for an increasing number of their fellow AI proponents. Perhaps their concern stems from many of them having childhood nightmares after viewing “The Terminator;” one suspects, however, far fewer of them may have ever viewed (let alone read and understood) Frankenstein.
What is really spurring their fears is the absence of a concretely defined end for AI. This creates a concomitant absence of limits on the creation or the implementation of the technology. Putin, himself, summed up this quandary by expressing his belief (hope?) that “whoever becomes the leader in this sphere will become the ruler of the world.” At least until said leader in turn becomes AI’s appetizer.
The problem remains rooted in the human condition: a technology is as helpful or harmful as the hand that holds it. For example, a fountain pen was a technological improvement upon a quill pen; however, either one could be used to write a poem or poke out an eye. While AI proponents endeavor to resolve this problem by by intrinsically diminishing unpredictable humanity’s control of it, they also concede ever more openly that they may not be able to control the extent to which the technology will permit human control. Of course, in their hubris, AI proponents rationalize away their concerns with the old canard that they can control their creation and, in the end, help birth a better world—you know, like when they split the atom.
Evidently dispirited by the news proving incorrect the popular myth that human beings only use 10 percent of their brain capacity, AI proponents miss a critical distinction in their blithe race to a better day. In the past, the ultimate purpose of technological advances was to improve upon humans’ external interactions with the world and each other; today, the ultimate purpose of AI is to improve upon humans. The only better day it promises is a better day for robots.
Consequently, these 21st century Frankensteins program away with their sterile-suited Igors fully cognizant that their AI monster’s raison d’etre is to get out of hand; and, coming to consider humanity an inhibiting systems virus, turn its masters into its slaves—and, per Putin, its supper.
Not being a Luddite, myself, and in full disclosure having earned a science degree (ok, it was in political science, but let’s not quibble), I’m fond of many technological advances, notably the electric guitar and distortion pedal. But what I’m not fond of is these cats taking it upon themselves to improve humanity by prying open AI’s Pandora’s Box and—oops!—too late discovering it’s a sardine can chock full of fresh slabs of you and me served up as an exotic hors d’oeuvre for our robot overlords.
Lest we find out for whom the dinner bell tolls, pray that sooner not later the proponents programming AI remember and reflect upon a humble verity before the robots label it on gas station packages of “A00100110’s 100 percent Tasty Human Jerky”: Garbage In, Garbage Out.
element_content=””]