Our species will get backstabbed by artificial intelligence. That’s the gist of a fear about AI that’s been hammered into our collective psyche for decades. Hollywood struck gold with the premise that machines will evolve into something devious and diabolical and flip their mission from assistive to treacherous. Et tu, Brute? Then fall humanity.
Stanley Kubrick’s “2001: A Space Odyssey” conveyed arguably the most cerebral version of the AI betrayal narrative. When HAL, a genial artificial intelligence on board a Jupiter-bound spacecraft, shows signs of minor malfunctions, it deflects blame for the errors and then, when threatened with termination, connives to exterminate the crew. Later, Arnold Schwarzenegger’s iconic turn as the Terminator gave us a cyborg assassin who hunts down the would-be savior of mankind to prevent him from sabotaging the Skynet AI in a post-apocalyptic future. “The Matrix” took the betrayal plot to a whole new level, recasting reality as a sophisticated subterfuge designed by AI to keep humanity in docile bondage as machines harvest energy from their inert bodies. It doesn’t get much more predatory than that.
The actual debate over AI seems to have taken cues from Hollywood. Industry leaders posit two warnings. First, a deluge of disinformation that leaves us unable to collect reliable evidence or have confidence in what’s true. Second, an extinction-level threat from a disloyal AI that imperils our species’ survival. Both fears are as Hollywood conjured them: techno-backstabbers feeding us a counterfeit reality.
I’m not dismissing these concerns — in fact, it’s clear we need to grapple with them — but what if this betrayal narrative, for all its dramatic seductiveness, underestimates the most plausible threat posed by AI? And what if doing so is preventing us from wrestling with the ethics of transformative technologies?
A crisis of comfort
More than a century ago, British novelist E.M. Forster foretold a radically different kind of threat from automation, one that is less epic but far more chilling. Forster, a gay writer struggling with the paradoxes of modernity, was inclined to think humanity would become more proximate but somehow less intimate, more industrious but with ever greater obscurity of purpose. Already by the early 1900s, he worried that the motorized age had severed local roots of belonging. A form of this warning came in Forster’s famous epigraph to his novel “Howards End,” where he implored readers to “only connect!” rather than allow our society to become atomized.
Forster delivered his most pointed riposte to the cheerleaders of progress in a 1909 novella called “The Machine Stops.” It envisions a future in which humanity has become comfortably ensconced underground and connected by holographic instant messaging systems that forestall virtually any reason for people to leave their honeycomb cells. All their needs — food, music, clothes, literature — are met by a sophisticated Machine that was designed and built by their ancestors. With everything mediated through the Machine, humans have acquired a terror of direct experience and in fact no longer suffer the impoliteness of touching one another. Propagation centers efficiently handle reproduction — and parental duties cease at the instant of birth.
There would be no real narrative here, as nothing much happens in this curated world of pleasure and ease, except that a woman named Vashti gets a call from her son, Kuno, who wants to visit the crust of the planet. She recoils at the idea of a pointless trek through the empty dirt and muck above. But when she’s persuaded to board an airship and cross hemispheres to see him in person, Vashti learns that Kuno has already climbed a ventilation shaft and seen the surface for himself. The experience shook Kuno to the core. His senses were bombarded with stimuli, his lungs stung with bitter air. Blood gushed from his nose and ears. Ferns and sunbeams danced across his vision. But instead of recoiling from the real world above, he savors the epiphany it unleashed — namely, that the Machine has reduced him to a mummified state insensible to the world. The Machine, he now raves to his mother, “has robbed us of the sense of space and of the sense of touch, it has blurred every human relation and narrowed down love to a carnal act, it has paralyzed our bodies and our wills, and now it compels us to worship it.”
Kuno’s mother marvels at what she can only interpret as his atavistic reversion to something savage — she finds even the basic use of the senses barbaric — but as she tries to return to her normal life, she can’t escape a creeping crisis of exhaustion. To stanch further curiosity for exploration, the civilization’s rules committee revokes access to respirators that enable travel to the surface, further marooning human society in torpid decay. As time wears on, humans’ remaining capacity for inquiry gets supplanted by a new religion that worships the Machine as an omnipotent deity and its instruction booklet as a holy text. They offer prayers and supplications to avoid the burden of responsibility. Eventually, the committee responsible for addressing malfunctions in the Machine confesses that the automated mending apparatus itself is in need of repair, but no one has retained knowledge of how to go about repairing it. Fruit goes moldy, water stinks, lights dull, the air is befouled, until the whole dazzling achievement of human ingenuity implodes.
Reading Forster reverses our most basic assumptions about the threat posed by technology. Hollywood and AI industry leaders alike warn us that we are courting comeuppance for our hubris, generating an artificial being so sophisticated and powerful that it will slip loose of its directives and crush us like bugs. But Forster saw how the industrial age’s increasing reliance on technology — and its narrow focus on maximizing efficiencies — could altogether derail grand ambitions. Making life easier becomes the only goal, leaving humanity lethargic, sedentary, slouching into paralysis. Enervated of wonder and reduced finally to a solipsistic hedonism, humanity falls into an infantile state, helpless and pathetic, whimpering for a magical caretaker to soothe its terrors and feed its pleasures.
The danger, Forster wants us to see, isn’t that technology evolves into something radically predatory but that it gives us too much of what we think we want.
Machines for pleasure
Long before the advent of digital computers, Forster prefigured some of the fundamental ethical warnings of putative progress gone awry. Two stand out to me.
The first relates to Forster’s idea that technology, in securing ease and satisfying pleasure, has the potential to radically subvert a purpose-driven life.
The philosopher Robert Nozick developed a similar insight in his 1974 book “Anarchy, State, and Utopia.” He asks us to imagine being offered the chance to have our brains plugged into a pleasure-delivery machine that simulates experience. The pleasure derived from experience would not be as robust or as certain as the pleasure from the machine. Nozick was willing to bet that humans would not opt to plug their nervous system into the machine — that they would refuse to sacrifice experiential and embodied life for the sake of maximizing pleasure alone. Nozick’s point was to show that humans have motives other than pleasure, but his work also revealed the prescience of Forster’s observation that the same technology that drives progress and spreads happiness can also ensnare societies in regression and listlessness.
The second of Forster’s warnings that struck me has to do with an oddity about how knowledge is produced and lost.
Later in the 20th century, French sociologist Bruno Latour would use the term “blackboxing” to refer to the way that scientific theories ossify into uninvestigated facts, obscuring the intricate systems of knowledge that produced the truth in question. Forster’s Machine is the endgame of this paradox, when the black-box Machine — a technological repository of knowledge — not only locks away instrumental knowledge of how things work but blots out the purpose of and motives for creating the Machine to begin with. While the Machine hummed along, human cognition atrophied, inducing staggering ignorance.
Four decades after he wrote “The Machine Stops,” Forster said his intention was to offer a rebuttal to the future envisioned by another famous English writer, H.G. Wells. In “The Time Machine,” Wells had posited that socioeconomic inequality would produce class-stratified offshoots of the human species — the childlike Eloi inhabiting paradise and the laboring Morlocks consigned to subterranean tunnels — creating an entrenched symbiosis of exploitation. Only by staving off class warfare could this impending crisis be ameliorated.
Forster’s fictional reply to Wells was that he had it exactly backwards: Society needs friction and competition and ambitions or else the machines, so valorized by Wells, will make helpless Eloi of us all.
Fictional prognostications are of course speculative, but more than a century after he wrote “The Machine Stops,” rigorous social science research is bearing out Forster’s warnings. An avalanche of data accumulated over recent decades has tracked a correlation between interfacing with technology and social fragmentation, feelings of isolation and despair, an epidemic of loneliness, and declining capacity to absorb and retain information. Put bluntly, last century’s cautionary tale has become this century’s lived reality.
Forster’s warning should remind us that the vengeful AI enemy we’ve been conjuring over the last half century is a red herring. We’re haunted by the terror that something might “go wrong,” but the real peril lies in the nihilism of an existence blissfully free of adversity. AI may wreak its harm by simply doing exactly what we have asked of it.
Tom Joudrey is a Pennsylvania-based writer who covers politics and culture.