Cyberpunk is Dead

By John Semley

Source: The Baffler

“It was an embarrasser; what did I want? I hadn’t thought that far ahead. Me, caught without a program!”
—Bruce Bethke, “Cyberpunk” (1983)

Held annually in a downtown L.A. convention center so massive and glassy that it served as a futurist backdrop for the 1993 sci-fi action film Demolition Man and as an intergalactic “Federal Transport Hub” in Paul Verhoeven’s 1997 space-fascism satire Starship Troopers, the Electronic Entertainment Expo, a.k.a. “E3,” is the trade show of the future. Sort of.

With “electronic entertainment” now surpassing both music and movies (and, indeed the total earnings of music and movies combined), the future of entertainment, or at least entertainment revenue, is the future of video games. Yet it’s a future that’s backward-looking, its gaze locked in the rearview as the medium propels forward.

Highlights of E3’s 2019 installment included more details around a long-gestating remake of the popular PlayStation 1-era role-playing game Final Fantasy VII, a fifth entry in the demon-shooting franchise Doom, a mobile remake of jokey kids side-scroller Commander Keen, and playable adaptations of monster-budget movie franchises like Star Wars and The Avengers. But no title at E3 2019 garnered as much attention as Cyberpunk 2077, the unveiling of which was met with a level of slavish mania one might reserve for a stadium rock concert, or the ceremonial reveal of an efficacious new antibiotic.

An extended trailer premiere worked to whet appetites. Skyscrapers stretched upward, slashed horizontally with long windows of light and decked out with corporate branding for companies called “DATA INC.” and “softsys.” There were rotating wreaths of bright neon billboards advertising near-futuristic gizmos and gee-gaws, and, at the street level, sketchy no-tell motels and cars of the flying, non-flying, and self-piloting variety. In a grimy, high-security bunker, a man with a buzzcut, his face embedded with microchips, traded blows with another, slightly larger man with a buzzcut, whose fists were robotically augmented like the cyborg Special Forces brawler Jax from Mortal Kombat. The trailer smashed to its title, and to wild applause from congregated gamers and industry types.

Then, to a chug-a-lug riff provided by Swedish straight-edge punkers Refused (recording under the nom de guerre SAMURAI) that sounded like the sonic equivalent of a can of Monster energy drink, an enormous freight-style door lifted, revealing, through a haze of pumped-out fog, a vaguely familiar silhouette: a tall, lean-muscular stalk, scraggly hair cut just above the shoulders. Over the PA system, in smoothly undulating, bass-heavy movie trailer tones, a canned voice announced: “Please welcome . . . Keanu Reeves.” Applause. Pitchy screams. Hysterics in the front row prostrating themselves in Wayne’s World “we’re not worthy!” fashion. “I gotta talk to ya about something!” Reeves roared through the din. Dutifully reading from a teleprompter, he plugged Cyberpunk 2077’s customizable characters and its “vast open world with a branching storyline,” set in “a metropolis of the future where body modification has become an obsession.”

More than just stumping for Cyberpunk 2077, Reeves lent his voice and likeness to the game as a non-playable character (NPC) named “Johnny Silverhand,” who is described in the accompanying press materials as a “legendary rockerboy.” A relative newbie to the world of blockbuster Xbox One games, Reeves told the audience at E3 that Cyberpunk piqued his interest because he’s “always drawn to fascinating stories.” The comment is a bit rich—OK, yes, this is a trade show pitch, but still—considering that such near-futuristic, bodily augmented, neon-bathed dystopias are hardly new ground for Reeves. His appearance in Cyberpunk 2077 serves more to lend the game some genre cred, given Reeves’s starring roles in canonical sci-fi films such as Johnny Mnemonic (1995) and the considerably more fantastic Matrix trilogy (1999-2003)—now quadrilogy; with an anticipated fourth installment announced just recently. Like many of E3 2019’s other top-shelf titles, Cyberpunk 2077 looked forward by reflecting back, conjuring its tech-noir scenario from the nostalgic ephemera of cyberpunk futures past.

This was hardly lost among all the uproar and excitement. Author William Gibson, a doyenne of sci-fi’s so-called “cyberpunk” subgenre, offered his own withering appraisal of Cyberpunk 2077, tweeting that the game was little more than a cloned Grand Theft Auto, “skinned-over with generic 80s retro-future” upholstery. “[B]ut hey,” Gibson added, a bit glibly, “that’s just me.” One would imagine that, at least in the burrows of cyberpunk fandom, Gibson’s criticism carries considerable weight.

After all, the author’s 1984 novel Neuromancer is a core text in cyberpunk literature. Gibson also wrote the screenplay for Johnny Mnemonic, adapted from one of his own short stories, which likewise developed the aesthetic and thematic template for the cyberpunk genre: future dystopias in which corporations rule, computer implants (often called “wetware”) permit access to expansive virtual spaces that unfold before the user like a walk-in World Wide Web, scrappy gangs of social misfits unite to hack the bad guys’ mainframes, and samurai swords proliferate, along with Yakuza heavies, neon signs advertising noodle bars in Kanji, and other fetish objects imported from Japanese pop culture. Gibson dissing Cyberpunk 2077 is a bit like Elvis Presley clawing out of his grave to disparage the likeness of an aspiring Elvis impersonator.

Gibson’s snark speaks to a deeper malaise that has beset cyberpunk. A formerly lively genre that once offered a clear, if goofy, vision of the future, its structures of control, and the oppositional forces undermining those authoritarian edifices, it has now been clouded by a kind of self-mythologizing nostalgia. This problem was diagnosed as early as 1991 by novelist Lewis Shiner, himself an early cyberpunk-lit affiliate.

“What cyberpunk had going for it,” Shiner wrote in a New York Times op-ed titled “Confessions of an Ex-Cyberpunk, “was the idea that technology did not have to be intimidating. Readers in their teens and 20’s responded powerfully to it. They were tired of hearing how their home computers were tempting them into crime, how a few hackers would undermine Western civilization. They wanted fiction that could speak to the sense of joy and power that computers gave them.”

That sense of joy had been replaced, in Shiner’s estimation, by “power fantasies” (think only of The Matrix, in which Reeves’s moonlighting hacker becomes a reality-bending god), which offer “the same dead-end thrills we get from video games and blockbuster movies” (enter, in due time, the video games and blockbuster movies). Where early cyberpunk offerings rooted through the scrap heap of genre, history, and futurist prognostication to cobble together a genre that felt vital and original, its modern iterations have recourse only to the canon of cyberpunk itself, smashing together tropes, clichés, and old-hat ideas that, echoing Gibson’s complaint, feel pathetically unoriginal.

As Refused (in their pre-computer game rock band iteration) put it on the intro to their 1998 record The Shape of Punk to Come: “They told me that the classics never go out of style, but . . . they do, they do.”

Blade Ran

The word was minted by author Bruce Bethke, who titled a 1980 short story about teenage hackers “Cyberpunk.” But cyberpunk’s origins can be fruitfully traced back to 1968, when Philip K. Dick published Do Androids Dream of Electric Sheep?, a novel that updated the speculative fiction of Isaac Asimov’s Robot series for the psychedelic era. It’s ostensibly a tale about a bounty hunter named Rick Deckard chasing rogue androids in a post-apocalyptic San Francisco circa 1992. But like Dick’s better stories, it used its ready-made pulp sci-fi premise to flick at bigger questions about the nature of sentience and empathy, playing to a readership whose conceptions of consciousness were expanding.

Ridley Scott brought Dick’s story to the big screen with a loose 1982 film adaptation, Blade Runner, which cast Harrison Ford as Deckard and pushed its drizzly setting ahead to 2019. With its higher order questions about what it means to think, to feel, and to be free—and about who, or what, is entitled to such conditions—Blade Runner effectively set a cyberpunk template: the billboards, the neon, the high-collared jackets, the implants, the distinctly Japanese-influenced mise-en-scène extrapolated from Japan’s 1980s-era economic dominance. It is said that William Gibson saw Blade Runner in theaters while writing Neuromancer and suffered something of a crisis of conscience. “I was afraid to watch Blade Runner,” Gibson told The Paris Review in 2011. “I was right to be afraid, because even the first few minutes were better.” Yet Gibson deepened the framework established by Blade Runner with a crucial invention that would come to define cyberpunk as much as drizzle and dumpsters and sky-high billboards. He added another dimension—literally.

Henry Case, Gibson establishes early on, “lived for the bodiless exultation of cyberspace.” As delineated in Neuromancer, cyberspace is an immersive, virtual dimension. It’s a fully realized realm of data—“bright lattices of logic unfolding across that colorless void”—which hackers can “jack into” using strapped-on electrodes. That the matrix is “bodiless” is a key concept, both of Neuromancer and of cyberpunk generally. It casts the Gibsonian idea of cyberspace against another of the genre’s hallmarks: the high-tech body mods flogged by Keanu Reeves during the Cyberpunk 2077 E3 demo.

Early in Neuromancer, Gibson describes these sorts of robotic, cyborg-like implants and augmentations. A bartender called Ratz has a “prosthetic arm jerking monotonously” that is “cased in grubby pink plastic.” The same bartender has implanted teeth: “a webwork of East European steel and brown decay.” Gibson’s intense, earthy descriptions of these body modifications cue the reader into the fundamental appeal of Neuromancer’s matrix, in which the body itself becomes utterly immaterial. Authors from Neal Stephenson (Snow Crash) to Ernest Cline (Ready Player One, which is like a dorkier Snow Crash, if such a thing is conceivable), further developed this idea of what theorist Fredric Jameson called “a whole parallel universe of the nonmaterial.”

As envisioned in Stephenson’s Snow Crash, circa 1992, this parallel universe takes shape less as some complex architecture of unfathomable data, and more as an immersive, massively multiplayer online role-playing game (MMORPG). Stephenson’s “Metaverse”—a “moving illustration drawn by [a] computer according to specifications coming down the fiber-optic cable”—is not a supplement to our real, three-dimensional world of physical bodies, but a substitute for it. Visitors navigate the Metaverse using virtual avatars, which are infinitely customizable. As Snow Crash’s hero-protagonist, Hiro Protagonist (the book, it should be noted, is something of a satire), describes it: “Your avatar can look any way you want it to . . . If you’re ugly, you can make your avatar beautiful. If you’ve just gotten out of bed, your avatar can still be wearing beautiful clothes and professionally applied makeup. You can look like a gorilla or a dragon or a giant talking penis in the Metaverse.”

Beyond Meatspatial Reasoning

The Metaverse seems to predict the wide-open, utopian optimism of the internet: that “sense of joy and power” Lewis Shiner was talking about. It echoes early 1990s blather about the promise of a World Wide Web free from corporate or government interests, where users could communicate with others across the globe, forge new identities in chat rooms, and sample from a smorgasbord of lo-res pornographic images. Key to this promise was, to some extent, forming new identities and relationships by leaving one’s physical form behind (or jacked into a computer terminal in a storage locker somewhere).

Liberated from such bulky earthly trappings, we’d be free to pursue grander, more consequential adventures inside what Gibson, in Neuromancer, calls “the nonspace of the mind.” Elsewhere in cyberpunk-lit, bodies are seen as impediments to the purer experience of virtuality. After a character in Cory Doctorow’s Down and Out in the Magic Kingdom unplugs from a bracingly real simulation immersing him in the life of Abraham Lincoln, he curses the limitations of “the stupid, blind eyes; the thick, deaf ears.” Or, as Case puts it in Neuromancer, the body is little more than “meat.”

In Stephenson’s Metaverse, virtual bodies don’t even obey the tedious laws of physics that govern our non-virtual world. In order to manage the high amount of pedestrian traffic within the Metaverse and prevent users from bumping around endlessly, the complicated computer programming permits avatars simply to pass through one another. “When things get this jammed together,” Hiro explains, “the computer simplifies things by drawing all of the avatars ghostly and translucent so you can see where you’re going.” Bodies—or their virtual representations—waft through one another, as if existing in the realm of pure spirit. There is an almost Romantic bent here (Neuromancer = “new romancer”). If the imagination, to the Romantics, opened up a gateway to deep spiritual truth, here technology serves much the same purpose. Philip K. Dick may have copped something of the 1960s psychedelic era’s ethos of expanding the mind to explore the radiant depths of the individual soul, spirit, or whatever, but cyberpunk pushed that ethos outside, creating a shared mental non-space accessible by anyone with the means—a kind of Virtual Commons, or what Gibson calls a “consensual hallucination.”

Yet outside this hallucination, bodies still persist. And in cyberpunk, the physical configurations of these bodies tend to express their own utopian dimension. Bruce Bethke claimed that “cyberpunk” resulted from a deliberate effort to “invent a new term that grokked the juxtaposition of punk attitudes and high technology.” Subsequent cyberpunk did something a bit different, not juxtaposing but dovetailing those “punk attitudes” with high-tech. (“Low-life, high-tech” is a kind of a cyberpunk mantra.) Neuromancer’s central heist narrative gathers a cast of characters—hacker Henry Case, a cybernetically augmented “Razorgirl” named Molly Millions, a drug-addled thief, a Rastafari pilot—that can be described as “ragtag.” The major cyberpunk blockbusters configure their anti-authoritarian blocs along similar lines.

In Paul Verhoeven’s cyberpunk-y action satire Total Recall, a mighty construction worker-cum-intergalactic-spy (Arnold Schwarzenegger) joins a Martian resistance led by sex workers, physically deformed “mutants,” little people, and others whose physical identities mirror their economic alienation and opposition to a menacing corporate-colonial overlord named Cohaagen.

In Johnny Mnemonic, Keanu Reeves’s businesslike “mnemonic courier” (someone who ferries information using computer implants embedded in the brain) is joined by a vixenish bodyguard (Dina Meyer’s Jane, herself a version of Neuromancer’s Molly Millions), a burly doctor (Henry Rollins), and a group of street urchin-like “Lo-Teks” engaged in an ongoing counterinsurgency against the mega-corporation Pharmakom. Both Mnemonic and Recall rely on cheap twists, in which a figure integral to the central intrigue turns out to be something ostensibly less- or other-than-human. Total Recall has Kuato, a half-formed clairvoyant mutant who appears as a tumorous growth wriggling in the abdomen of his brother. Even more ludicrously, Mnemonic’s climax reveals that the Lo-Teks’ leader is not the resourceful J-Bone (Ice-T), but rather Jones, a computer-augmented dolphin. In cyberpunk, the body’s status as “dead meat” to be transcended through computer hardware and neurological implantation offers a corollary sense of freedom.

The idea of the cybernetic body as a metaphor for the politicized human body was theorized in 1985, cyberpunk’s early days, by philosopher and biologist Donna Haraway. Dense and wildly eclectic, by turns exciting and exasperating, Haraway’s “Cyborg Manifesto” is situated as an ironic myth, designed to smash existing oppositions between science and nature, mind and body. Haraway was particularly interested in developing an imagistic alternative to the idea of the “Goddess,” so common to the feminism of the time. Where the Goddess was backward-looking in orientation, attempting to connect women to some prelapsarian, pre-patriarchal state of nature, the cyborg was a myth of the future, or at least of the present. “Cyborg imagery,” she writes, “can suggest a way out of the maze of dualisms in which we have explained our bodies and our tools to ourselves.” Part machine and part flesh, Haraway visualizes the cyborg as a being that threatens existing borders and assumes responsibility for building new ones.

Though they are not quite identical concepts, Haraway’s figure of the cyborg and the thematics of cyberpunk share much in common. A character like Gibson’s Molly Millions, for example, could be described as a cyborg, even if she is still essentially gendered as female (the gender binary was one of the many “dualisms” Haraway believed the cyborg could collapse). Cyborgs and cyberpunk are connected in their resistance to an old order, be it political and economic (as in Neuromancer, Johnny Mnemonic, etc.) or metaphysical (as in Haraway). The cyborg and the cyberpunk both dream of new futures, new social relationships, new bodies, and whole new categories of conceptions and ways of being.

The historical problem is that, for the most part, these new categories and these new relationships failed to materialize, as cyberpunk’s futures were usurped and commodified by the powers they had hoped to oppose.

Not Turning Japanese

In an introduction to the Penguin Galaxy hardcover reissue of Neuromancer, sci-fi-fantasy writer Neil Gaiman ponders precisely how the 1980s cyberpunk visions came to shape the future. “I wonder,” he writes, “to what extent William Gibson described a future, and how much he enabled it—how much the people who read and loved Neuromancer made the future crystallize around his vision.”

It’s a paradox that dogs most great sci-fi writers, whose powers for Kuato-style clairvoyance have always struck me as exaggerated. After all, it’s not as if, say, Gene Roddenberry literally saw into the future, observed voice-automated assistants of the Siri and Alexa variety, and then invented his starship’s speaking computers. It’s more that other people saw the Star Trek technology and went along inventing it. The same is true of Gibson’s matrix or Stephenson’s Metaverse, or the androids of Asimov and Dick. And the realization of many technologies envisioned by cyberpunk—including the whole concept of the internet, which now operates not as an escapist complement to reality, but an essential part of its fabric, like water or heat—has occurred not because of scrappy misfits and high-tech lowlifes tinkering in dingy basements, but because of gargantuan corporate entities. Or rather, the cyberpunks have become the corporate overlords, making the transition from the Lo-Teks to Pharmakom, from Kuato to Cohaagen. In the process, the genre and all its aspirations have been reduced to so much dead meat. This is what Shiner was reacting to when, in 1991, he renounced his cyberpunk affiliations, or when Bruce Bethke, who coined the term, began referring to “cyberpunk” as “the c-word.”

The commodification of the cool is a classic trick of capitalism, which has the frustrating ability to mutate faster than the forces that oppose it. Yet even this move toward commodification and corporatization is anticipated in much cyberpunk. “Power,” for Neuromancer’s Henry Case, “meant corporate power.” Gibson goes on: “Case had always taken it for granted that the real bosses, the kingpins in a given industry, would be both more and less than people.” For Case (and, it follows, Gibson, at least at the time of his writing), this power had “attained a kind of immortality” by evolving into an organism. Taking out one-or-another malicious CEO hardly matters when lines of substitutes are waiting in the wings to assume the role.

It’s here that cyberpunk critiques another kind of body. Not the ruddy human form that can be augmented and perfected by prosthetics and implants, but the economic body. Regarding the economy as a holistic organism—or a constituent part of one—is an idea that dates back at least as far as Adam Smith’s “invisible hand.” The rhetoric of contemporary economics is similarly biological. An edifying 2011 argument in Al Jazeera by Paul Rosenberg looked at the power of such symbolic conceptions of the economy. “The organic metaphor,” Rosenberg writes, “tells people to accept the economy as it is, to be passive, not to disturb it, to take a laissez faire attitude—leave it alone.”

This idea calls back to another of cyberpunk’s key aesthetic influences: the “body economic” of Japan in the 1980s. From the 2019 setting of 1982’s Blade Runner, to the conspicuous appearance of yakuza goons in Gibson’s stories, to Stephenson’s oddly anachronistic use of “Nipponese” in Snow Crash, cyberpunk’s speculative futures proceed from the economic ascendency of 1980s Japan, and the attendant anxiety that Japan would eventually eclipse America as an economic powerhouse. This idea, that Japan somehow is (or was) the future, has persisted all the way up to Cyberpunk 2077’s aesthetic template, and its foregrounding of villains like the shadowy Arasaka Corporation. It suggests that, even as it unfolds nearly sixty years from our future, the blockbuster video game is still obsessed with a vision of the future past.

Indeed, it’s telling that as the robust Japanese economy receded in the 1990s, its burly body giving up the proverbial ghost, that Japanese cinema became obsessed with avenging spirits channeled into the present by various technologies (a haunted video cassette in Hideo Nakata’s Ringu, the internet itself in Kiyoshi Kurosawa’s Kairo, etc.). But in the 1980s, Japan’s economic and technologic dominance seemed like a foregone conclusion. In a 2001 Time article, Gibson called Japan cyberpunk’s “de facto spiritual home.” He goes on:

I remember my first glimpse of Shibuya, when one of the young Tokyo journalists who had taken me there, his face drenched with the light of a thousand media-suns—all that towering, animated crawl of commercial information—said, “You see? You see? It is Blade Runner town.” And it was. It so evidently was.

Gibson’s analysis features one glaring mistake. His insistence that “modern Japan simply was cyberpunk” is tethered to its actual history as an economic and technological powerhouse circa the 1980s, and not from its own science-fictional preoccupations. “It was not that there was a cyberpunk movement in Japan or a native literature akin to cyberpunk,” he writes. Except there so evidently was.

The Rusting World

Even beyond the limp, Orwellian connotations, 1984 was an auspicious year for science-fiction. There was Neuromancer, yes. But 1984 also saw the first collected volume of Akira, a manga written and illustrated by Katsuhiro Otomo. Originally set, like Blade Runner, in 2019, Akira imagines a cyberpunk-y Neo-Tokyo, in which motorcycle-riding gangs do battle with oppressive government forces. Its 1988 anime adaptation was even more popular, in both Japan and the West. (The film’s trademark cherry red motorcycle has been repeatedly referenced in the grander cyberpunk canon, appearing in Steven Spielberg’s film adaptation of Ready Player One and, if pre-release hype is to believed, in Cyberpunk 2077 itself.) In 2018, the British Film Institute hailed Akira, accurately, as “a vital cornerstone of the cyberpunk genre.”

Japan has plenty of other, non-Akira cyberpunk touchstones. As a cinematic subgenre, Japanese cyberpunk feels less connected to the “cyber” and more to the spirit of “punk,” whether in the showcasing of actual Japanese punk rock bands (as in 1982’s Burst City) or the films’ own commitment to a rough-hewn, low-budget, underground aesthetic. Chief among the latter category of films is Shinya Tsukamoto’s Tetsuo: The Iron Man, which was shot on 16mm over a grueling year-and-a-half, mostly in and around Tetsuo actress and cinematographer Kei Fujiwara’s apartment, which also housed most of the film’s cast and crew.

Unlike the Western cyberpunk classics, Tsukamoto’s vision of human-machine hybridization is demonstrably more nightmarish. The film follows two characters, credited as the Salaryman (Tomorowo Taguchi) and the Guy (a.k.a. “The Metal Fetishist,” played by writer/director/producer/editor Tsukamoto himself), bound by horrifying mutations, which see their flesh and internal organs sprouting mechanical hardware.

In its own way, Tetsuo works as a cyberpunk-horror allegory for the Japanese economy. As the Salaryman and the Fetishist learn to accept the condition of their mechanization, they merge together, absorbing all the inorganic matter around them, growing enormously like a real-world computer virus or some terrifying industrial Katamari. Their mission resonates like a perverse inversion of Japan’s post-industrial promise. As Tsukamoto’s Fetishist puts it: “We can rust the whole world and scatter it into the dust of the universe.”

Like Haraway’s development of the cyborg as a metaphoric alternative to the New Age “goddess,” Tetsuo’s titular Iron Man can offer a similar corrective. If cyberpunk has become hopelessly obsessed with its own nostalgia, recycling all its 1980s bric-a-brac endlessly, then we need a new model. Far from the visions of Gibson, in which technology provides an outlet for a scrappy utopian impulse that jeopardizes larger corporate-political dystopias, Tetsuo is more pessimistic. It sees the body—both the individual physical body and the grander corpus of political economy—as being machine-like. Yet, as Rosenberg notes in his Al Jazeera analysis of economic rhetoric, it may be more useful to conceive of the economy not as a “body” or an organism but as a machine. The body metaphor is conservative, “with implications that tend toward passivity and acceptance of whatever ills there may be.” Machines, by contrast, can be fixed, greased, re-oriented. They are, unlike bodies, a thing separate from us, and so subject to our designs.

Cybernetic implants and cyborg technology are not some antidote to corporate hegemony. The human does not meld with technology to transcend the limitations of humanity. Rather, technology and machinery pose direct threats to precisely that condition. We cannot, in Tsukamoto’s film, hack our way to a better future, or technologically augment our way out of collective despair. Technology—and the mindless rush to reproduce it—are, to Tsukamoto, the very conditions of that despair. Even at thirty years old, Tetsuo offers a chilling vision not of the future, or of 1980s Japan, but of right now: a present where the liberating possibilities of technology have been turned inside-out; where hackers become CEOs whose platforms bespoil democracy; where automation offers not the promise of increased wealth and leisure time, but joblessness, desperation, and the wholesale redundancy of the human species; where the shared hallucination of the virtual feels less than consensual.

There’s nothing utopian about the model of cyberpunk developed in Tetsuo: The Iron Man. It is purely dystopian. But this defeatism offers clarity. And in denying the collaborative, collectivist, positive vision of a technological future in favor of a vision of identity-destroying, soul-obliterating horror, Tsukamoto’s stone-cold classic of Japanese cyberpunk invites us to imagine our own anti-authoritarian, anti-corporate arrangements. The enduring canon of American-style cyberpunk may have grown rusty. It has been caught, as Bethke put it in his genre-naming story, “without a program.” But the genre’s gnarlier, Japanese iterations have plenty to offer, embodying sci-fi’s dream of imagining a far-off future as a deep, salient critique of the present. It is only when we accept this cruel machinery of the present that we can freely contemplate how best to tinker with its future.

Left to peddle such a despairing vision in a packed-out L.A. convention center, even cyberpunk’s postmortem poster boy Keanu Reeves would be left with little to say but a resigned, bewildered, “Woah . . .”

Whose Dystopia Is It Anyway?

Reason writers debate which fictional dystopia best predicted our current moment.

By Mike Riggs, Katherine Mangu-Ward, Todd Krainin, Nick Gillespie, Jesse Walker, Robby Soave, Eric Boehm, Christian Britschgi, Peter Suderman & Brian Doherty

Source: Reason

With social media platforms seemingly unable to distinguish Russian trolls from red-blooded Americans, the last two years have felt like a Deckardian purgatory. The frequency with which intellectual elites accuse their detractors of laboring on behalf of an always-approaching-never-arriving foreign power, meanwhile, smacks of Orwell. And if the proliferation of opioids in the American heartland doesn’t sound like “delicious soma,” what does? (Marijuana? Alcohol? Twitter?)

“We live in Philip K. Dick’s future, not George Orwell’s or Aldous Huxley’s,” George Washington University’s Henry Farrell recently argued in the Boston Review. Despite being a poor prognosticator of what future technologies would look like and do, Dick, Farrell writes, “captured with genius the ontological unease of a world in which the human and the abhuman, the real and the fake, blur together.”

But the universe of possibilities is much larger than just Orwell, Huxley, or Dick. Below, Reason‘s editorial staffers make the case for nearly a dozen other Nostradamii of the right now, ranging from Edgar Allan Poe to Monty Python’s Terry Gilliam. As for why we’re debating dystopias, and not utopias: Because there is no bad in a utopia, and because no dystopia could persist for long without at least a little good, it’s safe to assume that if you’re living in an imperfect world—and you very much are—it’s a dystopian one.

Dick wasn’t wrong, but Edgar Allan Poe got there first, writes Nick Gillespie:

At the core of Philip K. Dick’s work is a profound anxiety about whether we are autonomous individuals or being programmed by someone or something else. In Do Androids Dream of Electric Sheep?, are the characters human or Nexus-6 androids? In The Three Stigmata of Palmer Eldritch and A Scanner Darkly, you’re never quite sure what’s real and what’s the product of too much “Chew-Z” and “Substance D,” hallucinogenic, mind-bending drugs that erode the already-thin line between reality and insanity.

Which is to say that Dick’s alternately funny and terrifying galaxy is a subset of the universe created by Edgar Allan Poe a century earlier. Poe’s protagonists—not really the right word for them, but close enough—are constantly struggling with basic questions of what is real and what is the product of their own demented minds.

This dilemma is front and center in Poe’s only novel, The Narrative of Arthur Gordon Pym (1838), which tells the story of a stowaway who ships out on the Grampus and endures mutiny, shipwreck, cannibalism, and worse. It becomes harder and harder for Pym to trust his senses about the most basic facts, such as what side of a piece of paper has writing on it. The conclusion—not really the right word for the book’s end, but close enough—dumps Pym’s epistemological problem into the reader’s lap in violent and hysterical fashion. A friend told me he threw the book across the room in disbelief when he read its final page, which anticipates the frustration so many of us feel while following the news these days. Just when you think reality can’t get any stranger or less believable, it does exactly that, in both Poe’s fictional world and our real one.

2018’s turn toward hamfisted authoritarianism echoes Terry Gilliam’s Brazil, says Christian Britschgi:

No-knock raids by masked, militarized, police officers. A ludicrously inefficient bureaucracy. Crackdowns on unlicensed repairmen. If all this sounds eerily familiar, you may have seen it coming in 1985’s Brazil.

Set in a repressive near-future Britain, the film tells the story of lowly civil servant Samuel Lowry, who wants nothing more than to hide in the comically inefficient bureaucratic machine that employs him, all while doing his level best to quietly resist both a narcissistic culture demanding he rise higher, and a brutish security apparatus looking to punish anyone who steps out of line.

Directed by Monty Python alum Terry Gilliam, Brazil is surreal, ridiculous, and often just plain silly. Yet there is something chilling about the film’s depiction of the state as a bumbling, byzantine bureaucracy that can’t help but convert every aspect of life into an endless series of permission slips, reinforced by a system of surveillance, disappearance, and torture.

Evil and inefficiency are intimately intertwined in Brazil—with the whole plot set in motion by a literal bug in the system that sends jackbooted thugs to raid the wrong house and arrest the wrong man. While the regime in Brazil lacks a central, dictatorial figure at the top of the pyramid, there is definitely something distinctly current about the world it depicts, with every application of force complemented by an equal element of farce. Trump’s first crack at imposing a travel ban, for instance, proved incredibly draconian and cruel precisely because of how rushed, sloppy, and incoherent the actual policy was.

Fortunately, our own world does manage to be far less authoritarian than the one depicted in Brazil and has mercifully better functioning technology as well. The parallels can still give one pause, however, when you consider what direction we might be headed in.

The current moment definitely tilts toward Ray Bradbury’s Fahrenheit 451, says Eric Boehm:

We are not living in a world where government agents raid homes to set books ablaze, but “there is more than one way to burn a book, and the world is full of people running about with lit matches,” as Ray Bradbury warned in a coda appended to post-1979 editions of his 1953 classic.

Specifically, Bradbury was warning about the dangers of authoritarian political correctness. In that coda, he relates anecdotes about an undergrad at Vassar College asking if he’d consider revising The Martian Chronicles to include more female characters, and a publishing house asking him to remove references to the Christian god in a short story they sought to reprint.

More generally, though, Bradbury was commenting on the common misunderstanding of Fahrenheit 451 as a story about an authoritarian government burning books. It is that, of course, but it’s really about how cultural decay allows authoritarianism to flourish. It was only after people had decided for themselves that books were dangerous that the government stepped in to enforce the consensus, Guy Montag’s boss tells him in one of the novel’s best scenes. “Technology, mass exploitation, and minority pressure carried the trick,” Captain Beatty explains. “Politics? One column, two sentences, a headline! Whirl man’s mind around so fast…that the centrifuge flings off all unnecessary, time-wasting thought!”

In place of literature and high culture, Bradbury’s dystopia has an eerily accurate portrayal of reality television. Montag’s wife is obsessed with the “parlor family” who inhabit the wall-sized television screens in the living room, and clearly has a closer attachment to them than to her husband. The ubiquity of those screens—and how the government exploits them—is on full display near the end of the story, when Montag is on the lam for revolting against orders to burn books, and messages are flashed across every parlor screen in the city telling people to look for the dangerous runaway fireman.

We might not live in Montag’s specific version of Bradbury’s dystopia, but we exist somewhere on the timeline that leads there—which is exactly what Bradbury, and Captain Beatty, are trying to tell us.

Wrong book! We’re really living in Neal Stephenson’s Snow Crash, says Katherine Mangu-Ward:

It’s 1992. Computers are running Windows 3.1. Mobile phones are rare and must be carried in a suitcase. A few nerds in Illinois are getting pretty close to inventing the first web browser, but they’re not quite there yet.

This is the year Neal Stephenson publishes Snow Crash, a novel whose action centers around a global fiber optic network, which can be accessed wirelessly via tiny computers and wearables. On this network, users are identifiable by their avatars, a Sanskrit word that Stephenson’s novel popularized; those avatars may or may not be reliable indications of what they are like in real life. Many of the characters work as freelancers, coding, delivering goods, or collecting information piecemeal. They are compensated in frictionless micropayments, some of which take place in encrypted online digital currency. Intellectual property is the most valuable kind of property, but knowledge is stored in vast digital libraries that function as fully searchable encyclopedias and compendia. Plus there’s this really cool digital map where you can zoom in and see anywhere on the planet.

Basically what I’m saying here is that every other entry in the feature is baloney. We are living in the world Neal Stephenson hallucinated after spending too much time in the library in the early 1990s. End of story.

Is it a dystopia? Sure, if you want to get technical about it: Our antihero, Hiro Protagonist (!), is beset by all manner of typical Blade Runner–esque future deprivations, including sub-optimal housing, sinister corporate villains, and a runaway virus that threatens to destroy all of humanity.

But in addition to the this-guy-must-have-a-secret-time-machine prescience of the tech, the book offers a gritty/pretty vision of anarcho-capitalism that’s supremely compelling—when they’re in meatspace, characters pop in and out of interestingly diverse autonomous quasi-state entities, and the remnants of the U.S. government is just one of the governance options.

Stephenson’s semi-stateless cyberpunk vision is no utopia, that’s for darn sure. But the ways in which it anticipated our technological world is astonishing, and I wouldn’t mind if our political reality inched a little closer to Snow Crash‘s imagined future as well.

Katherine is off by three years. 1989’s Back to the Future: Part II is actually the key to understanding 2018, says Robby Soave:

Back to the Future: Part II has always been the least-appreciated entry in the series: It’s the most confusing and kid-unfriendly, lacking both the originality of the first film and the emotional beats of the third. But almost 30 years after its release, the middle installment of Robert Zemeckis’s timeless time-travel epic is newly relevant: not for accurately depicting the future, but for warning us what life would be like with a buffoonish, bullying billionaire in charge.

2015, the furthest point in the future visited by Marty McFly and “Doc” Emmett Brown has come and gone, and we still don’t have flying cars, hover boards, or jackets that dry themselves. But we do have a president who seems ripped from the film’s alternate, hellish version of Hill Valley in 1985, where the loathsome Biff Tannen has become a powerful mogul after traveling into the past and using his knowledge of the future to rig a series of events in his favor.

The similarities between Trump and alternate-reality Biff are so numerous that Back to the Future writer Bob Gale has retroactively (and spuriously) claimed the 45th president as inspiration for the character. Biff buys Hill Valley’s courthouse and turns it into a casino hotel. Biff is a crony capitalist who weaponizes patriotism for personal enrichment (“I just want to say one thing: God bless America”). Biff is a paunchy playboy with two supermodel ex-wives, a bad temper, and even worse hair. There’s no escaping Biff: He’s a media figure, a businessman, a civic leader, and even a member of the family.

“Biff is corrupt, and powerful, and married to your mother!” Doc Brown laments to Marty. Millions of Americans no doubt feel the same way about a man who similarly possesses the uncanny ability to commandeer our attention and insert himself into every facet of modern life. Sometimes it’s hard to avoid the feeling that we’re simply living through the wrong timeline—thanks, McFly.

We may not have hoverboards, but America is teeming with the legal “Orb” from Woody Allen’s Sleeper, observes Todd Krainin:

The world never recovered after Albert Shanker, president of the United Federation of Teachers, acquired a nuclear warhead. Two hundred years later, in the year 2173, the territory once known as the United States is ruled by The Leader, the avuncular figurehead of a police state that brainwashes, surveils, and pacifies every citizen.

Every citizen except for our hero, Miles Monroe. Cryogenically frozen in the late 20th century, Monroe is thawed out in the 22nd. As the only person alive with no biometric record, Monroe is essentially an undocumented immigrant from the past, making him the ideal secret weapon for an underground revolutionary movement.

“What kind of government you guys got here?” asks a bewildered Monroe, after learning the state will restructure his brain. “This is worse than California!”

Monroe’s quest to take down the worse-than-Sacramento government takes him through a world that’s amazingly prescient for a film that aims for slapstick comedy. He gets high on the orb (space age marijuana), crunches on a 15-foot long stalk of techno-celery produced on an artificial farm (GMOs), impersonates a domestic assistant (Alexa), and joins a crunchy underground (#Resist), in order to defeat The Leader (guess who).

Sleeper‘s most memorable invention is the Orgasmatron, a computerized safe space that provides instant climaxes for a frigid and frightened populace. It’s basically the internet porn and sex robot for today’s intimacy-averse millennials.

In the highpoint of the film, Monroe attempts to clone The Leader from his nose. This in a film released 23 years before real doctors cloned Dolly the sheep from the cell of a mammary gland.

By the film’s end, Monroe is faced with the prospect of replacing The Leader with a revolutionary band of eco-Marxists. But some things never change.

“Political solutions don’t work,” he prophesies. “It doesn’t matter who’s up there. They’re all terrible.”

For a journalism outlet, we’ve been embarrassingly slow to recognize that Orson Scott Card’s Ender’s Game explains the media world we live in, argues Peter Suderman:

In a 2004 feature for Time, Lev Grossman explored of a new form of web-based journalism that was then radically reshaping both the political and media landscapes: blogs. Grossman profiled several bloggers, most of whom were young and relatively unknown, with little experience in or connection to mainstream journalism. Yet “blogs showcase some of the smartest, sharpest writing being published,” Grossman wrote. In particular, bloggers were influencing some pretty big national conversations about U.S. military actions and politics.

From the vantage of 2018, all this might seem like old news: The mainstream media has adopted and amplified many blogging practices. But even in 2004, the idea of user-produced, semi-anonymous journalism, posted directly to the net with no editorial filter, had been in circulation for years as a sci-fi conceit—perhaps most prominently in Orson Scott Card’s 1985 novel, Ender’s Game.

In the book, a child genius named Ender Wiggin is sent to an orbiting military academy to prepare for a military invasion. While he’s away, his adolescent siblings—themselves unusually gifted—hatch a plan to manipulate world politics by posting psuedononymous political arguments on “the nets.” These essays are read by citizens and politicians alike, and both siblings develop powerful followings. Eventually, they help prevent the world from exploding into planetary war, and pave the way for mankind’s colonial expansion into space.

Card’s narrative was too compact, its assumptions about the influence of online writing too simplistic. But it previewed the ways in which the internet would expand the reach and influence of little-known writers—especially political pundits—who lack conventional journalistic training or credentials. Today’s internet-based media landscape is neither a utopia nor a dystopia, but a lively, raucous, fascinating, and occasionally frustrating extrapolation of what Scott Card imagined before any of it existed in the real world.

This year is definitely one of Heinlein’s “crazy years,” says Brian Doherty:

Robert Heinlein was one of the first science fiction writers to create a fictional structure that seemed to privilege prediction, with his “Future History” sequence, collected in the volume The Past Through Tomorrow.

Prediction was not Heinlein’s purpose—storytelling was. But his “Future History” chart started off with the “Crazy Years”: “Considerable technical advance during this period, accompanied by a gradual deterioration of mores, orientation, and social institutions, terminating in mass psychoses in the sixth decade, and the interregnum.” Heinlein made this prediction in 1941, so the “sixth decade” meant the 1950s.

Did he really predict the Trump era? Heinlein fans have seen in wild ideological excesses on both left and right a clear sign that we are, collectively, losing our minds. Instapundit‘s Glenn Reynolds thinks we are certainly in Heinlein’s Crazy Years, noting it’s become a cliché among Heinlein fans to notice. He sees as evidence totemic but useless responses to policy crisis, and a social networking age that allows for tighter epistemic bubbles for information consumers and producers. Factually, the internet makes it stunningly easier for anyone to have opinions about politics and policy far better informed by accurate facts and trends than in any previous era. That so many might choose not to do so shows why predictions of “crazy years” can seem so eternally prescient: People can just be crazy (colloquially).

A lot of the “crazy” news these days that might lead to the never-witty declaration that it’s “not The Onion” come from unusual personal qualities of our president; some come from excesses of the desire to control others’ thought and expression. But if “crazy” means dangerous, then recent trends in crime domestically and wealth and health worldwide indicate we are mucking along well enough.

Indeed, as per the title of Heinlein’s anthology, the past is tomorrow and probably always will be. That times of technologic advance will be followed by “gradual deterioration” (read: changes) in mores, orientation, and social institutions is the kind of golden prediction of the dystopia we eternally are moving in (and always moving through) with which it’s hard to lose.

Loing before the 2016 Flyover Takeover, Walker Percy predicted a frayed nation would disassemble itself, writes Mike Riggs:

It’s the 1980s, and liberals have taken “In God We Trust” off the penny, while “knotheads”—conservatives—have mired the U.S. in a 15-year war with Ecuador. Liberals love “dirty movies from Sweden,” knotheads gravitate toward “clean” films, like The Sound of Music, Flubber, and Ice Capades of 1981. America’s big cities, meanwhile, are shells of themselves. “Wolves have been seen in downtown Cleveland, like Rome during the black plague.” Political polarization has even led to a change in international relations: “Some southern states have established diplomatic ties with Rhodesia. Minnesota and Oregon have their own consulates in Sweden.”

Our guide through the social hellscape of Love in the Ruins is Thomas More, a descendant of Sir Thomas More (author of 1516’s Utopia) and a lecherous Catholic psychiatrist with an albumin allergy who nevertheless chugs egg-white gin fizzes like water. A stand-in for Percy, More is a keen social taxonomist and a neutral party in the culture war. He notes that liberals tend to favor science and secularism; conservatives, business and God. But “though the two make much of their differences, I do not notice a great deal of difference between the two.” In the bustling Louisiana town of Paradise, wealthy knotheads and wealthy leftists live side by side, in nice houses, with new cars parked in their driveways, just as they currently do in Manhattan, Georgetown, and Palm Beach. One group may go to church on Sundays, the other bird watching, but they are more like each other than they are the “dropouts from, castoffs of, and rebels against our society” who live in the swamp on the edge of town.

Yet even the wealthy must bear the brunt of social frisson. A local golf course magnate alternates between depression and indignation as the poor of Paradise challenge his decision to automate the jobs at his country club.

Love in the Ruins is the most radical timeline extending from the King assassination, Kent State, and the Tate Murders, three historical moments that helped undo the World War II–era fantasy—ever more childish in hindsight—of America as a cohesive unit. We were not one then, and are not now. Percy saw 2018 coming from a four-decade mile.

You are all wrong, says Jesse Walker:

Identity has never been as fluid, fungible, and multiple as it is today. That guy you’re arguing with on Twitter might actually be a crowd of people. That crowd of people you’re arguing with might actually be just one guy. Trolls try on a persona for an hour, then discard it for something new. Bots adopt a persona and stick with it, but without an actual mind in command. Your identity might be stolen altogether, leaving you to learn that an entity that looks like you has been spending money, sending messages, or otherwise borrowing your life. You might even wake one day to discover that someone has inserted your head onto someone else’s body, all so a stranger can live out a fantasy.

You can decide for yourself how much of that is a utopia and how much is a dystopia. All I know is that at some point we started living in Being John Malkovich.