Dystopia Isn’t Sci-Fi—for Me, It’s the American Reality

Cadwell Turnbull is a contributing author of The Dystopia Triptych. Photograph: Broad Reach Publishing

By Cadwell Turnbull

Source: Wired

Imagine a city where a group of people have managed against all odds to carve out prosperity for themselves, at least for a little while. These people used to be owned by other people. Now, they are permitted freedom, but only so much, subject to the whims of the once-masters.

Prosperity is a dangerous thing for the oppressed. It is a dry hot day in a forest bound to catch fire. And so, eventually, there is spark. A teenage boy assaults a teenage girl of the once-master class in an elevator, or so the story is told. Truth doesn’t matter here. A story is enough. The once-masters want justice, which means all the once-slaves must be punished. Men, women, and children are dragged from their homes and shot, their stores and houses bombed or burned. The exact number of dead will remain uncertain, the story buried for so long that people will watch it in a television show almost a century later and mistake the dramatization of the event for pure fiction.

Imagine another city where the once-slaves are told they are getting treatment for a devastating illness, when they are in fact receiving a placebo. Imagine four decades of this lie, the originally infected passing on this disease to their spouses, their children, so that the once-masters can study the long-term effects of the disease on people they don’t consider fully human.

Imagine these cities are part of a great nation. The once-slaves are tired of their second-class citizenship so they begin a movement for justice and equity. This movement is met with a violent backlash. The once-slaves are attacked by dogs, blasted by hoses. Their churches are burned, their institutions subject to random acts of retaliation by the once-masters. Their activists are monitored. Their leaders are jailed or assassinated. There are victories, but even after the successes, once-slaves are shot down in the street for minor offenses or looking “suspicious.” Their neighborhoods are over-policed. Their children are denied quality education. Many of them are sent to prison, where they work for pennies or for nothing. But it isn’t called slavery. It is treated as coincidence that this forced labor disproportionately affects the oppressed class, the once-slaves.

These are the makings of dystopian fictions, and yet many in America don’t need to imagine them. It is their reality. However, most Americans would not call America a dystopia.

If the edges are filed off, the names of places and events changed, a few injustices amplified, Americans can pretend the sorts of things that happen in dystopias don’t happen in their backyards. They can call it fiction, create enough distance to make themselves comfortable with their country’s own sins. But this doesn’t change the fact that the American experience is dystopian for many marginalized people. And like in any dystopia, real or imagined, it is up to all Americans to recognize this storyline, imagine a better society outside of the current reality, and then work toward it. Otherwise, America consents to a normal that is grotesque.

I read my first dystopia in high school. As a teenager, 1984 terrified the hell out of me. I didn’t read it as a warning, but as a mirror to my own experience. I identified with the protagonist Winston Smith’s feeling that something was deeply wrong with his society and the overwhelming sense of helplessness that followed. In college, I read my first utopia. The Dispossessed, by Ursula K. Le Guin, in every sense, was an antidote to that despair I felt when reading 1984.

And then, many years later, I read “The Day Before the Revolution,” the prequel short story to The Dispossessed, and found in it the practical application of the novel’s revolutionary ideas. The story is beautifully quiet. It follows Odo, the founder of the radical movement at the heart of The Dispossessed, as she goes through her day and remembers important moments in her political and personal journey. Le Guin prefaced “The Day Before the Revolution” with a brief definition of the Odonian belief system: “Odonianism is anarchism … its principal and moral-practical theme is cooperation (solidarity, mutual aid). It is the most idealistic, and to me the most interesting, of all political theories.”

To be clear, the Odonians are not perfect. They are resistant to change and have allowed other forms of institutional privilege to develop and calcify in their society. But, because they believe in their utopia and have lived their lives in accordance with that belief, they’ve managed to build a reasonably just and equitable society

And this is where, in life just as in science fiction, a distinction must be made. A just and equitable society is not the same as a perfect one. I’d argue that everyone would benefit if we defined utopia as a move toward justice and equity, and not just the state of perfection. But in America, especially in discussions about social justice, “just” and “perfect” are treated as synonymous objectives. And because perfect is never attainable, justice, too, becomes out of reach. Under this framing, injustice becomes normal, oppression is realistic, and any move towards justice and equity must come from struggle. A disturbing unspoken belief is born from this framing, that marginalized people will never receive full humanity because a just society is not possible. By failing to recognize the dystopia, and dismissing the possibility of a utopia, America has resigned itself to its current, dark narrative.

As a result, in America, universal social welfare is too costly and politically unfeasible, while trillion-dollar corporate bailouts and endless wars go unquestioned. Police and prison reform are aimed towards harm reduction for marginalized communities, instead of daring to imagine a society where these institutions are mostly unnecessary. In American discourse, a society can’t take care of all its citizens or remedy the causes of crime.

In a society where injustice is normalized, justice becomes a goal that can only be achieved through sacrifice—tragedy becomes currency, a thing to be used, not prevented. It takes decades of confirmed police brutality before America considers even the most minor reforms. This is not by accident. Black and brown bodies have been the fuel used to drive this society towards slightly lesser states of injustice since the very beginning. The oppressed have always paid the price for progress.

And yet, Americans have never shown this kind of defeatism when it comes to technological advancements. When this nation decided to go to the moon, it was framed in terms of “How do we get there?” not “Is this possible?” And no one ever said, “This rocket may only get half-way to the moon, but first many must die.”

Americans once oblivious to the dystopia are waking up. That’s good. But the price of waking up should be considered, and the lives sacrificed to incrementalism must be mourned. It is easy for a pragmatist to ask for incremental change when the current reality favors them. But pragmatism hits differently when it is forced at gunpoint. Every loss on the way to justice is a collective sin, because it was decided that the road must be long and the oppressed must struggle for every inch.

Do not normalize the losses happening right now because of the gains. Assume where America has always been is a tragedy. What is done in hell isn’t romantic; sacrificing bodies to dystopia isn’t beautiful. As I write this, people protesting brutality are dying at the hands of law enforcement. No one should pay for progress with their life. And it isn’t naive to believe every member of society should have a healthy, empowering, and fulfilling time on earth. The ones that have suffered deserve nothing less than faith in that possibility. This moment may provide a way out of dystopia, but there has to be a collective reckoning with the dystopian aspects of American society as well as the cruel price of progress repeatedly placed on the backs of the oppressed. Through solidarity there is a way out of these bitter realities, but the way there must be just if the destination is to be just.

In science fiction there is a notion that the universe is filled with possible worlds just waiting for humanity to come settle. It has some of its more troubling roots in manifest destiny, but also in hope, and the idea that better worlds are possible. But what if this corner of Earth could be that imagined place? Imagine a better world right here, instead of elsewhere. The price is in going all the way, doing all the work, believing all the work can be done. That’s the only way to get to the moon. Human beings have to believe it exists.

Why We Need Dystopian Fiction Now More Than Ever

By August Cole and P.W. Singer

Source: Slate

It hits you every so often.

When you when you tug on a face mask to go pick up food for your family.

When you witness the powerless suffer casual violence by a man with a sneer.

When you see riot police surround the Lincoln Memorial and protesters snatched off the streets by masked soldiers in unmarked cars.

And when you realize that it is all being watched by an unblinking eye of A.I. surveillance.

At times, it feels like we are living in a real-world version of dystopia. The strange outcome, though, is that it means we need dystopian fiction now more than ever, to help us sort and even make it through it.

You’d think with everything going on, now would be the last time to escape to a world of darkness. And yet books, including those of awful imagined worlds, are in deep demand.
Some of it has been a return to old classics. In a period of disease and lockdowns lasting for weeks, booksellers report the seeming irony that Albert Camus’ The Plague and Gabriel García Márquez’s One Hundred Years of Solitude have seen renewed demand. And some of it has been escaping into new worlds, as with Divergent author Veronica Roth taking readers into another post-apocalypse with her new novel Chosen Ones. People have even been willing to enter imagined worlds that seem not too far away, such as Lawrence Wright’s best-selling pandemic thriller The End of October.

Yet the value of the genre is as much in education as entertainment. It can elucidate dangers, serving the role of warning and even preparation. Think of the recent resonance of Margaret Atwood’s 1985 Handmaid’s Tale and its 2020 sequel The Testaments or the revival of interest in It Can’t Happen Here by Sinclair Lewis in 1935. These are finely written works, not as indulgences, but as a pure expression of the idea that to be forewarned is to be forearmed. Even Susan Collins’ Hunger Games prequel, The Ballad of Songbirds and Snakes, might be interpreted in that light, showing how authoritarian rule can originate through the manipulations of an ambitious striver.

Our personal corner of this dark market is the meld of imagination with research. For our book Burn-In: A Novel of the Real Robotic Revolution, we chose the setting of not a far-off imagined world like Panem or Gilead, but Washington, D.C., just around the corner. What happens as Silicon Valley’s visions of utopia hits our real, and very divided, country? What plays out in politics, business, and even family life as our economy is rewired by AI and automation? Yet to make our scenario more haunting, we back up everything that happens in it with 27 pages of endnotes.

When the scarier elements from an imagined world come to life in the real one, however, there is no gleeful “I told you so.” When the novel coronavirus accelerated the more widespread roll out of the robots, remote work, job automation, and AI surveillance projected in our book, we certainly weren’t happy. All it meant was that all the tough dilemmas that our characters face would come quicker for all of us. What was perhaps most disturbing of the last few weeks, though, were when some of the most dystopian scenes we had painted of a future Washington, D.C., also came true, from our book’s scene of riot police deployed around the Lincoln Memorial to the militarized fence thrown up around the White House being put exactly where we had it in Burn-In.

Yet what makes dystopian fiction different is that its creators are oddly optimists at heart, as we are. These works are not about prediction, but prevention. The stories warn of just how far things can go if action isn’t taken, wrapped in a package that is far more impactful than a white paper or PowerPoint. Indeed, research shows that narrative, the oldest communication technology of all, holds more sway over both the public and policymakers than even the “most canonical academic sources.” Our minds can’t help but connect to the “synthetic environment” that our fictional heroes and villains experience, living part of our lives through theirs, even if imagined.

Most importantly, though, the dark worlds are only the setting. The stories are really about the agency of the people in them. And that is perhaps the true value of the dystopian fiction. These stories are not about what those characters experience so much as how they act. At the heart of every story of darkness is a story of perseverance.

As we face our own difficult journeys through the reality of 2020, it is perhaps that lesson which is most important of all.

America Has Always Been a Dystopia

Too many of us just haven’t been paying attention

By Bryan Merchant

Source: OneZero

“Trump’s American dystopia has reached a new and ominous cliff,” warns a CNN opinion headline. “The last two and a half months in America have felt like the opening montage in a dystopian film about a nation come undone,” writes New York Times columnist Michele Goldberg, in describing the images of militarized police storming U.S. cities to put down protests in the days following George Floyd’s murder, which came on the heels of two months of pandemic, panic, and widespread economic collapse. A very popular post published elsewhere on Medium was titled, bluntly, “America is a Dystopia.”

There is a lot of dystopia talk getting tossed around right now, for reasons that probably seem obvious. Those images we’ve all spent hours staring at on Twitter and cable TV — the military vehicles patrolling suburban streets, the lines of tactical vested officers cordoned around the Lincoln Memorial, the scenes of tear gas blurring flames as masked protesters clash with armed police — match up reasonably well with the aesthetics and broad strokes of a genre that we’ve spent the last 10 years staring at on Netflix and the other channels on cable TV.

But this is not “Trump’s American dystopia.” It is the continued, if inflamed, dystopian state of play as it has laid for centuries. The montage of horrors did not begin only a few months ago or when a cohort of privileged observers suddenly became aghast at the SWAT howitzers and brutal policing tactics when they were seen on suburban streets.

Years of toothless and profitable pop culture dystopias have primed consumers to ignore race, helping to obscure the fact that the real dystopia arrived long ago.

If we wanted to get pithy about it, we might say that the 2010s were the dystopia decade, a period that saw both the rise of dystopia as a reliably profitable and uniform entertainment format in mass culture and what appeared to be the IRL manifestation of the images and tropes the genre broadcast by decade’s end. The Hunger Games rose to dominate box offices and spawned a follow-on flotilla of similarly shaped YA dystopian fare. Black Mirror mainstreamed a visual mode of bleak cynicism about technology, and critical darlings like Ex MachinaHer, and Mad Max: Fury Road made apocalypses brought about by artificial intelligence and climate change palatable for the intelligentsia. Meanwhile, Blade Runner, RoboCop, Starship Troopers, and Children of Men became frequent touchstones. Partly because they are good films that offered prescient cultural and political commentary, and partly because their visuals provide handy fodder for comparative screen-grabbing on social media while we’re watching high-tech police forces brutalize popular uprisings, climate change-fueled wildfires spread across cityscapes, and A.I. take on alarming new dimensions, like being racist.

As a result, comparing America to a dystopia has become something of a national pastime; a recurring op-ed framework, a subgenre of Twitter commentary — especially during crisis points and moments of mass upheaval.

But what are we actually talking about when we talk about “dystopia”? Gesturing towards a vague constellation of injustices set to the color palette of a “gritty” summer blockbuster and declaring it dystopian won’t cut it — for dystopia to be useful as a cautionary tool for avoiding bad futures, we need to understand exactly what the ingredients setting a society on the road to ruin are. As it stands, much of the modern dystopian discourse seems content to position dystopia as something that is bad, with an air of futurity. To quote Daniel Mallory Ortberg’s famous mocking of Black Mirror: “What if phones, but too much.” What if high-tech cops, what if sea level rise, etc.

“The adjective dystopian implies fearful futures where chaos and ruin prevail,” writes Gregory Claeys, a historian and professor at Royal Holloway, University of London, and author of Dystopia: A Natural History. Though in a historical and literary sense, he says, dystopia most commonly describes “a regime defined by extreme coercion, inequality, imprisonment, and slavery.”

Because its most popular touchstones are science fiction, modern dystopia discourse tends to fixate on profit- or warfare-accelerating technologies — digital surveillance, facial recognition, automation software, drones, technologized weapons — and their capacity to serve the wealthy and powerful in a time of ecological collapse, health crises, and/or widening inequality. Our current moment fits the bill. The coronavirus, mass unemployment, and police brutality against a racial justice uprising are unfolding to the backdrop of SpaceX rocket launches and tech billionaires like Amazon’s Jeff Bezos rapidly expanding their wealth.

When I noted on Twitter that the SpaceX launch was sending astronauts on a for-profit trip into space as a surge of protests swept the country, it struck a chord. Many responded by comparing the events to Elysium, the 2013 Neill Blomkamp film about a future where the poor toil and swelter on Earth while the wealthy live in luxury in a space station that orbits above the Terran rabble.

Others pointed to the great Gil Scott Heron song, “Whitey’s On the Moon.” The musician and poet released it in 1970, one year after the NASA moon landing, which was itself one year after Dr. Martin Luther King, Jr.’s assassination provoked a mass nationwide uprising, perhaps the last at a scale comparable to the one we’re seeing today.

Some of the lyrics:

A rat done bit my sister Nell.

(with Whitey on the moon)

Her face and arms began to swell.

(and Whitey’s on the moon)

I can’t pay no doctor bill.

(but Whitey’s on the moon)

Ten years from now I’ll be payin’ still.

(while Whitey’s on the moon)

That song was recorded a half-century ago, yet the plight remains the same. It was the same in 1993, when Octavia Butler, in her own magisterial dystopia, Parable of the Sower, set in a mostly Black community in Southern California in the apocalyptic 2020s, described the news of the death of a Mars explorer as eliciting the following reaction: “People here in the neighborhood are saying she had no business going to Mars, anyway. All that money wasted on another crazy space trip when so many people here on earth can’t afford water, food, or shelter.”

Billionaires can afford to send payloads into orbit, to explore space for science and for profit, but we cannot afford to provide health care to the poor or even basic racial equality. That’s what too many of us are missing when we talk about dystopia.

As comparatively radical as a dystopia like Elysium (or, say, Snowpiercer) is — in terms of summer blockbusters, anyway — its critique is limited to class. It glosses over race. It’s Matt Damon versus Sharlto Copley and Jodi Foster and the other white orbital techno-authoritarians. Take a scan through any of the most popular dystopian cinema products of the last decade or so, and you’ll find the same thing; matters of race are omitted almost entirely from the big screen eschatologies. Not only are the genre’s prime exports — Hunger Games, Divergent, Blade Runner, Elysium, RoboCop, the list goes on — written and directed by white people, the protagonists, actors, and even antagonists are nearly uniformly white. And despite many of these being imagined, written, and made in a nation whose founding arrangement was the most dystopian system conceivable, race is never even a component of the conversation in mainstream dystopian cinema, much less what the uprisings are predicated upon. Even the Handmaid’s Tale, which exploded in the wake of Trump’s misogyny-lined ascendency to the presidency, relegates any matter of racial politics deep into the background.

Angelica Jade Bastién points all this out in “Why Don’t Dystopias Know How to Talk About Race?”, where she explains how this in effect allows white viewers to cosplay as the oppressed, without actually interrogating in any meaningful way what oppression might actually entail or who gets oppressed and why.

“Race is relegated to inspiration, coloring the towering cityscapes of these worlds, while the white characters toil under the hardships that Brown and Black people experience acutely in real life,” Bastién writes. “In this way, dystopias become less fascinating thought experiments or vital warnings than escapades in which white people can take on the constraints of what it means to be the other.”

And in so doing, these popular dystopias appropriate the other’s struggles while conveniently ignoring the actual roots of said struggle. I do still think there’s utility in dystopias and trying to heed their warnings, but only if we recognize what’s being warned against, and only, especially, if we manage to understand that many of the looming “dystopias” perceived by more affluent entertainment consumers have been the realities of plenty of communities who have faced deep inequalities, technologized surveillance, and state oppression for generations already.

There’s a tweet that’s gone viral a number of times over the dystopian decade, each time in slightly different variation. Its most recent iteration came just this January, before the pandemic and the uprising came to dominate dystopia discourse:

https://twitter.com/ElleOnWords/status/1218693768339251200

White dystopia fanboys like me, pundits, columnists, and social media users need to get this through our skulls. To invert a notorious quote attributed to William Gibson, the dystopia has always been here; it just hasn’t been evenly distributed.

The “dystopia” lens too often fixes conditions like those — heavily policed communities, invasive surveillance, state oppression — in the future, and it glosses over the realities of the present and the long histories of oppression of Black communities and bodies, plenty of which was technologically abetted. The writer Anthony Walton noted in a 1999 Atlantic piece, “Technology Versus African Americans,” that from “the caravel to the cotton gin, technological innovation has made things worse for Blacks.” Western technologies, he writes, formed the infrastructure that gave rise to Black slavery:

Arab and African slave traders exchanged their human chattels for textiles, metals, and firearms, all products of Western technological wizardry, and those same slavers used guns, vastly superior to African weapons of the time, in wars of conquest against those tribes whose members they wished to capture… The slave wars and trade were only the first of many encounters with Western technology to prove disastrous for people of African descent. In the United States, as in South America and the Caribbean, the slaves were themselves the technology that allowed Europeans to master the wilderness.

What better fits Claeys description of dystopia — “a regime defined by extreme coercion, inequality, imprisonment, and slavery” — than actual chattel slavery? America was founded as a dystopia.

Yet for white and affluent consumers, the constant generation of novel and fantastic apocalyptic scenarios serves to extend the horizon for the arrival of the hellish conditions contained in dystopia — if oppression is a nebulous but ever-approaching threat, it’s perpetually obscured, lifted away into a sub-fictional ether. It needs not be interrogated, not now, anyway. Which is how power prefers it.

That’s the other thing about dystopia: In many of its guises, it’s a plainly conservative enterprise. The most influential dystopia of the 21st century, I would argue, is not 1984, but Atlas Shrugged, which alone is responsible for a generation of greed-is-good Republican policymaking. The 600,000-page book, which I have (regrettably) read, positions a handful of great white men and women as the only thing keeping society together and inveighs against the millions of working-class “moochers” with a barely veiled racist subtext. (Its author was also openly racist.) Many dystopias are less flagrant but similarly conservative: They highlight the fear that we might all end up like the poor unwashed masses if we are not careful to uphold the social order, not the fear that the poor might never be liberated. And that, in fact, includes the ur-dystopia.

“Visions of the apocalypse are at least as old as 1000 B.C.,” according to the dystopian historian Claeys. “The triumph of chaos over order defined the Egyptian ‘Prophecies of Neferti’ foretold the complete breakdown of society.” In it, the “great no longer rule the land,’ the ‘slaves will be exalted.’” The first dystopia, in other words, was a cautionary tale for the haves against sliding into the world of the have-nots. It’s hard not to shake that vibe from a lot of the Twitter commentariat, pointing at the protests from afar, going “man it’s so dystopian” and moving on to whatever the central animating conflict is in their own personal heroic narratives.

There are still useful deployments of dystopian language — it can certainly be effective shorthand for “this is fucked in a new way, pay attention.” A good example is this series of viral tweets that chronicle a day of peaceful protest where demonstrators were in turn greeted with the creepy electrified visage of Gov. Andrew Cuomo on a towering billboard, beaming down the newly instated curfew. A couple hours later, many protesters would be beaten.

And dystopias can still jolt the politically uninvolved to wake up — this podcaster even pointed to Elysium as an entry point into radical politics. But the surfeit of commentary that amounts to “wow, this is like Blade Runner send tweet” needs an upgrade. White viewers like me need to rethink and reevaluate what it means to watch and read popular dystopian fiction, how those products are shaping our perspectives and critiques of the futures and what they’re missing. And many more Black voices clearly need to be added to the mainstream canon and the broader discussion — there’s tons of great Black dystopian fiction; Dhalgren by Samuel Delany, Who Fears Death by Nnedi Okorafor, Zone One by Colson Whitehead, pretty much anything by Ishmael Reed. Who Fears Death is in development for a TV series, which is a start, but these voices need to be better foregrounded and made central to modern dystopia discourse.

A lack of diversity has been a problem in science fiction since the genre’s inception, and it persists. When I went to the Nebulas, a high-profile sci-fi awards conference last year, attendees were overwhelmingly white. The fact that Octavia Butler’s magisterial Parable of the Sower — a dystopia that actually and skillfully manages to interrogate climate change, total economic collapse, privatization, and racist oppression — is somehow not a film or a limited series yet is as scathing an indictment of Hollywood’s insistence on whitewashing dystopias as anything. The book absolutely rips.

This is not to disparage anyone who feels like they’re living in a certain kind of almost-future hell. The number of people who genuinely experience the world as an impending or current dystopia is almost certainly rising in tandem with trends of still-increasing inequality. A decade of jobless recovery ended in 2019 with the highest levels of income inequality in 50 years, and record numbers of people of all backgrounds, even whites, are sliding into poverty and despair, and our encounters with climate change, technological surveillance, conservatism’s hard drift toward authoritarianism, and all of the above being increasingly mediated through digital devices. Our current socioeconomic system is now ideally structured to be a dystopian protagonist generator. It is rewarding elites with unprecedented wealth and luxury, equipping the agents of the state with increasingly advanced weapons and technology, exacerbating ecological collapse, and positioning us all to experience the devastation alone, blinking into a screen, hoping for tiny units of validation from a pithy comment or two about the state of the morass on social media. It is us versus [gestures wildly] all of that, out there.

Which makes it all the more imperative that white fans, pundits, and observers stop ignoring what it has historically meant to experience actual dystopian conditions. It means acknowledging and working to improve the material conditions for those who are surviving the current iteration, and not glibly waving off dystopia as some always-approaching, faceless Empire without zeroing in on the nation’s institutional prejudices, its targets for violence, its specific hatreds. It means we have to stop LARPing in appropriated fictions. It means understanding that this has always been a dystopia — and that those who have always resisted it are at the center of the story.

Minority Report (2002) Esoteric Analysis

By Jay Dyer

Source: Jay’s Analysis

Spielberg’s Minority Report is now an important film to revisit.  Based on the short story by visionary science fiction author Phillip K. Dick, Spielberg’s film version implements an important number of predictive programming elements not found in Dick.  Both are worth a look, but the film is important for JaysAnalysis, since now 13 years later, we are actually seeing the implementation of the total technocratic takeover, including pre-crime tracking systems.

Although the film and the short story present the precognition as a metaphysical mystery by telepathic individuals who can see into the aether, the real pre-crime systems are based on A.I. and the digitizing of all records under total information awareness.  And as I’ve said, this was DARPA’s plan for the Internet all along.

In fact, a good friend of mine worked for a few years digitizing mass medical records, and while most are aware of Google’s attempts to digitize all books, most do not know why.  I’ve warned for several years now the end goal of all this digitization is not for “efficiency” and trendy techy cool iWatches to monitor heart rates and location.  The ultimate goal is total mind control, loss of free will and the complete rewrite of all past reality.

Consider, for example, the power the system will wield with the ability to “delete” all past versions of literature – religious texts, Shakespeare, 1984, nothing will be sacred and unable to be “revised.”  Remember that in 2009 Amazon erased Orwell’s 1984.  Your own past may even be deleted, subject to revision or altered to make you the next villain!  All this is revealed in detail in Minority Report.  Thus, while the public adopts “Kindles,” print itself is assigned the doom of the kindled fire – like Farenheit 451, as Richard Grove has said.

Minority Report’s setting is a 2055 dystopic D.C., where Agent John Anderton (Tom Cruise) is framed for two murders from within his own PreCrime Corporation ranks by the CEO, Lamar Burgess (Max Von Sydow). (Note: The existing system appears to be a merger of private and government sectors.)   I’m sure most readers have seen the film, so I’ll spare you detailed plot recaps and hit the highlights for the sake of our purposes.

The film’s PreCrime alerts a private corporation to a predetermined murder event ahead of time, giving the Agents of the corporation time to save victims.  Hailed as a perfect system, the infallibility of PreCrime has made D.C. the safest city in the world, with no murders for several years.  As a result, the PreCrime test requires a total surveillance society, something akin to complete panopticism.  In fact, the advertising in D.C. is user specific, targeting pedestrian’s personal desires based on retina scans – and all travel requires retinal scanning and mass microchipping.

We are now on the verge of the implementation of retinal scanning, as the U.S. military has engaged in retinal scanning in occupied territories for several years now.  It is important to understand that the action of the military abroad are often a test ground for the implementation of such surveillance and tracking technology at “home.”  In October 2010, the Guardian reported of U.S. troops stationed in Afghanistan:

“With each iris and fingertip scanned, the device gave the operator a steadily rising percentage chance that the goat herder was on an electronic “watch list” of suspects. Although it never reached 100%, it was enough for the man to be taken to the nearest US outpost for interrogation.

Since the Guardian witnessed that incident, which occurred near the southern city of Kandahar earlier this year, US soldiers have been dramatically increasing the vast database of biometric information collected from Afghans living in the most war-torn parts of southern and eastern Afghanistan. The US army now has information on 800,000 people, while another database developed by the country’s interior ministry has records on 250,000 people.”

Wired Magazine reported millions were the goal.  The goal is not millions, but the entire globe, where any and all information is now currency for “big data.”  This is exactly the world Minority Report foresaw, and for those curious about Phillip K. Dick, whispers are his foresight was due to being well-connected with the Silicon Valley elites.  This is how Ubik foresaw the “Internet of Things” I’ve written about many times, and probably in part why Dick went insane (or was targeted).  Slate writes of Ubik:

“Samsung, the world’s largest manufacturer of televisions, tells customers in its privacy policy that “personal or other sensitive” conversations “will be among the data captured and transmitted to a third party” through the TV’s voice-recognition software. Welcome to the Internet of Things.

Sci-fi great Philip K. Dick warned us about this decades ago. In his classic 1969 novel Ubik, the characters have to negotiate the way they move and how they communicate with inanimate objects that monitor them, lock them out, and force payments.”

Just as the predictive algorithm in Asimov’s Foundation was able to track mass movements, so now the same algorithmic tracking is in place across the “web of things” that are capable of being recorded and tracked – and that’s most things.  The Pentagon has a virtual “you” in a realtime 3D interface that updates its data consistently from everything done on the web.  The Register reported in 2009 about this simulated warfare and predictive software:

“Defense analysts can understand the repercussions of their proposed recommendations for policy options or military actions by interacting with a virtual world environment,” write the researchers.

“They can propose a policy option and walk skeptical commanders through a virtual world where the commander can literally ‘see’ how things might play out. This process gives the commander a view of the most likely strengths and weaknesses of any particular course of action.”

It’s not telepathic Samantha Morton’s in a tub of goo, it’s Google and DARPA developing highly advanced technology along the lines of what William Binney exposed, as a former NSA employee.  Think here of War Games (1983), where the A.I. bot was able to war game future scenarios of global thermonuclear war, but thankfully Ferris Bueller was there to save us.  If this was displayed in 1983 in pop culture, imagine how far that technology has come 30 years later.  Lest anyone think the “precrime” is merely for security and weekend Xbox enjoyment, recall what I wrote two years back:

“Capitalism, communism, nationalism, 401ks, blah blah blah, all of these things are basically obsolete. Why?  Because of the nature of the real secret high-tech and plans for mega SmartCities that are to come.  You see, you think you are getting ahead and climbing the scum social ladder, and you aren’t even aware that the CEO of IBM Ginni Rommety gives lectures about SmartCities where everything you do will be rationed, tracked and traced by the central supercomputers, with pre-crime determining whether you are guilty of crimethink.  So everything you are trusting in is already obsolete.  You think I’m exaggerating?  On the contrary, you and your children’s futures are determined (you don’t have a future), and if you are allowed to live past the great culling, you will essentially be boxed into a giant WalmartTargetGameStopUniversity City that will literally be run by a supercomputer. Watch for yourself:”

And lest anyone think PreCrime is a thing of the future, consider that it has been used for two years in the U.K.  The New Scientist and 21stCenturyWire report:

“That’s the hope of police in the US, who have begun using advanced software to analyse crime data in conjunction with emails, text messages, chat files and CCTV recordings acquired by law enforcement. The system, developed by Wynyard, a firm based in Auckland, New Zealand, could even look at social media in real time in an attempt to predict where the gang might strike next.

“We’re trying to get to the source of the mastermind behind the criminal activity, that’s why we’re setting up a database so everybody can provide the necessary information and help us get higher up the chain,” says Craig Blanton of the Marion County Sheriff’s Office in Indiana. Because Felony Lane Gang members move from state to state to stay one step ahead, the centralised database is primed to aggregate historical information on the group and search for patterns in their movements, Blanton says.

“We know where they’ve been, where they are currently and where they may go in the future,” he says. “I think had we not taken on this challenge, we along with the other 110 impacted agencies would be doing our own thing without better knowledge of how this group operates.”

It’s not the only system that police forces have at their disposal. PredPol, which was developed by mathematician George Mohler at Santa Clara University in California, has been widely adopted in the US and the UK. The software analyses recorded crimes based on date, place and category of offence. It then generates daily suggestions for locations that should be patrolled by officers, depending on where it calculates criminal activity is most likely to occur.”

Returning to the film, an interesting tidbit occurs about three times that I noticed.  Any time Anderton or his fellow Agents access the “Temple,” the holding site of the telepathic PreCogs, the sound made is distinctly the iPhone power on sound.  The first iPod premiered in 2001, so I’m assuming it’s the same sound for turning on, but readers can correct me.  I find it curious if not, since the sound would likely be chosen for a reason. This puts the infamous 1984 Apple ad in a new, ominous light.

If you’ve seen the important Spike Jonze film, Her, you’ll see why.  In Her, lead character Theodore (Joaquin Phoenix) falls in love with an iOS – his operating system.  The iOS of his future is an intelligent software system with capability for learning (like the A.I. in War Games), and ultimately transcends its own limitations.

I bring this up because Minority Report is distinctly dominated by eye imagery.  While seemingly insignificant, it is my opinion that Siri and Apple in particular are crucial in the implementation of the coming new order.  Apple ads have contained a distinctly esoteric and significant cultural referent.  This is not to say Microsoft or any of the other tech giants are insignificant, on the contrary, I believe they are all arms of one entity and the appearance of competition is largely illusory.

There is only one military industrial complex, and DARPA and Google and Apple and Microsoft are all its children.  The façade of competition is enough to advance the technology by the tech nerds that serve it, but in the end, it all serves the same system.  My point here is that the iPhone is much more than an iPhone. It is actually an EYEphone, functioning as the eye of Sauron himself, as A.I. reconnaissance before the takeover.

I have mentioned before the whispers are the iPhone of the next few years will contain a Siri that communicates with you like a personal assistant.  I have finally found an article on this here, which describes it directly in connection to Her, like I said here.  “Viv” will do the following:

“On the other hand, not only will Viv recognize disparate requests, she will also be able to put them together. Basically, Viv is Siri with the ability to learn. The project is being kept heavily under wraps, but the guys at Viv have hinted that they’re working towards creating a “global brain,” a shared source of artificial intelligence that’s as readily accessible as heat or electricity.  It’s unclear how soon a breakthrough of this magnitude can happen. But if this team made Siri, you can bet their next project is going to blow the tech world to pieces.”

In order to endear the public to that idea, a prototype Siri had to be offered.  While this may be a rumor, it will eventually come.  And the dystopic scenario presented in Her will meet the nightmare of Minority Report.  For now, it all seems harmless (though we are seeing a generation of youth destroyed by screens and pads – Steve Jobs didn’t let his own kids play with an iPad!), but the end goal I assure is nefarious.

The dominant ideology of these tech giants is pure and total dysgenics (not eugenics).  In order for the total rewrite to come, the existing structure must be destroyed.  The “old way” of doing things will be scapegoated as the technocracy replaces it, offering utopia and salvation, but the synthetic rewrite is a Trojan horse.  Humanity will be enslaved in the same virtual Matrix Anderton is enslaved in, in the film.

The film’s tag line, which pops up numerous times in the story, is about running.  “Everybody runs,” and John spends most of the film on the run from the very system he operated.  The film asks the question multiple times, “Can you see?” and when we think of this on a deeper level in terms of predictive programming, I think we are intended to look beyond the immediate narrative.  There are also numerous hat tips to Blade Runner, where again the “running” imagery comes to the fore.  Can we run from the panopticon?  Do we have eyes to see the iEYES that are “infallibly” surveilling us perpetually?

Saturday Matinee: Rakka

From Oats Studios

Rakka is the story of broken humanity following the invasion of a technologically superior alien species. Bleak, harrowing and unrelenting, the humans we meet must find enough courage to go on fighting.

Directed by Neill Blomkamp and starring Sigourney Weaver.

Cyberpunk is Dead

By John Semley

Source: The Baffler

“It was an embarrasser; what did I want? I hadn’t thought that far ahead. Me, caught without a program!”
—Bruce Bethke, “Cyberpunk” (1983)

Held annually in a downtown L.A. convention center so massive and glassy that it served as a futurist backdrop for the 1993 sci-fi action film Demolition Man and as an intergalactic “Federal Transport Hub” in Paul Verhoeven’s 1997 space-fascism satire Starship Troopers, the Electronic Entertainment Expo, a.k.a. “E3,” is the trade show of the future. Sort of.

With “electronic entertainment” now surpassing both music and movies (and, indeed the total earnings of music and movies combined), the future of entertainment, or at least entertainment revenue, is the future of video games. Yet it’s a future that’s backward-looking, its gaze locked in the rearview as the medium propels forward.

Highlights of E3’s 2019 installment included more details around a long-gestating remake of the popular PlayStation 1-era role-playing game Final Fantasy VII, a fifth entry in the demon-shooting franchise Doom, a mobile remake of jokey kids side-scroller Commander Keen, and playable adaptations of monster-budget movie franchises like Star Wars and The Avengers. But no title at E3 2019 garnered as much attention as Cyberpunk 2077, the unveiling of which was met with a level of slavish mania one might reserve for a stadium rock concert, or the ceremonial reveal of an efficacious new antibiotic.

An extended trailer premiere worked to whet appetites. Skyscrapers stretched upward, slashed horizontally with long windows of light and decked out with corporate branding for companies called “DATA INC.” and “softsys.” There were rotating wreaths of bright neon billboards advertising near-futuristic gizmos and gee-gaws, and, at the street level, sketchy no-tell motels and cars of the flying, non-flying, and self-piloting variety. In a grimy, high-security bunker, a man with a buzzcut, his face embedded with microchips, traded blows with another, slightly larger man with a buzzcut, whose fists were robotically augmented like the cyborg Special Forces brawler Jax from Mortal Kombat. The trailer smashed to its title, and to wild applause from congregated gamers and industry types.

Then, to a chug-a-lug riff provided by Swedish straight-edge punkers Refused (recording under the nom de guerre SAMURAI) that sounded like the sonic equivalent of a can of Monster energy drink, an enormous freight-style door lifted, revealing, through a haze of pumped-out fog, a vaguely familiar silhouette: a tall, lean-muscular stalk, scraggly hair cut just above the shoulders. Over the PA system, in smoothly undulating, bass-heavy movie trailer tones, a canned voice announced: “Please welcome . . . Keanu Reeves.” Applause. Pitchy screams. Hysterics in the front row prostrating themselves in Wayne’s World “we’re not worthy!” fashion. “I gotta talk to ya about something!” Reeves roared through the din. Dutifully reading from a teleprompter, he plugged Cyberpunk 2077’s customizable characters and its “vast open world with a branching storyline,” set in “a metropolis of the future where body modification has become an obsession.”

More than just stumping for Cyberpunk 2077, Reeves lent his voice and likeness to the game as a non-playable character (NPC) named “Johnny Silverhand,” who is described in the accompanying press materials as a “legendary rockerboy.” A relative newbie to the world of blockbuster Xbox One games, Reeves told the audience at E3 that Cyberpunk piqued his interest because he’s “always drawn to fascinating stories.” The comment is a bit rich—OK, yes, this is a trade show pitch, but still—considering that such near-futuristic, bodily augmented, neon-bathed dystopias are hardly new ground for Reeves. His appearance in Cyberpunk 2077 serves more to lend the game some genre cred, given Reeves’s starring roles in canonical sci-fi films such as Johnny Mnemonic (1995) and the considerably more fantastic Matrix trilogy (1999-2003)—now quadrilogy; with an anticipated fourth installment announced just recently. Like many of E3 2019’s other top-shelf titles, Cyberpunk 2077 looked forward by reflecting back, conjuring its tech-noir scenario from the nostalgic ephemera of cyberpunk futures past.

This was hardly lost among all the uproar and excitement. Author William Gibson, a doyenne of sci-fi’s so-called “cyberpunk” subgenre, offered his own withering appraisal of Cyberpunk 2077, tweeting that the game was little more than a cloned Grand Theft Auto, “skinned-over with generic 80s retro-future” upholstery. “[B]ut hey,” Gibson added, a bit glibly, “that’s just me.” One would imagine that, at least in the burrows of cyberpunk fandom, Gibson’s criticism carries considerable weight.

After all, the author’s 1984 novel Neuromancer is a core text in cyberpunk literature. Gibson also wrote the screenplay for Johnny Mnemonic, adapted from one of his own short stories, which likewise developed the aesthetic and thematic template for the cyberpunk genre: future dystopias in which corporations rule, computer implants (often called “wetware”) permit access to expansive virtual spaces that unfold before the user like a walk-in World Wide Web, scrappy gangs of social misfits unite to hack the bad guys’ mainframes, and samurai swords proliferate, along with Yakuza heavies, neon signs advertising noodle bars in Kanji, and other fetish objects imported from Japanese pop culture. Gibson dissing Cyberpunk 2077 is a bit like Elvis Presley clawing out of his grave to disparage the likeness of an aspiring Elvis impersonator.

Gibson’s snark speaks to a deeper malaise that has beset cyberpunk. A formerly lively genre that once offered a clear, if goofy, vision of the future, its structures of control, and the oppositional forces undermining those authoritarian edifices, it has now been clouded by a kind of self-mythologizing nostalgia. This problem was diagnosed as early as 1991 by novelist Lewis Shiner, himself an early cyberpunk-lit affiliate.

“What cyberpunk had going for it,” Shiner wrote in a New York Times op-ed titled “Confessions of an Ex-Cyberpunk, “was the idea that technology did not have to be intimidating. Readers in their teens and 20’s responded powerfully to it. They were tired of hearing how their home computers were tempting them into crime, how a few hackers would undermine Western civilization. They wanted fiction that could speak to the sense of joy and power that computers gave them.”

That sense of joy had been replaced, in Shiner’s estimation, by “power fantasies” (think only of The Matrix, in which Reeves’s moonlighting hacker becomes a reality-bending god), which offer “the same dead-end thrills we get from video games and blockbuster movies” (enter, in due time, the video games and blockbuster movies). Where early cyberpunk offerings rooted through the scrap heap of genre, history, and futurist prognostication to cobble together a genre that felt vital and original, its modern iterations have recourse only to the canon of cyberpunk itself, smashing together tropes, clichés, and old-hat ideas that, echoing Gibson’s complaint, feel pathetically unoriginal.

As Refused (in their pre-computer game rock band iteration) put it on the intro to their 1998 record The Shape of Punk to Come: “They told me that the classics never go out of style, but . . . they do, they do.”

Blade Ran

The word was minted by author Bruce Bethke, who titled a 1980 short story about teenage hackers “Cyberpunk.” But cyberpunk’s origins can be fruitfully traced back to 1968, when Philip K. Dick published Do Androids Dream of Electric Sheep?, a novel that updated the speculative fiction of Isaac Asimov’s Robot series for the psychedelic era. It’s ostensibly a tale about a bounty hunter named Rick Deckard chasing rogue androids in a post-apocalyptic San Francisco circa 1992. But like Dick’s better stories, it used its ready-made pulp sci-fi premise to flick at bigger questions about the nature of sentience and empathy, playing to a readership whose conceptions of consciousness were expanding.

Ridley Scott brought Dick’s story to the big screen with a loose 1982 film adaptation, Blade Runner, which cast Harrison Ford as Deckard and pushed its drizzly setting ahead to 2019. With its higher order questions about what it means to think, to feel, and to be free—and about who, or what, is entitled to such conditions—Blade Runner effectively set a cyberpunk template: the billboards, the neon, the high-collared jackets, the implants, the distinctly Japanese-influenced mise-en-scène extrapolated from Japan’s 1980s-era economic dominance. It is said that William Gibson saw Blade Runner in theaters while writing Neuromancer and suffered something of a crisis of conscience. “I was afraid to watch Blade Runner,” Gibson told The Paris Review in 2011. “I was right to be afraid, because even the first few minutes were better.” Yet Gibson deepened the framework established by Blade Runner with a crucial invention that would come to define cyberpunk as much as drizzle and dumpsters and sky-high billboards. He added another dimension—literally.

Henry Case, Gibson establishes early on, “lived for the bodiless exultation of cyberspace.” As delineated in Neuromancer, cyberspace is an immersive, virtual dimension. It’s a fully realized realm of data—“bright lattices of logic unfolding across that colorless void”—which hackers can “jack into” using strapped-on electrodes. That the matrix is “bodiless” is a key concept, both of Neuromancer and of cyberpunk generally. It casts the Gibsonian idea of cyberspace against another of the genre’s hallmarks: the high-tech body mods flogged by Keanu Reeves during the Cyberpunk 2077 E3 demo.

Early in Neuromancer, Gibson describes these sorts of robotic, cyborg-like implants and augmentations. A bartender called Ratz has a “prosthetic arm jerking monotonously” that is “cased in grubby pink plastic.” The same bartender has implanted teeth: “a webwork of East European steel and brown decay.” Gibson’s intense, earthy descriptions of these body modifications cue the reader into the fundamental appeal of Neuromancer’s matrix, in which the body itself becomes utterly immaterial. Authors from Neal Stephenson (Snow Crash) to Ernest Cline (Ready Player One, which is like a dorkier Snow Crash, if such a thing is conceivable), further developed this idea of what theorist Fredric Jameson called “a whole parallel universe of the nonmaterial.”

As envisioned in Stephenson’s Snow Crash, circa 1992, this parallel universe takes shape less as some complex architecture of unfathomable data, and more as an immersive, massively multiplayer online role-playing game (MMORPG). Stephenson’s “Metaverse”—a “moving illustration drawn by [a] computer according to specifications coming down the fiber-optic cable”—is not a supplement to our real, three-dimensional world of physical bodies, but a substitute for it. Visitors navigate the Metaverse using virtual avatars, which are infinitely customizable. As Snow Crash’s hero-protagonist, Hiro Protagonist (the book, it should be noted, is something of a satire), describes it: “Your avatar can look any way you want it to . . . If you’re ugly, you can make your avatar beautiful. If you’ve just gotten out of bed, your avatar can still be wearing beautiful clothes and professionally applied makeup. You can look like a gorilla or a dragon or a giant talking penis in the Metaverse.”

Beyond Meatspatial Reasoning

The Metaverse seems to predict the wide-open, utopian optimism of the internet: that “sense of joy and power” Lewis Shiner was talking about. It echoes early 1990s blather about the promise of a World Wide Web free from corporate or government interests, where users could communicate with others across the globe, forge new identities in chat rooms, and sample from a smorgasbord of lo-res pornographic images. Key to this promise was, to some extent, forming new identities and relationships by leaving one’s physical form behind (or jacked into a computer terminal in a storage locker somewhere).

Liberated from such bulky earthly trappings, we’d be free to pursue grander, more consequential adventures inside what Gibson, in Neuromancer, calls “the nonspace of the mind.” Elsewhere in cyberpunk-lit, bodies are seen as impediments to the purer experience of virtuality. After a character in Cory Doctorow’s Down and Out in the Magic Kingdom unplugs from a bracingly real simulation immersing him in the life of Abraham Lincoln, he curses the limitations of “the stupid, blind eyes; the thick, deaf ears.” Or, as Case puts it in Neuromancer, the body is little more than “meat.”

In Stephenson’s Metaverse, virtual bodies don’t even obey the tedious laws of physics that govern our non-virtual world. In order to manage the high amount of pedestrian traffic within the Metaverse and prevent users from bumping around endlessly, the complicated computer programming permits avatars simply to pass through one another. “When things get this jammed together,” Hiro explains, “the computer simplifies things by drawing all of the avatars ghostly and translucent so you can see where you’re going.” Bodies—or their virtual representations—waft through one another, as if existing in the realm of pure spirit. There is an almost Romantic bent here (Neuromancer = “new romancer”). If the imagination, to the Romantics, opened up a gateway to deep spiritual truth, here technology serves much the same purpose. Philip K. Dick may have copped something of the 1960s psychedelic era’s ethos of expanding the mind to explore the radiant depths of the individual soul, spirit, or whatever, but cyberpunk pushed that ethos outside, creating a shared mental non-space accessible by anyone with the means—a kind of Virtual Commons, or what Gibson calls a “consensual hallucination.”

Yet outside this hallucination, bodies still persist. And in cyberpunk, the physical configurations of these bodies tend to express their own utopian dimension. Bruce Bethke claimed that “cyberpunk” resulted from a deliberate effort to “invent a new term that grokked the juxtaposition of punk attitudes and high technology.” Subsequent cyberpunk did something a bit different, not juxtaposing but dovetailing those “punk attitudes” with high-tech. (“Low-life, high-tech” is a kind of a cyberpunk mantra.) Neuromancer’s central heist narrative gathers a cast of characters—hacker Henry Case, a cybernetically augmented “Razorgirl” named Molly Millions, a drug-addled thief, a Rastafari pilot—that can be described as “ragtag.” The major cyberpunk blockbusters configure their anti-authoritarian blocs along similar lines.

In Paul Verhoeven’s cyberpunk-y action satire Total Recall, a mighty construction worker-cum-intergalactic-spy (Arnold Schwarzenegger) joins a Martian resistance led by sex workers, physically deformed “mutants,” little people, and others whose physical identities mirror their economic alienation and opposition to a menacing corporate-colonial overlord named Cohaagen.

In Johnny Mnemonic, Keanu Reeves’s businesslike “mnemonic courier” (someone who ferries information using computer implants embedded in the brain) is joined by a vixenish bodyguard (Dina Meyer’s Jane, herself a version of Neuromancer’s Molly Millions), a burly doctor (Henry Rollins), and a group of street urchin-like “Lo-Teks” engaged in an ongoing counterinsurgency against the mega-corporation Pharmakom. Both Mnemonic and Recall rely on cheap twists, in which a figure integral to the central intrigue turns out to be something ostensibly less- or other-than-human. Total Recall has Kuato, a half-formed clairvoyant mutant who appears as a tumorous growth wriggling in the abdomen of his brother. Even more ludicrously, Mnemonic’s climax reveals that the Lo-Teks’ leader is not the resourceful J-Bone (Ice-T), but rather Jones, a computer-augmented dolphin. In cyberpunk, the body’s status as “dead meat” to be transcended through computer hardware and neurological implantation offers a corollary sense of freedom.

The idea of the cybernetic body as a metaphor for the politicized human body was theorized in 1985, cyberpunk’s early days, by philosopher and biologist Donna Haraway. Dense and wildly eclectic, by turns exciting and exasperating, Haraway’s “Cyborg Manifesto” is situated as an ironic myth, designed to smash existing oppositions between science and nature, mind and body. Haraway was particularly interested in developing an imagistic alternative to the idea of the “Goddess,” so common to the feminism of the time. Where the Goddess was backward-looking in orientation, attempting to connect women to some prelapsarian, pre-patriarchal state of nature, the cyborg was a myth of the future, or at least of the present. “Cyborg imagery,” she writes, “can suggest a way out of the maze of dualisms in which we have explained our bodies and our tools to ourselves.” Part machine and part flesh, Haraway visualizes the cyborg as a being that threatens existing borders and assumes responsibility for building new ones.

Though they are not quite identical concepts, Haraway’s figure of the cyborg and the thematics of cyberpunk share much in common. A character like Gibson’s Molly Millions, for example, could be described as a cyborg, even if she is still essentially gendered as female (the gender binary was one of the many “dualisms” Haraway believed the cyborg could collapse). Cyborgs and cyberpunk are connected in their resistance to an old order, be it political and economic (as in Neuromancer, Johnny Mnemonic, etc.) or metaphysical (as in Haraway). The cyborg and the cyberpunk both dream of new futures, new social relationships, new bodies, and whole new categories of conceptions and ways of being.

The historical problem is that, for the most part, these new categories and these new relationships failed to materialize, as cyberpunk’s futures were usurped and commodified by the powers they had hoped to oppose.

Not Turning Japanese

In an introduction to the Penguin Galaxy hardcover reissue of Neuromancer, sci-fi-fantasy writer Neil Gaiman ponders precisely how the 1980s cyberpunk visions came to shape the future. “I wonder,” he writes, “to what extent William Gibson described a future, and how much he enabled it—how much the people who read and loved Neuromancer made the future crystallize around his vision.”

It’s a paradox that dogs most great sci-fi writers, whose powers for Kuato-style clairvoyance have always struck me as exaggerated. After all, it’s not as if, say, Gene Roddenberry literally saw into the future, observed voice-automated assistants of the Siri and Alexa variety, and then invented his starship’s speaking computers. It’s more that other people saw the Star Trek technology and went along inventing it. The same is true of Gibson’s matrix or Stephenson’s Metaverse, or the androids of Asimov and Dick. And the realization of many technologies envisioned by cyberpunk—including the whole concept of the internet, which now operates not as an escapist complement to reality, but an essential part of its fabric, like water or heat—has occurred not because of scrappy misfits and high-tech lowlifes tinkering in dingy basements, but because of gargantuan corporate entities. Or rather, the cyberpunks have become the corporate overlords, making the transition from the Lo-Teks to Pharmakom, from Kuato to Cohaagen. In the process, the genre and all its aspirations have been reduced to so much dead meat. This is what Shiner was reacting to when, in 1991, he renounced his cyberpunk affiliations, or when Bruce Bethke, who coined the term, began referring to “cyberpunk” as “the c-word.”

The commodification of the cool is a classic trick of capitalism, which has the frustrating ability to mutate faster than the forces that oppose it. Yet even this move toward commodification and corporatization is anticipated in much cyberpunk. “Power,” for Neuromancer’s Henry Case, “meant corporate power.” Gibson goes on: “Case had always taken it for granted that the real bosses, the kingpins in a given industry, would be both more and less than people.” For Case (and, it follows, Gibson, at least at the time of his writing), this power had “attained a kind of immortality” by evolving into an organism. Taking out one-or-another malicious CEO hardly matters when lines of substitutes are waiting in the wings to assume the role.

It’s here that cyberpunk critiques another kind of body. Not the ruddy human form that can be augmented and perfected by prosthetics and implants, but the economic body. Regarding the economy as a holistic organism—or a constituent part of one—is an idea that dates back at least as far as Adam Smith’s “invisible hand.” The rhetoric of contemporary economics is similarly biological. An edifying 2011 argument in Al Jazeera by Paul Rosenberg looked at the power of such symbolic conceptions of the economy. “The organic metaphor,” Rosenberg writes, “tells people to accept the economy as it is, to be passive, not to disturb it, to take a laissez faire attitude—leave it alone.”

This idea calls back to another of cyberpunk’s key aesthetic influences: the “body economic” of Japan in the 1980s. From the 2019 setting of 1982’s Blade Runner, to the conspicuous appearance of yakuza goons in Gibson’s stories, to Stephenson’s oddly anachronistic use of “Nipponese” in Snow Crash, cyberpunk’s speculative futures proceed from the economic ascendency of 1980s Japan, and the attendant anxiety that Japan would eventually eclipse America as an economic powerhouse. This idea, that Japan somehow is (or was) the future, has persisted all the way up to Cyberpunk 2077’s aesthetic template, and its foregrounding of villains like the shadowy Arasaka Corporation. It suggests that, even as it unfolds nearly sixty years from our future, the blockbuster video game is still obsessed with a vision of the future past.

Indeed, it’s telling that as the robust Japanese economy receded in the 1990s, its burly body giving up the proverbial ghost, that Japanese cinema became obsessed with avenging spirits channeled into the present by various technologies (a haunted video cassette in Hideo Nakata’s Ringu, the internet itself in Kiyoshi Kurosawa’s Kairo, etc.). But in the 1980s, Japan’s economic and technologic dominance seemed like a foregone conclusion. In a 2001 Time article, Gibson called Japan cyberpunk’s “de facto spiritual home.” He goes on:

I remember my first glimpse of Shibuya, when one of the young Tokyo journalists who had taken me there, his face drenched with the light of a thousand media-suns—all that towering, animated crawl of commercial information—said, “You see? You see? It is Blade Runner town.” And it was. It so evidently was.

Gibson’s analysis features one glaring mistake. His insistence that “modern Japan simply was cyberpunk” is tethered to its actual history as an economic and technological powerhouse circa the 1980s, and not from its own science-fictional preoccupations. “It was not that there was a cyberpunk movement in Japan or a native literature akin to cyberpunk,” he writes. Except there so evidently was.

The Rusting World

Even beyond the limp, Orwellian connotations, 1984 was an auspicious year for science-fiction. There was Neuromancer, yes. But 1984 also saw the first collected volume of Akira, a manga written and illustrated by Katsuhiro Otomo. Originally set, like Blade Runner, in 2019, Akira imagines a cyberpunk-y Neo-Tokyo, in which motorcycle-riding gangs do battle with oppressive government forces. Its 1988 anime adaptation was even more popular, in both Japan and the West. (The film’s trademark cherry red motorcycle has been repeatedly referenced in the grander cyberpunk canon, appearing in Steven Spielberg’s film adaptation of Ready Player One and, if pre-release hype is to believed, in Cyberpunk 2077 itself.) In 2018, the British Film Institute hailed Akira, accurately, as “a vital cornerstone of the cyberpunk genre.”

Japan has plenty of other, non-Akira cyberpunk touchstones. As a cinematic subgenre, Japanese cyberpunk feels less connected to the “cyber” and more to the spirit of “punk,” whether in the showcasing of actual Japanese punk rock bands (as in 1982’s Burst City) or the films’ own commitment to a rough-hewn, low-budget, underground aesthetic. Chief among the latter category of films is Shinya Tsukamoto’s Tetsuo: The Iron Man, which was shot on 16mm over a grueling year-and-a-half, mostly in and around Tetsuo actress and cinematographer Kei Fujiwara’s apartment, which also housed most of the film’s cast and crew.

Unlike the Western cyberpunk classics, Tsukamoto’s vision of human-machine hybridization is demonstrably more nightmarish. The film follows two characters, credited as the Salaryman (Tomorowo Taguchi) and the Guy (a.k.a. “The Metal Fetishist,” played by writer/director/producer/editor Tsukamoto himself), bound by horrifying mutations, which see their flesh and internal organs sprouting mechanical hardware.

In its own way, Tetsuo works as a cyberpunk-horror allegory for the Japanese economy. As the Salaryman and the Fetishist learn to accept the condition of their mechanization, they merge together, absorbing all the inorganic matter around them, growing enormously like a real-world computer virus or some terrifying industrial Katamari. Their mission resonates like a perverse inversion of Japan’s post-industrial promise. As Tsukamoto’s Fetishist puts it: “We can rust the whole world and scatter it into the dust of the universe.”

Like Haraway’s development of the cyborg as a metaphoric alternative to the New Age “goddess,” Tetsuo’s titular Iron Man can offer a similar corrective. If cyberpunk has become hopelessly obsessed with its own nostalgia, recycling all its 1980s bric-a-brac endlessly, then we need a new model. Far from the visions of Gibson, in which technology provides an outlet for a scrappy utopian impulse that jeopardizes larger corporate-political dystopias, Tetsuo is more pessimistic. It sees the body—both the individual physical body and the grander corpus of political economy—as being machine-like. Yet, as Rosenberg notes in his Al Jazeera analysis of economic rhetoric, it may be more useful to conceive of the economy not as a “body” or an organism but as a machine. The body metaphor is conservative, “with implications that tend toward passivity and acceptance of whatever ills there may be.” Machines, by contrast, can be fixed, greased, re-oriented. They are, unlike bodies, a thing separate from us, and so subject to our designs.

Cybernetic implants and cyborg technology are not some antidote to corporate hegemony. The human does not meld with technology to transcend the limitations of humanity. Rather, technology and machinery pose direct threats to precisely that condition. We cannot, in Tsukamoto’s film, hack our way to a better future, or technologically augment our way out of collective despair. Technology—and the mindless rush to reproduce it—are, to Tsukamoto, the very conditions of that despair. Even at thirty years old, Tetsuo offers a chilling vision not of the future, or of 1980s Japan, but of right now: a present where the liberating possibilities of technology have been turned inside-out; where hackers become CEOs whose platforms bespoil democracy; where automation offers not the promise of increased wealth and leisure time, but joblessness, desperation, and the wholesale redundancy of the human species; where the shared hallucination of the virtual feels less than consensual.

There’s nothing utopian about the model of cyberpunk developed in Tetsuo: The Iron Man. It is purely dystopian. But this defeatism offers clarity. And in denying the collaborative, collectivist, positive vision of a technological future in favor of a vision of identity-destroying, soul-obliterating horror, Tsukamoto’s stone-cold classic of Japanese cyberpunk invites us to imagine our own anti-authoritarian, anti-corporate arrangements. The enduring canon of American-style cyberpunk may have grown rusty. It has been caught, as Bethke put it in his genre-naming story, “without a program.” But the genre’s gnarlier, Japanese iterations have plenty to offer, embodying sci-fi’s dream of imagining a far-off future as a deep, salient critique of the present. It is only when we accept this cruel machinery of the present that we can freely contemplate how best to tinker with its future.

Left to peddle such a despairing vision in a packed-out L.A. convention center, even cyberpunk’s postmortem poster boy Keanu Reeves would be left with little to say but a resigned, bewildered, “Woah . . .”

Saturday Matinee: John Dies at the End

“John Dies at the End” (2012) is a sci-fi/horror/comedy written and directed by Don Coscarelli and based on David Wong’s novel of the same name. Like Phantasm, it’s a cross-genre cult film involving an inter-dimensional invasion by body-snatching aliens, but this time with the added complication of hallucinogenic drugs and time travel. It also features relatively unknown actors Chase Williamson and Rob Mayes as the two main protagonists with strong supporting performances by Paul Giamatti (who was also executive producer), Clancy Brown and Doug Jones.

Watch the full film on Hoopla here: https://www.hoopladigital.com/title/11488923

Saturday Matinee: Corporate Monster

By Rob Munday

Source: Short of the Week

From New Coke to the New World Order, conspiracy theories are rich throughout history and that stranger you meet in the bar, or family member you get stuck with at a wedding, will rejoice in informing you how the moon landing was filmed in a basement or how The Beatles replaced Paul McCartney after he died. In the age of fake news, where world leaders are the ones fueling these theories, S/W favourite Ruairi Robinson (Blinky™) returns to the site with his own conspiracy story—Corporate Monster.

Inspired by the current state of the world, which Robinson rather aptly describes as “f*cked”, Corporate Monster is the tale of a man who obsessively starts to believe that parasitic creatures are controlling the world. Hidden in the shadows and disguised from the majority of the population, these Zoidbergian beings are pulling all the political strings behind the scenes. The idea of giant decapods hiding behind human faces is obviously far-fetched (though nothing seems impossible nowadays), but the theory that a wealthy elite holds sway over the globe feels much less fictional.

An obvious ode to Carpenter’s cult classic They Live, Corporate Monster was filmed in both Dublin and Detroit, and builds on its eighties inspiration with a retro-tinged production. Frequent Robinson collaborator Macgregor is back behind the lens, and the duo compliment each other perfectly once again, deftly handling the quieter paranoid moments and the tense action sequences with equal aplomb. Produced with the support of Screen Ireland, this week’s online release comes on a sad note however, as it represents the anniversary of the passing of Robinson’s writing partner Eoin Rogers, and the film is dedicated to his memory.