Saturday Matinee: Invasion of the Body Snatchers (1978)

“Invasion of the Body Snatchers” (1978) is a science fiction horror film directed by Philip Kaufman (The Wanderers), with a screenplay by W.D. Richter (dir. Buckaroo Banzai) and starring Donald Sutherland, Brooke Adams, Veronica Cartwright, Jeff Goldblum and Leonard Nimoy. It is a remake of the 1956 film of the same name, also based on the 1955 novel “The Body Snatchers” by Jack Finney. The plot involves a San Francisco health inspector and his colleague who discover that humans are being replaced by physically identical alien clones devoid of emotion.

Saturday Matinee: The Adventures of Buckaroo Banzai

No Matter Where You Go, Here It Is

By Peter Sobczynski

Source: RogerEbert.com

Practically from the moment that I first saw it at the age of 13 during its very brief run at the long-defunct Golf Mill Theaters (thanks for the ride, Mom) in the fall of 1984, “The Adventures of Buckaroo Banzai: Across the Eighth Dimension” has been my all-time favorite movie. And yet, it occurs to me that while I’ve been lucky enough to write at length about any number of favorites over the years, I’ve never had the occasion to do so for that particular film. Oh sure, I’ve proclaimed it as a favorite many times and have made reference to it every now and then—I even gave “Sharknado 4: The 4th Awakens” an extra half-star for nodding to it—but I haven’t had the opportunity to properly explain my love of the film. Happily, it is now making its long-awaited Blu-ray debut in a package from Shout! Factory that includes all the bells and whistles that members of its ever-widening cult could possibly ask for. Even more happily, it gives me a chance to sit down and once and for all explain why I love this film so much.

Of course, that is easier said than done because, as anyone who has seen it can attest, it’s not exactly the kind of movie that can be summed up in a sentence or two. Even the most basic, no-frills explanation of it will send many heads reeling, either out of excitement or confusion. Perhaps the best place to start is to look at its hero, the one and only Buckaroo Banzai himself. The Japanese-American son of a pair of brilliant scientists, he first studied medicine and became a brilliant neurosurgeon. However, he chose to become a modern-day Renaissance man and soon branched out into particle physics, designing high-powered automobiles, occasionally saving the world with the aid of his band of Blue Blaze Irregulars and performing with his other band, the hard-rocking Hong Kong Cavaliers, a group made up of geniuses from other scientific endeavors. (All of this is summed up for viewers in an opening roll of text not dissimilar from the ones that kick off the “Star Wars” films).

As the film proper opens, Buckaroo (Peter Weller), along with his men and mentor Dr. Hikita (Robert Ito), are ostensibly preparing to test a new Jet Car with the capacity to drive at the speed of sound. The real experiment, however, involves the Oscillation Overthruster, a secret device that they hope will let them drive through solid matter. This is not the first attempt to make a go of the Overthruster. In 1938, Dr. Hikita was working for the eminent physicist Dr. Emilio Lizardo (John Lithgow) when he tried to pass through—the experiment was a botch that lodged him partway through a wall, and landed him in the Trenton Home for the Criminally Insane. In 1955, an attempt by Hikita and Buckaroo’s parents was sabotaged by crime lord Hanoi Xan via a bombing that killed his parents. (A flashback to this scene was cut from the original release but can be seen in the deleted scenes, where we discover that Buckaroo’s mother was played by Jamie Lee Curtis.) Buckaroo, however, succeeds and not only manages to drive through a mountain with nary a scratch, he has returned with some kind of alien organism attached to the Jet Car. Upon hearing this news, Lizardo breaks out of the asylum, claiming that he is going home. (Get ready because now things are about to get a little confusing.)

As it turns out, when Lizardo was trapped in the eighth dimension all those decades ago, he had his mind taken over by Lord John Whorfin, a fearsome Red Lectroid who was banished there alongside many of his followers after an unsuccessful attempt to take over their home world of Planet 10 from the more peaceable Black Lectroids. Before being locked up, he managed to bring many fellow Red Lectroids to Earth, where they have been living in plain sight and are now running a defense contracting company based in Grover’s Mills, New Jersey that is currently in charge of building a new bomber for the US Air Force. What they have really been doing with the government’s money is building a spaceship that will allow them to rescue their comrades still trapped in the 8th Dimension and return to take over Planet 10 once and for all. Now that Buckaroo has perfected the necessary Overthruster, all they need to do is steal it and they are home free.

After receiving a mysterious electric shock that allows him to see Lectroids as they really are and prevent the attempted theft of the Overthruster during a press conference, Buckaroo learns of the existence of Yoyodyne. But when the Hong Kong Cavaliers hack into their computer database, they discover that every single employee has the first name of John, a bizarre surname and an application for a Social Security card dated November 1, 1938. Around this time, a Black Lectroid emissary arrives with a message from their leader stating that if Buckaroo is unable to stop Whorfin/Lizardo from using the Overthruster to return to the eighth dimension, they will protect themselves by faking a nuclear attack that will start World War III. With the fate of the world now in his hands, Buckaroo and his team, including new Hong Kong Cavalier recruit Sidney Zweibel (Jeff Goldblum), a neurosurgeon and piano player who dresses up in full cowboy gear (including chaps) and calls himself New Jersey, and Penny Priddy (Ellen Barkin), a mysterious woman who meets Buckaroo after being accused of trying to shoot him during a concert (she was actually trying to kill herself but was accidentally bumped by a waitress at the key moment), set off to do battle with the Lectroids, recover the Overthruster, save humanity and if time permits, explain exactly what that watermelon is doing there.

In other words, “The Adventures of Buckaroo Banzai” is your typical sci-fi/action/comedy/rock&roll/kung-fu/political satire/neo-western/guys-on-a-mission extravaganza. The film was the brainchild of writer Earl Mac Rauch, who had written a couple of novels and co-wrote the screenplay to the Martin Scorsese musical “New York, New York,” and W.D. Richter, who had already established himself as a writer of quirky screenplays through such films as the goofy action comedy “Slither” and the masterful 1978 remake of “Invasion of the Body Snatchers.” One day, Mac Rauch dreamed up this character that would eventually be called Buckaroo Banzai and Richter encouraged him to write a screenplay involving his adventures. Supposedly, Mac Rauch would start one, get about fifty pages into it and then abandon it to try again with a new story. Eventually, Richter and producer Neil Canton formed a company to make “Buckaroo Banzai” and got Rauch to write a new treatment, using material from his previous attempts, that was then called “Lepers from Saturn.” Although it was rejected by many, it got noticed at MGM and studio chief David Begelmen agreed to finance it. Unfortunately, the project was then delayed for nearly a year because of a writers strike and Begelmen left MGM after a number of his expensive projects died at the box-office. However, Begelmen formed his own production company, bought the script back from MGM and made a deal with 20th Century Fox to produce it.

This would prove to be good news and bad news for the project. On the one hand, it was Begelmen’s enthusiasm that eventually got the film up and running. On the other hand, he apparently saw it as a straightforward action film in the mold of “Raiders of the Lost Ark” and either overlooked the weirdo humor in the screenplay or just assumed that Richter and Mac Rauch would dump all of that nonsense somewhere along the way in order to ensure that it would be a hit. Once it became evident that the weird stuff was not going by the wayside, Begelmen began battling with Richter, Mac Rauch and Canton over the most inexplicable things in a misguided attempt to exert authority and make the film that he wanted. For example, Richter has hired the great cinematographer Jordan Cronenweth, whose credits included such titles as “Brewster McCloud,” “Altered States” and, perhaps his most famous work, “Blade Runner.” The story goes that Begelmen agreed to his hiring as long as he didn’t make the film look in any way like “Blade Runner” but after several weeks of shooting, he decided that it was indeed looking like “Blade Runner” and had Cronenweth replaced with Fred J. Koenekamp, who had shot such epics as “Patton” and “The Towering Inferno.” At another point, he threatened to shut down the production over a pair of red-rimmed glasses that Buckaroo wore in a couple of scenes on the theory that heroes don’t wear red glasses.

The struggles to make the film were equaled only by the struggles to get it released and find an audience. Perhaps realizing early on that trying to sell the film to a mainstream audience at first might not be a wise idea, Fox decided to promote it at sci-fi conventions in the months leading up to its release by stressing that it was a cult movie in the making. Unfortunately, this approach wound up backfiring as the sci-fi audience was understandably wary of anything announcing itself as a cult movie before anyone had actually seen it—in their eyes, a cult movie is one that is discovered and nurtured by a loyal audience, not one that arrives in theaters proclaiming itself as such right from the get-go. “Buckaroo Banzai” was originally scheduled for a wide release on June 8, 1984 (which would have pitted it against the opening weekends of “Ghostbusters” and “Gremlins”) but was bumped at the last minute to a much-reduced opening in a few cities in mid-August that barely made any impact, though it did receive good reviews from the likes of Pauline Kael and Vincent Canby. Over the next couple of months, it opened in a few more cities before finally disappearing from theaters altogether.

And this is the point where the film and I finally crossed paths. Back in the prehistoric days before the Internet, a kid obsessed with the world of film would have to get himself down to the local bookstore or newsstand to purchase magazines that contained articles about upcoming releases. One such magazine was Starlog, which was dedicated to new and classic films in the sci-fi/fantasy genres and even though they were not necessarily favorites of mine, there were usually enough items of interest in each issue to make it worth the purchase. Now, 1984 provided a bumper crop of titles for genre fans—this was the year of “Indiana Jones and the Temple of Doom,” “Ghostbusters,” “Streets of Fire,” “Star Trek III: The Search for Spock,” “Gremlins,” “2010,” “The Last Starfighter,” “Dune” and the proverbial many more—and while not all of them may have lived up to the hype, they sure looked tantalizing at the time. As good as most of those looked, it was this “Buckaroo Banzai” thing that looked most intriguing to me. Even at the borderline precocious age of 13, I had already fancied myself as someone who knew more than a thing or two about movies but I had never seen or heard of anything like this before. Needless to say, that June release date couldn’t come quickly enough and even though the August delay was frustrating, my enthusiasm did not wane. However, it was devastating to discover that Chicago was not a part of that August release and that when it did finally open locally a month or so later, it was only in a couple of theaters with the nearest one located about 40 miles away. Thanks to a supremely indulgent mother, I made it to that theater during its opening weekend and sat down in what was, aside from myself, my mother and maybe five other people, an almost totally empty house to finally bear witness to the film that I had been obsessing over for months. I must admit that as the lights went down, the pessimist in me was thinking “What if this isn’t that good after all?”

Fat chance of that happening. Not only did “The Adventures of Buckaroo Banzai” live up to all of my insanely inflated expectations, it somehow managed to exceed them. I loved that it took any number of film genres and slammed them all together into one crazy-quilt narrative. I loved the idea of a hero who was valued more for his brains than for his ability to beat the bad guys into a pulp. I loved the funky New Wave aesthetic. I loved the decidedly offbeat humor, especially since one of my problems with science-fiction has always been its tendency to occasionally take itself a little too seriously at times. I loved the idea that all of the spaceships on display looked more like seashells or rotting fruit than the gleaming craft that whizzed through space on “Star Trek.” I loved the jaw-dropping performance by John Lithgow as Emilio Lizardo, a hilariously audacious turn that saw him using “frothing mad” as a mere stepping-off point to a level of pure craziness that at times seems more like a possession than a performance. I loved the sight of Ellen Barkin in that slinky pink dress. (Hey, I was a 13-year-old boy.) I even loved the end credits sequence that found Buckaroo and the Hong Kong Cavaliers traipsing through an empty L.A. aqueduct to the tune of the film’s jaunty theme music while the titles breathlessly promised that they would return in “Buckaroo Banzai vs. the World Crime League,” despite sensing (correctly, as it turned out) that the lack of people in the theaters meant such a prospect was unlikely at best. Watching this film, it was almost as if someone was tapping directly into my mind’s idea of a great movie and projecting it before my eyes. (For those of you who are curious, venerable Mom wound up enjoying it as well, though the few other patrons seemed more than a little bewildered when the lights went up afterwards.)

However, unlike a lot of things that seemed cool back in the day and eventually look fairly silly with the wisdom of age, my admiration for “The Adventures of Buckaroo Banzai” has only grown over the years as I have been able to appreciate just how innovative and groundbreaking it was. For example, while genre mash-ups are a relatively common occurrence these days, they were fairly non-existent back then—the fear being that such things would be impossible to market to people who preferred a undiluted take on their preferred genre over one that mixed up with with several others—and it is amazing to see how well Richter and Mac Rauch juggle the various generic tropes in ways that clearly have fun with them without crossing the line into overtly making fun of them. Additionally, I love the way it dropped this bizarre and complicated mythology in the laps of viewers without any elongated explanations and assumed that they would have the intelligence to figure things out as it went along. Now this wasn’t a completely unheard-of approach—“Star Wars” began in much the same way—but it was taken to such a level here that it almost felt like you were watching chapter five of a serial where you had already missed chapters one through four. Admittedly, this approach may have alienated as many viewers as it enchanted—some reviewers complained that they felt as if they were watching someone else’s private in-joke that made no effort to let them in on the fun—but to these eyes, the notion of creating this oddly detailed world (with stuff practically bursting out of every jam-packed frame) and then immersing viewers in it was an audacious approach that paid off beautifully. If you ever wondered what might have resulted if Robert Altman had ever been given the reins of a large-scale science-fiction project (not counting “Quintet”), this film may be the closest that we ever come to answering that question.

Another aspect of the film that may have bewildered viewers but now seems startlingly prescient is how it depicts a world in which popular culture has extended its tendrils into all areas of life in unexpectedly goofy ways. No matter where one goes in the film, there is an odd cultural reference there to comment upon it. During the Jet Car experiment, we see a scientific gauge labeled “Sine” that is eventually followed by ones marked “Seeled” and “Delivered.” When it is announced that Dr. Lizardo has escaped from the asylum, he is mistaken by one person for Mr. Wizard. During Whorfin’s manic speech rallying his men as they prepare to leave for Planet 10, he sort-of quotes the Beach Boys cover “Sloop John B” by exhorting “I feel so broke-up, I want to go home!” Orson Welles (“The guy from the old wine commercials?”) is the basis of a couple of gags, one fleeting (when we get a glimpse of the President of the United States, played by Ronald Lacey, he is made up to look exactly like Charles Foster Kane) and one that inspires one of its funniest conceits. As it turns out, the infamous “War of the Worlds” broadcast was not fiction at all—it was Lectroids landing in New Jersey, not Martians, and they hypnotized Orson Welles into broadcasting that it was all made up. Even Buckaroo himself is often depicted as a pop culture hero just as much as he is a regular hero—we see his face plastered upon comic books and video games and he is, to be sure, the rare hero who tops off a day of derring-do by playing a sold-out concert with his band that finds him belting out an especially soulful cover of the classic “Since I Don’t Have You.”

And yet, the oddest and perhaps most arcane cultural reference is the odd connection the film shares with the works of legendary author Thomas Pynchon. Not only does it share a certain thematic similarity with the dense narratives and weirdo humor prevalent in Pynchon’s work, his cheerfully surreal novel The Crying of Lot 49 was largely centered around an shadowy aerospace manufacturer known as Yoyodyne Systems. In fact, one could argue the case that long before the arrival of “Inherent Vice,” “The Adventures of Buckaroo Banzai,” at least in a metaphorical sense, was the first real stab at bringing the Pynchon perspective from the page to the screen. (The plot thickened when Pynchon published his 1990 novel Vineland, which itself contained a couple of not-so-subtle references to “The Adventures of Buckaroo Banzai,” leading to speculation that either Richter or Mac Rauch actually were the reclusive novelist.)

As for the performances, all of the actors clearly found just the right way to connect with the admittedly quirky tone of the material. Some have slagged Peter Weller for being a little stiff at times but they are missing the point—this is a character who is so cool and above the fray that he doesn’t have to be constantly calling attention to himself. More importantly, he serves the necessary duty of being the anchor of the film that keeps it from flying off amidst all of the other oddball characters—it is a smart piece of underplaying from an actor who has never quite gotten his due despite starring in two all-time genre classics (the other, of course, being “Robocop”). Jeff Goldblum had already proven himself to be a more-than-reliable supporting player by the time he did this film and his ability to put a unique and often hilarious spin on even the most seemingly mundane bit of dialogue never shone brighter than it did here. (He also deserves credit not only for donning one of the goofiest Western outfits ever seen but somehow making it work against all odds.) As Penny Priddy, the one female character of note (perhaps the one aspect of the film that does not date very well today), Ellen Barkin more than holds her own with the guys. Christopher Lloyd turns up as John Bigboote, who has been in charge of the goings-on at Yoyodyne over the past few decades and whose name inspires a great running joke involving Whorfin repeatedly mispronouncing it as “big booty,” and he is a blast throughout. However, the scene-stealer of the bunch—indeed, one of the scene-stealing performances of all time—is unquestionably John Lithgow as Emilio Lizardo. Given the rare opportunity to play a character where going too far over the top is simply impossible, Lithgow pulls out all the stops with his astoundingly flamboyant turns in which everything from his accent (which genuinely sounds like an alien trying to approximately an Italian dialect) to his wardrobe (which finds him wearing two of everything) is cranked up to maximum effect. And yet, even though he is playing a character who is clearly out of control, the performance never is—Lithgow knows exactly when to go for laughs or menace and hits those beats perfectly every time. He also gets many of the film’s best lines as well and I guarantee that after you see it, you too will be quoting (no doubt in your best approximation of the accent) such classic lines as, “Laugh while you can, monkey boy!” and “Sealed with a curse as sharp as a knife/Doomed is-a your soul and damned is your life!”

Although the film tanked in theaters, it eventually began to develop a genuine cult following once it hit cable and home video and brave viewers were given the chance to experience it for themselves. The promised sequel never emerged (due in part to a tangled situation involving the rights and the bankruptcy of the original production company), but “Buckaroo Banzai” has continued to live on in the pop culture firmament in odd and unusual ways. A couple of installments of the “Dick Tracy” comic strip made arcane references to the film and the finale of Wes Anderson’s “The Life Aquatic with Steve Zissou” paid homage to the end credits sequence (and, of course, included Jeff Goldblum in the mix). After a long delay, the film came out on DVD in 2001 in an edition that deepened the meta-textual joke, positing that Buckaroo Banzai was indeed a real person and that the film was actually a docudrama depicting real-life events. Strangely enough, in 1998, Fox attempted to develop a TV adaptation that was to be titled “Buckaroo Banzai: Ancient Secrets and New Mysteries” that never got off the ground save for a brief bit of test computer animation that can be found as an extra on the new Blu-ray. Even more strangely, it was announced earlier this year that another attempt to bring it to television was being attempted by none other than Kevin Smith and that Amazon Studios might be producing it.

Whether or not this particular endeavor pans out remains to be seen. However, until it happens, the new Blu-ray should more than tide over fans of the film. The two-disc package contains all the material from the original DVD release—the original commentary with Richter and Mac Rauch maintaining the illusion that what they are presenting is fact, deleted scenes, the alternate opening with Buckaroo’s parents, a short featurette and the trailer—along with a new commentary from sci-fi experts Michael and Dennis Okuda. More importantly, there is “Into the 8th Dimension,” a full-length documentary that chronicles every possible aspect of the film from its strange beginnings to its occasionally tortured production to its long and fruitful afterlife that is packed with fascinating tidbits of information. For example, we learn that when Richter first found the actor he wanted to play Buckaroo, Begelmen refused to cast him on the belief that he would never become a movie star—so long, Tom Hanks.

Of course, the best feature of all is the film itself in all its crazy, one-of-a-kind glory. For decades, I have loved this movie beyond all others. Watching it again, I realized that love had not been misplaced in the slightest. Now, those of you who have never seen it before may not react in quite the same way as I did, but I can pretty much guarantee that you have never seen a film quite like it—maybe the one that came closest to approximating its wild mixup of genres and strange humor was “Big Trouble in Little China,” on which Richter served as a co-writer—and that if you are able to accept its offbeat nature, you are in for the cinematic ride of your life. And when it is all over and you begin to delve into the special features, you will even finally learn exactly what that watermelon was doing there.

 

Watch the full film on Hoopla.

Dystopia Isn’t Sci-Fi—for Me, It’s the American Reality

Cadwell Turnbull is a contributing author of The Dystopia Triptych. Photograph: Broad Reach Publishing

By Cadwell Turnbull

Source: Wired

Imagine a city where a group of people have managed against all odds to carve out prosperity for themselves, at least for a little while. These people used to be owned by other people. Now, they are permitted freedom, but only so much, subject to the whims of the once-masters.

Prosperity is a dangerous thing for the oppressed. It is a dry hot day in a forest bound to catch fire. And so, eventually, there is spark. A teenage boy assaults a teenage girl of the once-master class in an elevator, or so the story is told. Truth doesn’t matter here. A story is enough. The once-masters want justice, which means all the once-slaves must be punished. Men, women, and children are dragged from their homes and shot, their stores and houses bombed or burned. The exact number of dead will remain uncertain, the story buried for so long that people will watch it in a television show almost a century later and mistake the dramatization of the event for pure fiction.

Imagine another city where the once-slaves are told they are getting treatment for a devastating illness, when they are in fact receiving a placebo. Imagine four decades of this lie, the originally infected passing on this disease to their spouses, their children, so that the once-masters can study the long-term effects of the disease on people they don’t consider fully human.

Imagine these cities are part of a great nation. The once-slaves are tired of their second-class citizenship so they begin a movement for justice and equity. This movement is met with a violent backlash. The once-slaves are attacked by dogs, blasted by hoses. Their churches are burned, their institutions subject to random acts of retaliation by the once-masters. Their activists are monitored. Their leaders are jailed or assassinated. There are victories, but even after the successes, once-slaves are shot down in the street for minor offenses or looking “suspicious.” Their neighborhoods are over-policed. Their children are denied quality education. Many of them are sent to prison, where they work for pennies or for nothing. But it isn’t called slavery. It is treated as coincidence that this forced labor disproportionately affects the oppressed class, the once-slaves.

These are the makings of dystopian fictions, and yet many in America don’t need to imagine them. It is their reality. However, most Americans would not call America a dystopia.

If the edges are filed off, the names of places and events changed, a few injustices amplified, Americans can pretend the sorts of things that happen in dystopias don’t happen in their backyards. They can call it fiction, create enough distance to make themselves comfortable with their country’s own sins. But this doesn’t change the fact that the American experience is dystopian for many marginalized people. And like in any dystopia, real or imagined, it is up to all Americans to recognize this storyline, imagine a better society outside of the current reality, and then work toward it. Otherwise, America consents to a normal that is grotesque.

I read my first dystopia in high school. As a teenager, 1984 terrified the hell out of me. I didn’t read it as a warning, but as a mirror to my own experience. I identified with the protagonist Winston Smith’s feeling that something was deeply wrong with his society and the overwhelming sense of helplessness that followed. In college, I read my first utopia. The Dispossessed, by Ursula K. Le Guin, in every sense, was an antidote to that despair I felt when reading 1984.

And then, many years later, I read “The Day Before the Revolution,” the prequel short story to The Dispossessed, and found in it the practical application of the novel’s revolutionary ideas. The story is beautifully quiet. It follows Odo, the founder of the radical movement at the heart of The Dispossessed, as she goes through her day and remembers important moments in her political and personal journey. Le Guin prefaced “The Day Before the Revolution” with a brief definition of the Odonian belief system: “Odonianism is anarchism … its principal and moral-practical theme is cooperation (solidarity, mutual aid). It is the most idealistic, and to me the most interesting, of all political theories.”

To be clear, the Odonians are not perfect. They are resistant to change and have allowed other forms of institutional privilege to develop and calcify in their society. But, because they believe in their utopia and have lived their lives in accordance with that belief, they’ve managed to build a reasonably just and equitable society

And this is where, in life just as in science fiction, a distinction must be made. A just and equitable society is not the same as a perfect one. I’d argue that everyone would benefit if we defined utopia as a move toward justice and equity, and not just the state of perfection. But in America, especially in discussions about social justice, “just” and “perfect” are treated as synonymous objectives. And because perfect is never attainable, justice, too, becomes out of reach. Under this framing, injustice becomes normal, oppression is realistic, and any move towards justice and equity must come from struggle. A disturbing unspoken belief is born from this framing, that marginalized people will never receive full humanity because a just society is not possible. By failing to recognize the dystopia, and dismissing the possibility of a utopia, America has resigned itself to its current, dark narrative.

As a result, in America, universal social welfare is too costly and politically unfeasible, while trillion-dollar corporate bailouts and endless wars go unquestioned. Police and prison reform are aimed towards harm reduction for marginalized communities, instead of daring to imagine a society where these institutions are mostly unnecessary. In American discourse, a society can’t take care of all its citizens or remedy the causes of crime.

In a society where injustice is normalized, justice becomes a goal that can only be achieved through sacrifice—tragedy becomes currency, a thing to be used, not prevented. It takes decades of confirmed police brutality before America considers even the most minor reforms. This is not by accident. Black and brown bodies have been the fuel used to drive this society towards slightly lesser states of injustice since the very beginning. The oppressed have always paid the price for progress.

And yet, Americans have never shown this kind of defeatism when it comes to technological advancements. When this nation decided to go to the moon, it was framed in terms of “How do we get there?” not “Is this possible?” And no one ever said, “This rocket may only get half-way to the moon, but first many must die.”

Americans once oblivious to the dystopia are waking up. That’s good. But the price of waking up should be considered, and the lives sacrificed to incrementalism must be mourned. It is easy for a pragmatist to ask for incremental change when the current reality favors them. But pragmatism hits differently when it is forced at gunpoint. Every loss on the way to justice is a collective sin, because it was decided that the road must be long and the oppressed must struggle for every inch.

Do not normalize the losses happening right now because of the gains. Assume where America has always been is a tragedy. What is done in hell isn’t romantic; sacrificing bodies to dystopia isn’t beautiful. As I write this, people protesting brutality are dying at the hands of law enforcement. No one should pay for progress with their life. And it isn’t naive to believe every member of society should have a healthy, empowering, and fulfilling time on earth. The ones that have suffered deserve nothing less than faith in that possibility. This moment may provide a way out of dystopia, but there has to be a collective reckoning with the dystopian aspects of American society as well as the cruel price of progress repeatedly placed on the backs of the oppressed. Through solidarity there is a way out of these bitter realities, but the way there must be just if the destination is to be just.

In science fiction there is a notion that the universe is filled with possible worlds just waiting for humanity to come settle. It has some of its more troubling roots in manifest destiny, but also in hope, and the idea that better worlds are possible. But what if this corner of Earth could be that imagined place? Imagine a better world right here, instead of elsewhere. The price is in going all the way, doing all the work, believing all the work can be done. That’s the only way to get to the moon. Human beings have to believe it exists.

Why We Need Dystopian Fiction Now More Than Ever

By August Cole and P.W. Singer

Source: Slate

It hits you every so often.

When you when you tug on a face mask to go pick up food for your family.

When you witness the powerless suffer casual violence by a man with a sneer.

When you see riot police surround the Lincoln Memorial and protesters snatched off the streets by masked soldiers in unmarked cars.

And when you realize that it is all being watched by an unblinking eye of A.I. surveillance.

At times, it feels like we are living in a real-world version of dystopia. The strange outcome, though, is that it means we need dystopian fiction now more than ever, to help us sort and even make it through it.

You’d think with everything going on, now would be the last time to escape to a world of darkness. And yet books, including those of awful imagined worlds, are in deep demand.
Some of it has been a return to old classics. In a period of disease and lockdowns lasting for weeks, booksellers report the seeming irony that Albert Camus’ The Plague and Gabriel García Márquez’s One Hundred Years of Solitude have seen renewed demand. And some of it has been escaping into new worlds, as with Divergent author Veronica Roth taking readers into another post-apocalypse with her new novel Chosen Ones. People have even been willing to enter imagined worlds that seem not too far away, such as Lawrence Wright’s best-selling pandemic thriller The End of October.

Yet the value of the genre is as much in education as entertainment. It can elucidate dangers, serving the role of warning and even preparation. Think of the recent resonance of Margaret Atwood’s 1985 Handmaid’s Tale and its 2020 sequel The Testaments or the revival of interest in It Can’t Happen Here by Sinclair Lewis in 1935. These are finely written works, not as indulgences, but as a pure expression of the idea that to be forewarned is to be forearmed. Even Susan Collins’ Hunger Games prequel, The Ballad of Songbirds and Snakes, might be interpreted in that light, showing how authoritarian rule can originate through the manipulations of an ambitious striver.

Our personal corner of this dark market is the meld of imagination with research. For our book Burn-In: A Novel of the Real Robotic Revolution, we chose the setting of not a far-off imagined world like Panem or Gilead, but Washington, D.C., just around the corner. What happens as Silicon Valley’s visions of utopia hits our real, and very divided, country? What plays out in politics, business, and even family life as our economy is rewired by AI and automation? Yet to make our scenario more haunting, we back up everything that happens in it with 27 pages of endnotes.

When the scarier elements from an imagined world come to life in the real one, however, there is no gleeful “I told you so.” When the novel coronavirus accelerated the more widespread roll out of the robots, remote work, job automation, and AI surveillance projected in our book, we certainly weren’t happy. All it meant was that all the tough dilemmas that our characters face would come quicker for all of us. What was perhaps most disturbing of the last few weeks, though, were when some of the most dystopian scenes we had painted of a future Washington, D.C., also came true, from our book’s scene of riot police deployed around the Lincoln Memorial to the militarized fence thrown up around the White House being put exactly where we had it in Burn-In.

Yet what makes dystopian fiction different is that its creators are oddly optimists at heart, as we are. These works are not about prediction, but prevention. The stories warn of just how far things can go if action isn’t taken, wrapped in a package that is far more impactful than a white paper or PowerPoint. Indeed, research shows that narrative, the oldest communication technology of all, holds more sway over both the public and policymakers than even the “most canonical academic sources.” Our minds can’t help but connect to the “synthetic environment” that our fictional heroes and villains experience, living part of our lives through theirs, even if imagined.

Most importantly, though, the dark worlds are only the setting. The stories are really about the agency of the people in them. And that is perhaps the true value of the dystopian fiction. These stories are not about what those characters experience so much as how they act. At the heart of every story of darkness is a story of perseverance.

As we face our own difficult journeys through the reality of 2020, it is perhaps that lesson which is most important of all.

America Has Always Been a Dystopia

Too many of us just haven’t been paying attention

By Bryan Merchant

Source: OneZero

“Trump’s American dystopia has reached a new and ominous cliff,” warns a CNN opinion headline. “The last two and a half months in America have felt like the opening montage in a dystopian film about a nation come undone,” writes New York Times columnist Michele Goldberg, in describing the images of militarized police storming U.S. cities to put down protests in the days following George Floyd’s murder, which came on the heels of two months of pandemic, panic, and widespread economic collapse. A very popular post published elsewhere on Medium was titled, bluntly, “America is a Dystopia.”

There is a lot of dystopia talk getting tossed around right now, for reasons that probably seem obvious. Those images we’ve all spent hours staring at on Twitter and cable TV — the military vehicles patrolling suburban streets, the lines of tactical vested officers cordoned around the Lincoln Memorial, the scenes of tear gas blurring flames as masked protesters clash with armed police — match up reasonably well with the aesthetics and broad strokes of a genre that we’ve spent the last 10 years staring at on Netflix and the other channels on cable TV.

But this is not “Trump’s American dystopia.” It is the continued, if inflamed, dystopian state of play as it has laid for centuries. The montage of horrors did not begin only a few months ago or when a cohort of privileged observers suddenly became aghast at the SWAT howitzers and brutal policing tactics when they were seen on suburban streets.

Years of toothless and profitable pop culture dystopias have primed consumers to ignore race, helping to obscure the fact that the real dystopia arrived long ago.

If we wanted to get pithy about it, we might say that the 2010s were the dystopia decade, a period that saw both the rise of dystopia as a reliably profitable and uniform entertainment format in mass culture and what appeared to be the IRL manifestation of the images and tropes the genre broadcast by decade’s end. The Hunger Games rose to dominate box offices and spawned a follow-on flotilla of similarly shaped YA dystopian fare. Black Mirror mainstreamed a visual mode of bleak cynicism about technology, and critical darlings like Ex MachinaHer, and Mad Max: Fury Road made apocalypses brought about by artificial intelligence and climate change palatable for the intelligentsia. Meanwhile, Blade Runner, RoboCop, Starship Troopers, and Children of Men became frequent touchstones. Partly because they are good films that offered prescient cultural and political commentary, and partly because their visuals provide handy fodder for comparative screen-grabbing on social media while we’re watching high-tech police forces brutalize popular uprisings, climate change-fueled wildfires spread across cityscapes, and A.I. take on alarming new dimensions, like being racist.

As a result, comparing America to a dystopia has become something of a national pastime; a recurring op-ed framework, a subgenre of Twitter commentary — especially during crisis points and moments of mass upheaval.

But what are we actually talking about when we talk about “dystopia”? Gesturing towards a vague constellation of injustices set to the color palette of a “gritty” summer blockbuster and declaring it dystopian won’t cut it — for dystopia to be useful as a cautionary tool for avoiding bad futures, we need to understand exactly what the ingredients setting a society on the road to ruin are. As it stands, much of the modern dystopian discourse seems content to position dystopia as something that is bad, with an air of futurity. To quote Daniel Mallory Ortberg’s famous mocking of Black Mirror: “What if phones, but too much.” What if high-tech cops, what if sea level rise, etc.

“The adjective dystopian implies fearful futures where chaos and ruin prevail,” writes Gregory Claeys, a historian and professor at Royal Holloway, University of London, and author of Dystopia: A Natural History. Though in a historical and literary sense, he says, dystopia most commonly describes “a regime defined by extreme coercion, inequality, imprisonment, and slavery.”

Because its most popular touchstones are science fiction, modern dystopia discourse tends to fixate on profit- or warfare-accelerating technologies — digital surveillance, facial recognition, automation software, drones, technologized weapons — and their capacity to serve the wealthy and powerful in a time of ecological collapse, health crises, and/or widening inequality. Our current moment fits the bill. The coronavirus, mass unemployment, and police brutality against a racial justice uprising are unfolding to the backdrop of SpaceX rocket launches and tech billionaires like Amazon’s Jeff Bezos rapidly expanding their wealth.

When I noted on Twitter that the SpaceX launch was sending astronauts on a for-profit trip into space as a surge of protests swept the country, it struck a chord. Many responded by comparing the events to Elysium, the 2013 Neill Blomkamp film about a future where the poor toil and swelter on Earth while the wealthy live in luxury in a space station that orbits above the Terran rabble.

Others pointed to the great Gil Scott Heron song, “Whitey’s On the Moon.” The musician and poet released it in 1970, one year after the NASA moon landing, which was itself one year after Dr. Martin Luther King, Jr.’s assassination provoked a mass nationwide uprising, perhaps the last at a scale comparable to the one we’re seeing today.

Some of the lyrics:

A rat done bit my sister Nell.

(with Whitey on the moon)

Her face and arms began to swell.

(and Whitey’s on the moon)

I can’t pay no doctor bill.

(but Whitey’s on the moon)

Ten years from now I’ll be payin’ still.

(while Whitey’s on the moon)

That song was recorded a half-century ago, yet the plight remains the same. It was the same in 1993, when Octavia Butler, in her own magisterial dystopia, Parable of the Sower, set in a mostly Black community in Southern California in the apocalyptic 2020s, described the news of the death of a Mars explorer as eliciting the following reaction: “People here in the neighborhood are saying she had no business going to Mars, anyway. All that money wasted on another crazy space trip when so many people here on earth can’t afford water, food, or shelter.”

Billionaires can afford to send payloads into orbit, to explore space for science and for profit, but we cannot afford to provide health care to the poor or even basic racial equality. That’s what too many of us are missing when we talk about dystopia.

As comparatively radical as a dystopia like Elysium (or, say, Snowpiercer) is — in terms of summer blockbusters, anyway — its critique is limited to class. It glosses over race. It’s Matt Damon versus Sharlto Copley and Jodi Foster and the other white orbital techno-authoritarians. Take a scan through any of the most popular dystopian cinema products of the last decade or so, and you’ll find the same thing; matters of race are omitted almost entirely from the big screen eschatologies. Not only are the genre’s prime exports — Hunger Games, Divergent, Blade Runner, Elysium, RoboCop, the list goes on — written and directed by white people, the protagonists, actors, and even antagonists are nearly uniformly white. And despite many of these being imagined, written, and made in a nation whose founding arrangement was the most dystopian system conceivable, race is never even a component of the conversation in mainstream dystopian cinema, much less what the uprisings are predicated upon. Even the Handmaid’s Tale, which exploded in the wake of Trump’s misogyny-lined ascendency to the presidency, relegates any matter of racial politics deep into the background.

Angelica Jade Bastién points all this out in “Why Don’t Dystopias Know How to Talk About Race?”, where she explains how this in effect allows white viewers to cosplay as the oppressed, without actually interrogating in any meaningful way what oppression might actually entail or who gets oppressed and why.

“Race is relegated to inspiration, coloring the towering cityscapes of these worlds, while the white characters toil under the hardships that Brown and Black people experience acutely in real life,” Bastién writes. “In this way, dystopias become less fascinating thought experiments or vital warnings than escapades in which white people can take on the constraints of what it means to be the other.”

And in so doing, these popular dystopias appropriate the other’s struggles while conveniently ignoring the actual roots of said struggle. I do still think there’s utility in dystopias and trying to heed their warnings, but only if we recognize what’s being warned against, and only, especially, if we manage to understand that many of the looming “dystopias” perceived by more affluent entertainment consumers have been the realities of plenty of communities who have faced deep inequalities, technologized surveillance, and state oppression for generations already.

There’s a tweet that’s gone viral a number of times over the dystopian decade, each time in slightly different variation. Its most recent iteration came just this January, before the pandemic and the uprising came to dominate dystopia discourse:

https://twitter.com/ElleOnWords/status/1218693768339251200

White dystopia fanboys like me, pundits, columnists, and social media users need to get this through our skulls. To invert a notorious quote attributed to William Gibson, the dystopia has always been here; it just hasn’t been evenly distributed.

The “dystopia” lens too often fixes conditions like those — heavily policed communities, invasive surveillance, state oppression — in the future, and it glosses over the realities of the present and the long histories of oppression of Black communities and bodies, plenty of which was technologically abetted. The writer Anthony Walton noted in a 1999 Atlantic piece, “Technology Versus African Americans,” that from “the caravel to the cotton gin, technological innovation has made things worse for Blacks.” Western technologies, he writes, formed the infrastructure that gave rise to Black slavery:

Arab and African slave traders exchanged their human chattels for textiles, metals, and firearms, all products of Western technological wizardry, and those same slavers used guns, vastly superior to African weapons of the time, in wars of conquest against those tribes whose members they wished to capture… The slave wars and trade were only the first of many encounters with Western technology to prove disastrous for people of African descent. In the United States, as in South America and the Caribbean, the slaves were themselves the technology that allowed Europeans to master the wilderness.

What better fits Claeys description of dystopia — “a regime defined by extreme coercion, inequality, imprisonment, and slavery” — than actual chattel slavery? America was founded as a dystopia.

Yet for white and affluent consumers, the constant generation of novel and fantastic apocalyptic scenarios serves to extend the horizon for the arrival of the hellish conditions contained in dystopia — if oppression is a nebulous but ever-approaching threat, it’s perpetually obscured, lifted away into a sub-fictional ether. It needs not be interrogated, not now, anyway. Which is how power prefers it.

That’s the other thing about dystopia: In many of its guises, it’s a plainly conservative enterprise. The most influential dystopia of the 21st century, I would argue, is not 1984, but Atlas Shrugged, which alone is responsible for a generation of greed-is-good Republican policymaking. The 600,000-page book, which I have (regrettably) read, positions a handful of great white men and women as the only thing keeping society together and inveighs against the millions of working-class “moochers” with a barely veiled racist subtext. (Its author was also openly racist.) Many dystopias are less flagrant but similarly conservative: They highlight the fear that we might all end up like the poor unwashed masses if we are not careful to uphold the social order, not the fear that the poor might never be liberated. And that, in fact, includes the ur-dystopia.

“Visions of the apocalypse are at least as old as 1000 B.C.,” according to the dystopian historian Claeys. “The triumph of chaos over order defined the Egyptian ‘Prophecies of Neferti’ foretold the complete breakdown of society.” In it, the “great no longer rule the land,’ the ‘slaves will be exalted.’” The first dystopia, in other words, was a cautionary tale for the haves against sliding into the world of the have-nots. It’s hard not to shake that vibe from a lot of the Twitter commentariat, pointing at the protests from afar, going “man it’s so dystopian” and moving on to whatever the central animating conflict is in their own personal heroic narratives.

There are still useful deployments of dystopian language — it can certainly be effective shorthand for “this is fucked in a new way, pay attention.” A good example is this series of viral tweets that chronicle a day of peaceful protest where demonstrators were in turn greeted with the creepy electrified visage of Gov. Andrew Cuomo on a towering billboard, beaming down the newly instated curfew. A couple hours later, many protesters would be beaten.

And dystopias can still jolt the politically uninvolved to wake up — this podcaster even pointed to Elysium as an entry point into radical politics. But the surfeit of commentary that amounts to “wow, this is like Blade Runner send tweet” needs an upgrade. White viewers like me need to rethink and reevaluate what it means to watch and read popular dystopian fiction, how those products are shaping our perspectives and critiques of the futures and what they’re missing. And many more Black voices clearly need to be added to the mainstream canon and the broader discussion — there’s tons of great Black dystopian fiction; Dhalgren by Samuel Delany, Who Fears Death by Nnedi Okorafor, Zone One by Colson Whitehead, pretty much anything by Ishmael Reed. Who Fears Death is in development for a TV series, which is a start, but these voices need to be better foregrounded and made central to modern dystopia discourse.

A lack of diversity has been a problem in science fiction since the genre’s inception, and it persists. When I went to the Nebulas, a high-profile sci-fi awards conference last year, attendees were overwhelmingly white. The fact that Octavia Butler’s magisterial Parable of the Sower — a dystopia that actually and skillfully manages to interrogate climate change, total economic collapse, privatization, and racist oppression — is somehow not a film or a limited series yet is as scathing an indictment of Hollywood’s insistence on whitewashing dystopias as anything. The book absolutely rips.

This is not to disparage anyone who feels like they’re living in a certain kind of almost-future hell. The number of people who genuinely experience the world as an impending or current dystopia is almost certainly rising in tandem with trends of still-increasing inequality. A decade of jobless recovery ended in 2019 with the highest levels of income inequality in 50 years, and record numbers of people of all backgrounds, even whites, are sliding into poverty and despair, and our encounters with climate change, technological surveillance, conservatism’s hard drift toward authoritarianism, and all of the above being increasingly mediated through digital devices. Our current socioeconomic system is now ideally structured to be a dystopian protagonist generator. It is rewarding elites with unprecedented wealth and luxury, equipping the agents of the state with increasingly advanced weapons and technology, exacerbating ecological collapse, and positioning us all to experience the devastation alone, blinking into a screen, hoping for tiny units of validation from a pithy comment or two about the state of the morass on social media. It is us versus [gestures wildly] all of that, out there.

Which makes it all the more imperative that white fans, pundits, and observers stop ignoring what it has historically meant to experience actual dystopian conditions. It means acknowledging and working to improve the material conditions for those who are surviving the current iteration, and not glibly waving off dystopia as some always-approaching, faceless Empire without zeroing in on the nation’s institutional prejudices, its targets for violence, its specific hatreds. It means we have to stop LARPing in appropriated fictions. It means understanding that this has always been a dystopia — and that those who have always resisted it are at the center of the story.

Minority Report (2002) Esoteric Analysis

By Jay Dyer

Source: Jay’s Analysis

Spielberg’s Minority Report is now an important film to revisit.  Based on the short story by visionary science fiction author Phillip K. Dick, Spielberg’s film version implements an important number of predictive programming elements not found in Dick.  Both are worth a look, but the film is important for JaysAnalysis, since now 13 years later, we are actually seeing the implementation of the total technocratic takeover, including pre-crime tracking systems.

Although the film and the short story present the precognition as a metaphysical mystery by telepathic individuals who can see into the aether, the real pre-crime systems are based on A.I. and the digitizing of all records under total information awareness.  And as I’ve said, this was DARPA’s plan for the Internet all along.

In fact, a good friend of mine worked for a few years digitizing mass medical records, and while most are aware of Google’s attempts to digitize all books, most do not know why.  I’ve warned for several years now the end goal of all this digitization is not for “efficiency” and trendy techy cool iWatches to monitor heart rates and location.  The ultimate goal is total mind control, loss of free will and the complete rewrite of all past reality.

Consider, for example, the power the system will wield with the ability to “delete” all past versions of literature – religious texts, Shakespeare, 1984, nothing will be sacred and unable to be “revised.”  Remember that in 2009 Amazon erased Orwell’s 1984.  Your own past may even be deleted, subject to revision or altered to make you the next villain!  All this is revealed in detail in Minority Report.  Thus, while the public adopts “Kindles,” print itself is assigned the doom of the kindled fire – like Farenheit 451, as Richard Grove has said.

Minority Report’s setting is a 2055 dystopic D.C., where Agent John Anderton (Tom Cruise) is framed for two murders from within his own PreCrime Corporation ranks by the CEO, Lamar Burgess (Max Von Sydow). (Note: The existing system appears to be a merger of private and government sectors.)   I’m sure most readers have seen the film, so I’ll spare you detailed plot recaps and hit the highlights for the sake of our purposes.

The film’s PreCrime alerts a private corporation to a predetermined murder event ahead of time, giving the Agents of the corporation time to save victims.  Hailed as a perfect system, the infallibility of PreCrime has made D.C. the safest city in the world, with no murders for several years.  As a result, the PreCrime test requires a total surveillance society, something akin to complete panopticism.  In fact, the advertising in D.C. is user specific, targeting pedestrian’s personal desires based on retina scans – and all travel requires retinal scanning and mass microchipping.

We are now on the verge of the implementation of retinal scanning, as the U.S. military has engaged in retinal scanning in occupied territories for several years now.  It is important to understand that the action of the military abroad are often a test ground for the implementation of such surveillance and tracking technology at “home.”  In October 2010, the Guardian reported of U.S. troops stationed in Afghanistan:

“With each iris and fingertip scanned, the device gave the operator a steadily rising percentage chance that the goat herder was on an electronic “watch list” of suspects. Although it never reached 100%, it was enough for the man to be taken to the nearest US outpost for interrogation.

Since the Guardian witnessed that incident, which occurred near the southern city of Kandahar earlier this year, US soldiers have been dramatically increasing the vast database of biometric information collected from Afghans living in the most war-torn parts of southern and eastern Afghanistan. The US army now has information on 800,000 people, while another database developed by the country’s interior ministry has records on 250,000 people.”

Wired Magazine reported millions were the goal.  The goal is not millions, but the entire globe, where any and all information is now currency for “big data.”  This is exactly the world Minority Report foresaw, and for those curious about Phillip K. Dick, whispers are his foresight was due to being well-connected with the Silicon Valley elites.  This is how Ubik foresaw the “Internet of Things” I’ve written about many times, and probably in part why Dick went insane (or was targeted).  Slate writes of Ubik:

“Samsung, the world’s largest manufacturer of televisions, tells customers in its privacy policy that “personal or other sensitive” conversations “will be among the data captured and transmitted to a third party” through the TV’s voice-recognition software. Welcome to the Internet of Things.

Sci-fi great Philip K. Dick warned us about this decades ago. In his classic 1969 novel Ubik, the characters have to negotiate the way they move and how they communicate with inanimate objects that monitor them, lock them out, and force payments.”

Just as the predictive algorithm in Asimov’s Foundation was able to track mass movements, so now the same algorithmic tracking is in place across the “web of things” that are capable of being recorded and tracked – and that’s most things.  The Pentagon has a virtual “you” in a realtime 3D interface that updates its data consistently from everything done on the web.  The Register reported in 2009 about this simulated warfare and predictive software:

“Defense analysts can understand the repercussions of their proposed recommendations for policy options or military actions by interacting with a virtual world environment,” write the researchers.

“They can propose a policy option and walk skeptical commanders through a virtual world where the commander can literally ‘see’ how things might play out. This process gives the commander a view of the most likely strengths and weaknesses of any particular course of action.”

It’s not telepathic Samantha Morton’s in a tub of goo, it’s Google and DARPA developing highly advanced technology along the lines of what William Binney exposed, as a former NSA employee.  Think here of War Games (1983), where the A.I. bot was able to war game future scenarios of global thermonuclear war, but thankfully Ferris Bueller was there to save us.  If this was displayed in 1983 in pop culture, imagine how far that technology has come 30 years later.  Lest anyone think the “precrime” is merely for security and weekend Xbox enjoyment, recall what I wrote two years back:

“Capitalism, communism, nationalism, 401ks, blah blah blah, all of these things are basically obsolete. Why?  Because of the nature of the real secret high-tech and plans for mega SmartCities that are to come.  You see, you think you are getting ahead and climbing the scum social ladder, and you aren’t even aware that the CEO of IBM Ginni Rommety gives lectures about SmartCities where everything you do will be rationed, tracked and traced by the central supercomputers, with pre-crime determining whether you are guilty of crimethink.  So everything you are trusting in is already obsolete.  You think I’m exaggerating?  On the contrary, you and your children’s futures are determined (you don’t have a future), and if you are allowed to live past the great culling, you will essentially be boxed into a giant WalmartTargetGameStopUniversity City that will literally be run by a supercomputer. Watch for yourself:”

And lest anyone think PreCrime is a thing of the future, consider that it has been used for two years in the U.K.  The New Scientist and 21stCenturyWire report:

“That’s the hope of police in the US, who have begun using advanced software to analyse crime data in conjunction with emails, text messages, chat files and CCTV recordings acquired by law enforcement. The system, developed by Wynyard, a firm based in Auckland, New Zealand, could even look at social media in real time in an attempt to predict where the gang might strike next.

“We’re trying to get to the source of the mastermind behind the criminal activity, that’s why we’re setting up a database so everybody can provide the necessary information and help us get higher up the chain,” says Craig Blanton of the Marion County Sheriff’s Office in Indiana. Because Felony Lane Gang members move from state to state to stay one step ahead, the centralised database is primed to aggregate historical information on the group and search for patterns in their movements, Blanton says.

“We know where they’ve been, where they are currently and where they may go in the future,” he says. “I think had we not taken on this challenge, we along with the other 110 impacted agencies would be doing our own thing without better knowledge of how this group operates.”

It’s not the only system that police forces have at their disposal. PredPol, which was developed by mathematician George Mohler at Santa Clara University in California, has been widely adopted in the US and the UK. The software analyses recorded crimes based on date, place and category of offence. It then generates daily suggestions for locations that should be patrolled by officers, depending on where it calculates criminal activity is most likely to occur.”

Returning to the film, an interesting tidbit occurs about three times that I noticed.  Any time Anderton or his fellow Agents access the “Temple,” the holding site of the telepathic PreCogs, the sound made is distinctly the iPhone power on sound.  The first iPod premiered in 2001, so I’m assuming it’s the same sound for turning on, but readers can correct me.  I find it curious if not, since the sound would likely be chosen for a reason. This puts the infamous 1984 Apple ad in a new, ominous light.

If you’ve seen the important Spike Jonze film, Her, you’ll see why.  In Her, lead character Theodore (Joaquin Phoenix) falls in love with an iOS – his operating system.  The iOS of his future is an intelligent software system with capability for learning (like the A.I. in War Games), and ultimately transcends its own limitations.

I bring this up because Minority Report is distinctly dominated by eye imagery.  While seemingly insignificant, it is my opinion that Siri and Apple in particular are crucial in the implementation of the coming new order.  Apple ads have contained a distinctly esoteric and significant cultural referent.  This is not to say Microsoft or any of the other tech giants are insignificant, on the contrary, I believe they are all arms of one entity and the appearance of competition is largely illusory.

There is only one military industrial complex, and DARPA and Google and Apple and Microsoft are all its children.  The façade of competition is enough to advance the technology by the tech nerds that serve it, but in the end, it all serves the same system.  My point here is that the iPhone is much more than an iPhone. It is actually an EYEphone, functioning as the eye of Sauron himself, as A.I. reconnaissance before the takeover.

I have mentioned before the whispers are the iPhone of the next few years will contain a Siri that communicates with you like a personal assistant.  I have finally found an article on this here, which describes it directly in connection to Her, like I said here.  “Viv” will do the following:

“On the other hand, not only will Viv recognize disparate requests, she will also be able to put them together. Basically, Viv is Siri with the ability to learn. The project is being kept heavily under wraps, but the guys at Viv have hinted that they’re working towards creating a “global brain,” a shared source of artificial intelligence that’s as readily accessible as heat or electricity.  It’s unclear how soon a breakthrough of this magnitude can happen. But if this team made Siri, you can bet their next project is going to blow the tech world to pieces.”

In order to endear the public to that idea, a prototype Siri had to be offered.  While this may be a rumor, it will eventually come.  And the dystopic scenario presented in Her will meet the nightmare of Minority Report.  For now, it all seems harmless (though we are seeing a generation of youth destroyed by screens and pads – Steve Jobs didn’t let his own kids play with an iPad!), but the end goal I assure is nefarious.

The dominant ideology of these tech giants is pure and total dysgenics (not eugenics).  In order for the total rewrite to come, the existing structure must be destroyed.  The “old way” of doing things will be scapegoated as the technocracy replaces it, offering utopia and salvation, but the synthetic rewrite is a Trojan horse.  Humanity will be enslaved in the same virtual Matrix Anderton is enslaved in, in the film.

The film’s tag line, which pops up numerous times in the story, is about running.  “Everybody runs,” and John spends most of the film on the run from the very system he operated.  The film asks the question multiple times, “Can you see?” and when we think of this on a deeper level in terms of predictive programming, I think we are intended to look beyond the immediate narrative.  There are also numerous hat tips to Blade Runner, where again the “running” imagery comes to the fore.  Can we run from the panopticon?  Do we have eyes to see the iEYES that are “infallibly” surveilling us perpetually?

Saturday Matinee: Rakka

From Oats Studios

Rakka is the story of broken humanity following the invasion of a technologically superior alien species. Bleak, harrowing and unrelenting, the humans we meet must find enough courage to go on fighting.

Directed by Neill Blomkamp and starring Sigourney Weaver.

Cyberpunk is Dead

By John Semley

Source: The Baffler

“It was an embarrasser; what did I want? I hadn’t thought that far ahead. Me, caught without a program!”
—Bruce Bethke, “Cyberpunk” (1983)

Held annually in a downtown L.A. convention center so massive and glassy that it served as a futurist backdrop for the 1993 sci-fi action film Demolition Man and as an intergalactic “Federal Transport Hub” in Paul Verhoeven’s 1997 space-fascism satire Starship Troopers, the Electronic Entertainment Expo, a.k.a. “E3,” is the trade show of the future. Sort of.

With “electronic entertainment” now surpassing both music and movies (and, indeed the total earnings of music and movies combined), the future of entertainment, or at least entertainment revenue, is the future of video games. Yet it’s a future that’s backward-looking, its gaze locked in the rearview as the medium propels forward.

Highlights of E3’s 2019 installment included more details around a long-gestating remake of the popular PlayStation 1-era role-playing game Final Fantasy VII, a fifth entry in the demon-shooting franchise Doom, a mobile remake of jokey kids side-scroller Commander Keen, and playable adaptations of monster-budget movie franchises like Star Wars and The Avengers. But no title at E3 2019 garnered as much attention as Cyberpunk 2077, the unveiling of which was met with a level of slavish mania one might reserve for a stadium rock concert, or the ceremonial reveal of an efficacious new antibiotic.

An extended trailer premiere worked to whet appetites. Skyscrapers stretched upward, slashed horizontally with long windows of light and decked out with corporate branding for companies called “DATA INC.” and “softsys.” There were rotating wreaths of bright neon billboards advertising near-futuristic gizmos and gee-gaws, and, at the street level, sketchy no-tell motels and cars of the flying, non-flying, and self-piloting variety. In a grimy, high-security bunker, a man with a buzzcut, his face embedded with microchips, traded blows with another, slightly larger man with a buzzcut, whose fists were robotically augmented like the cyborg Special Forces brawler Jax from Mortal Kombat. The trailer smashed to its title, and to wild applause from congregated gamers and industry types.

Then, to a chug-a-lug riff provided by Swedish straight-edge punkers Refused (recording under the nom de guerre SAMURAI) that sounded like the sonic equivalent of a can of Monster energy drink, an enormous freight-style door lifted, revealing, through a haze of pumped-out fog, a vaguely familiar silhouette: a tall, lean-muscular stalk, scraggly hair cut just above the shoulders. Over the PA system, in smoothly undulating, bass-heavy movie trailer tones, a canned voice announced: “Please welcome . . . Keanu Reeves.” Applause. Pitchy screams. Hysterics in the front row prostrating themselves in Wayne’s World “we’re not worthy!” fashion. “I gotta talk to ya about something!” Reeves roared through the din. Dutifully reading from a teleprompter, he plugged Cyberpunk 2077’s customizable characters and its “vast open world with a branching storyline,” set in “a metropolis of the future where body modification has become an obsession.”

More than just stumping for Cyberpunk 2077, Reeves lent his voice and likeness to the game as a non-playable character (NPC) named “Johnny Silverhand,” who is described in the accompanying press materials as a “legendary rockerboy.” A relative newbie to the world of blockbuster Xbox One games, Reeves told the audience at E3 that Cyberpunk piqued his interest because he’s “always drawn to fascinating stories.” The comment is a bit rich—OK, yes, this is a trade show pitch, but still—considering that such near-futuristic, bodily augmented, neon-bathed dystopias are hardly new ground for Reeves. His appearance in Cyberpunk 2077 serves more to lend the game some genre cred, given Reeves’s starring roles in canonical sci-fi films such as Johnny Mnemonic (1995) and the considerably more fantastic Matrix trilogy (1999-2003)—now quadrilogy; with an anticipated fourth installment announced just recently. Like many of E3 2019’s other top-shelf titles, Cyberpunk 2077 looked forward by reflecting back, conjuring its tech-noir scenario from the nostalgic ephemera of cyberpunk futures past.

This was hardly lost among all the uproar and excitement. Author William Gibson, a doyenne of sci-fi’s so-called “cyberpunk” subgenre, offered his own withering appraisal of Cyberpunk 2077, tweeting that the game was little more than a cloned Grand Theft Auto, “skinned-over with generic 80s retro-future” upholstery. “[B]ut hey,” Gibson added, a bit glibly, “that’s just me.” One would imagine that, at least in the burrows of cyberpunk fandom, Gibson’s criticism carries considerable weight.

After all, the author’s 1984 novel Neuromancer is a core text in cyberpunk literature. Gibson also wrote the screenplay for Johnny Mnemonic, adapted from one of his own short stories, which likewise developed the aesthetic and thematic template for the cyberpunk genre: future dystopias in which corporations rule, computer implants (often called “wetware”) permit access to expansive virtual spaces that unfold before the user like a walk-in World Wide Web, scrappy gangs of social misfits unite to hack the bad guys’ mainframes, and samurai swords proliferate, along with Yakuza heavies, neon signs advertising noodle bars in Kanji, and other fetish objects imported from Japanese pop culture. Gibson dissing Cyberpunk 2077 is a bit like Elvis Presley clawing out of his grave to disparage the likeness of an aspiring Elvis impersonator.

Gibson’s snark speaks to a deeper malaise that has beset cyberpunk. A formerly lively genre that once offered a clear, if goofy, vision of the future, its structures of control, and the oppositional forces undermining those authoritarian edifices, it has now been clouded by a kind of self-mythologizing nostalgia. This problem was diagnosed as early as 1991 by novelist Lewis Shiner, himself an early cyberpunk-lit affiliate.

“What cyberpunk had going for it,” Shiner wrote in a New York Times op-ed titled “Confessions of an Ex-Cyberpunk, “was the idea that technology did not have to be intimidating. Readers in their teens and 20’s responded powerfully to it. They were tired of hearing how their home computers were tempting them into crime, how a few hackers would undermine Western civilization. They wanted fiction that could speak to the sense of joy and power that computers gave them.”

That sense of joy had been replaced, in Shiner’s estimation, by “power fantasies” (think only of The Matrix, in which Reeves’s moonlighting hacker becomes a reality-bending god), which offer “the same dead-end thrills we get from video games and blockbuster movies” (enter, in due time, the video games and blockbuster movies). Where early cyberpunk offerings rooted through the scrap heap of genre, history, and futurist prognostication to cobble together a genre that felt vital and original, its modern iterations have recourse only to the canon of cyberpunk itself, smashing together tropes, clichés, and old-hat ideas that, echoing Gibson’s complaint, feel pathetically unoriginal.

As Refused (in their pre-computer game rock band iteration) put it on the intro to their 1998 record The Shape of Punk to Come: “They told me that the classics never go out of style, but . . . they do, they do.”

Blade Ran

The word was minted by author Bruce Bethke, who titled a 1980 short story about teenage hackers “Cyberpunk.” But cyberpunk’s origins can be fruitfully traced back to 1968, when Philip K. Dick published Do Androids Dream of Electric Sheep?, a novel that updated the speculative fiction of Isaac Asimov’s Robot series for the psychedelic era. It’s ostensibly a tale about a bounty hunter named Rick Deckard chasing rogue androids in a post-apocalyptic San Francisco circa 1992. But like Dick’s better stories, it used its ready-made pulp sci-fi premise to flick at bigger questions about the nature of sentience and empathy, playing to a readership whose conceptions of consciousness were expanding.

Ridley Scott brought Dick’s story to the big screen with a loose 1982 film adaptation, Blade Runner, which cast Harrison Ford as Deckard and pushed its drizzly setting ahead to 2019. With its higher order questions about what it means to think, to feel, and to be free—and about who, or what, is entitled to such conditions—Blade Runner effectively set a cyberpunk template: the billboards, the neon, the high-collared jackets, the implants, the distinctly Japanese-influenced mise-en-scène extrapolated from Japan’s 1980s-era economic dominance. It is said that William Gibson saw Blade Runner in theaters while writing Neuromancer and suffered something of a crisis of conscience. “I was afraid to watch Blade Runner,” Gibson told The Paris Review in 2011. “I was right to be afraid, because even the first few minutes were better.” Yet Gibson deepened the framework established by Blade Runner with a crucial invention that would come to define cyberpunk as much as drizzle and dumpsters and sky-high billboards. He added another dimension—literally.

Henry Case, Gibson establishes early on, “lived for the bodiless exultation of cyberspace.” As delineated in Neuromancer, cyberspace is an immersive, virtual dimension. It’s a fully realized realm of data—“bright lattices of logic unfolding across that colorless void”—which hackers can “jack into” using strapped-on electrodes. That the matrix is “bodiless” is a key concept, both of Neuromancer and of cyberpunk generally. It casts the Gibsonian idea of cyberspace against another of the genre’s hallmarks: the high-tech body mods flogged by Keanu Reeves during the Cyberpunk 2077 E3 demo.

Early in Neuromancer, Gibson describes these sorts of robotic, cyborg-like implants and augmentations. A bartender called Ratz has a “prosthetic arm jerking monotonously” that is “cased in grubby pink plastic.” The same bartender has implanted teeth: “a webwork of East European steel and brown decay.” Gibson’s intense, earthy descriptions of these body modifications cue the reader into the fundamental appeal of Neuromancer’s matrix, in which the body itself becomes utterly immaterial. Authors from Neal Stephenson (Snow Crash) to Ernest Cline (Ready Player One, which is like a dorkier Snow Crash, if such a thing is conceivable), further developed this idea of what theorist Fredric Jameson called “a whole parallel universe of the nonmaterial.”

As envisioned in Stephenson’s Snow Crash, circa 1992, this parallel universe takes shape less as some complex architecture of unfathomable data, and more as an immersive, massively multiplayer online role-playing game (MMORPG). Stephenson’s “Metaverse”—a “moving illustration drawn by [a] computer according to specifications coming down the fiber-optic cable”—is not a supplement to our real, three-dimensional world of physical bodies, but a substitute for it. Visitors navigate the Metaverse using virtual avatars, which are infinitely customizable. As Snow Crash’s hero-protagonist, Hiro Protagonist (the book, it should be noted, is something of a satire), describes it: “Your avatar can look any way you want it to . . . If you’re ugly, you can make your avatar beautiful. If you’ve just gotten out of bed, your avatar can still be wearing beautiful clothes and professionally applied makeup. You can look like a gorilla or a dragon or a giant talking penis in the Metaverse.”

Beyond Meatspatial Reasoning

The Metaverse seems to predict the wide-open, utopian optimism of the internet: that “sense of joy and power” Lewis Shiner was talking about. It echoes early 1990s blather about the promise of a World Wide Web free from corporate or government interests, where users could communicate with others across the globe, forge new identities in chat rooms, and sample from a smorgasbord of lo-res pornographic images. Key to this promise was, to some extent, forming new identities and relationships by leaving one’s physical form behind (or jacked into a computer terminal in a storage locker somewhere).

Liberated from such bulky earthly trappings, we’d be free to pursue grander, more consequential adventures inside what Gibson, in Neuromancer, calls “the nonspace of the mind.” Elsewhere in cyberpunk-lit, bodies are seen as impediments to the purer experience of virtuality. After a character in Cory Doctorow’s Down and Out in the Magic Kingdom unplugs from a bracingly real simulation immersing him in the life of Abraham Lincoln, he curses the limitations of “the stupid, blind eyes; the thick, deaf ears.” Or, as Case puts it in Neuromancer, the body is little more than “meat.”

In Stephenson’s Metaverse, virtual bodies don’t even obey the tedious laws of physics that govern our non-virtual world. In order to manage the high amount of pedestrian traffic within the Metaverse and prevent users from bumping around endlessly, the complicated computer programming permits avatars simply to pass through one another. “When things get this jammed together,” Hiro explains, “the computer simplifies things by drawing all of the avatars ghostly and translucent so you can see where you’re going.” Bodies—or their virtual representations—waft through one another, as if existing in the realm of pure spirit. There is an almost Romantic bent here (Neuromancer = “new romancer”). If the imagination, to the Romantics, opened up a gateway to deep spiritual truth, here technology serves much the same purpose. Philip K. Dick may have copped something of the 1960s psychedelic era’s ethos of expanding the mind to explore the radiant depths of the individual soul, spirit, or whatever, but cyberpunk pushed that ethos outside, creating a shared mental non-space accessible by anyone with the means—a kind of Virtual Commons, or what Gibson calls a “consensual hallucination.”

Yet outside this hallucination, bodies still persist. And in cyberpunk, the physical configurations of these bodies tend to express their own utopian dimension. Bruce Bethke claimed that “cyberpunk” resulted from a deliberate effort to “invent a new term that grokked the juxtaposition of punk attitudes and high technology.” Subsequent cyberpunk did something a bit different, not juxtaposing but dovetailing those “punk attitudes” with high-tech. (“Low-life, high-tech” is a kind of a cyberpunk mantra.) Neuromancer’s central heist narrative gathers a cast of characters—hacker Henry Case, a cybernetically augmented “Razorgirl” named Molly Millions, a drug-addled thief, a Rastafari pilot—that can be described as “ragtag.” The major cyberpunk blockbusters configure their anti-authoritarian blocs along similar lines.

In Paul Verhoeven’s cyberpunk-y action satire Total Recall, a mighty construction worker-cum-intergalactic-spy (Arnold Schwarzenegger) joins a Martian resistance led by sex workers, physically deformed “mutants,” little people, and others whose physical identities mirror their economic alienation and opposition to a menacing corporate-colonial overlord named Cohaagen.

In Johnny Mnemonic, Keanu Reeves’s businesslike “mnemonic courier” (someone who ferries information using computer implants embedded in the brain) is joined by a vixenish bodyguard (Dina Meyer’s Jane, herself a version of Neuromancer’s Molly Millions), a burly doctor (Henry Rollins), and a group of street urchin-like “Lo-Teks” engaged in an ongoing counterinsurgency against the mega-corporation Pharmakom. Both Mnemonic and Recall rely on cheap twists, in which a figure integral to the central intrigue turns out to be something ostensibly less- or other-than-human. Total Recall has Kuato, a half-formed clairvoyant mutant who appears as a tumorous growth wriggling in the abdomen of his brother. Even more ludicrously, Mnemonic’s climax reveals that the Lo-Teks’ leader is not the resourceful J-Bone (Ice-T), but rather Jones, a computer-augmented dolphin. In cyberpunk, the body’s status as “dead meat” to be transcended through computer hardware and neurological implantation offers a corollary sense of freedom.

The idea of the cybernetic body as a metaphor for the politicized human body was theorized in 1985, cyberpunk’s early days, by philosopher and biologist Donna Haraway. Dense and wildly eclectic, by turns exciting and exasperating, Haraway’s “Cyborg Manifesto” is situated as an ironic myth, designed to smash existing oppositions between science and nature, mind and body. Haraway was particularly interested in developing an imagistic alternative to the idea of the “Goddess,” so common to the feminism of the time. Where the Goddess was backward-looking in orientation, attempting to connect women to some prelapsarian, pre-patriarchal state of nature, the cyborg was a myth of the future, or at least of the present. “Cyborg imagery,” she writes, “can suggest a way out of the maze of dualisms in which we have explained our bodies and our tools to ourselves.” Part machine and part flesh, Haraway visualizes the cyborg as a being that threatens existing borders and assumes responsibility for building new ones.

Though they are not quite identical concepts, Haraway’s figure of the cyborg and the thematics of cyberpunk share much in common. A character like Gibson’s Molly Millions, for example, could be described as a cyborg, even if she is still essentially gendered as female (the gender binary was one of the many “dualisms” Haraway believed the cyborg could collapse). Cyborgs and cyberpunk are connected in their resistance to an old order, be it political and economic (as in Neuromancer, Johnny Mnemonic, etc.) or metaphysical (as in Haraway). The cyborg and the cyberpunk both dream of new futures, new social relationships, new bodies, and whole new categories of conceptions and ways of being.

The historical problem is that, for the most part, these new categories and these new relationships failed to materialize, as cyberpunk’s futures were usurped and commodified by the powers they had hoped to oppose.

Not Turning Japanese

In an introduction to the Penguin Galaxy hardcover reissue of Neuromancer, sci-fi-fantasy writer Neil Gaiman ponders precisely how the 1980s cyberpunk visions came to shape the future. “I wonder,” he writes, “to what extent William Gibson described a future, and how much he enabled it—how much the people who read and loved Neuromancer made the future crystallize around his vision.”

It’s a paradox that dogs most great sci-fi writers, whose powers for Kuato-style clairvoyance have always struck me as exaggerated. After all, it’s not as if, say, Gene Roddenberry literally saw into the future, observed voice-automated assistants of the Siri and Alexa variety, and then invented his starship’s speaking computers. It’s more that other people saw the Star Trek technology and went along inventing it. The same is true of Gibson’s matrix or Stephenson’s Metaverse, or the androids of Asimov and Dick. And the realization of many technologies envisioned by cyberpunk—including the whole concept of the internet, which now operates not as an escapist complement to reality, but an essential part of its fabric, like water or heat—has occurred not because of scrappy misfits and high-tech lowlifes tinkering in dingy basements, but because of gargantuan corporate entities. Or rather, the cyberpunks have become the corporate overlords, making the transition from the Lo-Teks to Pharmakom, from Kuato to Cohaagen. In the process, the genre and all its aspirations have been reduced to so much dead meat. This is what Shiner was reacting to when, in 1991, he renounced his cyberpunk affiliations, or when Bruce Bethke, who coined the term, began referring to “cyberpunk” as “the c-word.”

The commodification of the cool is a classic trick of capitalism, which has the frustrating ability to mutate faster than the forces that oppose it. Yet even this move toward commodification and corporatization is anticipated in much cyberpunk. “Power,” for Neuromancer’s Henry Case, “meant corporate power.” Gibson goes on: “Case had always taken it for granted that the real bosses, the kingpins in a given industry, would be both more and less than people.” For Case (and, it follows, Gibson, at least at the time of his writing), this power had “attained a kind of immortality” by evolving into an organism. Taking out one-or-another malicious CEO hardly matters when lines of substitutes are waiting in the wings to assume the role.

It’s here that cyberpunk critiques another kind of body. Not the ruddy human form that can be augmented and perfected by prosthetics and implants, but the economic body. Regarding the economy as a holistic organism—or a constituent part of one—is an idea that dates back at least as far as Adam Smith’s “invisible hand.” The rhetoric of contemporary economics is similarly biological. An edifying 2011 argument in Al Jazeera by Paul Rosenberg looked at the power of such symbolic conceptions of the economy. “The organic metaphor,” Rosenberg writes, “tells people to accept the economy as it is, to be passive, not to disturb it, to take a laissez faire attitude—leave it alone.”

This idea calls back to another of cyberpunk’s key aesthetic influences: the “body economic” of Japan in the 1980s. From the 2019 setting of 1982’s Blade Runner, to the conspicuous appearance of yakuza goons in Gibson’s stories, to Stephenson’s oddly anachronistic use of “Nipponese” in Snow Crash, cyberpunk’s speculative futures proceed from the economic ascendency of 1980s Japan, and the attendant anxiety that Japan would eventually eclipse America as an economic powerhouse. This idea, that Japan somehow is (or was) the future, has persisted all the way up to Cyberpunk 2077’s aesthetic template, and its foregrounding of villains like the shadowy Arasaka Corporation. It suggests that, even as it unfolds nearly sixty years from our future, the blockbuster video game is still obsessed with a vision of the future past.

Indeed, it’s telling that as the robust Japanese economy receded in the 1990s, its burly body giving up the proverbial ghost, that Japanese cinema became obsessed with avenging spirits channeled into the present by various technologies (a haunted video cassette in Hideo Nakata’s Ringu, the internet itself in Kiyoshi Kurosawa’s Kairo, etc.). But in the 1980s, Japan’s economic and technologic dominance seemed like a foregone conclusion. In a 2001 Time article, Gibson called Japan cyberpunk’s “de facto spiritual home.” He goes on:

I remember my first glimpse of Shibuya, when one of the young Tokyo journalists who had taken me there, his face drenched with the light of a thousand media-suns—all that towering, animated crawl of commercial information—said, “You see? You see? It is Blade Runner town.” And it was. It so evidently was.

Gibson’s analysis features one glaring mistake. His insistence that “modern Japan simply was cyberpunk” is tethered to its actual history as an economic and technological powerhouse circa the 1980s, and not from its own science-fictional preoccupations. “It was not that there was a cyberpunk movement in Japan or a native literature akin to cyberpunk,” he writes. Except there so evidently was.

The Rusting World

Even beyond the limp, Orwellian connotations, 1984 was an auspicious year for science-fiction. There was Neuromancer, yes. But 1984 also saw the first collected volume of Akira, a manga written and illustrated by Katsuhiro Otomo. Originally set, like Blade Runner, in 2019, Akira imagines a cyberpunk-y Neo-Tokyo, in which motorcycle-riding gangs do battle with oppressive government forces. Its 1988 anime adaptation was even more popular, in both Japan and the West. (The film’s trademark cherry red motorcycle has been repeatedly referenced in the grander cyberpunk canon, appearing in Steven Spielberg’s film adaptation of Ready Player One and, if pre-release hype is to believed, in Cyberpunk 2077 itself.) In 2018, the British Film Institute hailed Akira, accurately, as “a vital cornerstone of the cyberpunk genre.”

Japan has plenty of other, non-Akira cyberpunk touchstones. As a cinematic subgenre, Japanese cyberpunk feels less connected to the “cyber” and more to the spirit of “punk,” whether in the showcasing of actual Japanese punk rock bands (as in 1982’s Burst City) or the films’ own commitment to a rough-hewn, low-budget, underground aesthetic. Chief among the latter category of films is Shinya Tsukamoto’s Tetsuo: The Iron Man, which was shot on 16mm over a grueling year-and-a-half, mostly in and around Tetsuo actress and cinematographer Kei Fujiwara’s apartment, which also housed most of the film’s cast and crew.

Unlike the Western cyberpunk classics, Tsukamoto’s vision of human-machine hybridization is demonstrably more nightmarish. The film follows two characters, credited as the Salaryman (Tomorowo Taguchi) and the Guy (a.k.a. “The Metal Fetishist,” played by writer/director/producer/editor Tsukamoto himself), bound by horrifying mutations, which see their flesh and internal organs sprouting mechanical hardware.

In its own way, Tetsuo works as a cyberpunk-horror allegory for the Japanese economy. As the Salaryman and the Fetishist learn to accept the condition of their mechanization, they merge together, absorbing all the inorganic matter around them, growing enormously like a real-world computer virus or some terrifying industrial Katamari. Their mission resonates like a perverse inversion of Japan’s post-industrial promise. As Tsukamoto’s Fetishist puts it: “We can rust the whole world and scatter it into the dust of the universe.”

Like Haraway’s development of the cyborg as a metaphoric alternative to the New Age “goddess,” Tetsuo’s titular Iron Man can offer a similar corrective. If cyberpunk has become hopelessly obsessed with its own nostalgia, recycling all its 1980s bric-a-brac endlessly, then we need a new model. Far from the visions of Gibson, in which technology provides an outlet for a scrappy utopian impulse that jeopardizes larger corporate-political dystopias, Tetsuo is more pessimistic. It sees the body—both the individual physical body and the grander corpus of political economy—as being machine-like. Yet, as Rosenberg notes in his Al Jazeera analysis of economic rhetoric, it may be more useful to conceive of the economy not as a “body” or an organism but as a machine. The body metaphor is conservative, “with implications that tend toward passivity and acceptance of whatever ills there may be.” Machines, by contrast, can be fixed, greased, re-oriented. They are, unlike bodies, a thing separate from us, and so subject to our designs.

Cybernetic implants and cyborg technology are not some antidote to corporate hegemony. The human does not meld with technology to transcend the limitations of humanity. Rather, technology and machinery pose direct threats to precisely that condition. We cannot, in Tsukamoto’s film, hack our way to a better future, or technologically augment our way out of collective despair. Technology—and the mindless rush to reproduce it—are, to Tsukamoto, the very conditions of that despair. Even at thirty years old, Tetsuo offers a chilling vision not of the future, or of 1980s Japan, but of right now: a present where the liberating possibilities of technology have been turned inside-out; where hackers become CEOs whose platforms bespoil democracy; where automation offers not the promise of increased wealth and leisure time, but joblessness, desperation, and the wholesale redundancy of the human species; where the shared hallucination of the virtual feels less than consensual.

There’s nothing utopian about the model of cyberpunk developed in Tetsuo: The Iron Man. It is purely dystopian. But this defeatism offers clarity. And in denying the collaborative, collectivist, positive vision of a technological future in favor of a vision of identity-destroying, soul-obliterating horror, Tsukamoto’s stone-cold classic of Japanese cyberpunk invites us to imagine our own anti-authoritarian, anti-corporate arrangements. The enduring canon of American-style cyberpunk may have grown rusty. It has been caught, as Bethke put it in his genre-naming story, “without a program.” But the genre’s gnarlier, Japanese iterations have plenty to offer, embodying sci-fi’s dream of imagining a far-off future as a deep, salient critique of the present. It is only when we accept this cruel machinery of the present that we can freely contemplate how best to tinker with its future.

Left to peddle such a despairing vision in a packed-out L.A. convention center, even cyberpunk’s postmortem poster boy Keanu Reeves would be left with little to say but a resigned, bewildered, “Woah . . .”