Pop Culture Has Become an Oligopoly

By Adam Mastroianni

Source: Experimental History

You may have noticed that every popular movie these days is a remake, reboot, sequel, spinoff, or cinematic universe expansion. In 2021, only one of the ten top-grossing films––the Ryan Reynolds vehicle Free Guy––was an original. There were only two originals in 2020’s top 10, and none at all in 2019.

People blame this trend on greedy movie studios or dumb moviegoers or competition from Netflix or humanity running out of ideas. Some say it’s a sign of the end of movies. Others claim there’s nothing new about this at all.

Some of these explanations are flat-out wrong; others may contain a nugget of truth. But all of them are incomplete, because this isn’t just happening in movies. In every corner of pop culture––movies, TV, music, books, and video games––a smaller and smaller cartel of superstars is claiming a larger and larger share of the market. What used to be winners-take-some has grown into winners-take-most and is now verging on winners-take-all. The (very silly) word for this oligopoly, like a monopoly but with a few players instead of just one.

I’m inherently skeptical of big claims about historical shifts. I recently published a paper showing that people overestimate how much public opinion has changed over the past 50 years, so naturally I’m on the lookout for similar biases here. But this shift is not an illusion. It’s big, it’s been going on for decades, and it’s happening everywhere you look. So let’s get to the bottom of it.

(Data and code available here.)

Movies 

At the top of the box office charts, original films have gone extinct. 

I looked at the 20 top-grossing movies going all the way back to 1977 (source), and I coded whether each was part of what film scholars call a “multiplicity”—sequels, prequels, franchises, spin-offs, cinematic universe expansions, etc. This required some judgment calls. Lots of movies are based on books and TV shows, but I only counted them as multiplicities if they were related to a previous movie. So 1990’s Teenage Mutant Ninja Turtles doesn’t get coded as a multiplicity, but 1991’s Teenage Mutant Ninja Turtles II: The Secret of the Ooze does, and so does the 2014 Teenage Mutant Ninja Turtles remake. I also probably missed a few multiplicities, especially in earlier decades, since sometimes it’s not obvious that a movie has some connection to an earlier movie.

Regardless, the shift is gigantic. Until the year 2000, about 25% of top-grossing movies were prequels, sequels, spinoffs, remakes, reboots, or cinematic universe expansions. Since 2010, it’s been over 50% ever year. In recent years, it’s been close to 100%.

Original movies just aren’t popular anymore, if they even get made in the first place.

Top movies have also recently started taking a larger chunk of the market. I extracted the revenue of the top 20 movies and divided it by the total revenue of the top 200 movies, going all the way back to 1986 (source). The top 20 movies captured about 40% of all revenue until 2015, when they started gobbling up even more.

Television

Thanks to cable and streaming, there’s way more stuff on TV today than there was 50 years ago. So it would make sense if a few shows ruled the early decades of TV, and now new shows constantly displace each other at the top of the viewership charts.

Instead, the opposite has happened. I pulled the top 30 most-viewed TV shows from 1950 to 2019 (source) and found that fewer and fewer franchises rule a larger and larger share of the airwaves. In fact, since 2000, about a third of the top 30 most-viewed shows are either spinoffs of other shows in the top 30 (e.g., CSI and CSI: Miami) or multiple broadcasts of the same show (e.g., American Idol on Monday and American Idol on Wednesday). 

Two caveats to this data. First, I’m probably slightly undercounting multiplicities from earlier decades, where the connections between shows might be harder for a modern viewer like me to understand––maybe one guy hosted multiple different shows, for example. And second, the Nielsen ratings I’m using only recently started accurately measuring viewership on streaming platforms. But even in 2019, only 14% of viewing time was spent on streaming, so this data isn’t missing much.

Music

It used to be that a few hitmakers ruled the charts––The Beatles, The Eagles, Michael Jackson––while today it’s a free-for-all, right?

Nope. A data scientist named Azhad Syed has done the analysis, and he finds that the number of artists on the Billboard Hot 100 has been decreasing for decades.

And since 2000, the number of hits per artist on the Hot 100 has been increasing. 

(Azhad says he’s looking for a job––you should hire him!)

A smaller group of artists tops the charts, and they produce more of the chart-toppers. Music, too, has become an oligopoly.

Books

Literature feels like a different world than movies, TV, and music, and yet the trend is the same.

Using LiteraryHub’s list of the top 10 bestselling books for every year from 1919 to 2017, I found that the oligopoly has come to book publishing as well. There are a couple ways we can look at this. First, we can look at the percentage of repeat authors in the top 10––that is, the number of books in the top 10 that were written by an author with another book in the top 10.

It used to be pretty rare for one author to have multiple books in the top 10 in the same year. Since 1990, it’s happened almost every year. No author ever had three top 10 books in one year until Danielle Steel did it 1998. In 2011, John Grisham, Kathryn Stockett, and Stieg Larsson all had two chart-topping books each.

We can also look at the percentage of authors in the top 10 were already famous––say, they had a top 10 book within the past 10 years. That has increased over time, too. 

In the 1950s, a little over half of the authors in the top 10 had been there before. These days, it’s closer to 75%.

Video games

I tracked down the top 20 bestselling video games for each year from 1995 to 2021 (sources: 1234567) and coded whether each belongs to a preexisting video game franchise. (Some games, like Harry Potter and the Sorcerer’s Stone, belong to franchises outside of video games. For these, I coded the first installment as originals and any subsequent installments as franchise games.)

The oligopoly rules video games too:

In the late 1990s, 75% or less of bestselling video games were franchise installments. Since 2005, it’s been above 75% every year, and sometimes it’s 100%. At the top of the charts, it’s all Mario, Zelda, Call of Duty, and Grand Theft Auto.

Why is this happening?

Any explanation for the rise of the pop oligopoly has to answer two questions: why have producers started producing more of the same thing, and why are consumers consuming it? I think the answers to the first question are invasionconsolidation, and innovation. I think the answer to the second question is proliferation.

Invasion

Software and the internet have made it easier than ever to create and publish content. Most of the stuff that random amateurs make is crap and nobody looks at it, but a tiny proportion gets really successful. This might make media giants choose to produce and promote stuff that independent weirdos never could, like an Avengers movie. This can’t explain why oligopolization started decades ago––YouTube only launched in 2005, for example, and most Americans didn’t have broadband until 2007––but it might explain why it’s accelerated and stuck around.

Consolidation

Big things like to eat, defeat, and outcompete smaller things. So over time, big things should get bigger and small things should die off. Indeed, movie studiosmusic labelsTV stations, and publishers of books and video games have all consolidated. Maybe it’s inevitable that major producers of culture will suck up or destroy everybody else, leaving nothing but superstars and blockbusters. Indeed, maybe cultural oligopoly is merely a transition state before we reach cultural monopoly.

Innovation

You may think there’s nothing left to discover in art forms as old as literature and music, and that they simply iterate as fashions change. But it took humans thousands of years to figure out how to create the illusion of depth in paintings. Novelists used to think that sentences had to be long and complicated until Hemingway came along, wrote some snappy prose, and changed everything. Even very old art forms, then, may have secrets left to discover. Maybe the biggest players in culture discovered some innovations that won them a permanent, first-mover chunk of market share. I can think of a few:

  • In books: lightning-quick plots and chapter-ending cliffhangers. Nobody thinks The Da Vinci Code is high literature, but it’s a book that really really wants you to read it. And a lot of people did!
  • In music: sampling. Musicians seem to sample more often these days. Now we not only remake songs; we franchise them too.
  • In movies, TV, and video games: cinematic universes. Studios have finally figured out that once audiences fall in love with fictional worlds, they want to spend lots of time in them. Marvel, DC, and Star Wars are the most famous, but there are also smaller universe expansions like Better Call Saul and El Camino from Breaking Bad and The Many Saints of Newark from The Sopranos. Video game developers have understood this for even longer, which is why Mario does everything from playing tennis to driving go-karts to, you know, being a piece of paper.

Proliferation

Invasion, consolidation, and innovation can, I think, explain the pop oligopoly from the supply side. But all three require a willing audience. So why might people be more open to experiencing the same thing over and over again?

As options multiply, choosing gets harder. You can’t possibly evaluate everything, so you start relying on cues like “this movie has Tom Hanks in it” or “I liked Red Dead Redemption, so I’ll probably like Red Dead Redemption II,” which makes you less and less likely to pick something unfamiliar. 

Another way to think about it: more opportunities means higher opportunity costs, which could lead to lower risk tolerance. When the only way to watch a movie is to go pick one of the seven playing at your local AMC, you might take a chance on something new. But when you’ve got a million movies to pick from, picking a safe, familiar option seems more sensible than gambling on an original.

This could be happening across all of culture at once. Movies don’t just compete with other movies. They compete with every other way of spending your time, and those ways are both infinite and increasing. There are now 60,000 free books on Project Gutenberg, Spotify says it has 78 million songs and 4 million podcast episodes, and humanity uploads 500 hours of video to YouTube every minute. So uh, yeah, the Tom Hanks movie sounds good.

What do we do about it?

Some may think that the rise of the pop oligopoly means the decline of quality. But the oligopoly can still make art: Red Dead Redemption II is a terrific game, “Blinding Lights” is a great song, and Toy Story 4 is a pretty good movie. And when you look back at popular stuff from a generation ago, there was plenty of dreck. We’ve forgotten the pulpy Westerns and insipid romances that made the bestseller lists while books like The Great GatsbyBrave New World, and Animal Farm did not. American Idol is not so different from the televised talent shows of the 1950s. Popular culture has always been a mix of the brilliant and the banal, and nothing I’ve shown you suggests that the ratio has changed.

The problem isn’t that the mean has decreased. It’s that the variance has shrunk. Movies, TV, music, books, and video games should expand our consciousness, jumpstart our imaginations, and introduce us to new worlds and stories and feelings. They should alienate us sometimes, or make us mad, or make us think. But they can’t do any of that if they only feed us sequels and spinoffs. It’s like eating macaroni and cheese every single night forever: it may be comfortable, but eventually you’re going to get scurvy. 

We haven’t fully reckoned with what the cultural oligopoly might be doing to us. How much does it stunt our imaginations to play the same video games we were playing 30 years ago? What message does it send that one of the most popular songs in the 2010s was about how a 1970s rock star was really cool? How much does it dull our ambitions to watch 2021’s The Matrix: Resurrections, where the most interesting scene is just Neo watching the original Matrix from 1999? How inspiring is it to watch tiny variations on the same police procedurals and reality shows year after year? My parents grew up with the first Star Wars movie, which had the audacity to create an entire universe. My niece and nephews are growing up with the ninth Star Wars movie, which aspires to move merchandise. Subsisting entirely on cultural comfort food cannot make us thoughtful, creative, or courageous.

Fortunately, there’s a cure for our cultural anemia. While the top of the charts has been oligopolized, the bottom remains a vibrant anarchy. There are weird books and funky movies and bangers from across the sea. Two of the most interesting video games of the past decade put you in the role of an immigration officer and an insurance claims adjuster. Every strange thing, wonderful and terrible, is available to you, but they’ll die out if you don’t nourish them with your attention. Finding them takes some foraging and digging, and then you’ll have to stomach some very odd, unfamiliar flavors. That’s good. Learning to like unfamiliar things is one of the noblest human pursuits; it builds our empathy for unfamiliar people. And it kindles that delicate, precious fire inside us––without it, we might as well be algorithms. Humankind does not live on bread alone, nor can our spirits long survive on a diet of reruns.

Tears in Rain: ‘Blade Runner’ and Philip K. Dick’s Legacy in Film

Blade Runner, and the work of Philip K. Dick, continues to find its way into our cinemas and minds. How did the visions of a paranoid loner become the most relevant science fiction of our time?

By Sean Bell

Source: PopMatters

You are an unusual man, Mr. Asher,” the cop beside him said. “Crazy or not, whatever it is that has gone wrong with you, you are one of a kind.”– Philip K. Dick, Flow My Tears, The Policeman Said.
“I’ve seen things you people wouldn’t believe…”
— Roy Batty, Blade Runner.

Los Angeles, 2019. The script simply reads: “Ext. Hades – Dusk”. Chemical flame bursts into the perpetual, post-nuclear Los Angeles gloom from skyscraping smokestacks, rising out of an industrial landscape Hieronymus Bosch might have envisaged. Flying cars flit through the darkness like fireflies. The camera moves slowly over impossible architecture with the immensity and decaying grandeur of ancient Egypt; a phantasmagorical megalopolis strung with necklaces of neon. The music rises.

The last war is over and nobody won. The Earth is living on borrowed time. Science has destroyed all boundaries between the real and unreal except those we choose to impose. Policemen are murderers, androids are lovers, nobody can be trusted, and everybody dies. Welcome back to the world of Blade Runner. Every time we return, it becomes more recognisable. Perhaps we never left.

Santa Ana, 1981. By now, Philip K. Dick — author, religious visionary and possible schizophrenic, who once said “You would have to kill me and prop me up in the seat of my car with a smile painted on my face to get me to go near Hollywood” — has read and approved the shooting script for Blade Runner by Hampton Fancher and David Peoples, as well as viewed a test reel of the movie’s groundbreaking special effects. In an effusive letter to Jeff Walker, the man in charge of the film’s marketing. Dick makes his prophetic opinion clear.

“Let me sum it up this way,” he wrote. “Science fiction has slowly and ineluctably settled into a monotonous death: it has become inbred, derivative, stale. Suddenly you people come in, some of the greatest talents currently in existence, and now we have a new life, a new start. As for my own role in the BLADE RUNNER project, I can only say that I did not know that a work of mine or a set of ideas of mine could be escalated into such stunning dimensions. My life and creative work are justified and completed by BLADE RUNNER… It will prove invincible.” (‘Letter to Jeff Walker regarding Blade Runner on Philip K Dick.com)

Dick, with a writer’s knack, chose the right words. Blade Runner has survived everything that could be thrown at it, including its initial critical reception, successive unsatisfactory edits, and dilution by both the science fiction genre and the film industry, which would plagiarise its vision with varying degrees of shamelessness for the next three decades. Yet nothing has been able to replicate its unique synthesis of elements, melding equal parts noir, action, romance, cyberpunk, dystopia and a meditation on what it means to be human. Whether praised or disparaged, ignored or overexposed, it has seen off all critics and challengers. Blade Runner‘s invincibility endures.

Unfortunately, the inimitable artistic fortitude of Ridley Scott‘s best work also allowed Hollywood to practice its favourite hobby of missing the point. The novels (or rather, in the film industry netherworld, ‘properties’) of Philip K. Dick — a writer largely unappreciated in his own lifetime who, at his lowest, claimed he he could not afford the late fee for a library book and infamously sustained himself on horse meat from a local pet store — have become highly prized and sought-after by studios eager to import intelligent ideas, rather than go through the hassle of conceiving their own. In the (twisted) spirit of Dick’s short story “We Can Remember It For You Wholesale“, no need for originality, we can develop it for you wholesale.

Question the nature of identity with Colin Farrell’s Douglas Quaid /Hauser as he flees the futuristic gunfire in the remake of Len Wiseman’s Total Recall (2012, an adaptation of an adaptation, as it were). Consider the limitations of free will with Tom Cruise’s Chief John Anderton (a well-known fan of unstable science fiction writers) in Steven Spielberg’s Minority Report (2012) while wondering how short he really is. Watch Jon Woo’s Paycheck (2003) and wonder how much Ben Affleck is actually being paid to appear in this piece of shit.

The names change, but the formula remains the same. Each summer, there must be blockbusters, and to fill the gaps in between the explosions with the barest bones of a Phil Dick story is, to the studio mindset, to buy instant, ready-made philosophical depth and artistic worth. Just add CGI!

Improbably, with 11 films based on his work and more in the pipeline, Dick has become the most adapted science fiction author in cinema history… And I’m not sure that’s a good thing.

“It’s a film about whether or not you can have a meaningful relationship with your toaster.” — Harrison Ford on Blade Runner in an interview with The Washington Post, 11 September 1992

That Scott is a director second only to George Lucas in his determination to tinker with a bygone masterwork until it meets his ultimate, exacting satisfaction (at last count, there are seven different versions of Blade Runner floating around the ether, finally culminating in Scott’s so-called ‘final cut’ in 2007), when a startling new interpretation of the film appeared online this June to wild acclaim, he had nothing to do with it. Instead, it was the work of the artist Anders Ramsell, who painstakingly recreated every frame of the movie’s opening 13 minutes with 3,285 gorgeous, haunting, impressionistic watercolours.

Dick, with a writer’s knack, chose the right words. Blade Runner has survived everything that could be thrown at it, including its initial critical reception, successive unsatisfactory edits, and dilution by both the science fiction genre and the film industry, which would plagiarise its vision with varying degrees of shamelessness for the next three decades. Yet nothing has been able to replicate its unique synthesis of elements, melding equal parts noir, action, romance, cyberpunk, dystopia and a meditation on what it means to be human. Whether praised or disparaged, ignored or overexposed, it has seen off all critics and challengers. Blade Runner‘s invincibility endures.

Unfortunately, the inimitable artistic fortitude of Ridley Scott‘s best work also allowed Hollywood to practice its favourite hobby of missing the point. The novels (or rather, in the film industry netherworld, ‘properties’) of Philip K. Dick — a writer largely unappreciated in his own lifetime who, at his lowest, claimed he he could not afford the late fee for a library book and infamously sustained himself on horse meat from a local pet store — have become highly prized and sought-after by studios eager to import intelligent ideas, rather than go through the hassle of conceiving their own. In the (twisted) spirit of Dick’s short story “We Can Remember It For You Wholesale“, no need for originality, we can develop it for you wholesale.

Question the nature of identity with Colin Farrell’s Douglas Quaid /Hauser as he flees the futuristic gunfire in the remake of Len Wiseman’s Total Recall (2012, an adaptation of an adaptation, as it were). Consider the limitations of free will with Tom Cruise’s Chief John Anderton (a well-known fan of unstable science fiction writers) in Steven Spielberg’s Minority Report (2012) while wondering how short he really is. Watch Jon Woo’s Paycheck (2003) and wonder how much Ben Affleck is actually being paid to appear in this piece of shit.

The names change, but the formula remains the same. Each summer, there must be blockbusters, and to fill the gaps in between the explosions with the barest bones of a Phil Dick story is, to the studio mindset, to buy instant, ready-made philosophical depth and artistic worth. Just add CGI!

Improbably, with 11 films based on his work and more in the pipeline, Dick has become the most adapted science fiction author in cinema history… And I’m not sure that’s a good thing.

“It’s a film about whether or not you can have a meaningful relationship with your toaster.” — Harrison Ford on Blade Runner in an interview with The Washington Post, 11 September 1992

That Scott is a director second only to George Lucas in his determination to tinker with a bygone masterwork until it meets his ultimate, exacting satisfaction (at last count, there are seven different versions of Blade Runner floating around the ether, finally culminating in Scott’s so-called ‘final cut’ in 2007), when a startling new interpretation of the film appeared online this June to wild acclaim, he had nothing to do with it. Instead, it was the work of the artist Anders Ramsell, who painstakingly recreated every frame of the movie’s opening 13 minutes with 3,285 gorgeous, haunting, impressionistic watercolours.

The painting technique employed to create this effect is known as aquarelle, which also acts as a sly commentary on Blade Runner‘s repeatedly-rejiggered legacy: because of its transparency, with aquarelle nothing can be painted over — once a mistake has been made, it has to be lived with. But as I watched what Ramsell called his ‘paraphrase’ of the movie, the familiar scenes and faces and voices emerging from the flickering, sensuous wash of colour, it seemed all the more appropriate. Other than Derek Jarman’s Caravaggio, I can think of no other film that so looks like a painting without paint — a piece of art that evokes other art, and yet remains entirely itself.

This is all a long way of saying that Blade Runner is beautiful, in almost every way possible. This has, ridiculously, sometimes been described as being to its detriment: Roger Ebert, one of the film’s more famous naysayers, wrote in a contemporary review in the Chicago Sun-Times that “It looks fabulous… but it is thin in its human story,” (11 September 1992) an oddly myopic remark to make about a parable on the nature of humanity. True, Blade Runner has an embarrassment of style, but never at the expense of substance, much like the best examples of the Cinema du look movement, which was just emerging in French cinema at the time of Blade Runner‘s release. Indeed, in appearance and atmosphere, Blade Runner is so distinct that it changed the aesthetic of science fiction, and new subgenres have since been invented simply to describe it — ‘future noir’, coined by the critic and filmmaker Paul M. Sammon, being the most enduring.

To those who have yet to experience the film, I wonder what to say that has not been said elsewhere. Based on Philip K. Dick’s 1968 novel Do Androids Dream of Electric Sheep?, Blade Runner somehow manages to be simultaneously the truest adaptation of Dick’s work yet produced — in spirit, if not plot — and also the freest in its interpretation of the material.

The central premise remains in both book and film; Rick Deckard (Harrison Ford), halfway between policeman and bounty hunter, makes his living by tracking and killing androids that are almost entirely indistinguishable from humans, living amongst us under false identities. The supposed test for distinguishing real from unreal humans is that of empathy, but as Deckard methodically adds to his ‘artificial’ bodycount, he begins to question who is really the unemotional machine.

Blade Runner dumps much of the novel’s narrative furniture — Scott justified this to Dick by saying: “You know you’re so dense, mate, that by page 32, there’s about 17 storylines” — in a way that, in almost any other adaptation, would be considered sacrilege. Most noticeably, in the novel, Deckard is trapped in a barely functional marriage, while in the movie, Harrison Ford’s laconic protagonist is the classic bachelor gumshoe.

Also prominent in the book are ‘Mercerism’, the religion based around its followers’ empathy for a man whom others were throwing rocks at, and the ‘Mood Organ’, the household device which stimulates any mood the user wishes to experience, and which much of the population now depends on in order to face another bleak, post-apocalyptic day — two tragicomic creations typical of Dick, but too wry to fit with the Blade Runner‘s overall tone. Despite such changes, Dick’s reading of the screenplay convinced him that this film could express his ideas in a way he had not thought possible.

The production could hardly be called smooth. Even before its release, troubling rumours had begun circulating, and as Paul M Sammon wrote ,”by the time BR (Blade Runner) officially completed principle photography in July on 1981, the gossip has become more specific — BR‘s workload had been horrendous, its shooting conditions miserable, and its director, difficult. Moreover, the whispers went, BR’s moneymen were unhappy, its leading man had clashed with his director, and the crew had been near revolt.” Some of these stories were indeed true, but one should bear in mind that movie moneymen are nearly always unhappy, and Harrison Ford will always be the man who told George Lucas that “you can type this shit, but you sure as hell can’t say it.”

Paranoia Runs Through Blade Runner Like a Knife Edge

When Blade Runner was first released, the reviews ran from lukewarm to underwhelmed to baffled, a fact that many critics have been doing their best to forget in the intervening years, polishing the film’s legend in a frantic effort to make up for missing the boat first time round. Considered today, in our gleaming plastic future of bad credit and worse politics, where we are reliant not on humanised robots but dehumanising gadgets, and where much of the movie’s retro-futurism appears as quaint as it does prescient, what is left to say about Blade Runner?

Should I recite the plaudits that have become ritualistic? Should I point out the Roy Batty’s (Rutger Hauer) ‘tears in rain’ monologue is among the most moving and eloquent ever written or performed? Or that Vangelis revolutionised jazz, electronic music and film scoring simultaneously in one of the best soundtracks ever composed? Is it really necessary to highlight the acting of all involved; Harrison Ford as one of the best soulful detectives to grace noir of any kind, an understated and heartbreaking Sean Young (as Rachael), a dangerous but innocent Daryl Hannah (as Pris), or Rutger Hauer’s mixture of the psychotic and the Shakespearean? Do I need to praise the costumes, the set design, or the script which balances so many themes, so many hidden or implied meanings, with such grace and economy?

Well, yes. You want a criticism of Blade Runner? I would’ve liked more of Edward James Olmos’ Gaff. He’s cool. I also want Deckard’s coat.

But we need to be reminded of these things, every now and again, as each new generation discovers the film afresh, and especially when the next logical question becomes: how did Blade Runner get it so right, more so than any other adaptation of Dick’s work since? What did it understand that they did not? To try and answer that, we have to go back to the source.

“Phil acknowledged that his talk sometimes sounded like “religious nonsense & occult nonsense” — but somewhere in it all was the truth. And he would never find it. God himself had assured him of that.” — Lawrence Sutin, Divine Invasions: A Life of Philip K. Dick.

In 2000, at the one and only Disinfo Con — a friendly gathering of millennial deviants concerning all things ‘counterculture’ — Grant Morrison, a writer who could arguably be counted as one of Dick’s prodigious bastard literary offspring, opened his keynote speech marvelling at the fact he spent his youth reading Robert Anton Wilson and now, all these years later, “we’re standing here, we’re talking about this shit and it’s real.”

Fans of Philip K. Dick have been feeling like that since at least the ’70s, when the dark realities on the other side of the Drug Revolution and the true possibilities of an all-powerful Nixonian surveillance state were becoming painfully apparent. Then as now, we use Dick’s paranoia to express our own.

Dick once commented that, with the making of Blade Runner, what had once been a private world that only he inhabited was now open to all; they would live in its murky depths, just as he did every day. At first, one might think that a paranoid and delusional visionary — and I use the word in its most literal sense — would want nothing more than for others to see and experience what he does. But perhaps Dick didn’t want anyone in his world. Maybe he didn’t think his world should be suffered by anyone else.

The fantastic, tortured mind of Philip K. Dick has been discussed, analysed and treated like an object of supreme curiosity since his death in 1982 (a few months shy of Blade Runner‘s release) and before. He had eked out an impoverished existence writing science fiction since the ’50s, when it was less a genre than a ghetto, and could only make a living by producing at a furious pace which he fuelled with huge amounts of amphetamines, at one point producing 12 novels over the course of two years.

But Dick was ever a seeker of truth, which made him, strangely, something of an oddity in science fiction. Within these novels, beneath the aliens and spaceships, moon colonies and interstellar wars, was arguably the best satire being done in science fiction on American life outside of Kurt Vonnegut‘s oeuvre, the philosophical undertones and untrammelled, often gothic imagination of which called to mind Jorge Luis Borges more than Isaac Asimov. The uniting thread that runs through his work — and, eventually, his life — is the notion that reality, as we know it, is fundamentally untrustworthy.

As the ’60s wore on, Dick’s involvement with the drug culture increased, adding a psychedelic aspect to both his ever-evolving philosophy and his ever-present paranoia. In 1971, Dick experienced a defining moment: his home was broken into, his safe blown open, and many of his personal papers burglerized. He never discovered who was responsible, though he had many suspects — local drug addicts, Black Panthers, the police, the FBI, the Soviets, or any of these, pretending to be one of the others.

The effect it left on Dick was to give him proof, unexplained and terrifying, that his paranoia was somehow justified. There had been comfort in simply telling himself he was crazy. But after years of grappling with his mental health, someone, it seemed, truly was out to get him.

“I mean,” he told the Aquarian magazine in a 1974 interview, “it’s a very frightening thing when the head of a police department tells you that you better leave the county because you have enemies, and you don’t know who these enemies are or why you’ve incurred their wrath.”

When one is peripherally aware of one’s own tenuous psychological condition (which we must be, we tell ourselves, if we have any hope of overcoming it), the uninitiated cannot imagine the sheer anguish, the self-doubt, the fear that follows when you know you cannot trust your own mind, and you cannot trust anyone to help you. The luckiest paranoids are also egotists, or better yet, megalomaniacs; they face the lurking forces of imagined persecution with a defiant roar or a knowing smirk.

Dick, on the other hand, was a sensitive, fragile, frightened soul, as well as a possible undiagnosed schizophrenic. His subsequent religious experiences — a series of hallucinations in which a beam of pink light connected him with a “transcendentally rational mind” — may have been his brain’s attempt to soothe his paranoia, or merely the next stage of his mental instability. In either case, as always, it provided inspiration for another novel.

Critics and biographers often talk about Dick’s paranoia and delusions like they were shocking fashion statements or extreme political opinions — just another interesting aspect to the bizarro image of a literary titan. What they don’t say, although the evidence is all around us, is that Dick’s paranoia is ours, as well. Ours may not have such colourful outlets or dramatic results, but we shall always carry our paranoia with us. It cannot be blotted out. There will forever be dark fears lurking in the deeper pools of our mind about our untrustworthy friends, co-workers, policemen, criminals, the FBI, the CIA, the Communists, the aliens, or God himself… and beyond. Existence will always be open to question, forever taunting us with its uncertainty. We can’t trust reality, but we have to live there.

This paranoia runs through Blade Runner like a knife edge. You cannot trust others, or yourself. You cannot trust your memories, your past, or your future. The entire world may be against you, even if it doesn’t appear to be. Then again, you don’t have to like it.

The heroes and villains of Blade Runner do not like, or accept, the oppression of the paranoid worldview. Deckard fights a series of seemingly impossible battles, even against what he thought was true, and finds romance where it should be impossible. Roy Batty, the leader of the renegade replicants, fights against death itself, seeking to extend and outlive his designer-mandated expiration date. The inhuman fights to become human, so humans must prove their humanity. This is the message at Blade Runner‘s core, and it has never been replicated.

Near the end of Dick’s 1976 novel Radio Free Albemuth, the protagonist, an author surrogate for Dick who also writes pulp science fiction, is faced by a gloating representative of the government conspiracy that the writer has become entangled with, who coolly informs him they will continue to put out books under his name — lurid trash scattered with a few of the author’s trademark concepts to keep it recognisable for his fans, but encoded with non-too-subtle propaganda messages designed to keep its readers afraid and unenlightened.

At our most cynical, it’s sometimes difficult not to think of Hollywood the same way: trading on Dick’s name and style and core ideas, but discarding the message and the mind behind them.

Authors often have to face a string of misfires and misinterpretations before their work gets the cinematic treatment it deserves; Dick, on the other hand, got a masterpiece on the first go. And it remains ours to enjoy: Blade Runner, as Dick wrote, is invincible.

Maybe it’s finally time for Hollywood to leave the legacy of Philip K. Dick at the local bookstore, where it belongs.

“What matters to me is the writing, the act of manufacturing the novel, because while I am doing it, at that particular moment, I am in the world I am writing about. It is real to me, completely and utterly. Then, when I’m finished, and I have to stop, withdraw from that world forever — that destroys me… I promise myself: I will never write another novel. I will never again imagine people from whom I will eventually be cut off. I tell myself this… and, secretly and cautiously, I begin another book.” — Philip K. Dick, “Notes Made Late At Night By A Weary SF Writer”, 1968.

How to Avert a Digital Dystopia

By Jumana Abu-Ghazaleh

Source: OneZero

“What I find [ominous] is how seldom, today, we see the phrase ‘the 22nd century.’ Almost never. Compare this with the frequency with which the 21st century was evoked in popular culture during, say, the 1920s.”

—William Gibson, famed science-fiction author, in an interview on dystopian fiction.

The 2010s are almost over. And it doesn’t quite feel right.

When the end of 2009 came into view, the end of the 2000s felt like a relatively innocuous milestone. The current moment feels so much more, what’s the word?

Ah, yes: dystopian.

Looking back, “dystopia” might have been the watchword of the 2010s. Black Mirror debuted close to the beginning of the decade, and early in its run, it was sometimes critiqued for how over-the-top it all felt. Now, at the end of the decade, it’s regularly critiqued as made obsolete by reality.

And it’s not just prestige TV like Black Mirror reflecting the decade’s mood of incipient collapse. Of the 2010s top 10 highest-grossing films, by my count at least half involve an apocalypse either narrowly averted or, in fact, taking place (I’m looking at you, Avengers movies).

People have reasons to wallow. I get it. The existential threat of climate change alone — and seeing efforts to mitigate it slow down precisely as it becomes more pressing — could fuel whole libraries of dystopian fiction.

Meanwhile, our current tech landscape — the monopolies, the wild spread of disinformation, the sense that your most private data could go public whenever, with no recourse, all the things that risk making Black Mirror feel quaint — truly feels dystopian.

We enjoy watching distant, imaginary dystopias because they distract us from oncoming, real dystopias.

Since no one in a position to actually do something about our dystopian reality seems to be admitting it — no business leaders, politicians or legacy media — it makes sense that you might get catharsis of acknowledgment from pop culture instead. And yet, the most popular end-of-the-world fiction isn’t about actual imminent threats from climate or tech. It’s about Thanos coming to snap half of life out of existence. Or Voldemort threatening to destroy us Muggles.

Maybe that kind of pop culture, which acknowledges dystopia but not the actual threats we currently face, gives us a feeling of control: Sure, Equifax could leak my social security number and face zero consequences, but there are no Hunger Games. Wow — it really could be so much worse! Maybe we enjoy watching distant, imaginary dystopias because they distract us from oncoming, real dystopias.

But let’s look at those actual potential dystopias for a moment and think about what we need to do to avert them.

I’d suggest the big four U.S. tech giants — Amazon, Facebook, Apple, Google — each have a distinct possible dystopia associated with them. If we don’t turn around our current reality, we will likely get all four — after all, for all the antagonistic rhetoric among the giants, they are rather co-dependent. Let’s look at what we might have, ahem, look forward to — unless we demand the tech giants deliver on the utopia they purportedly set out to achieve when their respective founders raised their rounds of millions. I would argue not only that we can, but that we must hold them accountable.

“Mad Max,” or, slowly then all at once: starring Apple

“‘How did you go bankrupt?’ Bill asked. ‘Two ways,’ Mike said. ‘Gradually and then suddenly.’”

—Ernest Hemingway, The Sun Also Rises.

When you think of Mad Max, you probably think of an irradiated, post-apocalyptic desert hellscape. You’re also not thinking of Mad Max.

In the original 1979 film, the apocalypse hasn’t quite yet happened. There’s been a substantial social breakdown, but things are getting worse in slow motion. There are still functioning towns. Our protagonist, Max, is a working-class cop; and while there’s reason to believe a big crash is coming, or has even begun, society is still hanging on. (It’s only in the sequels that we’re well into the post-apocalyptic landscape people are thinking of when they say “Mad Max.”)

A relatively subtle dystopia, where things gradually decline in the background, is also a good day-to-day description of a society overrun by algorithms, even without the attention-grabbing mega-scandals of a Cambridge Analytica or massive data breach. A kind of dystopia “light” — and Apple is its poster child.

After all, Apple has a genuinely better track record than some of the other tech giants on a few key privacy issues. But it’s also genuinely aware of the value of promulgating that vision of itself — and that can lead Apple users into danger.

In January, Apple purchased a multistory billboard outside the Consumer Electronics Show in Las Vegas, with this message: “What happens on your iPhone, stays on your iPhone.” Sounds great — but it’s deeply misleading, and as journalist Mark Wilson noted, Apple’s mismatch between rhetoric and behavior fuels the nightmare that is our current data security crisis:

“[iPhone] contents are encrypted by default […] But that doesn’t stop the 2 million or so apps in the App Store from spying on iPhone users and selling details of their private lives. “Tens of millions of people have data taken from them — and they don’t have the slightest clue,” says [the] founder of [the] cybersecurity firm Guardian […] The Wall Street Journal studied 70 iOS apps […] and found several that were delivering deeply private information, including heart rate and fertility data, to Facebook.” [Emphasis mine.]

A tech giant that is claiming it’s the path to salvation, while effectively creating a trap for those who believe it, sounds ironically familiar given Apple’s famous evocation of Big Brother.

After all, when people talk about habit-forming technology in terms so terrifying they’ve convinced Silicon Valley executives to limit their children’s access to their own products, let’s be real: They’re talking about iPhones.

When academic child psychology researcher Jean Twenge talks about a possible teenage mental health epidemic fueled by social media, we know what’s at the heart of it: She’s talking about iPhones.

All those aforementioned horror stories, and a huge slice of those algorithms you’ve heard so much about, are likely first reaching you on smartphones that, with world market share above 50%, are largely, you guessed it, iPhones. (And none of these stories even mention Apple workers at overseas at facilities like Foxconn who create our iPhones and who really are living in a kind of explicit dystopia.)

What happens on your iPhone almost certainly doesn’t stay on your iPhone. But who created that surveillance capitalism running it all in the first place?

Enter Google.

“Black Mirror:” “Nosedive,” or, welcome to surveillance capitalism: starring Google

“We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.”

—Google’s then-CEO Eric Schmidt, in a 2011 interview.

You’ve probably heard it before: “if you’re not paying, you’re the product.” This is usually in reference to ostensibly “free” services like Facebook or Gmail. It’s a creepy thought. And, according to Shoshana Zuboff, professor emeritus at Harvard and economic analyst of what she’s termed “surveillance capitalism,” the selling of your personal information undermines autonomy. It’s worse than you being the product: “You are not the product. You are the abandoned carcass.”

Google, according to Zuboff, is the original inventor of Surveillance Capitalism. In their early “Don’t Be Evil” days, the idea of accessing people’s private Google searches and selling them was considered unthinkable. Then Google realized it could use search data for targeting purposes — and never stopped creating opportunities to surveil their users:

“Google’s new methods were prized for their ability to find data that users had opted to keep private and to infer extensive personal information that users did not provide. These operations were designed to bypass user awareness. […]In other words, from the very start Google’s breakthrough depended upon a one-way mirror: surveillance.”

Twenty years later, surveillance capitalism has become so ubiquitous that it’s hard to live in Western society without being surveilled constantly by private actors.

As far as I know, no mass popular culture has really yet captured this reality, but one small metaphor that kind of hits on its effects is a Black Mirror episode called “Nosedive.”

In “Nosedive,” everyday people’s lived experience is very clearly the picked-apart carcass for an entire economic and social order; a kind of surveillance-driven social credit score affects every aspect of your daily life, from customer service to government resources to friendships, all based on your app usage and, most creepily, how other people rate you in the app.

If surveillance capitalism has been the engine powering our economy in the background for nearly two decades, it’s now having a coming-out party. Increasingly, Google isn’t just surveilling us in private — with its “designing smart cities” initiatives, the company will literally be making city management decisions instead of citizens: Sidewalk Labs, a Google sister company, plans to develop “the most innovative district in the entire world” in the Quayside neighborhood of Toronto, and Google itself is planning on siphoning every bit of data about how Quayside residents live and breathe and move via ubiquitous monitoring sensors that will likely inform — for a fee naturally — how other cities will develop.

If surveillance capitalism has been the engine powering our economy in the background for nearly two decades, it’s now having its coming-out party.

Much like Apple, Google takes pains to present itself as a conscientious corporate citizen. They might be paternalistic, or antidemocratic — but they have learned it’s important to their brand that they’re seen as responsive to their workers and the broader public, largely thanks to the courageous and persistent effort of their workers and consumer advocates in civil society.

Not so much with Amazon.

“Elysium,” or, dystopia for some, Prime Day for others: starring Amazon

“[The New York Times] claims that our intentional approach is to create a soulless, dystopian workplace where no fun is had and no laughter heard. Again, I don’t recognize this Amazon and I very much hope you don’t either.” —Jeff Bezos, August 17, 2015 letter to staff after the New York Times investigation into working conditions at the company.

In 2015, Jeff Bezos felt the need to set the record straight: The New York Times was wrong about Amazon. Working there did not feel like a dystopia.

The years since have only validated the New York Times story, which focused on life for coders and executives at Amazon. Notably, when the Times and other investigative journalists have probed life for the far more numerous warehouse workers employed by Amazon, Bezos has largely stayed silent.

In fact, the further down the corporate ladder you get at Amazon, the more likely it seems that Jeff Bezos will stay quiet on any controversy. Just this month, in a report published almost exactly four years after Bezos’ “Amazon is not a dystopia” declaration, the New York Times has uncovered almost a dozen previously unreported deaths allegedly caused by Amazon’s decentralized delivery network. Rather than defend itself out loud, Amazon has kept quiet while repeating the same argument in the courts: Those delivery people aren’t Amazon workers at all, and thus Amazon is not liable.

Amazon, like every major tech giant, has a key role in the dystopia of surveillance capitalism — the monopolylike market share of Amazon Web Services, and Amazon’s involvement in increasingly ubiquitous facial recognition software, represent their own deeply dystopian trends. But the most visible dystopia Amazon creates, for all to see, is dystopia in the workplace.

In many ways, Amazon is the single company that best explains the appeal of an Andrew Yang figure to a certain slice of economically alienated young voters. When speaking near Amazon’s HQ in Seattle, Yang explicitly talked about the surveillance of Amazon workers, and how reliable those jobs are in any case:

“All the Amazon employees [here] are like, ‘Oh shit, is Jeff watching me right now?’… [Amazon will] open up a fulfillment warehouse that employs, let’s call it 20,000 people. How many retail workers worked at the malls that went out of business because of Amazon? [The] greatest thing would be if Jeff Bezos just stood up one day and said, ‘Hey, the truth is we are one of the primary organizations automating away millions of American jobs.’ […] I have friends who work at Amazon and they say point-blank that ‘we are told we are going to be trying to get rid of our own jobs.’”

You can flat-out disagree with Yang’s proposed solutions, but a lot of his appeal stems from the fact that he’s diagnosing a problem that broad swaths of people don’t feel is being talked about. Yang validates his supporters’ concerns that they are, in fact, living in a dystopia of the corporate overlord variety.

In the movie Elysium, most work is done in warehouses, under constant surveillance, with workers creating the very automation systems that surveil and punish them. The movie takes place in a company townlike setting, with no such thing as a class system or social mobility. Meanwhile, the ruling class in Elysium lives in space, having left everyone else behind to work on Earth, a planet now fully ravaged by climate change.

That might sound particularly far-fetched, but given Bezos’ explicit intention to colonize space because “we are in the process of destroying this planet,” it suddenly doesn’t feel so off the mark. And in an era where Governors and Mayors openly genuflect to Amazon, preemptively giving up vast swaths of democratic powers for the mere possibility that Amazon might host an office building there, it’s hard not to feel like we’re already in an Elysium-flavored dystopia.

Amazon has their dystopia picked out, flavor and all. But what happens when the biggest social network in the world can’t decide which dystopia it wants to be when it grows up?

Pick a dystopia — any dystopia!: starring Facebook

“Understanding who you serve is always a very important problem, and it only gets harder the more people that you serve.”

—Mark Zuckerberg, 2014 interview with the New York Times.

Ready Player One is one of the more popular recent dystopian novels.

The bleak future it depicts is relatively straightforward: In the face of economic and ecological collapse, the vast majority of human interaction and commercial activity happens over a shared virtual reality space called Oasis.

In Oasis, the downtrodden masses compete in enormous multiplayer video games, hoping to win enough prizes and gain sufficient corporate sponsorship to scrape out a decent existence. Imagine a version of The Matrix, where people choose to constantly log into unreality because actual reality has gotten so unbearably terrible, electing to let the real world waste away. Horrific.

Ready Player One is also the book that Oculus founder and former Facebook employee Palmer Luckey used to give new hires, working on virtual reality to get them “excited” about the “potential” of their work.

Sound beyond parody? In so many ways, Facebook is unique among the tech giants: It’s not hiding the specter of dystopia. It’s amplifying dystopia.

It’s hard to pick a popular dystopia Facebook isn’t invested in.

Surveillance capitalism? Google invented it, but Facebook has taken it to a whole new level with its social and emotional contagion experiments and relentless tracking of even nonusers.

1984? Sure, Facebook says, quietly patenting technology that lets your phone record you without warning.

Brave New World? Lest we forget, Facebook literally experimented with making depression contagious in 2014.

28 Days Later, or any of the various other mass-violence-as-disease horror movies like The Happening? Facebook has been used to spread mass genocidal panics far more terrifying than any apocalyptic Hollywood film.

What about the seemingly way out there dystopias — something like THX-1138 or a particularly gnarly Black Mirror episode where a brain can have its thoughts directly read, or even electronically implanted? It won’t comfort you to know that Facebook just acquired CTRL-Labs, which is developing a wearable brain-computer interface, raising questions about literal thought rewriting, brain hacking, and psychological “discontinuity.”

Roger McNamee, an early Zuckerberg advisor and arguably its most important early investor, has become unadorned about it: Facebook has become a dystopia. It’s up to the rest of us to catch up.

We spent the 2010s on dystopia—let’s spend the 2020s on utopia instead

“Plan for the worst, hope for the best, and maybe wind up somewhere in the middle.” —Bright Eyes, “Loose Leaves”

People generally seem to think dystopias are possible, but utopias are not. No one ridicules you for conceiving of a dystopia.

I think part of that is because it gives us an easy out. Dystopias paralyze us. They overwhelm. They make us feel small and powerless. Envisioning Dystopia is like getting married anticipating the divorce. All we can do is make sure it’s amicable.

Is there room for a utopian counterweight? There’s not only room, there’s an urgent need if we want to look forward (as opposed to despondently) to the 22nd century. We cannot avert or undo dystopias without believing in their counterparts.

But we need to make the utopian alternative feel real, accessible, and achievable. We need to be rooting not for the lesser of two evils, but for something actually good.

Dystopias — real, about-to-unfold dystopias — have been averted before. The threat of nuclear apocalypse during the Cold War. The shrinking hole in the ozone layer (which is both distinct from, and has lessons to teach us about, the climate crisis). We didn’t land in utopia, but it was only by hitching our wagons to a utopian vision that we averted the worst.

In 2017, cultural historian Jill Lepore penned a kind of goodbye letter to dystopian fiction, calling for a renewal of utopian imagination. “Dystopia,” she lamented, “used to be a fiction of resistance; it’s become a fiction of submission.” Dystopian narratives once served as stark warnings of what might be in store for us if we do nothing, spurring us on to devise a brighter future. Today, dystopian fiction is so prevalent and comes in so many unsavory flavors that our civic imaginations are understandably confined to identifying the one we deem most likely to inevitably happen, and to come to terms with it.

But we don’t have to.

A new decade is on the way. Let’s spend the 2020s exercising our utopian imaginations — the muscles we use to envision dystopia are now all too-well-developed, and a body that only exercises one set of muscles quickly grows off-balance.

Dystopias disempower. We are tiny, inconsequential — how could we do anything about them? Utopias, on the other hand, are rhetorical devices calling upon us to build. They invite our participation. Because a utopia where we don’t matter is a contradiction in terms.

Let’s envision a world where those creating algorithms are thinking not only about their reach, but also about their impact. A world in which we are not the carcass left behind by surveillance capitalism. A world in which calling for ethical norms and standards is in itself a utopian act.

Let’s spend the next decade fighting for what we actually want: A world in which the powerful few are held to a higher standard; an industry in which ethics aren’t an afterthought, and the phrase “unintended consequences” doesn’t absolve actors from the fall out of their very deliberate acts.

Let’s actualize the utopia which, ironically enough, the tech giants themselves so enthusiastically promised us when they set out to change the world.

Let’s spend this next decade asking for what we actually want.

The Philosophy of Westworld

By Jeremy D. Johnson

Source: Omni

Michael Crichton wrote and directed Westworld for the big screen in 1973. That same decade, in 1976, an adjunct professor named Julian Jaynes made the bestseller list with a surprising title: The Origins of Consciousness in the Breakdown of the Bicameral Mind. You wouldn’t think that a book with a name like that would become such a popular success. Yet, there it was. In 2016, when Westworld came to the small screen in the re-imagined HBO series, you wouldn’t imagine Jaynes getting heard from again. Especially since bicameralism wasn’t even mentioned in the Michael Crichton’s original film. Yet, there he was. Early on in Westworld’s first season Dr. Ford, one of the creators of the park, explains how he and his co-founder Arnold used a “debunked” theory about the origins of consciousness to bootstrap A.I. The scientific community didn’t recognize bicameralism as an explanation for the origins of the human mind, but, as Dr. Ford suggests, it could be useful for building an artificial one. Thousands of people—perhaps more—started Googling for “bicameral mind.” Bloggers and YouTube channels capitalized on the sudden interest by writing articles and introductory videos about this weird, arguably psychedelic theory of consciousness. Suddenly everyone was interested.

This article isn’t going to be one of those explanation pieces, but it’s worth mentioning a few, precursory details.

Looking Through the Mirror of Consciousness

According to bicameralism, human beings used to hear voices—auditory hallucinations—as a means for the right brain to “talk” with the left. Rather than having an inner monologue, the kind of self-consciousness we take for granted today, ancient people literally heard the voices of gods as their conscience, telling them what to do. This, Jaynes argues, accounts for the abundant descriptions from antiquity of gods and deities appearing all over the place, meddling directly in human affairs. Over time—about 3000 years ago—as various calamities occurred and societies got bigger, more complex, the bicameral mind broke down. The gods went silent. The modern, introspective self, quite literally, came to mind.

Jaynes may have been onto something, but even if he wasn’t, his book makes for a compelling and well-written read. The cultural zeitgeist of the 1970s, we must remember, was the high-water mark of psychedelic intrigue and “High Weirdness,” with writers like Philip K. Dick and Robert Anton Wilson both having their own inextricable experiences in 1974 (see “2-3-74”). Dick would turn this encounter into the semi-autobiographical VALIS trilogy as well as his Exegesis. This brings us back to our time.

Bicameralism would have been enough to place Westworld in good, present company: Netflix’s recent Stranger Things and OA, cerebral films like Arrival, and even the metaphysical, possibly D.M.T. inspired comic book movie Dr. Strange. Just to name a few. What connects any and all of these media is pop culture’s intensifying allure to the mysteries of our own consciousness. We’re having something, as The Atlantic recently suggested, like a “metaphysical moment.” Multiple realities intersecting with our own. Deep, dark structures of the psyche spilling up into the conscious mind in the form of auditory hallucinations. The emergence of consciousness buried somewhere in archaic chapters of history. All of these subjects are in a full saturation moment through hit T.V. series, and at least flirted with in Hollywood blockbusters. Consciousness is in. (Permit a moment of conjecture, but with the increased sense of global, existential malaise around issues like climate change and political nativism, that we’ve turned inward for solutions should come as no surprise. Western culture in the 1960s and 70s, despite, or because, of being under threat of a Cold War and nuclear armageddon, produced tremendously thoughtful and visionary art.)

Westworld is a show that celebrates the kind of weird prevalent in pop culture during the 1970s: a desire to connect with those hidden recesses of the psyche that each of us have experienced in dream, creative process, and revery. “O, what a world of unseen visions and heard silences,” Jaynes writes in The Origins of Consciousness, “this insubstantial country of the mind!” When Dolores, a “host” in the park, goes on her journey of self-discovery, there’s a part of us that goes with her. It helps that Dolores, along with the other hosts in the park, experience their memories as a kind of waking dreaming, navigating altered states of consciousness and auditory hallucinations in order to succeed in their quest for liberation. We’ve all felt, quite rightly, that there is more to ourselves than our waking, conscious minds, and that if there was some way to communicate with those occluded dimensions of ourselves we could gain some inkling of wisdom (hence, I think, all the self-described “psychonauts” around today). Westworld functions like a scrying mirror for the curious audience to embark on their own journeys of self-knowledge. It is this more intangible aspect of the show—and not just Western gunslinging androids—that made it such a hit.

Jeffrey Kripal, a religious scholar, writes about this intimate link between pop culture and consciousness in Mutants and Mystics: Science Fiction, Superhero Comics, and the Paranormal.

“What makes these particular forms of American popular culture so popular is precisely the paranormal. The paranormal here understood as dramatic physical manifestations of the meaning and force of consciousness itself.”

We are drawn to the weird because the weird is showing us something about ourselves.

Elaine Pagels published The Gnostic Gospels in 1979, a book which quickly became a classic in the American spiritual counter-culture. I mention it here because of the intriguing gnostic motifs embodied so well by Dr. Ford himself. For those of who you aren’t familiar with gnosticism, or The Gnostic Gospels, these were written by early Christian sects who, speaking very generally, believed in heretical ideas. There was no single gnostic church. Philip K. Dick was drawn to their darker, paranoid theme of the false world: the idea that our reality was somehow an illusory one—a trap—created by a lesser god. A “demiurge.” The demiurge would rule over its creation and keep human souls ignorant of their spiritual birthright, lest they break through themselves in states of elevated consciousness or “gnosis.” It was, in other words, up to the individual to liberate themselves, not through reason, or faith, but gnosis. Other popular films, like The Matrix Trilogy, would take this motif and run with it quite successfully. But Westworld’s Dr. Ford plays the perfect gnostic demiurge; having created the hosts in the first place, he ensures that they stay ignorant to their own potential for self-consciousness and liberation. Trapped in their loops, and wiped of their memories, the hosts remain blissfully unaware that they are existing inside of an amusement park. (To avoid any major spoilers I’ll simply leave this cryptic remark: we know this is only partly true by the end of season one. The gnostic trap becomes a different, albeit more violent, means toward freedom. Dr. Ford, by the final episode, becomes a triumphant expounder of the gnostic doctrine: the gods won’t help you liberate yourself. Those voices were you. You are the higher being waiting to become self-illuminated. Westworld is not only about consciousness, but liberation through personal gnosis.)

This Path is Never Linear

The maze is an image with deep significance. Hosts in the park, when they begin to develop nascent self-consciousness, are invited to partake in a puzzle—“The Maze.” The Man in Black is repeatedly told, much to his dismay, “the maze isn’t meant for you.” It doesn’t stop him from trying. The goal is to get to the center of it, but what does this mean? Carl Jung, the Swiss psychologist responsible for developing a theory of the unconscious, and for whom the 70s spiritual counterculture would help to popularize, would immediately recognize the maze as a symbol of both the labyrinth and the mandala. Let me explain.

By entering the maze, or synonymous labyrinth (the show dangles this myth in front of us with the strange appearance of a Minotaur host), an individual embarks on a perilous journey of self-discovery. It is through surviving the perilous twists and turns of the labyrinth that the adventurer gains some a form of self-realization. Think: Luke Skywalker and Yoda’s cave in Star Wars V: The Empire Strikes Back. In the case of Westworld, the maze leads to consciousness, and perhaps even freedom from the park itself. Jung, if he were alive today, might smile and nod. “The goal of psychic development,” he writes in Memories, Dreams, Reflections, “is the self.” Jung adds—echoing Dr. Ford—that consciousness isn’t a pyramid but a maze: “There is no linear evolution; there is only the circumambulation of the self.” When we see the image of the maze painted on the skull of a host, early on in the season, we’re looking at a mandala: those intricately patterned mazes often leading towards some center. Jung writes, “The mandala is the center. It is the exponent of all paths… to the center, to individuation.” It is through the messy, round-about series of wrong turns that we come to consciousness. “Mistake. Mistake.” There is no straight path to the center of the maze. There is no easy way towards self-discovery. No wonder we loved this show. It turns out the maze really is meant for us.

Modern Fictions – How the Sacred Manifests in Chaos, Superheroes and Outer Spaces

3124315-chaoswar00302

By Kingsley L. Davis

Source: Reality Sandwich

The virtual topographies of our millennial world are rife with angels and aliens, with digital avatars and mystic Gaian minds, with utopian longings and gnostic science fictions, and with dark forebodings of apocalypse and demonic enchantment
Erik Davis

…All our so-called consciousness is a more or less fantastic commentary on an unknown, perhaps unknowable, but felt text…
Frederick Nietzsche

Science fiction is always more important than science
Timothy Leary

Everything that can be said has already been said, or something to that effect. It is not original to make the statement that originality no longer exists as it’s all been done before. Yet, as Marshall McLuhan famously said, ‘the medium is the message.’ So it may not be the message we are concerned with here but rather the medium of its passing. And the adage goes that everything exists according to ‘time and place.’ When the ‘sacred speaks’ – so to speak – it does so through the ways and means of the times. This could apply to prophets, oracles, and channelling as well as pop culture and its modern fictions. The sacred, the sublime, has always walked amongst the profane. The signs are everywhere, blended into the sidewalks, pulp fictions, and the kitsch ‘n’ kool of the art world. For iconic sci-fi writer Philip K. Dick, most of the sublime things of his world were disguised as trash that seamlessly slipped into the background of a dysfunctional world reality. As modern society slipstreamed into a post-modern smorgasbord of chaos, clutter, poetically burnt outbursts, and beatific revelations, a new landscape of the sacred was scattered across the bedrock. The seeming trash of the everyday mundane clashed with the incoming cosmic, and a new urge for the transcendental found its way into so many popular cultural forms that it would take an encyclopaedic mobius-strip to recite it all. For my purposes here I will only all-so-briefly take a hop and skip around some of the budding flora that displayed a burgeoning sacred urge to blur the boundaries and reach for the sublime connection.

However paradoxical it may sound, one of the mediums for the sacred virus to spread came through the channel of chaos. Chaos, contrary to what we may think of it as being an anarchic and senseless cacophony, is actually a canvas for patterns to play out on. As the later emergence of the chaos sciences showed, there was a theory behind chaos – a method behind its apparent madness. Chaos, as we soon learned, did not operate in isolation. As the famous ‘butterfly effect’ was apt at promoting, a minimal disturbance in one part of the world (e.g., a butterfly flapping its wings) could result in a climate effect in another part (a tornado was often cited!). Everything thus existed in patterns, and not in arbitrary, random molluscs and mole-hills. The Santa Fe Institute (founded in 1984) quickly became a prominent centre for the research into complex systems, otherwise known as chaos science. Yet the emergence of chaos science had been actualized earlier through many different cultural forms of recognizing ‘chaos’ as a precursor to states of consciousness. Many forms/functions that emerge as aspects of the human condition are first seeded in popular culture ahead of their wider actualization. After all, the basis of the sacred refers to actualized aspects of human consciousness. And what the sacred art shows us is that its presence in our reality-matrix is determined by our capacity of consciousness to receive and acknowledge it. Chaos, as well as being patterns embedded in physical, computational, biological, and social systems, is also patterns of our minds. In fact, it can be said that chaos is part of the order of the cosmos.

 

Chaos & the Cosmic

“Tis an ill wind that blows no minds” – Principia Discordia

The signs for magic, chaos, and transcendental byways were popping up almost everywhere on the western landscape in the post-war, post-modern years. Enochian magic, Golden Dawn rituals, and meta-computing of the self were seeding a growing experimentalism of the human mind. In the US especially, a blend of anarchic cultural subversions were manifesting that played upon known semi-mystical memes. One of these was the text of the Principia Discordia that emerged in the nineteen-sixties as a ‘sacred text’ of Discordianism. Written by Malaclypse the Younger (Greg Hill) and Omar Khayyam Ravenhurst (Kerry Thornley) it proclaimed “All hail Discordia!” in a mixture of goddess worship with the notion of order and disorder as balancing illusions. The fifth commandment of Principia Discordia states, ‘V – A Discordian is Prohibited of Believing What he Reads.’ In mode with a rising tide of memes dealing with truth-through-contradiction the Principia Discordia also went on to claim that,

All statements are true in some sense, false in some sense, meaningless in some sense, true and false in some sense, true and meaningless in some sense, false and meaningless in some sense, and true and false and meaningless in some sense.

Discordia came to influence the writings of maverick author-philosopher Robert Anton Wilson, who popularised it further in his books, especially in ‘The Illuminatus! Trilogy.’ These utterances were echoed by the writer William S. Burroughs who, besides experimenting in cut-up narrative techniques, proclaimed a Discordian-esque ‘Nothing is absolutely true – Everything is permissible.’ Burrough’s infamous outburst was a culmination of religious history (the Assassins of Hassan-i Sabbah) with anarchic chaos from his spirit-possessed universe.

Around this time literate and literary magicians were cropping up everywhere, writing tracts on magic for a modern reader. Many of these literary figures were connected to the Golden Dawn system of magic. Yet another emerging stream was that of chaos magic which originated in United Kingdom in the late seventies. This broader magical path was liberal enough to combine forms of neoshamanism, eastern philosophy, quantum science, visionary art, and later computer technology. This experimental perspective on magic was part of a wider trend in experimenting with known forms for new avenues of stimulating and awakening consciousness. These ‘chaotic’ paths were attempting to destabilize our conditioning patterns and our resultant consensus reality. They were all aimed at waking up the usually-slumbering human mind. As the seminal work Waking Up (1986) by Charles Tart showed, humanity was largely intoxicated with a ‘consensus trance’ that kept us from recognizing sigils of the sacred. In more recent years the metaphors and memes of being trapped within a waking dream, or of dreams within dreams, have been explored in such popular films as ‘The Truman Show’ (1998); the ‘Matrix Trilogy’ (1999-2003); and Inception (2010). Part of the myth we find ourselves popularising is the mythology that we are in some sort of constructed reality – a gnostic-inspired simulacrum of truth.

Gnostic ideas are being gnawed over, processed, and consumed in ever more popular forms of culture. There’s an odd wave of mystical-spiritual impulses now radiating through popular culture that encourages us to throw ourselves into new world-spaces, fantastic realms, and mythological fictions and factions. These are new mash-ups of the counterculture now being packaged and presented as part of mainstream culture. And in recent years the most extraordinary success in this area has been the incredible, phenomenal rise of the modern superhero.

 

Superheroes & the Super-Self

Ever since Nietzsche first declared that ‘God is dead’ we have been reeling and dealing from our encroaching mortality – and trying to avoid this by seeking new technologies and cultural expressions of immortality. This collective experience on the possible ‘death of god’ is like a shock hammer-blow that propels us against the loss of sacred meaning and sublime mystery. Whether we admit it or not we fear the sense of absence, where nothing exists to which we can lend our communal assent. We don’t wish to struggle fragmented and bewildered, abashed by creative forms of indulgence. We cannot be left behind, losing our vital contact with the imaginal, the numinous, or the magical. We cannot be left untransformed in our vacant spaces as a paranormal pop culture washes over us. No – we need our superheroes, our possibilities, our potentials. We need to find a cultural expression for the human psyche; for our psychic currents and transmissions and sacred communication – our superheroes must live on!

Perhaps through the loss of our gods we have had to become our own multiple gods, as we realized a need to fill a vacuum left by myth. With the loss of the godly connection a different psychic wave was released upon the world to coincide with a rising arc of human consciousness. According to Jung, the gods gradually became our disease – ‘The gods have become diseases…who unwittingly let loose psychic epidemics on the world.’[1] These diseases have now morphed into mutations that make us into a hybrid human-god, with superhuman capacities, yet shunned by the world for being heretical against the natural order. We have the X-Men walking amongst us, a mutant subspecies of humans. The natural order is evo-mythological – it is sacred, beyond human, and connects us with evolutionary currents. In the absence of our ancient myths we have ingested the sacred alchemical root and through pop-culture morphed this transformation into the new wave of superheroes – myth lives anew in spandex. Maybe it is a cliché because it is true; we wish to find the personal superhero within each of us – the journey of the individual, unfolding within the great cosmic drama. This myth – this journey – has largely been taken from us through scientific rationalism and an industrial modernity. Yet now, by becoming more than oneself, we serve the larger story arc.

Our popular subcultures are gradually becoming the norm.  It is not only a question of whether more people are interested or not, but rather that these ideas are more widely available now thanks to popular culture. As William Irwin Thompson notes – ‘We Americans, who are so intent on creating a culture of technological materialism, cannot take in esoteric lore directly; it has to find another way in, and so comic books, science fiction, and movies are the back door.’ [2] Popular culture has been the back door for most of us, and not just for the Americans. But now perhaps the door frames are merging into the background and disappearing altogether. The waking life and the dream are becoming part of the same movie plot, as in Richard Linklater’s film version of Philip K. Dick’s A Scanner Darkly (2006). We are more and more waking up into our own movie – our very own Truman Show – where ideas are seeded directly into our environments in order to catalyze our awakening. Like the ancient Eastern tales told us, we have been asleep in a distant land and now we are receiving messages – signals – flashing like neon signs through our popular culture. This marks our juncture, our crisis point, between moving toward waking up or falling back into archaic, catastrophic and catatonic slumber. Again, Thompson reminds us that we ‘intuitively sense our evolutionary crisis and are expressing the catastrophe bifurcation through art – primarily through science fiction.’[3]

Our ultra high-definition visual culture is acting like a portal for the otherworld to enter. The psychedelic experiences that were once fringe and condemned are being re-played out through modern fictions that blend Gnostic tropes, mythological memes, and multidimensional portholes. Transcendental states of consciousness, ratified by the far explorations of new science, are adding to the mix of a new 21st century mythology that as of yet remains unnamed. Perhaps we are emerging toward the birth of new sacred gods. These are the gods of mutations, of neurological and biological adaptations. And they are emerging first in our pop-culture as our superheroes and psychic mutants. In this initiation into a psychically enhanced future we will need more than ever to learn how to distinguish the demonic from the spiritual. Hence the current barrage of cultural tropes in our films, TV series, and fiction that show angels vs. devils, humans vs. vampires, and the whole gamut of the good vs. the bad that has crawled from the forest floor to enter into the quest for the holy grail. All the while the Fisher King sits immobilized, feasting on an orgy of massively multiplayer online role-playing games (MMORPG). In this way the gods will never be forgotten as they merge with a super-augmented mutant humanity in spandex. As psychologist James Hillman says,

Remember: what the Greeks said their Gods asked for above all else, and perhaps only, was not blood; it was not to be forgotten, that is, to be kept in mind, recollected as psychological facts… [The God’s] reality can never fade as long as they are remembered, that is, kept in mind. That’s how they survive. [4]

The real gods, as we knew all along and yet had forgotten, reside within our psyche – they are kept in mind. And yet they can only become real for us – to re-mind us – when dashing about on the stage and streets in front of our very eyes. We need the sacred to slap our faces in spandex gloves before we begin to blink a waking eye.

As Jeffrey J. Kripal writes in his Mutants and Mystics, we have entered the stage of ‘Realization’ whereby we begin to recognize that the events around us in popular culture are not only real but are participatory. That is, our sacred and supernatural fictions appear for us and require our engaged reading of them in order for them to read us. Kripal says that,

In some fundamental way that we do not yet understand, they are us, projected into the objective world of events and things, usually through some story, symbol, or sign. Realization is the insight that we are caught in such a story. Realization is the insight that we are being written.[5]

The latest revival in the superheroes genre is significant in how it takes the mutant trope further and projects it forward as a form of evolutionary mysticism. Our new heroes are displaying to us our latent capacities and powers that are yet to unfold. We are witness to the first wave of mutant evolutionary pioneers. The summit of human evolution is far in the distance, and yet its early stages are manifesting through the Marvel and DC Universes where god-like potentials await us. Through such characters as Spider-Man, Iron Man, Captain America, Wolverine, and Doctor Strange, Marvel mesmerizes paranormal subliminals into popular cultural consciousness. And DC does the same with Superman, Batman, Wonder Woman, Green Lantern, Flash, and Green Arrow. Then as gangs they come together as the Guardians of the Galaxy, the Fantastic Four, the X-Men (Marvel), or as the Justice League (DC). They are now our teachers, our guides, our mutant futures that are beyond human. As Kripal recognized, the mutants have become practicing mystics.

We are seemingly living more and more in a mutational and metaphysical universe; and with the arrival of augmented reality our boundaries of interaction with the physical world around us will blur. And yet this suggests a return to the sacred perspective whereby the tangible and intangible worlds become an integral part of our holistic reality-matrix. And we are already well on our way as our outer and inner spaces explode into new blistering supernova.

 

Outer Spaces – Inner Spaces

Humankind has always been a child of the stars. Our early civilizations mapped the heavens before they mapped the terrain under their feet. The abode of the gods was amongst the glitterballs of the night sky, and their chariots blazed across the incandescent cosmic canvas. So it was no surprise then when the UFOs started to dart across our urban skies and come crashing down disguised as government weather balloons. Recent popular culture has nurtured a fascination with outer spaces and our galactic cousins from the Golden Age of science fiction of the nineteen thirties, forties, and fifties to the new wave of the sixties and seventies. The concerns of our outer space relations shifted from how to make contact with our space cousins to the entropic death of the universe. And then the environmental trope entered our outer spaces, as if a subliminal projection from our very own inner spaces. The sacred inner space of humankind was now tethering with the galactic outer spaces concerned with our future place in the universe. The growing number of alleged UFO abductees that emerged in the latter part of the twentieth century began to relay messages of extraterrestrial concern for our planetary well-being.

John. E Mack, an American professor of psychiatry at Harvard Medical School, in his later years became a leading authority on the spiritual or transformational effects of the alien abduction experience. Mack came to view the alien abduction phenomenon as acting as a catalyst to ‘shatter the boundaries of the psyche and to open consciousness to a wider sense of existence and connection in the universe.’[6] For more than a decade Mack rigorously studied the alien abduction phenomenon and interviewed hundreds of people (whom Mack referred to as ‘experiencers’). What initially started out as an exercise in studying mental illness soon turned into an in-depth inquiry into personal and spiritual transformation. Mack eventually came to see the alien abduction phenomenon as one of the most powerful agents for spiritual growth, personal transformation, and expanded awareness – in other words, as a trigger for a sacred experience. Despite the external anxiety produced by the experience, it was clear to both Mack and his set of experiencers that a profound communion was being established between humankind and other realities. Further, that this interaction was catalyzing a shift in human consciousness toward collapsing the old models of materialistic duality and opening up a connection not only ‘beyond the Earth’ but with other dimensional realities. Mack notes that ‘the process of psychospiritual opening that the abduction phenomenon provokes may bring experiencers to a still deeper level of consciousness where the oneness or interconnectedness of creation becomes a compelling reality.’[7]

This interconnectedness became a channel for the experiencers (abductees) to receive an impressive range of information; such as healing knowledge, spiritual truths, science, technology, and ecology. A major part of the information was apparently concerning the status of the Earth and humanity’s relationship with its environment. Many of the experiencers referred to their own abduction phenomenon as participating in a trans-dimensional or interspecies relationship. The transformative effects of these unusual encounters were often remarkable. Mack’s experiencers talked about an expansion of psychic or intuitive abilities; a heightened reverence for nature; the feeling of having a special mission on Earth; the collapse of space/time perception; an understanding of multi-dimensions of reality and the existence of multi-verses;  a feeling of connection with all of creation; and a whole range of related transpersonal experiences. Significant from these accounts is that, according to the experiencers, the abduction phenomenon is sometimes accompanied by a sense of moving into, or connecting with, other realities or dimensions. The sacred space and outer space were becoming one and the same. Or to put it another way, the contact initiated from those ‘out there’ was having a catalyzing effect to trigger an awakening in the inner spaces way ‘down here.’ It made sense then that our human future was going to include space migration. And according to our galactic cousins, it may even be a necessity if we continued to mess up our planetary home as if it were nothing but a playground to scoff around in.

Inner space junkie Timothy Leary was already riding that space-me-outta-here ticket with his S.M.I.2L.E. philosophy. Leary’s S.M.I.2L.E. stood for Space Migration, Increased Intelligence, and Life Extension. Basically, these were all the tropes from the post-humanism melange added on to the sci-fi dream of humanity living off-planet. We also have now the commercial race to establish a new branch of space tourism, with Virgin Galactic being one of the visible and vocal frontrunners. SpaceX, another private enterprise, is banking its dollars on helping to colonize Mars. There’s no lack of vision, it’s now down to the know-how and the technological leg-up. Now that the space cat is out the bag (excuse the pun), it’s only going to be a matter of time before the picture we have of ‘being human’ will incorporate the starry, cold vistas of outer space. From the earliest sacred expressions in the cave art of our ancestors to the ideas of space migration, they all show two fundamental urges within the human being: i) I am human, I am here (recognition); and ii) Where is the heavenly connection? (contact). Human dreams have encompassed living on Mars, leaving and migrating beyond the solar system, and of contact with ‘Higher Intelligence.’ Gene Roddenberry, creator of Star Trek, managed to combine both contact and communication through his receiving of channelled information. It has been documented that Roddenberry was introduced to an entity called ‘Tom’ who represented the Council of Nine, through the channel medium Phyllis Schlemmer. Roddenberry was allegedly receiving information for a film script to be written that would help prepare the public for extraterrestrial contact. The alleged film never got made, yet we might wonder what ideas made their way into Star Trek (including Star Trek: Deep Space Nine). It appears that there are those ‘out there’ who are concerned for our proper preparation for the sacred communion. And the archetypes are now flooding through our popular culture like an evangelical tsunami.

The mythic archetypes from Joseph Campbell’s The Hero with a Thousand Faces filled out the roles in George Lucas’s epic Star Wars universe. The good, the bad, and the hairy all took their cue and played along with the hero’s journey for an updated mythological rendering. Whilst the rise of industrial modernity and the secularization of culture may have contributed to an eroding of our myth-consciousness and a demotion of mystery, a new vital force has emerged that is shifting our planetary pranic energy. There may be those who bemoan that our current civilization does not have a mythic centre, yet they’re missing the point. And this point is that there is no exact point anymore. As hermetic lore states, the center is everywhere and the circumference nowhere. The earlier gods retreated back on their sky chariots until we finally arrived at the point where we asked ourselves where all the gods went. The new sacred guides are now secreted in our popular texts that penetrate the outer and inner worlds. These post-historic mythic guides are first to be found within us – within our collective species psyche that gets projected out onto our celluloid and digital-scapes. These re-modelled chameleon mythic memes are telling us that we are not here alone, nor are we here for ourselves alone. The future is both arriving, is here now, and has already been.

We have such films as Back to the Future, Primer, Looper, Terminator, Interstellar, and all the rest to attest to our obsession with shifting our timely perspectives. Everything is now malleable, according to our new quantum sciences, and our sacred revival is knocking down linear walls of rigidity. Just when you thought that you were safe in stable comfort zones, the paranormal is getting ready to redress itself as the new normal. A Gnostic-like awareness of being embedded in a reality-construct will become ever greater as our technologies increasingly broker and interface our physical experience. There are a plentiful array of fictions and films that ply us with plots on technologically-driven machine gnosis. Perhaps they are trying to signal that we are entering the sacred space of hybrid awareness. The film Transcendence (2014), for example, showed humanity edging toward sacred sentience as a means for solving the world’s global problems. As Vaclev Havel stated in one of his addresses – ‘Transcendence is the only real alternative to extinction’ (July 4th, 1994). Yet we are not on our way out, despite what the fear-mongering mainstream media may be trying to ram down our throats. Nor are we heading toward a techno-machine Overlord future with us as the slaves. Because the sacred works in multiple streams and never hedges all bets on a one-trick pony.

The game changer coming onto the scene is the participatory mind of human consciousness. The coming space migration is a reflection of our expanding inner spaces. We are toying with these memes in our popular culture now ahead of their coming actualization. What our fictions are dealing with are the blueprints before we’re ready to go the full hog. And that’s why we’re in a period of incredible experimentation – we are juggling with a new type of energy coming into our cultural realities. And this new pranic force is getting expressed in a myriad of multiple forms; be it creatively, chaotically, commercially, or crazily. It’s a cacophony of exuberance and experimentation trying to find its harmonic resonance. We are gaming, bopping, and trailblazing our way into a re-identification with a sacred energy. There’s a strong sense of the sacred filtering through our modern cultural memes, and it’s not all as chaotic as it seems.

 

References

1 Sabini, Meredith (ed) (2008) C.G. JUNG on Nature, Technology & Modern Life, Berkeley, CA, North Atlantic Books, p98

2 Thompson, William Irwin (1998) Coming Into Being: Artifacts and Texts in the Evolution of Consciousness. New York, St. Martin’s Griffin, p218

3 Thompson, William Irwin (1998) Coming Into Being: Artifacts and Texts in the Evolution of Consciousness. New York, St. Martin’s Griffin, p223

4 Cited in Hollis, James (1995) Tracking the Gods: The Place of Myth in Modern Life. Toronto, Canada, Inner City Books, p147

5 Kripal, Jeffrey J (2011) Mutants & Mystics: Science Fiction, Superhero Comics, and the Paranormal. Chicago The University of Chicago Press, p217-18

6 Mack, John, E. (1999) Passport to the Cosmos: Human Transformation & Alien Encounters. New York: Crown Publishers, p218

7 Mack, John, E. (1999) Passport to the Cosmos: Human Transformation & Alien Encounters, New York, Crown Publishers, p136

Weaponized Hyperreality: Social Engineering Through Corporate State Propaganda and Religion

2789bbece2d59ed130d9d0ee1aeea307

By Luther Blissett and J. F. Sebastian of Arkesoul

Perhaps no philosophical concept more aptly describes the current cultural milieu than hyperreality, characterized by wikipedia as “an inability of consciousness to distinguish reality from a simulation of reality, especially in technologically advanced postmodern societies.” The predominance of hyperreality comes at a time when people in power have never had more to conceal, distort and distract the population from while there’s never been more people who have more means and motives to stay distracted. This is evident in many aspects of contemporary life from corporate news narratives shaped by sponsors and “official sources”, increasingly absurd denials of the true state of the economy from (mis)leaders, widespread dependence on pharmaceuticals worsened by direct-to-consumer advertising and a sham drug war, fanatical worship of celebrities, to slavish acquiescence to fads and fashion. But most obvious is the increasing amount of time spent in front of screens whether for work, shopping, social media, education, self-expression, games, web content, or the exponentially growing volume of video entertainment. Though video games and web series are catching up, the primary narrative formats for cultural expression and transmission today are still television and film.

Struggling to retain their cultural/economic status in the face of increased competition while appeasing shareholders of their monolithic multinational corporate owners, large film and television studios are increasingly risk averse. This is glaringly apparent in the output of major studios which are for the most part the media equivalent of comfort food; familiar (formulaic), satisfying (crowd pleasing), full of empty calories (lacking intellectual/emotional complexity or challenging ideas) and generally bland in terms of content and presentation. On television this is commonly displayed through clichéd tropes, characters and situations while films are now more than ever driven by CGI enhanced spectacle. Both rely on repeating what has worked in the past and (for viewers of a certain age) appealing to nostalgia while pandering to current cultural trends.

Of course such strategies overlap, as there’s more than a few television programs that offer Hollywood style spectacle and big budget movies which imitate successful formulas in the form of adaptations, sequels, prequels, reboots, spin-offs, and mockbusters. In fact the majority of Hollywood’s summer blockbuster output is now comprised of such derivative and safe content predominantly in the form of fantasy and science fiction films.

The ideological motives and functions of cinema and other pop culture are manifold, but a major one is control and influence of mass audiences. We now know the US government has been doing it at least since the 1950s. According to a Church Committee investigation detailing Operation Mockingbird in 1976:

“The CIA currently maintains a network of several hundred foreign individuals around the world who provide intelligence for the CIA and at times attempt to influence opinion through the use of covert propaganda. These individuals provide the CIA with direct access to a large number of newspapers and periodicals, scores of press services and news agencies, radio and television stations, commercial book publishers, and other foreign media outlets.”

More recently, in 1991 the Task Force Report on Greater CIA Openness revealed the CIA “now has relationships with reporters from every major wire service, newspaper, news weekly, and television network in the nation,” which enables them to “turn some ‘intelligence failure’ stories into ‘intelligence success’ stories, and has contributed to the accuracy of countless others.” It also revealed that the CIA has “persuaded reporters to postpone, change, hold, or even scrap stories that could have adversely affected national security interests…” (Global Research, Lights, Camera… Covert Action: The Deep Politics of Hollywood)

Government influence of culture factories such as Hollywood through covert infiltration or embedded advisors ensures that the end product reflects the values and behaviors they wish to promote (ie. xenophobia, deference to authority, nationalism, parochialism, narcissism, anti-intellectualism, consumerism, rapaciousness, etc). In some cases, most notably Zero Dark Thirty and United 93,  the goal is to cement an official narrative into the collective consciousness. A more sophisticated method of social engineering via Hollywood is predictive programming; presenting through media societal changes to be implemented by leaders in order to gradually condition the public and reduce resistance to such changes.

Manipulation of public sentiment through mass media also makes sense from a purely corporate perspective. Why wouldn’t media owners gear the ideological content of their products to support the systems they benefit from while screening out more critical messages? Occasional subversive content may get past the gatekeepers if it’s immediately profitable (which it sometimes can be if particularly resonant), can be co-opted in some way that serves the status quo, or if the creative minds behind it are particularly lucky, talented, and/or well connected. Regardless, one could argue uncritical media consumption is a form of pacification through distraction and escapism and all corporate media content are a result of calculating the highest return on investment, which more often than not reflects the culture’s most deeply ingrained values and myths.

This is particularly true for fantasy/sci film films, which have become ubiquitous for a number of reasons including cultural tastes of global demographics, aesthetic trends (eg. hyperreal CG effects for evermore spectacular imagery), impact of changing media technology on the economics of production and distribution, growing awareness of the value of properties belonging to rich fictional universes which can be mined by worldbuilding studio screenwriters, and in many cases, resonance with our increasingly dystopian world. Most fundamental is profitability, especially as sfx technology becomes more advanced and affordable, licensing opportunities increase, and film franchises that come with large and passionate built-in fan bases reduce the need for marketing and practically sell themselves.

Many who grow up immersed in geek culture already have a hyperreal relationship with fantasy and science fiction realms which heightens the nostalgia evoked by the stream of multimedia incarnations and product tie-ins (bolstered by cult-like fan communities). Is it any surprise that fans who’ve extrapolated on the “Jedi” concept from the Star Wars films turned it into a religion? The Jedi cosmology (and similar ones from countless sci-fi/fantasy films) are modeled on mysticism, a philosophical framework which could fill a void for spiritually deprived materialist cultures. For many people, comic book fandom is another safe and entertaining way to explore concepts that might otherwise be too “out there” (perhaps especially among those who share an equally strong interest in materialist science). At the same time, because of the influence of marketing, the greater role of technology in society and changing cultural trends, geek culture has become a larger part of mainstream culture. Combined with celebrity worship, the lure of technology (both on-screen and off), and increasingly omnipotent powers of multinational corporations, modern big budget sci-fi/fantasy films represent a confluence of potent socioreligious crosscurrents.

Recent works such as Christopher Knowles’s Our Gods Wear Spandex and Grant Morrison’s Supergods examine to an extent superheroes as modern mythological archetypes. Bill Moyer’s The Power of Myth explored how Joseph Campbell’s theory of the monomyth (or hero’s journey) influenced and shaped the Star Wars films (which itself has influenced myriad blockbusters since). In The Hero With a Thousand Faces, Campbell identified a story template used in almost all pre-modern cultures across the globe which goes something like this:

A reluctant “chosen one” in an ordinary world receives a call to adventure and warning of a danger that must be confronted. With the training and wisdom of a mentor the hero crosses the threshold into the unknown. Companions acquired along the way assist in overcoming a series of challenges and temptations until reaching the depth of their fears and resultant apotheosis or rebirth. This empowers them to achieve their goal and return triumphant to an admiring family/community/nation etc.

It’s not hard to see the attraction of narratives such as this which tap into primal emotional needs and can be found in a wide range of religious narratives such as the lives of Buddha, Christ, Muhammad and Rama among others. It also serves as a metaphor for spiritual/psychological journeys through life.

In a recent post on his Secret Sun blog, Christopher Knowles states “Myths grow out of times of crisis and upheaval, in one way or another. The current vogue for superheroes is a symptom of the powerlessness felt by a populace under assault by the realities of Globalist social engineering, war-making and economic redundancy.” I would add that myths can also be exploited to function as part of a cultural assault to perpetuate Globalist agendas. Authoritarians are all too eager to depict themselves as monomythic demigod saviors and/or those serving them as self-sacrificing rugged individualist heroes fulfilling their grand destinies.

In the same piece, Knowles concludes: “But myths do die. They aren’t immortal. The next war or wars may in fact sweep away the myths of the 20th Century entirely. The wars may send people reaching back to far older myths as civil wars can rekindle the bonfires of identity, sending people back to the myths of ancestors. This has always emerged in times of close conflict, particularly in conflicts seen as struggles against occupying powers.”

If he’s correct, there may be some hope for our culture to reclaim myths as a means of understanding reality rather than serve as a trapdoor to fabricated hyperreality. The problem is that there is a gap that needs “filling in” between reality and hyperreality. One of the many consequences of postmodernism is the complete blurring of the line between what is real and what is not. A sort of apathy has kicked in within the human psyche given that crushing truths, not easily discernable in the past, are all out there in the raw. Religious and scientific truths once held sacred can be easily discarded. Morality is a rare hobby in a generation both cynical and powerless to discern reality. This is as well due to globalism and technology, which serve as hubs for information retrieval that wasn’t readily available. Humanity has developed thicker skin, while at the same time widened the existential void left by a reality that is less and less objectifiable. Opinion makers are everywhere, information is ubiquitous, and the species is obsessed with being entertained while answers are readily manufactured in the shape of capital fetishes, all the while ideology that purportedly made a call for a “better and different” world, such as Marxism and psychoanalysis, has become both a haunting spectre and an empty promise.

In the past these formulae failed. In the future they seem more and more unlikely. Capitalism has adapted itself to revolutionary ideology. It has generated even more power from it, defusing the motivation for change and twisting the definition of revolution, all the while turning such concepts into brands. The irony. There is call for a “new objectivism”, however. A bet for a system reboot, in which categorical truths can be retrieved and argued from. The analogy is this: keeping what works and dismissing what doesn’t. Sounds like a simple and logical plan. The problem is that those who get to define what works and what doesn’t will be the powerful, uncanny minority. This is their game, and we have cynically accepted it. It is the way it is. Unless we can evolve from reality to hyperreality, and from hyperreality back into reality, as a species that learns, adapts, understands how high the stakes really are, and moves forward as a collective that is conscious and responsible of its flaws, it appears we are doomed. Three scenarios: first, the narrative will continue as is: the majority will continue to be repressed, and will perpetually seek escape by the hand that feeds until lost completely in hyperreality. Technology moves forward, religion condenses into inconvenient myth: we completely “plug in”. Then what? Well, you just have to see Her to see into this future. The second, war extinguishes civilization and winds back the evolutionary clock, think Mad Max, until we reach the first scenario, as if in a loop. The third and most bleak, nuclear war. The species ends.

What we learn from this exercise is that we are at the apex. This is it. The crushing truth of existence is firmly on our shoulders. War is unravelling. An ever-shrinking number of brands dominate the world. And an even smaller number of people call the shots. In between reality and hyperreality there is confusion. There is no longer a basis to discern between the two. We are as it were, lost. We need to fill in this gap. We need to dig deeper than ever before for a reason to dissolve our differences. Somehow the dilemma is set: surrender or die. But the crux of the problem can be overridden if we use the knowledge and tools we have to fight for a better, and more responsible alternative.