RAD, MAD & BAD: The Analog Rebellion of Craig Baldwin and Other Cinema

By Andy Prisbylla

Source: We Are The Mutants

In our current technocratic society, it’s incredibly rare to meet someone who is genuinely free. The erosion of the Consent Decrees of 1948—which allowed media conglomerates to own and control movie theaters—drastically altered the landscape of film and video production, further destabilizing an already unlevel playing field between corporate interests, content creators, and consumers. The trickle-down economics Reagan touted in his 1981 tax act proved only to favor the affluent, further alienating independent creators who were frozen out of a livelihood through economic blacklisting, a perpetual attack that continues to this day. Bill Clinton’s elimination of the fin-syn rules that required television networks to source 35% of their content from independent producers only helped to continue this trend into the new millennium, and soon the mainstream movie and TV-consuming public was offered a slate of hegemonic programming supplied by a monopoly rule. 

With traditional avenues of information exchange becoming more restricted, pockets of transgressive media resistance—inspired by the countercultural film and video collectives of the ‘60s and ‘70s—gained 501(c)(3) non-profit status in 1980s America. These artist-run community organizations championed alternative educational perspectives on media literacy and performance, such as Artists’ Television Access in the Mission District of San Francisco, California. Operating under the umbrella of this community space exists a cinematic collective with a subversive trajectory: a film screening series and analog archive curated from the margins of mainstream media and acceptable art practice. Under the stewardship of underground filmmaker and curator Craig Baldwin, Other Cinema stands as the vanguard of Baldwin’s personal artistic conviction—what he calls “cinema povera,” an anti-capitalist filmmaking creed where artists only use the materials at their disposal to create art. Combine this practice with an ethos of media archeology and mixed-media collage that predates our current remix culture activities and what’s generated is an exhibition calendar of the modern avant-garde—a thirty-six week screening schedule projecting experimental film and video to the masses. Every Saturday night, cartoons, B movies, and commercials hold equal ground with industrial, educational, documentary, personal essay, and public domain/orphan films, bringing together numerous artists and filmmakers from around the world under one cinematic ceiling for close to 40 years.

Specific details surrounding the origins of Other Cinema are hard to quantify, partially due to the vastly prolific yet oddly cryptic career of founder Craig Baldwin. Born into a self-admitted 1950s middle-class existence in the Sacramento suburb of Carmichael, California, Baldwin spent his teenage years nurturing a ravenous curiosity for subversive cultures and media. During high school, he was often at the local Towne Theater enjoying the latest midnight show of underground programming, absorbing the cinematic combustion of the ‘60s experimental scene led by filmmakers like George Kuchar and Bruce Conner, who as a teacher would later kick Baldwin out of his film class while attending San Francisco State University. In college Baldwin also indulged in subterranean films such as Peter Watkins’s 1966 pseudo-doc The War Game and exploitation flicks like Paul Bartel’s 1975 sci-fi dystopian romp Death Race 2000. Forming a fascination with film exhibition, Baldwin worked as a projectionist at several movie houses throughout the city, navigating the film worlds through an eclecticism of arthouse, exploitation, pornography, and political activism—including contributing programming and film services to El Salvador Film and Video Projects for the Salvadoran solidarity movement of northern California. His early activism with the artistic political action collective the Urban Rats saw him and his cohorts reclaim San Francisco’s urban landscape through adverse possession or “squatter’s rights,” which allowed Baldwin to experiment with expanded cinema performance, such as projecting film in abandoned buildings and other derelict dwellings. 

This direct approach towards genre and social action speaks to Baldwin’s personal opposition towards the status quo, an attitude that not only informs Other Cinema’s motion picture programming but also Baldwin’s own filmmaking forays. His early experiments with Super 8 film—such as the prototypical culture jam/situationist prank Stolen Movie—bled into his 16mm attacks on advertising, consumerism, and colonialism in Wild Gunman and RocketKitKongoKit before gaining maximum velocity with his Dexedrine-driven, countermyth conspiracy report Tribulation 99. Making up the pure found footage/collage aesthetic of his filmography until introducing live-action performance into the mix with his films !O No Coronado!Spectres of the Spectrum, and Mock Up on Mu, these early works draw heavily from Baldwin’s now massive archive of analog film. Housed beneath the Artists’ Television Access property, this subterranean scroll of marginalized media is continuously rescued from the bowels of civilization’s ever evolving technological burden and given new purpose. The shift from film in the 1970s to magnetic tape in the 1980s saw major institutions overhaul their audio/visual collections in favor of more economical video formats, sending reels of celluloid to the dumpsters and landfills. Much like the Dadaists of the early 20th century avant-garde, whose use of appropriation and photomontage expressed anti-bourgeois protest through their art, Baldwin and company salvaged these bastardized works from material obscurity and celebrated their ephemeral nature through collage and remix. These hybrid works of the late 20th century serve as precursor to many of our current 21st century new media innovations, resulting in the continued radicalization of modern artistic folklore, such as the mashups and supercuts of Everything Is Terrible! and the radical anti-authoritarian statements of the sister collective Soda Jerk

Baldwin and Other Cinema’s defense of the diminished and discarded extends not only to the physical media he interacts with but to the audience he exhibits for. Maintaining a dialectical attitude, Baldwin expresses both respect and disrespect towards film genre and classification by spinning one off the other and forming new categories. Each screening is meant to give equal weight to diverse voices and provoke participation amongst attendees—an ethos Baldwin codified with his underground screening series The RAD, The MAD & The BAD while programming film events for Artists’ Television Access during the organization’s formative years. A protean yet practical film genre grouping system sorted through three major categories stripped of pretense and soaked in punk rock colloquialism, each selection was designated its own time slot on Wednesdays and Saturdays with those represented creating a continuity across each section:

The RAD: showcasing political and social action films 

The MAD: mad genius or auteur cinema

The BAD: psychotronic themes of horror/sci-fi/exploitation

Defying the unspoken restraint behind many traditional classification systems that favor a false high-brow aesthetic to an honest low-brow sensibility, The RAD, The MAD & The BAD crossed the cultural demarcation line with an egalitarian stance towards genre representation, allowing for serious discussion about what constitutes a film’s importance and its commodification within society. More importantly, it displayed through example that poor production doesn’t always mean poor quality, and films created on the margins of capital contain a certain cultural influence and accessibility that corporate-backed productions can only hope to afford or inspire.

The authentic response audiences gave towards the weekly film schedule at Artists’ Television Access saw the prestigious San Francisco Cinematheque approach Baldwin to bring his street sensibility to their precocious crowd with Sub-Cinema, a RAD, MAD & BAD-inspired program that ran over the course of 1985. The creation of other pop-up programs like Anti-Films and Eyes of Hell inspired Baldwin to consolidate his film selections under his own programming umbrella, and soon the ethos that fueled The RAD, The MAD & The BAD manifested itself into the physical embodiment of Baldwin’s own psyche and practice with the foundation of Other Cinema. 

If the RAD, MAD & BAD helped bring acceptance to the concept of marginalization in film selection and exhibition during the 1980s, the programming behind Other Cinema built upon this provocation by introducing new alternative voices from the microcinema scene of the 1990s. One of the forefronts of this new cinematic experience, Other Cinema became a home for a subculture of film using and reusing old and new technologies to create future underground works, with filmmakers and exhibitors from across the country like Sam GreenMartha ColburnGreta SniderBill DanielOrgone Cinema3Ton Cinema, and “others” coalescing to this space like the children of Hamelin to the Pied Piper’s whimsical flute. Many of these groups and individuals appear in Baldwin’s upcoming career monograph Avant to Live!, a 500-page treatise detailing his cinematic trajectory in the media arts.

The decline of physical media coupled with our perpetual progression towards a digital state continues to divide us, with some championing the virtual realm and its democratization of new technologies and others questioning its effect on the human experience and how we interact with each other. The popularity of streaming services continues to challenge the economic longevity of physical media, forcing film formats into a wave of obsolescence. Despite this, Craig Baldwin and Other Cinema rise against the tide with an analog assault of expanded cinema every Saturday night. Filmmakers on the fringe and maverick media archeologists with an overwhelming responsibility to film history, yet hamstrung by a lack of resources, congregate at Other Cinema to embrace the struggle in an ever evolving motion picture renaissance. It’s a form of masochism one needs to make it on this side of the art world—the “masochism of the margins,” as Baldwin often says. It takes pain and sacrifice to live here, yet the psychic rewards far outweigh the material loss. 

Craig Baldwin: Avant to Live! is a collaborative project between San Francisco Cinematheque and INCITE: Journal of Experimental Media and was released on May 30, 2023.

Andy Prisbylla is an underground filmmaker and exhibitor from the rust belt apocalypse of Steel City, PA. His screening series SUBCINEMA showcases subterranean movies and art through digital programming and live pop-up events. Find out more through Letterboxd and Mastodon

Saturday Matinee: Kurt Vonnegut: Unstuck in Time

By Matt Zoller Seitz

Source: RogerEbert.com

“Kurt Vonnegut: Unstuck in Time” is messy in the way that wakes for dear friends are messy.

Some speakers go on too long, and there are others that you may wish you’d heard from at greater length, or at all. And the raw sentiment coursing through every moment of the affair, however heartfelt, can be overwhelming, especially if you didn’t know the deceased as well as the folks memorializing him. 

The deceased here is Kurt Vonnegut, and the person who planned, executed and hosted this cinematic wake, director Robert B. Weide (a veteran documentarian and an Emmy-winning director for “Curb Your Enthusiasm”), was a friend of Vonnegut’s throughout the final 25 years of his life. This movie, co-directed by Don Argott, runs over two hours. Thematic and structural ideas are introduced, nurtured, forgotten, then reintroduced awkwardly. Weide himself is a major character—as well he should be, considering that Vonnegut essentially made Weide his personal archivist, sending him letters and manuscripts and faxes and video and audio tapes, and this film is as much a portrait of a friendship as it is the warts-and-all record of a great writer’s life—but sometimes the proportions feel off. When Weide disappears for long stretches, I don’t know that it’s exactly a slam to say that you don’t miss him, because people are mainly here for Vonnegut, one of the most important American writers of the 20th century, and a fount of charisma even at his lowest depths of sour narcissism in the 1970s. 

Vonnegut fans know that he specialized in slim, nimbly written books, with short chapters and short paragraphs that jumped wherever Vonnegut’s consciousness happened to take him. “Unstuck in Time” lets us know that it is consciously modeling its structure on Vonnegut’s writing, in particular his widest-read work, the nonlinear novel/memoir “Slaughterhouse Five,” from whence the documentary’s subtitle is drawn (“Billy Pilgrim came unstuck in time,” it starts); and to a lesser extent, Vonnegut’s late-career bestseller “Timequake,” a fragmented, self-aware book that is partly about the difficulty of writing “Timequake.” 

There are also cinematic allusions to Vonnegut’s literary alter-ego, Kilgore Trout, in the way that Weide and Argott and three credited film editors weave together the relationship between Vonnegut and Weide. Weide first meets Vonnegut in 1982 at age 23 after writing him a fan letter inquiring about the possibility of making a documentary about his life, and he holds onto that youthful starstruck quality even in reminiscences shot long after Vonnegut’s death in 2007. Over time, the pupil gains the master’s respect, to the point where Weide writes and coproduces a feature-length adaptation of Vonnegut’s novel “Mother Night,” starring Nick Nolte and directed by actor-filmmaker Keith Gordon, who as luck would have it played Rodney Dangerfield’s son in “Back to School,” a comedy in which Vonnegut played himself.

This may all sound as if it’s articulated more cleanly and effectively than it is. The filmmakers have committed simultaneously and with equal enthusiasm to a couple of filmmaking approaches that are at odds. One is the detached, clinical-mathematical, unsentimental, science-fictional, time-tripping biography, a la “Slaughterhouse Five” and “Timequake,” represented here by inventive cutting from image to image and idea to idea, sometimes lingering on signifiers of creative self-awareness. These include closeups of the timeline on an editor’s computer screen, montages of Vonnegut doing or saying the same thing in different decades of his life, snippets of films based on Vonnegut’s writing, and animated sequences modeled on Vonnegut’s drawings, which were as distinctive as his prose.

The other approach is more straightforward: Weide and Argott are making a straightforward PBS-style documentary about an artist’s life, supervised by a director and fan who knew him intimately, and tghat draws on footage ranging from childhood through old age. The latter might jump around in time in terms of the years in which it was created, but it ultimately tells Vonnegut’s story in a far more conventional way that the movie promises to do in its opening minutes.

This is fine; in fact it’s more than fine, because as Vonnegut and various experts on his work point out, Vonnegut remains readable and relevant in large part because he expressed himself in a direct way, drawing upon what’s described here as a journalistic writing style. Correspondingly, the most moving scenes and moments in “Unstuck in Time” are unmannered accounts of events. These range in emotional character from elating (Vonnegut’s commercial and critical success with “Slaughterhouse Five” after years of financial struggle) to vexing (after that success, he left his first wife, Jane, who’d been by his side during the lean years, moved to Manhattan, and married his mistress) to tragic (Vonnegut’s brother-in-law dying in a train wreck just two days before Vonnegut’s older sister died of cancer) to inspirational (Vonnegut unhesitatingly raising his late sister’s four sons alongside the three kids he had with Jane).

All of this material is fascinating, and articulated in vivid detail thanks to Weide’s trove of material. There are closeups of typewritten revisions of Vonnegut classics, each alteration indicated in pencil or pen, and letters and answering machine messages covering every imaginable life event. The filmmakers lay it all out so elegantly that whenever the movie seems to forget that it’s also about Weide and suddenly interrupts the flow to insert a reference to one of Weide’s own milestone events (such as his wife’s own battle with a debilitating illness and his Emmy win for “Curb,” which seems to be in there so that he can include Vonnegut’s answering machine message congratulating him) it’s awkward because Weide is clearly still grieving, too, and the viewer is torn between wanting to bear witness to Weide’s miseries and triumphs and wanting him to get back to Kurt Vonnegut as quickly as possible.

There is, nevertheless, something to be said for a documentary that tries to do something different and perhaps impossible, even if it doesn’t quite get there. And in the end, any flaws or missed opportunities are subsumed by the movie’s sincerity and wealth of insight. Its analysis of the role that Vonnegut’s World War II experience played in his demeanor as well as his fiction is fascinating and on-point, and the editors bring it all back at the end when Vonnegut, outraged by the second Bush administration’s invasion of Iraq and weaponizing of patriotism, writes a series of columns for “In These Times” magazine that will ultimately be collected in 2005’s “A Man Without a Country,” arguably his last major work.

Weide himself comes across as a sardonic and compassionate witness and guide, often taking the piss out of his own reverence for Vonnegut just when things threaten to get a bit moist. The devotion he displays towards Vonnegut throughout the second half of the writer’s life is as inspiring as Vonnegut’s own high points as a human being. We should all be lucky enough to have a friend who will tell our story.

____________________

Watch Kurt Vonnegut: Unstuck in Time on Kanopy here: https://www.kanopy.com/en/product/13799154

Chance Encounters as the Walls Close In

By Edward Curtin

Source: Behind the Curtain

“A treasure stumbled upon, suddenly; not gradually accumulated, by adding one to one. The accumulation of learning, ‘adding to the sum-total of human knowledge’; lay that burden down, that baggage, that impediment. Take nothing for your journey; travel light.”   – Norman O. Brown, Love’s Body

These are “heavy” times, colloquially speaking.  Forebodings everywhere.  Everything broken.  People on edge, nervous, filled with anxiety about they know not what since it seems to be everything. The economy, politics, elections, endless propaganda, the war in Ukraine, censorship, the environment, nuclear war, Covid/vaccines, a massive world-wide collapse, the death of democratic possibilities, the loss of all innocence as a very weird and dangerous future creeps upon us, etc. Only the most anesthetized don’t feel it.

The anxiety has increased even as access to staggering amounts of knowledge – and falsehoods – has become available with the click of a button into the digital encyclopedia.  The CIA’s MK-Ultra mind control program has gone digital.  The more information, the more insubstantial the world seems, but it is not an insubstantiality that connects to hope or faith but to despair.  Across the world people are holding their breath.  What’s next?

Roberto Calasso, the late great Italian writer, wrote that we live in “the unnamable present,” which seems accurate.  Information technology, with its easily available marriage of accurate and fraudulent information, affects people at the fathomless depths of the mind and spirit.  Yet it is taken-for-granted that the more such technological information there is available, as well as the ease with which one can add one’s two-cents to it, is a good thing, even as those powerful deep-state forces that control the Internet pump out an endless stream of purposely dissembling and contradictory messages.  Delusions of omnipotence and chaos everywhere, but not in the service of humanity.  Such chaos plays in chords D and C – Depressing and Controlling.

In the midst of this unnamable present, all of us need to dream of beauty and liberation even as we temporarily rely on digital technology for news of the wider world.  For the local news we can step outside and walk and talk to people, but we can’t endlessly travel everywhere, so we rely on the Internet for reports from elsewhere.  Even as we exercise great effort to discern facts from fictions through digital’s magic emanations, we hunger for some deeper experiences than the ephemerality of this unnamable world.  Without it we are lost in a forest of abstractions.

While recently dawdling on a walk, I stopped to browse through tables of free books on the lawn of my local library.  I was looking for nothing but found something that startled me: a few descriptive words of a child’s experience.  I chanced to pick up an old (1942), small autobiography by the English historian, A. L. Rowse – A Cornish Childhood.  The flyleaf informed me that it was the story of his pre-World War I childhood in a little Cornish village in southwestern England.  The son of a china-clay worker and mother of very modest means, Rowse later went on to study at Oxford and became a well-known scholar and author of about a hundred books.  In other words, a man whose capacious mind was encyclopedic long before the Internet offered its wares of information about everything from A to Z.

Since my grandfather, the son of an Irish immigrant father and English mother, had spent his early years working in a bobbin factory in Bradford, England, a polluted mill town in the north, before sailing at age 11 from Liverpool to New York City aboard the Celtic with his four younger siblings sans parents, I had an interest in what life was like for poor children in England during that era.  How circumstances influenced them: two working-class boys, one who became an Oxford graduate and well-known author; the other who became a NYC policeman known only to family and friends.  The words Rowse wrote and I read echoed experiences that I had had when young; I wondered if my grandfather had experienced something similar.  Rowse writes this on pages 16-17 where I randomly opened the book:

A little group of thatched cottages in the middle of the village had a small orchard attached; and I remember well the peculiar purity of the blue sky seen through the white clusters of apple-blossom in spring. I remember being moon-struck looking at it one morning early on my way to school. It meant something for me; what I couldn’t say. It gave me an unease at heart, some reaching outwards toward perfection such as impels men into religion, some sense of the transcendence of things, of the fragility of our hold upon life . . . . I could not know then that it was an early taste of aesthetic sensation, a kind of revelation which has since become a secret touchstone of experience for me, an inner resource and consolation. . . . In time it became my creed – if that word can be used of a religion which has no dogma, no need of dogma; for which this ultimate aesthetic experience, this apprehension of the world and life as having value essentially in the moment of being apprehended qua beauty, I had no need of religion. . . . in that very moment it seemed that time stood still, that for a moment time was held up and one saw experience as through a rift across the flow of it, a shaft into the universe. But what gave such poignancy to the experience was that, in the very same moment that one felt time standing still, one knew at the back of the mind, or with another part of it, that it was moving inexorably on, carrying oneself and life with it. So that the acuity of the experience, the reason why it moved one so profoundly, was that at bottom it was a protest of the personality against the realization of its final extinction. Perhaps, therefore, it was bound up with, a reflex action from, the struggle for survival. I could get no further than that; and in fact have remained content with that.

I quote so many of Rowse’s words because they seem to contain two revelations that pertain to our current predicament. One a revelation that opens onto hope; the other a revelation of hopelessness. On the one hand, Rowse writes beautifully about how a patch of blue sky through apple blossoms (and his reading Wordsworth’s Intimations of Immortality) could open his heart and soul to deep aesthetic consolation.  Calasso, in discussing “absolute literature” and the Bhagavad Gita in Literature and the Gods, refers to this experience with the word ramaharsa or horripilation, the happiness of the hairs.  It is that feeling one has when one experiences a thrill so profound that a shiver goes down one’s spine and one experiences an epiphany.  Your hairs and other body parts stand up, whether it’s from a patch of blue, a certain spiritual or erotic/love encounter, or a line of poetry that takes your breath away.  Such a thrill often happens through a serendipitous stumbling.

For Rowse, the epiphany was bounded, like a beautiful bird with its wings clipped; it was an “aesthetic experience” that seemed to exclude something genuinely transcendent in the experiential and theological sense. Maybe it was more than that when he was young, but when this scholar described it in his 39th year, this intellectual could only say it was aesthetic.

C. S. Lewis, in the opening pages of The Abolition of Man, echoing Coleridge’s comment about two tourists at a waterfall, one who calls the waterfall pretty and the other who calls it sublime (Coleridge endorsing the later and dismissing the former with disgust), writes, “The feelings which make a man call an object sublime are not sublime feelings but feelings of veneration.” In other words, the sublime nature of a patch of blue sky through apple blossoms in the early morn cannot be reduced to a person’s subjective feelings but is objectively true and a crack into the mystery of transcendence. To see it as a protest against one’s personal extinction and to be content to “get no further than that” is to foreclose the possibility that what the boy felt was not what the man thought; or to quote Wordsworth about what seems to have happened to Rowse: “Shades of the prison house begin to close/Upon the growing boy,” and that is that.

But we are even a longer way gone from when Rowse wrote his remembrances.  In our secular Internet age, first society and now its technology, not aesthetics or the religion of art, have replaced God for many people, who, like Rowse, have lost the ability to experience the divine.  It embarrasses them.  Something – an addiction to pseudo-knowledge? – blocks their willingness to be open to surpassing the reasoning mind.  We think we are too sophisticated to bend that low even when looking up. “The pseudomorphism between religion and society” has passed unobserved, as Calasso puts it:

It all came together not so much in Durkheim’s [French sociologist 1858-1917] claim that “the religious is the social,’ but in the fact that suddenly such a claim sounded natural. What was left in the end was naked society, but invested now with all the powers inherited, or rather burgled, from religion. The twentieth century would see its triumph. The theology of society severed every tie, renounced all dependence, and flaunted the distinguishing feature: the tautological, the self-advertising. The power and impact of totalitarian regimes cannot be explained unless we accept that the very notion of society has appropriated an unprecedented power, one previously the preserve of religion. . . . Being anti-social would become the equivalent of sinning against the Holy Ghost. . . . Society became the subject above all subjects, for whose sake everything is justified.

For someone like Rowse, the Oxford scholar and bibliophile, writing in the midst of WW II about his childhood before WW I, an exquisite aesthetic explanation suffices to explain his experience, one that he concludes was perhaps part of an evolutionary reflex action connected to the struggle for survival.  Thus this epiphany of beauty is immured in sadness rather than opening out into possible hope.  Lovely as his description is, it is caged in inevitability, as if to say: Here is your bit of beauty on your way to dusty death.  It is a denial of freedom, of spiritual reality, of what Lewis refers to for brevity’s sake as ‘the Tao,’ what the Chinese have long meant as the great thing, the correspondence between the outer and the inner, a reality beyond causality and the controlling mind.

Now even beauty has been banned behind machine experiences.  But the question of beauty is secondary to the nature of reality and our connection to it.  The fate of the world depends upon it.  When the world is too much with us and doom and gloom are everywhere, where can we turn to find a way forward to find a place to stand to fight the evils of nuclear weapons, poverty, endless propaganda, and all the other assorted demons marauding through our world?

It will not be to machines or more information, for they are the essence of too-muchness.  It will not come from concepts or knowledge, which Nietzsche said made it possible to avoid pain.  I believe it will only come from what he suggested: “To make an experiment of one’s very life – this alone is freedom of the spirit, this then became for me my philosophy.”  And before you might think, “Look where it got him, stark raving mad,” let me briefly explain.  Nietzsche may seem like an odd choice to suggest as insightful when it comes to openness to a spiritual dimension to experience since he is usually but erroneously seen as someone who “killed God.”  Someone like Gandhi might seem more appropriate with his “experiments with truth.”  And of course Gandhi is very appropriate.  But so too are Emerson, Thoreau, Jung, and many others, at least in my limited sense of what I mean by experiment.  I mean experimenting-experiencing (both derived from the same Latin word, expereri, to try or test) by assuming through an act of faith or suspension of disbelief that if we stop trying to control everything and open ourselves to serendipitous stumbling, what may seem like simply beautiful aesthetic experiences may be apertures into a spiritual energy we were unaware of.  James W. Douglass explores this possibility in his tantalizing book, Lightning East to West: Jesus, Gandhi, and the Nuclear Age, when he asks and then explores this question: “Is there a spiritual reality, inconceivable to us today, which corresponds in history to the physical reality which Einstein discovered and which led to the atomic bomb?”

I like to think that my grandfather, although a man not very keen on things spiritual, might have, in his young years amidst the grime and fetid air of Bradford, chanced to look up and saw a patch of blue sky through the rising smoke and felt the “happiness of the hairs” that opened a crack in his reality to let the light in.

Roberto Calasso quotes this from Nietzsche:

That huge scaffolding and structure of concepts to which the man who must clings in order to save himself in the course of life, for the liberated intellect is merely a support and a toy for his daring devices. And should he break it, he shuffles it around and ironically reassembles it once more, connecting what is least related and separating what is closest. By doing so he shows that those needful ploys are of no use to him and that he is no longer guided by concepts but by intuitions.

I have an intuition that there are hierophanies everywhere, treasures to be stumbled upon – by chance.  If we let them be.

My eyes already touch the sunny hill,
going far ahead of the road I have begun.
So we are grasped by what we cannot grasp;
It has its inner light, even from a distance –

And changes us, even if we do not reach it,
Into something else, which, hardly sensing it, we already are;
A gesture waves us on, answering our own wave. . .
But what we feel is the wind in our faces.

– Rainer Maria Rilke, “A Walk”

Did David Foster Wallace predict the future?

Our world is more dystopian than Infinite Jest

By Sarah Ditum

Source: UnHerd

Infinite Jest is frequently attention-repellent. David Foster Wallace’s brick-sized novel is physically challenging, an 800g book that forces you to flick back and forth to the errata. This is not optional. Major plot points hinge on throwaway glosses. 

I was a bratty, bookish 15-year-old when it was published in 1996. A 1,000-page-plus novel bloated with endnotes that have their own footnotes was an irresistible challenge. David Foster Wallace was not an obscurantist in his own literary taste — he taught Stephen King and Thomas Harris at Illinois state university — but Infinite Jest is a book at bloody-minded war with its own bookness. With its maddening excess of information that you must hold in your hand as best you can, it feels more like the internet.

As well as being attention-repellent, it is also sometimes just repellent. There are scenes of comedically extreme horror: a woman dying after the handbag that holds her artificial heart is snatched from her, a man dying in his own filth while obsessively watching reruns of M*A*S*H, a dog dragged behind a car until all that’s left is a leash, a collar and a “nubbin”. Before livestreamed mass shootings and animal cruelty for clicks, Wallace knew that the grisly and grotesque was what the public wanted.

He did not see the future. But he saw the forces shaping the future, and understood the ways they would deform people in turn. 

In an aside, Wallace writes about how, with the introduction of the “Teleputer” (what we would call a laptop), video calls enjoyed huge popularity, followed by dramatic decline. Users quickly discover that being seen is enormously anxiety-inducing, partly because it means you must visibly be paying attention to the other party at all times, partly because you must also pay attention to how you look when making a call.

The answer to this anxiety is, first, “high definition masking” — a flattering composite of the user’s face digitally overlaid on the screen. Then comes actual masking — hyperreal rubber versions of the user’s face that can be quickly strapped on for calls. Eventually, in response to this “stressfully vain repulsion at their own videophonic appearance”, consumers revert to audio-only, which is now “culturally approved as a kind of chic integrity”. 

This divide between the real and the represented has been borne out by our experience of Zoom, Instagram and TikTok: filters are now so advanced that they can be applied to moving images, and you can digitally beautify yourself while livestreaming. Only instead of resorting to rubber masks, we remodel the flesh itself: “filter face” tweakments, intended to bring the human closer to the digital ideal, are on the rise. Wallace was right about the way pervasive exposure to our own image would break us. It’s just that the way we’ve responded is, somehow, even more dystopian than he imagined.

Infinite Jest’s near future is now our near past, and in 2008, Wallace killed himself after suffering decades of profound depression. By the middle of the next decade, his greatest novel had been recast as a byword for tedious white masculinity, the author himself cancelled. This was, at least in the biographical sense, deserved. In 1990, Wallace had met the poet Mary Karr. He was a resident in a halfway house, she was a volunteer, and he became obsessed with her. They dated, they broke up, then he assaulted and stalked her. In 2018, Karr tweeted that he had “tried to buy a gun. kicked me. climbed up the side of my house at night. followed my son age 5 home from school. had to change my number twice, and he still got it. months and months it went on.”

The novel includes multiple men in recovery steeping in the shame of their past violence, and it would be nice to imagine that this was Wallace examining his own conscience. On the other hand, it also includes a reciprocated love story between the large, lunkish, David-Foster-Wallace-ish character Don Gately, and the beautiful, idealised, Mary-Karr-ish Joelle van Dyne. Infinite Jest was, arguably, an implement of his ongoing harassment and should not be dishonestly mined for signs of redemption.

Still, it is a very contemporary thing to demand moral purity in artists: the kind of impulse that, perhaps, comes from seeking simplicity when far too much knowing is possible. “What do we do with the art of monstrous men?” asked Claire Dederer, as though to be an audience is inevitably to be an accomplice. Good art can be made by people who’ve done bad things, and perhaps only a monstrous man can faithfully portray the outlines of his own monstrosity. Reading is not an act of worship, although one of the problems for Infinite Jest is that certain male readers have treated it as such. 

And so, Infinite Jest has plummeted from literary touchstone to confirmed red flag. In a viral tweet from 2020 listing “Top 7 Warning Signs In a Man’s Bookshelf”, the first item was “A dog-eared copy of Infinite Jest”. The “dog-eared” was important: it was the act of having read it, rather than posing as someone who might read it, that sounded the klaxon.

But unread copies could be equally alarming: when the actor Jason Segal bought Infinite Jest in preparation for playing Wallace in a film, he recalled that the female bookseller rolled her eyes and said: “Every guy I’ve ever dated has an unread copy on his bookshelf.” Nicole Cliffe made it number four on her catalogue of “Books that Literally All White Men Own”. 

I have never run into a “DFW guy” — they’re probably more of an American campus thing. But I ran into the “Philip Roth guy” at university and recognise the type: clammy, proprietorial, forcing his literary taste on girls in lieu of forcing himself. That I had read Infinite Jest felt vaguely embarrassing. All that effort, and it turned out the most high-status option would have been to not read it and then be glibly dismissive. 

It’s perversely appropriate that Infinite Jest ended up holding such a key place in the vocabulary of this irony-bound strand of performative feminism, because irony was one of the things that Wallace was both appalled and fascinated by. In a 1993 essay, he writes that “irony and ridicule are entertaining and effective, and that at the same time they are agents of a great despair and stasis in US culture.”

Infinite Jest isn’t above irony, but it often pits itself against irony. “It’s like there’s some rule that real stuff can only get mentioned if everybody rolls their eyes or laughs in a way that isn’t happy,” thinks one character. Another feels an “aftertaste of shame after revealing passion of any belief and type when with Americans, as if he had made flatulence instead of had revealed belief” (the weird syntax is because this character is Quebecois). When sincerity is untenable, it becomes easier to engage with symbols than things. 

Over and again in the novel, the “real” gets displaced by the representation, like the rubber faces that can replace flesh ones on video calls. One of the centrepiece scenes of Infinite Jest features a geopolitical strategy game called Eschaton — a kind of Risk, but played by teenagers with balls and rackets to stand for missiles. The game comes violently undone when the players start hitting each other and the referee can’t work out how to distinguish between the territory and the map. As for the M*A*S*H-obsessive, “crucial distinctions had collapsed” between the fiction and the real.

And maybe this is connected to the novel’s weirdly well-informed interest in transsexuality. The gender ideology that makes front-page news now was a niche interest in the Nineties, confined mostly to academic papers and message boards for transitioners. Wallace’s inclusion of a young, effeminate, gay, “gender-dysphoric” character and a middle-aged, masculine, straight crossdresser suggests a hefty familiarity with the sexology literature long before any of this had crossed into the mainstream — it’s effectively a thumbnail sketch of the influential theory, developed by Ray Blanchard in the Eighties and Nineties, that male transsexuality divides into “two types”, the autogynephiliac and the homosexual.

But it also fits with the vision of an America where the signifiers that stand for “woman” hold more weight than the physical fact of femaleness. Gender as we experience it now — the idea of an “essence” or “true self” that renders the material body irrelevant — couldn’t have come to exist without the internet. Only when technology allowed people to present themselves as pure language, signifier unmoored from signified, did it become possible to believe that sex was malleable or unreal. Maybe transsexuality fascinated Wallace because he saw it as another way that humans confuse the symbol with the thing itself, the feminine with the female.

This summer, I started rereading Infinite Jest, mostly out of curiosity. It is, still, a very annoying book. But there’s something I didn’t understand about it in 1996 that I do now I’m older than Wallace was when he wrote it. He saw American culture as an exhausted force, trapped smirking in a hall of mirrors. And he saw that getting worse as screens extended their influence.

One of Wallace’s influences, Thomas Pynchon, wrote stories about the technology that made America possible: geographical surveys (Mason Dixon), the postal service (The Crying of Lot 49). Infinite Jest is about the technology that could undo a state: a kind of entertainment so compelling that it turns consumers utterly away from reality. It asks whether the real, or something like it, might be worth recovering. 

It is, still, a difficult book — and difficult in new ways. The wheedling presence of my phone is competition that Infinite Jest never had to contend with the first time around. The disturbing fact of Wallace’s own bad acts, too, was not available to me in the Nineties, and even if it had been it probably wouldn’t have struck me as a problem for the novel. But the difficulty is, and always has been, the point. Of course Infinite Jest could be shorter, lighter, less infuriating. But if it’s heavy, it’s because it’s weighing you back down in the physical world.

Reality Tunnels: How to Control & Re-Program Your Mind

By Jack Fox-Williams

Source: Waking Times

When I was in secondary school, a teacher showed me an animated optical illusion in which a dancer appears to be spinning in one direction. I was adamant that the dancer was spinning clockwise, while my teacher insisted it was spinning counterclockwise. She then told me that you could change the direction of the dancer by focusing on the feet. I gazed with meditative fixation, and suddenly, to my amazement, the dancer started spinning counterclockwise! My teacher explained that since there are no visual cues for three-dimensional depth, your mind can determine what direction the dancer spins.

At that moment, I realised that reality is a construct of the mind, and we all potentially see the same world differently. I may have put it in less eloquent terms than that (considering I was only a teenager), but there was a fundamental shift in my understanding. The illusion made me realise that the notion of ‘objective truth’ was essentially arbitrary since our subjective beliefs mediate sensory experience.

My teacher and I could have argued for hours, days or weeks as to which direction the dancer was spinning; science couldn’t have proven either of us correct since it was a matter of perception rather than ‘truth’. In ‘reality’, the dancer was spinning in both directions, but since the brain has a natural tendency to classify, categorise and catalogue information in binary terms (up/down, left/right, black/white, clockwise/counterclockwise), the animated optical illusion appears monodirectional.

There are numerous examples of this in our day-to-day lives, like when we fail to appreciate other people’s viewpoints because we perceive the world differently. We believe we are right despite the multiple (if not infinite) interpretations about the nature of reality.

What are Reality Tunnels?

The countercultural guru Timothy Leary coined the term ‘reality-tunnel’ to describe our filtered perceptions of the world. Robert Anton Wilson later developed the concept to describe “pre-composed patterns of thinking which limit and distort the perception of reality by reducing complexity and options.”1 According to Wilson, reality-tunnels shape our phenomenological sense of self, editing out experiences that do not support our beliefs while focusing on those which do.2

An advocate for capitalism, for example, will gather facts to support the view that capitalism is the most effective socioeconomic model, discarding any information that runs contrary to this viewpoint. Similarly, a Marxist will construct arguments based on select information to support the view that communism is the best system, often neglecting evidence that contradicts their position.

As the psychedelic scholar Ido Hartogsohn states, “all of us harbour established ideas about minorities, religions, nationalities, the sexes, the right ways to think, act, feel govern, eat, drink, and what not. Reality tunnels act to help us fortify these ideas against challenging information.”3 In this sense, there is a crossover between the concept of reality-tunnels and confirmation bias, the latter described as the “human tendency to notice and assign significance to observations that confirm existing beliefs while filtering out or rationalising away observations that do not fit with prior beliefs and expectations.”4 The phenomenon of confirmation bias helps explain why people who ascribe to a reality-tunnel are oblivious. Most people believe their worldview corresponds to the “one true objective reality,” however, Wilson emphasises that many reality tunnels are artistic creations, a culmination of biological, cultural and environmental inputs.5

The notion that reality is shaped by the conditions of the human mind is not new. The 18th-century German philosopher Immanuel Kant proposed in his Critique of Pure Reason that experience is based “on the perception of external objects and a priori knowledge.”6 We receive information about the external world through our five senses, which is then processed by the brain, allowing us to conceptualise its contents. When I look at an object, such as a chair or a table, I have no understanding of its external nature. The qualities that enable me to denote the meaning of the object, such as shape, colour, size etc., have no objective existence; they are merely by-products of the brain.

The French psychoanalyst and psychiatrist Jacques Lacan proposed his theory distinguishing between ‘The Real’ and the ‘Symbolic’. Lacan argued that ‘The Real’ is the “imminent unified reality which is mediated through symbols that allow it to be parsed into intelligible and differentiated segments.”7 However, the ‘Symbolic’, which is primarily subconscious, is “further abstracted into the imaginary (our actual beliefs and understandings of reality). These two orders ultimately shape how we come to understand reality.”8

The Harvard sociologist, Talcott Parsons, uses the word gloss to describe how our minds come to perceive reality. According to Parsons, we are taught how to “put the world” together by others who subscribe to a consensus reality based on shared beliefs, norms and associations.9 A gloss constitutes a total system of language and/or perception. For example, the word ‘house’ is a gloss since we lump together a series of isolated phenomena – floor, ceiling, window, lights, rugs, etc. – and turn it into a totality of meaning.

The author and anthropologist Carlos Castaneda commented on this notion, stating, “we have to be taught to put the world together in this way. A child reconnoitres the world with few preconceptions until he is taught to see things in a way that corresponds to the descriptions everybody agrees on. The world is an agreement. The system of glossing seems to be somewhat like walking… we learn we are subject to the syntax of language and the mode of perception it contains.”10

The French philosopher Jacques Derrida stated that our understanding of objects (and the words which denote them) are only understood in relation to how they are contextually related to other objects (and denotive words).11

We can break free from prescribed reality-tunnels by using objects and language in unusual or disjointed ways, thereby creating new discursive meanings, associations and connotations. This was the aim and outcome of certain art movements such as Dadaism and Surrealism, as well as Brion Gysin and William Burroughs’ cut-up method.12

The famous ethnobotanist and psychonaut, Terence McKenna, argued that ideology and culture are tools “which give other people control over one’s experience and identity since they lead individuals to shape their identity according to pre-conceived forms. If a person identifies with commercial brands or with popular ideas of what is beautiful, true or important, they give away their power to other people.”13 McKenna once said that you should not see “culture and ideology as your friends,” implying that you should understand reality on your own terms rather than buying into “pre-packaged ideological and cultural ideals” such as communism, capitalism, democracy or some form of totalitarianism.14 Belief in itself, argued McKenna, was “limiting to the individual, because every time you believe in something you are automatically precluded from believing its opposite. By believing something, you are virtually shutting yourself from all contradictory information, thus once again performing the sin of imposing a rigid simplified structure upon an infinitely complex reality.”15

Much like McKenna, Wilson recommends that a “fully functioning human ought to be aware of their reality tunnel and be able to keep it flexible enough to accommodate and, to some degree, empathise with different ‘game rules’, different cultures.”16 According to Wilson, constructivist thinking, which considers how social and cultural processes determine our perception of the world, constitutes an exercise in metacognition, enabling us to become aware of how reality tunnels are never truly objective, thereby decreasing the “chance that we will confuse our map of the world with the actual world.”17

How Your Reality Tunnel Is Formed

The constraints of human biology partially limit our models of reality. As Wilson states, our DNA “evolved from standard primate DNA and still has a 98% similarity to chimpanzee (and 85% similarity to the DNA of the South American Spider Monkey). We have the same gross anatomy as other primates, the same nervous system and the same sense organs. While our highly developed pre-frontal cortex enables us to perform ‘higher’, more complex mental tasks than other primates, our perceptions remain largely within the primate norm.”18

The neural apparatus produced by our genetic coding helps create what ethologists call the umwelt, or “world-field.” Birds, reptiles and insects occupy a separate umwelt or reality-tunnel to primates (ourselves, included). For example, bees are able to perceive floral patterns in ultraviolet light, which we cannot (unless certain technologies are utilised). Canine, feline and primate reality-tunnels remain similar enough that friendship and communication can occur between these different species, however, a snake (for example) occupies such a different reality tunnel that their behaviour appears entirely alien.

As Wilson argues, the belief that human umwelt reveals “reality” or “deep-reality” is as “naïve as the notion that a yardstick shows more reality than a voltmeter or that ‘my religion is better than your religion’. Neurogenetic chauvinism has no more scientific justification than national or sexual chauvinisms.”19 He goes so far as to suggest that “no animal, including the domesticated primate, can smugly assume the world created by its senses and brain equal in all respects the ‘real world’ or the ‘only real world’.”20

Reality-tunnels are also influenced by “imprint vulnerability,” periods in our lives when early childhood/adolescent experiences “bond neurons into reflex networks which remain for life.”21 The psychological researchers, Lorenz and Tinbergen, won a Nobel Prize in 1973 for their research into imprinting, which demonstrated that “the statistically normal snow-goose imprints its mother, as distinct from any other goose, shortly after birth. This imprint creates a ‘bond’ and the gosling attaches itself to the mother in every possible way.”22 These imprints can be imposed onto literally anything. Lorenz observed a case in which a gosling, in the absence of its mother, imprinted a ping-pong ball. It followed the ping-pong ball around and, on reaching adulthood, “attempted to mount the ball sexually.”23

Wilson estimates that the age at which we are imprinted with language determines lifelong programs of “cleverness” (verbal intelligence) and “dumbness” (verbal unintelligence), since linguistic models enable us to articulate mental processing, evaluate complex ideas and communicate with those around us.24 Furthermore, how and when our first sexual experiences are imprinted can “determine lifelong programs of heterosexuality, brash promiscuity or monogamy etc.”25 In more obscure imprints, such as celibacy, foot-fetishism and sadomasochism, the “bounded brain circuitry seems quite as mechanical as the imprint which bounded the gosling to the ping-pong ball.”26

These examples suggest that experiences during childhood, when the brain exhibits optimal ‘neuroplasticity’ (a term used to refer to malleability of neural networks in the brain), can shape our reality tunnels far into adulthood. As Sigmund Freud proposed, many “rational” thoughts and behaviours are typically the result of “repressed” memories, impulses and desires, which dwell in the murky depths of the unconscious mind.27

Furthermore, reality-tunnels are shaped by social conditioning, the “sociological process of training in a society to respond in a manner generally approved by the society in general and peer groups within society.”28 Manifestations of social conditioning are multifarious but include nationalism, education, employment, entertainment, popular culture, spirituality and family life. Unlike imprinting, which usually requires only one powerful experience to set permanently into the neural networks of the brain, conditioning requires “many repetitions of the same experience and does not set permanently.”29

The processes of social conditioning vary greatly, depending on the cultural environment to which one is exposed. For example, an individual born in a Muslim country (such as Saudi Arabia) will likely believe in the teachings of the Quran and adhere to certain religious norms, customs and traditions. However, individuals born in a Western capitalist/consumerist country, or an Eastern country with Hindu or Buddhist traditions, will adhere to different cultural and behavioural codes.

Reality-tunnels are also formed through the process of learning. Much like conditioning, learning requires repetition, but it also requires motivation. Therefore, it plays “less of a role in human perception and belief than genetics and imprinting and even less than conditioning does.”30 Learning marks a major difference between how mammals, reptiles, insects and birds perceive the world. For example, snakes share the same reality tunnel since they merely act on biologically determined reflexes, with only minor imprinted differences. Mammals show “more conditioned and learned differences in their reality tunnels.”31

Humans demonstrate a higher aptitude for learning due to our highly developed cortex and frontal lobes as well as our prolonged infancy. This variability functions as “the greatest evolutionary strength of the human race” since it enables us to pass down knowledge from one generation to the next. But it also means that we can become brainwashed and label other people who do not share our beliefs as “mad,” “anti-social,” or “blasphemous.” In fact, it could be said that the majority of all wars are the result of two (or more) opposing reality tunnels fighting for supremacy. This is particularly evident in the case of religious conflict, where people kill each other in the name of “God.”

Tunnel Vision: The Politicisation of Reality

The rise of “identity politics” in the 21st century perfectly demonstrates how reality-tunnels prevent us from considering alternative perspectives and viewpoints. During the last decade, the political domain has become increasingly polarised as the left and right engage in a battle for cultural supremacy. Such polarisation was apparent in the Brexit referendum of 2016, in which 51.9% of the British public voted to leave the European Union while 48.1% voted to remain.32 The marginal success of the ‘leave’ campaign highlighted the strong division between both sides of the political spectrum. The former stressed the importance of the Union in promoting social and economic stability, while the latter emphasised the importance of national identity, sovereignty and independence.

The rhetoric employed by both the ‘leave’ and ‘remain’ campaign was so binary in its articulation that neither side engaged in meaningful dialogue; instead, the referendum became a series of baseless slogans, mottos and catchphrases – an advertising campaign designed to appeal to target demographics. The referendum was more about two separate reality-tunnels competing for ideological supremacy than a balanced analysis of benefits and risks.

The US presidential election of 2016 was a similar drama of competing reality-tunnels, shaped by masterful spin doctors and hidden persuaders who exploited modern advertising techniques to capture specific demographics based on class, age, sex, religion, geographical location and other criteria.

Donald Trump was well-known for his campaign slogan ‘Make America Great Again’ and other catchphrases that employed a lexicon of patriotism, populism and protectionism to appeal to those on the right. On the other hand, Hillary Clinton used the slogan ‘Stronger Together’ to evoke feelings of unity, compassion, and solidarity. The election became a battle between two contrasting reality-tunnels, grounded in meaningless rhetoric and hyperbole.

Another example of politicisation is the Covid-19 crisis, with the public divided into two camps – those who supported measures such as lockdowns versus the other side that rejected many of these same measures. Political polarisation demolished a sane balanced approach to the crisis, exacerbated and intensified by the ongoing political divide across the media landscape.

‘Echo Chambers’ & Identity Politics

Social media has fuelled identity politics by enabling groups and movements to generate an online presence and have real-world impacts. According to independent scholar and author Ilaria Bifarini, this results in the emergence of ‘echo chambers’ in which internet users “find information that validates their pre-existing opinions and activates confirmation bias.”33 This mechanism, says Bifarini, “strengthens one’s beliefs and radicalises them, without adding anything to information and knowledge. The result is the ideological extremism that we are observing today and in which we are taking part, where political debates have been replaced by supporters and verbal violence.”34

Another way this happens, for example, is how Google’s online video sharing and social media platform YouTube utilises algorithmic data to show users similar content to their prior engagements – content they are likely to engage with in the future, thus creating a feedback loop in which they are exposed to media reinforcing their political preferences.35 As media scholars Brooke E. Auxier and Jessica Vitak state, “many social media platforms structure their content-feeds based on what an algorithm determines to be the ‘top’ or most ‘relevant’ stories. While these tools may help users control their information and news environments – making consumption more manageable and mitigating information overload – it is possible that these tailoring tools will expose users to redundant information and singular viewpoints.”36

Both sides of the political spectrum fail to engage in meaningful discussion when they are entrapped in a single reality-tunnel, the stability of which is threatened by competing narratives. Instead, political dialogue becomes characterised by inflammatory insults, name-calling and defamation.

Loaded language – such as ‘virtue signallers’, ‘snowflakes’, ‘racist’, ‘transphobic’, ‘Islamophobic’, ‘hetero-normative’, ‘privileged’ – enables identity groups to protect the integrity of their reality-tunnel by excluding those who hold a different opinion. In the same way that religious cult leaders isolate their members from the outside world, so too do identity groups orientate themselves around a closed belief system, which is immune to criticism, contention or challenge.

In order to facilitate a more meaningful discussion, it is important that both sides learn to break free from the constraints of their reality-tunnel.

Rising Above the Fray

In his book Prometheus Rising, Robert Anton Wilson provides various techniques for challenging dominant reality tunnels. Writing in the early 1980s, Wilson suggested that “if you are a liberal, subscribe to the [conservative magazine] National Review… Each month try to enter their reality-tunnel for a few hours while reading their articles. If you are a conservative, subscribe to New York Review of Books for a year and try to get into their headspace for a few hours a month. If you are a rationalist, subscribe to Fate Magazine for a year. If you are an occultist, join the Committee for the Scientific Investigation of Claims of the Paranormal and read their journal, The Sceptical Inquirer, for a year.”37

To put a modern ‘spin’ on this exercise, if you follow conservative thinkers online such as Jordan Peterson or Ben Shapiro, expose yourself to leftist thinkers such as Slavoj Zizek or Noam Chomsky, and do the opposite if you are on the left. Subscribe to internet channels that do not align with your reality-tunnel. By performing this exercise, you will find that you can think about political issues in a more balanced, neutral and multidimensional way, free from the constraints of ideological dogma.

You can use the same technique with religion. In one exercise, Wilson says, “become a pious Roman Catholic. Explain in three pages why the Church is still infallible and holy despite Popes like Alexander VI (the Borgia Pope), Pious XII (ally of Hitler), etc.”38 Then explain why the Church is an immoral and outdated institution; also write three pages detailing why you believe this to be the case. If you have the time, you can perform the same exercise with other religions such as Hinduism, Buddhism, and even Satanism. Explain why these religions hold the key to the ‘true’ nature of ‘reality’ and then refute yourself by providing a counterargument.

You can use the same technique to become more conspiratorial in your thinking. In one exercise, Wilson says, “start collecting evidence that your phone is bugged. Everyone gets a letter occasionally that is slightly damaged. Assume that somebody is opening your mail and clumsily revealing it. Look around for evidence that your co-workers or neighbours think you’re a bit queer and are planning to have you committed to a mental hospital.”39 Observe how these assumptions influence your perception of other people and their behaviour – it won’t be long before you find evidence to support your paranoid thinking!

Once you have sufficiently experimented with this reality-tunnel, “try living a whole week with the program, ‘Everybody likes me and tries to help me achieve all of my goals’.”40 Then try living a whole month with the program, “I have chosen to be aware of this particular reality.”41 Then try living a day with the program, “I am God playing at being a human being. I created every reality I notice.”42 Then try living forever with the metaprogram, “Everything works out more perfectly than I plan it.”43 By adopting these different reality-tunnels, you will notice how malleable your perceptual faculties really are – the world can become a place of conspiracy and collusion or a place of benevolence and positivity, depending on how you view it.

Wilson provides another interesting exercise to expand the boundaries of consciousness, in which you “list at least 15 similarities between New York (or any large city) and an insect colony, such as a bee-hive or termite hill. Contemplate the information in the DNA loop, which created both of these enclaves of high coherence and organisation, in primate and insect societies.”44 Then, “Read the Upanishads and every time you see the word ‘Atman’ or ‘World Soul’, translate it as DNA blueprint. See if it makes sense to you that way.”45 According to Wilson, “Contemplating these issues usually triggers Jungian synchronicities. See how long after reading this chapter you encounter an amazing coincidence – e.g., seeing DNA on a license plate, having a copy of the Upanishads given to you unexpectedly…”46

Experimenting with different reality tunnels is a necessary practice if one wishes to challenge dominant narratives, perspectives and viewpoints and expand the boundaries of human consciousness. As we find ourselves in a post-modern ‘information age’, where an increasing number of political factions compete for informational authority, we are exposed to the hidden forces of propaganda more than ever before.

Every time we log into Facebook, Twitter, Instagram and YouTube, we allow ourselves to be manipulated by a complex system of algorithms that generates content based on our likes, dislikes, and even our differences. In order to escape the trappings of ideological dogma, we must become conscious of our biological, social and environmental conditioning and adopt a more multidimensional way of thinking.

Understanding about ‘reality-tunnels’ becomes instrumental in achieving true inner liberation since it enables us to think about the mind as a form of technological software that can be continually updated and reorganised. We achieve a state of metacognition, an awareness of one’s thought processes and an understanding of the patterns behind them. It is what the pioneering mind explorer John Lilly called our capacity for “metaprogramming,” the creation, revision, and reorganisation of mental programs.47

Although we are constrained by the limitations of biological programming (to a certain extent), the creativity of human consciousness is infinite, a maze of endless possibilities and potentialities waiting to be explored. As the Buddha said, “All that we are is the result of all that we have thought. It is founded on thought. It is based on thought.”48

Footnotes

1. Hartogsohn, I. (2015). The Psychedelic Society Revisited: On Reducing Valves, Reality Tunnels and the Question of Psychedelic Culture, Psychedelic Press, 3
2. ultrafeel.tv/reality-tunnel-how-beliefs-and-expectations-create-what-you-experience-in-life
3. Op cit., Hartogsohn, I. (2015), 4
4. en.wikipedia.org/wiki/Confirmation_bias 
5. Anton Wilson, R. (1983). Prometheus Rising, Tempe, Arizona: New Falcon
6. en.wikipedia.org/wiki/Immanuel_Kant
7. en.wikipedia.org/wiki/Reality_tunnel
8. Ibid
9. Parsons, T. (1951). The Social System, Glencoe, Illinois: The Free Press, 1951
10. Sam Keen, Castaneda interview, Psychology Today, December 1972
11. Derrida, J. (1978). ‘Genesis’ and ‘Structure’ and Phenomenology, in Writing and Difference, Routledge.
12. en.wikipedia.org/wiki/Dada
13. Op cit., Hartogsohn, I. (2015), 1
14. Ibid, 2
15. Anton Wilson, R. (1990). Quantum Psychology: How Brain Software Programs You & Your World, New Falcon Publications
16. en.wikipedia.org/wiki/Reality_tunnel 
17. Quantum Psychology, 74
18. Ibid, 74
19. Ibid, 75
20-24. Ibid, 76
25-26. Ibid, 76-77
27. Wollheim, R. (1971). Freud, Fontana Press
28. en.wikipedia.org/wiki/Social_conditioning 
29. Quantum Psychology, 77
30. Ibid
32. Ibid
33. Bifarini, I. Cognitive bias and echo chambers: The social media trap, www.academia.edu/40650380/Cognitive_bias_and_echo_chambers_The_social_media_trap
34. Ibid
35. Nguyen, C. Echo Chamber and Epistemic Bubbles, www.academia.edu/36634677/Echo_Chambers_and_Epistemic_Bubbles 
36. Auxier, B. and Vitak (2019). Factors Motivating Customization and Echo Chamber Creation Within Digital News Environments. Social Media and Society, April-June 2019
37. Prometheus Rising, 83
38. Ibid, 159
39. Ibid, 241
40-43. Ibid, 242
44-46. Ibid, 190
47. Lilly, John C. Programming & Metaprogramming in the Human Biocomputer, New York: The Julian Press, Inc., 1967
48. The Dhammapada

Pop Culture Has Become an Oligopoly

By Adam Mastroianni

Source: Experimental History

You may have noticed that every popular movie these days is a remake, reboot, sequel, spinoff, or cinematic universe expansion. In 2021, only one of the ten top-grossing films––the Ryan Reynolds vehicle Free Guy––was an original. There were only two originals in 2020’s top 10, and none at all in 2019.

People blame this trend on greedy movie studios or dumb moviegoers or competition from Netflix or humanity running out of ideas. Some say it’s a sign of the end of movies. Others claim there’s nothing new about this at all.

Some of these explanations are flat-out wrong; others may contain a nugget of truth. But all of them are incomplete, because this isn’t just happening in movies. In every corner of pop culture––movies, TV, music, books, and video games––a smaller and smaller cartel of superstars is claiming a larger and larger share of the market. What used to be winners-take-some has grown into winners-take-most and is now verging on winners-take-all. The (very silly) word for this oligopoly, like a monopoly but with a few players instead of just one.

I’m inherently skeptical of big claims about historical shifts. I recently published a paper showing that people overestimate how much public opinion has changed over the past 50 years, so naturally I’m on the lookout for similar biases here. But this shift is not an illusion. It’s big, it’s been going on for decades, and it’s happening everywhere you look. So let’s get to the bottom of it.

(Data and code available here.)

Movies 

At the top of the box office charts, original films have gone extinct. 

I looked at the 20 top-grossing movies going all the way back to 1977 (source), and I coded whether each was part of what film scholars call a “multiplicity”—sequels, prequels, franchises, spin-offs, cinematic universe expansions, etc. This required some judgment calls. Lots of movies are based on books and TV shows, but I only counted them as multiplicities if they were related to a previous movie. So 1990’s Teenage Mutant Ninja Turtles doesn’t get coded as a multiplicity, but 1991’s Teenage Mutant Ninja Turtles II: The Secret of the Ooze does, and so does the 2014 Teenage Mutant Ninja Turtles remake. I also probably missed a few multiplicities, especially in earlier decades, since sometimes it’s not obvious that a movie has some connection to an earlier movie.

Regardless, the shift is gigantic. Until the year 2000, about 25% of top-grossing movies were prequels, sequels, spinoffs, remakes, reboots, or cinematic universe expansions. Since 2010, it’s been over 50% ever year. In recent years, it’s been close to 100%.

Original movies just aren’t popular anymore, if they even get made in the first place.

Top movies have also recently started taking a larger chunk of the market. I extracted the revenue of the top 20 movies and divided it by the total revenue of the top 200 movies, going all the way back to 1986 (source). The top 20 movies captured about 40% of all revenue until 2015, when they started gobbling up even more.

Television

Thanks to cable and streaming, there’s way more stuff on TV today than there was 50 years ago. So it would make sense if a few shows ruled the early decades of TV, and now new shows constantly displace each other at the top of the viewership charts.

Instead, the opposite has happened. I pulled the top 30 most-viewed TV shows from 1950 to 2019 (source) and found that fewer and fewer franchises rule a larger and larger share of the airwaves. In fact, since 2000, about a third of the top 30 most-viewed shows are either spinoffs of other shows in the top 30 (e.g., CSI and CSI: Miami) or multiple broadcasts of the same show (e.g., American Idol on Monday and American Idol on Wednesday). 

Two caveats to this data. First, I’m probably slightly undercounting multiplicities from earlier decades, where the connections between shows might be harder for a modern viewer like me to understand––maybe one guy hosted multiple different shows, for example. And second, the Nielsen ratings I’m using only recently started accurately measuring viewership on streaming platforms. But even in 2019, only 14% of viewing time was spent on streaming, so this data isn’t missing much.

Music

It used to be that a few hitmakers ruled the charts––The Beatles, The Eagles, Michael Jackson––while today it’s a free-for-all, right?

Nope. A data scientist named Azhad Syed has done the analysis, and he finds that the number of artists on the Billboard Hot 100 has been decreasing for decades.

And since 2000, the number of hits per artist on the Hot 100 has been increasing. 

(Azhad says he’s looking for a job––you should hire him!)

A smaller group of artists tops the charts, and they produce more of the chart-toppers. Music, too, has become an oligopoly.

Books

Literature feels like a different world than movies, TV, and music, and yet the trend is the same.

Using LiteraryHub’s list of the top 10 bestselling books for every year from 1919 to 2017, I found that the oligopoly has come to book publishing as well. There are a couple ways we can look at this. First, we can look at the percentage of repeat authors in the top 10––that is, the number of books in the top 10 that were written by an author with another book in the top 10.

It used to be pretty rare for one author to have multiple books in the top 10 in the same year. Since 1990, it’s happened almost every year. No author ever had three top 10 books in one year until Danielle Steel did it 1998. In 2011, John Grisham, Kathryn Stockett, and Stieg Larsson all had two chart-topping books each.

We can also look at the percentage of authors in the top 10 were already famous––say, they had a top 10 book within the past 10 years. That has increased over time, too. 

In the 1950s, a little over half of the authors in the top 10 had been there before. These days, it’s closer to 75%.

Video games

I tracked down the top 20 bestselling video games for each year from 1995 to 2021 (sources: 1234567) and coded whether each belongs to a preexisting video game franchise. (Some games, like Harry Potter and the Sorcerer’s Stone, belong to franchises outside of video games. For these, I coded the first installment as originals and any subsequent installments as franchise games.)

The oligopoly rules video games too:

In the late 1990s, 75% or less of bestselling video games were franchise installments. Since 2005, it’s been above 75% every year, and sometimes it’s 100%. At the top of the charts, it’s all Mario, Zelda, Call of Duty, and Grand Theft Auto.

Why is this happening?

Any explanation for the rise of the pop oligopoly has to answer two questions: why have producers started producing more of the same thing, and why are consumers consuming it? I think the answers to the first question are invasionconsolidation, and innovation. I think the answer to the second question is proliferation.

Invasion

Software and the internet have made it easier than ever to create and publish content. Most of the stuff that random amateurs make is crap and nobody looks at it, but a tiny proportion gets really successful. This might make media giants choose to produce and promote stuff that independent weirdos never could, like an Avengers movie. This can’t explain why oligopolization started decades ago––YouTube only launched in 2005, for example, and most Americans didn’t have broadband until 2007––but it might explain why it’s accelerated and stuck around.

Consolidation

Big things like to eat, defeat, and outcompete smaller things. So over time, big things should get bigger and small things should die off. Indeed, movie studiosmusic labelsTV stations, and publishers of books and video games have all consolidated. Maybe it’s inevitable that major producers of culture will suck up or destroy everybody else, leaving nothing but superstars and blockbusters. Indeed, maybe cultural oligopoly is merely a transition state before we reach cultural monopoly.

Innovation

You may think there’s nothing left to discover in art forms as old as literature and music, and that they simply iterate as fashions change. But it took humans thousands of years to figure out how to create the illusion of depth in paintings. Novelists used to think that sentences had to be long and complicated until Hemingway came along, wrote some snappy prose, and changed everything. Even very old art forms, then, may have secrets left to discover. Maybe the biggest players in culture discovered some innovations that won them a permanent, first-mover chunk of market share. I can think of a few:

  • In books: lightning-quick plots and chapter-ending cliffhangers. Nobody thinks The Da Vinci Code is high literature, but it’s a book that really really wants you to read it. And a lot of people did!
  • In music: sampling. Musicians seem to sample more often these days. Now we not only remake songs; we franchise them too.
  • In movies, TV, and video games: cinematic universes. Studios have finally figured out that once audiences fall in love with fictional worlds, they want to spend lots of time in them. Marvel, DC, and Star Wars are the most famous, but there are also smaller universe expansions like Better Call Saul and El Camino from Breaking Bad and The Many Saints of Newark from The Sopranos. Video game developers have understood this for even longer, which is why Mario does everything from playing tennis to driving go-karts to, you know, being a piece of paper.

Proliferation

Invasion, consolidation, and innovation can, I think, explain the pop oligopoly from the supply side. But all three require a willing audience. So why might people be more open to experiencing the same thing over and over again?

As options multiply, choosing gets harder. You can’t possibly evaluate everything, so you start relying on cues like “this movie has Tom Hanks in it” or “I liked Red Dead Redemption, so I’ll probably like Red Dead Redemption II,” which makes you less and less likely to pick something unfamiliar. 

Another way to think about it: more opportunities means higher opportunity costs, which could lead to lower risk tolerance. When the only way to watch a movie is to go pick one of the seven playing at your local AMC, you might take a chance on something new. But when you’ve got a million movies to pick from, picking a safe, familiar option seems more sensible than gambling on an original.

This could be happening across all of culture at once. Movies don’t just compete with other movies. They compete with every other way of spending your time, and those ways are both infinite and increasing. There are now 60,000 free books on Project Gutenberg, Spotify says it has 78 million songs and 4 million podcast episodes, and humanity uploads 500 hours of video to YouTube every minute. So uh, yeah, the Tom Hanks movie sounds good.

What do we do about it?

Some may think that the rise of the pop oligopoly means the decline of quality. But the oligopoly can still make art: Red Dead Redemption II is a terrific game, “Blinding Lights” is a great song, and Toy Story 4 is a pretty good movie. And when you look back at popular stuff from a generation ago, there was plenty of dreck. We’ve forgotten the pulpy Westerns and insipid romances that made the bestseller lists while books like The Great GatsbyBrave New World, and Animal Farm did not. American Idol is not so different from the televised talent shows of the 1950s. Popular culture has always been a mix of the brilliant and the banal, and nothing I’ve shown you suggests that the ratio has changed.

The problem isn’t that the mean has decreased. It’s that the variance has shrunk. Movies, TV, music, books, and video games should expand our consciousness, jumpstart our imaginations, and introduce us to new worlds and stories and feelings. They should alienate us sometimes, or make us mad, or make us think. But they can’t do any of that if they only feed us sequels and spinoffs. It’s like eating macaroni and cheese every single night forever: it may be comfortable, but eventually you’re going to get scurvy. 

We haven’t fully reckoned with what the cultural oligopoly might be doing to us. How much does it stunt our imaginations to play the same video games we were playing 30 years ago? What message does it send that one of the most popular songs in the 2010s was about how a 1970s rock star was really cool? How much does it dull our ambitions to watch 2021’s The Matrix: Resurrections, where the most interesting scene is just Neo watching the original Matrix from 1999? How inspiring is it to watch tiny variations on the same police procedurals and reality shows year after year? My parents grew up with the first Star Wars movie, which had the audacity to create an entire universe. My niece and nephews are growing up with the ninth Star Wars movie, which aspires to move merchandise. Subsisting entirely on cultural comfort food cannot make us thoughtful, creative, or courageous.

Fortunately, there’s a cure for our cultural anemia. While the top of the charts has been oligopolized, the bottom remains a vibrant anarchy. There are weird books and funky movies and bangers from across the sea. Two of the most interesting video games of the past decade put you in the role of an immigration officer and an insurance claims adjuster. Every strange thing, wonderful and terrible, is available to you, but they’ll die out if you don’t nourish them with your attention. Finding them takes some foraging and digging, and then you’ll have to stomach some very odd, unfamiliar flavors. That’s good. Learning to like unfamiliar things is one of the noblest human pursuits; it builds our empathy for unfamiliar people. And it kindles that delicate, precious fire inside us––without it, we might as well be algorithms. Humankind does not live on bread alone, nor can our spirits long survive on a diet of reruns.

Alice Walker and the Price of Conscience

Alice Walker was disinvited to the Bay Area Book Festival after Zionist groups threatened to carry out protests. The public and presenters are complicit in her blacklisting if they attend.

By Chris Hedges

Source: The Chris Hedges Report

There is a steep price to pay for having a conscience and more importantly the courage to act on it. The hounds of hell pin you to the cross, hammering nails into your hands and feet as they grin like the Cheshire cat and mouth bromides about respect for human rights, freedom of expression and diversity. I have watched this happen for some time to Alice Walker, one of the most gifted and courageous writers in America. Walker, who was awarded the Pulitzer Prize for fiction for her novel The Color Purple, has felt the bitter sting of racism. She refuses to be silent about the plight of the oppressed, including the Palestinians.

“Whenever I come out with a book, or anything that will take me before the public, the world, I am assailed as this person I don’t recognize,” she said when I reached her by phone. “If I tried to keep track of all the attacks over the decades, I wouldn’t be able to keep working. I am happy people are standing up. It is all of us. Not just me. They are trying to shut us down, shut us up, erase us. That reality is what is important.”

The Bay Area Book festival delivered the latest salvo against Walker. The organizers disinvited her from the event because she  praised the writings of the New Age author David Icke and called his book And the Truth Shall Set You Free “brave.” Icke has denied critics’ charges of anti-Semitism. The festival organizers twisted themselves into contortions to say they were not charging Walker with anti-Semitism. She was banned because she lauded a controversial writer, who I suspect few members of the committee have read. The poet and writer Honorée Fanonne Jeffers, who Walker was to interview, withdrew from the festival in protest.

Walker, a supporter of the Boycott, Divestment and Sanctions (BDS) movement, has been a very public advocate for Palestinian rights and a critic of Israel for many years. Her friendship with Icke has long been part of the public record. She hid nothing. It is not as if the festival organizers suddenly discovered a dark secret about Walker. They sought to capitalize on her celebrity and then, when they felt the heat from the Israel lobby, capitulated to the mob to humiliate her.

“I don’t know these people,” Walker said of the festival organizers who disinvited her. “It feels like the south. You know they are out there in the community, and they have their positions, but all you see are sheets. That’s what this is. It’s like being back in the south.”

Banning writers because of books they like or find interesting nullifies the whole point of a book festival. Should I be banned because I admire Louis-Ferdinand Céline’s masterpieces Journey to the End of the NightDeath on the Installment Plan, and Castle to Castle, despite his virulent anti-Semitism, which even after World War II he refused to relinquish? Should I be banned for liking Joseph Heller’s Catch-22, which I recently reread, and which is rabidly misogynistic? Should I be banned for loving William Butler Yeats, who, like Ezra Pound, many of whose poems I have also committed to memory, was a fascist collaborator? Should I be banned because I revere Hannah Arendt, whose attitudes towards African-Americans were paternalistic, at best, and arguably racist? Should I be banned because I cherish books by C.S. Lewis, Norman Mailer and D.H. Lawrence, who were homophobic?

We might as well sweep clean library shelves if the attitudes of writers we read mean we are denied a right to speak. 

And let’s not even get started with the Bible, which I studied as a seminarian at Harvard Divinity School. God repeatedly demands righteous acts of genocide, transforming the Nile into blood so the Egyptians will suffer from thirst. God sends swarms of locusts and flies to torture the Egyptians, along with hail, fire and thunder to destroy all plants and trees. God orders the firstborn in every Egyptian household killed so all will know “that the Lord makes a distinction between Egyptians and Israel.” The killing goes on until “there was not a house where one was not dead.” 

The Bible contains much of this divinely sanctioned slaughtering of non-believers. It endorses slavery and the beating of enslaved people. It condones the execution of homosexuals and women who commit adultery. It views women as property and approves the right of fathers to sell their daughters. But the Bible also remains, with all these contradictions and moral failings, a great religious, ethical and moral document. Even the most flawed books often have something to teach us.

Organizers of the festival attacked Walker for her poem “It is Our Frightful Duty.” They accuse Walker of channeling Icke’s alleged anti-Semitism into her writing, as if Walker is unable to think for herself. The attack on the poem, which is a gross misreading of its intent, exposes the lie that Walker’s position on Israel and Palestine had nothing to do with her being disinvited.

“Unfortunately, Ms. Walker has not only promoted Icke’s ideas widely on her own blog and in interviews, but they may have influenced her own writing,” the festival wrote in a statement. “Ms. Walker’s 2017 poem “It is our (Frightful) Duty to Study the Talmud” encourages people to use Google and Youtube to “follow the trail of “The / Talmud” as its poison belatedly winds its way / Into our collective consciousness. // Some of what you find will sound / Too crazy to be true. Unfortunately those bits are likely / To be true.” A New York Magazine essay by writer Nylah Burton (who identifies as Black and Jewish) describes her reaction to Walker’s support of Icke and this poem.”

The poem calls out these hate-filled religious texts. “All of it: The Christian, the Jewish, The Muslim; even the Buddhist. All of it, without exception, At the root.” Walker reminds us in the poem that these texts have been used throughout millennia to sanctify subjugation, dehumanization and murder. Slave holders defended the enslavement of Blacks by citing numerous passages in the Old and the New Testament, including Paul’s Letter to the Ephesians where, equating slaveholders with God, Paul writes: “Slaves, be obedient to your human masters with fear and trembling, in sincerity of heart, as to Christ.”  

Israel seeks, in the same way, to legitimize its colonial-settler project by citing the Old Testament and the Talmud, the primary source of Jewish law. Never mind that Palestine was a Muslim country from the 7th century until it was seized by military force in 1948. The Old Testament, in the hands of Zionists, is a deed to Palestinian land.

Walker excoriates this religious chauvinism and mythology. She warns that theocracies, which sacralize state power, are dangerous. In the poem, she highlights  passages in the Talmud used to condemn those outside the faith. Jews must repudiate these sections in the Talmud and the Old Testament, as those of us who are Christians must repudiate the hateful passages in the Bible. When these religious screeds are weaponized by zealots —Christian, Muslim or Jewish — they propagate evil. 

Walker writes:

Is Jesus boiling eternally in hot excrement,

For his “crime” of throwing the bankers

Out of the Temple? For loving, standing with,

And defending

The poor? Was his mother, Mary,

A whore?

Are Goyim (us) meant to be slaves of Jews, and not only

That, but to enjoy it?

Are three year old (and a day) girls eligible for marriage and intercourse?

Are young boys fair game for rape?

Must even the best of the Goyim (us, again) be killed?

Pause a moment and think what this could mean

Or already has meant

In our own lifetime.

Walker was invited to the festival to interview Honorée Fanonne Jeffers about her work, not to give a lecture on Icke or Palestine — but no matter. She ran afoul of the thought police, who are always vigilant about catering to smear campaigns against Israeli critics but blithely ignore the virulent and overt racism of Israeli politicians, military commanders, writers and intellectuals.

Walker is not the first writer targeted by Israel. Israel banned the author Gunter Grass and demanded the rescindment of his Nobel prize after he wrote a poem denouncing Germany’s decision to provide Israel with nuclear submarines, warning that Israel “could wipe out the Iranian people” if it attacked Iran. Former Israeli Defense Minister Avigdor Lieberman, who calls for the ethnic cleansing of Palestinians to create a “Greater” Israel, described the Palestinian poet Mahmoud Darwish as “someone who has written texts against Zionism — which are still used as fuel for terror attacks against Israel.” He said honoring Darwish was the equivalent to honoring Adolf Hitler for “Mein Kampf.” Israeli bookstores Steimatzky and Tzomet Sefarim purged Sally Rooney’s novels from some 200 branches and online sites because of her support for BDS. Israeli writer Yehonatan Geffen was beaten outside his home for calling the Israeli prime minister a racist.

Bay Area Book Festival founder and director Cherilyn Parsons defended the board’s decision to disinvite Walker when I requested a comment:  

Our decision to disinvite Ms. Walker had nothing to do with her position on Palestine, her voice as a Black woman writer, or her right to speak her mind freely. We honor all those things. We also do not hold that she is anti-Semitic. (To be pro-Palestinian does not mean a person is anti-Semitic, just as to be Jewish does not mean that one is anti-Palestine.) Our decision was based purely on Ms. Walker’s inexplicable, ongoing endorsement of David Icke, a conspiracy theorist who dangerously promulgates such beliefs as that Jewish people bankrolled Hitler, caused the 2008 global financial crisis, staged the 9/11 terrorist attacks, and more. (See his book “And the Truth Shall Set You Free,” available full-text on the Internet Archive.) Icke also regularly promotes “The Protocols of the Elders of Zion,” a fabricated, uber-anti-Semitic text that was widely read during the time of social upheaval in pre-WWII Germany and turned public sentiment against Jews–a truly dangerous document for a populace to embrace. Finally, we note that Ms. Walker provided financial support for, and participation in, a documentary celebrating Icke and his work.

“I do not believe he is anti-Semitic or anti-Jewish,” Walker posted on her website. “I do believe he is brave enough to ask the questions others fear to ask, and to speak his own understanding of the truth wherever it might lead. Many attempts have been made to censor and silence him. As a woman, and a person of color, as a writer who has been criticized and banned myself, I support his right to share his own thoughts.”

“I maintain that I can be friends with whoever I like,” Walker told me. “The attachment to this belief that this person is evil is strange. He’s not.”

I worked for two years as a reporter in Jerusalem. I listened to the daily filth spewed out by Israelis about Arabs and Palestinians, who used racist tropes to sanctify Israeli apartheid and gratuitous violence against Palestinians. Israel routinely orders air strikes, targeted assassinations, drone attacks, artillery strikes, tank assaults and naval bombardments on the largely defenseless population in Gaza. Israel blithely dismisses those it murders, including children, as unworthy of life, drawing on poisonous religious edicts. It is risible that Israel  and its US supporters can posit themselves as anti-racists, abrogating the right to cancel Walker. It is the equivalent of allowing the Klan to vet speakers lists.  

Torat Ha’Melech by Rabbi Yitzhak and Rabbi Yosef Elitzur is one of innumerable examples of the deep racism embedded in Israeli culture. The book provides rabbinical advice to Israeli soldiers and officers in the occupied Palestinian territories. It  describes non-Jews as “uncompassionate by nature” and justifiably exterminated to “curb their evil inclinations.” “If we kill a gentile who has violated one of the seven commandments of [Noah]…there is nothing wrong with the murder.” It assures troops that it is morally legitimate to kill Palestinian children, writing, “There is justification for killing babies if it is clear they will grow up to harm us, and in such a situation they may be harmed deliberately, and not only during combat with adults.” The Biblical prohibition on murder, Yitzhak and Elitzur write, “refers only to a Jew who kills a Jew, and not to a Jew who kills a gentile, even if that gentile is one of the righteous among the nations.” They even say it is “permissible” to kill Jewish dissidents. A Jewish dissident, the rabbis write, is a rodef. rodef, according to traditional Jewish law, is someone who is “pursuing” another person to murder him or her. It is the duty of a Jew to kill a rodef if the rodef is told to cease the threatening behavior and does not. Yigal Amir, who assassinated Israeli Prime Minister Yitzhak Rabin in 1995, argued that the din rodef, or “law of the pursuer,” justified Rabin’s murder.

Walker is the best among us. She is one of our most gifted and lyrical writers. She stands unequivocally with the crucified of the earth. She sees her own pain in the pain of others. She demands justice. She pays the price.

Boycott the Bay Area Book Festival.

That is the least we owe a literary and moral titan.

The Future Is Here: Dystopian Movies Fit for a Dystopian World

By John W. Whitehead

Source: The Rutherford Institute

“The Internet is watching us now. If they want to. They can see what sites you visit. In the future, television will be watching us, and customizing itself to what it knows about us. The thrilling thing is, that will make us feel we’re part of the medium. The scary thing is, we’ll lose our right to privacy. An ad will appear in the air around us, talking directly to us.”—Director Steven Spielberg, Minority Report

We have arrived, way ahead of schedule, into the dystopian future dreamed up by such science fiction writers as George Orwell, Aldous Huxley, Margaret Atwood and Philip K. Dick.

Much like Orwell’s Big Brother in 1984, the government and its corporate spies now watch our every move.

Much like Huxley’s A Brave New World, we are churning out a society of watchers who “have their liberties taken away from them, but … rather enjoy it, because they [are] distracted from any desire to rebel by propaganda or brainwashing.”

Much like Atwood’s The Handmaid’s Tale, the populace is now taught to “know their place and their duties, to understand that they have no real rights but will be protected up to a point if they conform, and to think so poorly of themselves that they will accept their assigned fate and not rebel or run away.”

And in keeping with Philip K. Dick’s darkly prophetic vision of a dystopian police state—which became the basis for Steven Spielberg’s futuristic thriller Minority Report which was released 20 years ago—we are now trapped into a world in which the government is all-seeing, all-knowing and all-powerful, and if you dare to step out of line, dark-clad police SWAT teams and pre-crime units will crack a few skulls to bring the populace under control.

Minority Report is set in the year 2054, but it could just as well have taken place in 2022.

Seemingly taking its cue from science fiction, technology has moved so fast in the short time since Minority Report premiered in 2002 that what once seemed futuristic no longer occupies the realm of science fiction.

Incredibly, as the various nascent technologies employed and shared by the government and corporations alike—facial recognition, iris scanners, massive databases, behavior prediction software, and so on—are incorporated into a complex, interwoven cyber network aimed at tracking our movements, predicting our thoughts and controlling our behavior, Spielberg’s unnerving vision of the future is fast becoming our reality.

Both worlds—our present-day reality and Spielberg’s celluloid vision of the future—are characterized by widespread surveillance, behavior prediction technologies, data mining, fusion centers, driverless cars, voice-controlled homes, facial recognition systems, cybugs and drones, and predictive policing (pre-crime) aimed at capturing would-be criminals before they can do any damage.

Surveillance cameras are everywhere. Government agents listen in on our telephone calls and read our emails. Political correctness—a philosophy that discourages diversity—has become a guiding principle of modern society.

The courts have shredded the Fourth Amendment’s protections against unreasonable searches and seizures. In fact, SWAT teams battering down doors without search warrants and FBI agents acting as a secret police that investigate dissenting citizens are common occurrences in contemporary America.

We are increasingly ruled by multi-corporations wedded to the police state. Much of the population is either hooked on illegal drugs or ones prescribed by doctors. And bodily privacy and integrity has been utterly eviscerated by a prevailing view that Americans have no rights over what happens to their bodies during an encounter with government officials, who are allowed to search, seize, strip, scan, spy on, probe, pat down, taser, and arrest any individual at any time and for the slightest provocation.

All of this has come about with little more than a whimper from an oblivious American populace largely comprised of nonreaders and television and internet zombies, but we have been warned about such an ominous future in novels and movies for years.

The following 15 films may be the best representation of what we now face as a society.

Fahrenheit 451 (1966). Adapted from Ray Bradbury’s novel and directed by Francois Truffaut, this film depicts a futuristic society in which books are banned, and firemen ironically are called on to burn contraband books—451 Fahrenheit being the temperature at which books burn. Montag is a fireman who develops a conscience and begins to question his book burning. This film is an adept metaphor for our obsessively politically correct society where virtually everyone now pre-censors speech. Here, a brainwashed people addicted to television and drugs do little to resist governmental oppressors.

2001: A Space Odyssey (1968). The plot of Stanley Kubrick’s masterpiece, as based on an Arthur C. Clarke short story, revolves around a space voyage to Jupiter. The astronauts soon learn, however, that the fully automated ship is orchestrated by a computer system—known as HAL 9000—which has become an autonomous thinking being that will even murder to retain control. The idea is that at some point in human evolution, technology in the form of artificial intelligence will become autonomous and human beings will become mere appendages of technology. In fact, at present, we are seeing this development with massive databases generated and controlled by the government that are administered by such secretive agencies as the National Security Agency and sweep all websites and other information devices collecting information on average citizens. We are being watched from cradle to grave.

Planet of the Apes (1968). Based on Pierre Boulle’s novel, astronauts crash on a planet where apes are the masters and humans are treated as brutes and slaves. While fleeing from gorillas on horseback, astronaut Taylor is shot in the throat, captured and housed in a cage. From there, Taylor begins a journey wherein the truth revealed is that the planet was once controlled by technologically advanced humans who destroyed civilization. Taylor’s trek to the ominous Forbidden Zone reveals the startling fact that he was on planet earth all along. Descending into a fit of rage at what he sees in the final scene, Taylor screams: “We finally really did it. You maniacs! You blew it up! Damn you.” The lesson is obvious, but will we listen? The script, although rewritten, was initially drafted by Rod Serling and retains Serling’s Twilight Zone-ish ending.

THX 1138 (1970). George Lucas’ directorial debut, this is a somber view of a dehumanized society totally controlled by a police state. The people are force-fed drugs to keep them passive, and they no longer have names but only letter/number combinations such as THX 1138. Any citizen who steps out of line is quickly brought into compliance by robotic police equipped with “pain prods”—electro-shock batons. Sound like tasers?

A Clockwork Orange (1971). Director Stanley Kubrick presents a future ruled by sadistic punk gangs and a chaotic government that cracks down on its citizens sporadically. Alex is a violent punk who finds himself in the grinding, crushing wheels of injustice. This film may accurately portray the future of western society that grinds to a halt as oil supplies diminish, environmental crises increase, chaos rules, and the only thing left is brute force.

Soylent Green (1973). Set in a futuristic overpopulated New York City, the people depend on synthetic foods manufactured by the Soylent Corporation. A policeman investigating a murder discovers the grisly truth about what soylent green is really made of. The theme is chaos where the world is ruled by ruthless corporations whose only goal is greed and profit. Sound familiar?

Blade Runner (1982). In a 21st century Los Angeles, a world-weary cop tracks down a handful of renegade “replicants” (synthetically produced human slaves). Life is now dominated by mega-corporations, and people sleepwalk along rain-drenched streets. This is a world where human life is cheap, and where anyone can be exterminated at will by the police (or blade runners). Based upon a Philip K. Dick novel, this exquisite Ridley Scott film questions what it means to be human in an inhuman world.

Nineteen Eighty-Four (1984). The best adaptation of Orwell’s dark tale, this film visualizes the total loss of freedom in a world dominated by technology and its misuse, and the crushing inhumanity of an omniscient state. The government controls the masses by controlling their thoughts, altering history and changing the meaning of words. Winston Smith is a doubter who turns to self-expression through his diary and then begins questioning the ways and methods of Big Brother before being re-educated in a most brutal fashion.

Brazil (1985). Sharing a similar vision of the near future as 1984 and Franz Kafka’s novel The Trial, this is arguably director Terry Gilliam’s best work, one replete with a merging of the fantastic and stark reality. Here, a mother-dominated, hapless clerk takes refuge in flights of fantasy to escape the ordinary drabness of life. Caught within the chaotic tentacles of a police state, the longing for more innocent, free times lies behind the vicious surface of this film.

They Live (1988). John Carpenter’s bizarre sci-fi social satire action film assumes the future has already arrived. John Nada is a homeless person who stumbles across a resistance movement and finds a pair of sunglasses that enables him to see the real world around him. What he discovers is a world controlled by ominous beings who bombard the citizens with subliminal messages such as “obey” and “conform.” Carpenter manages to make an effective political point about the underclass—that is, everyone except those in power. The point: we, the prisoners of our devices, are too busy sucking up the entertainment trivia beamed into our brains and attacking each other up to start an effective resistance movement.

The Matrix (1999). The story centers on a computer programmer Thomas A. Anderson, secretly a hacker known by the alias “Neo,” who begins a relentless quest to learn the meaning of “The Matrix”—cryptic references that appear on his computer. Neo’s search leads him to Morpheus who reveals the truth that the present reality is not what it seems and that Anderson is actually living in the future—2199. Humanity is at war against technology which has taken the form of intelligent beings, and Neo is actually living in The Matrix, an illusionary world that appears to be set in the present in order to keep the humans docile and under control. Neo soon joins Morpheus and his cohorts in a rebellion against the machines that use SWAT team tactics to keep things under control.

Minority Report (2002). Based on a short story by Philip K. Dick and directed by Steven Spielberg, the film offers a special effect-laden, techno-vision of a futuristic world in which the government is all-seeing, all-knowing and all-powerful. And if you dare to step out of line, dark-clad police SWAT teams will bring you under control. The setting is 2054 where PreCrime, a specialized police unit, apprehends criminals before they can commit the crime. Captain Anderton is the chief of the Washington, DC, PreCrime force which uses future visions generated by “pre-cogs” (mutated humans with precognitive abilities) to stop murders. Soon Anderton becomes the focus of an investigation when the precogs predict he will commit a murder. But the system can be manipulated. This film raises the issue of the danger of technology operating autonomously—which will happen eventually if it has not already occurred. To a hammer, all the world looks like a nail. In the same way, to a police state computer, we all look like suspects. In fact, before long, we all may be mere extensions or appendages of the police state—all suspects in a world commandeered by machines.

V for Vendetta (2006). This film depicts a society ruled by a corrupt and totalitarian government where everything is run by an abusive secret police. A vigilante named V dons a mask and leads a rebellion against the state. The subtext here is that authoritarian regimes through repression create their own enemies—that is, terrorists—forcing government agents and terrorists into a recurring cycle of violence. And who is caught in the middle? The citizens, of course. This film has a cult following among various underground political groups such as Anonymous, whose members wear the same Guy Fawkes mask as that worn by V.

Children of Men (2006). This film portrays a futuristic world without hope since humankind has lost its ability to procreate. Civilization has descended into chaos and is held together by a military state and a government that attempts to keep its totalitarian stronghold on the population. Most governments have collapsed, leaving Great Britain as one of the few remaining intact societies. As a result, millions of refugees seek asylum only to be rounded up and detained by the police. Suicide is a viable option as a suicide kit called Quietus is promoted on billboards and on television and newspapers. But hope for a new day comes when a woman becomes inexplicably pregnant.

Land of the Blind (2006). In this dark political satire, tyrannical rulers are overthrown by new leaders who prove to be just as evil as their predecessors. Maximilian II is a demented fascist ruler of a troubled land named Everycountry who has two main interests: tormenting his underlings and running his country’s movie industry. Citizens who are perceived as questioning the state are sent to “re-education camps” where the state’s concept of reality is drummed into their heads. Joe, a prison guard, is emotionally moved by the prisoner and renowned author Thorne and eventually joins a coup to remove the sadistic Maximilian, replacing him with Thorne. But soon Joe finds himself the target of the new government.

All of these films—and the writers who inspired them—understood what many Americans, caught up in their partisan, flag-waving, zombified states, are still struggling to come to terms with: that there is no such thing as a government organized for the good of the people. Even the best intentions among those in government inevitably give way to the desire to maintain power and control at all costs.

Eventually, as I make clear in my book Battlefield America: The War on the American People and in its fictional counterpart The Erik Blair Diaries, even the sleepwalking masses (who remain convinced that all of the bad things happening in the police state—the police shootings, the police beatings, the raids, the roadside strip searches—are happening to other people) will have to wake up.

Sooner or later, the things happening to other people will start happening to us.

When that painful reality sinks in, it will hit with the force of a SWAT team crashing through your door, a taser being aimed at your stomach, and a gun pointed at your head. And there will be no channel to change, no reality to alter, and no manufactured farce to hide behind.

As George Orwell warned, “If you want a picture of the future, imagine a boot stamping on a human face forever.”