Prisons Without Walls: We’re All Inmates in the American Police State

they-live

By John W. Whitehead

Source: The Rutherford Institute

“It is perfectly possible for a man to be out of prison and yet not free—to be under no physical constraint and yet be a psychological captive, compelled to think, feel and act as the representatives of the national state, or of some private interest within the nation wants him to think, feel and act. . . . To him the walls of his prison are invisible and he believes himself to be free.”—Aldous Huxley, A Brave New World Revisited

Free worlders” is prison slang for those who are not incarcerated behind prison walls.  Supposedly, those fortunate souls live in the “free world.” However, appearances can be deceiving.

“As I got closer to retiring from the Federal Bureau of Prisons,” writes former prison employee Marlon Brock, “it began to dawn on me that the security practices we used in the prison system were being implemented outside those walls.” In fact, if Brock is right, then we “free worlders” do live in a prison—albeit, one without visible walls.

In federal prisons, cameras are everywhere in order to maintain “security” and keep track of the prisoners. Likewise, the “free world” is populated with video surveillance and tracking devices. From surveillance cameras in stores and street corners to license plate readers (with the ability to log some 1,800 license plates per hour) on police cars, our movements are being tracked virtually everywhere. With this increasing use of iris scanners and facial recognition software—which drones are equipped with—there would seem to be nowhere to hide.

Detection and confiscation of weapons (or whatever the warden deems “dangerous”) in prison is routine. The inmates must be disarmed. Pat downs, checkpoints, and random searches are second nature in ferreting out contraband.

Sound familiar?

Metal detectors are now in virtually all government buildings. There are the TSA scanning devices and metal detectors we all have to go through in airports. Police road blocks and checkpoints are used to perform warrantless searches for contraband. Those searched at road blocks can be searched for contraband regardless of their objections—just like in prison. And there are federal road blocks on American roads in the southwestern United States. Many of them are permanent and located up to 100 miles from the border.

Stop and frisk searches are taking place daily across the country. Some of them even involve anal and/or vaginal searches. In fact, the U.S. Supreme Court has approved strip searches even if you are arrested for a misdemeanor—such as a traffic stop. Just like a prison inmate.

Prison officials open, search and read every piece of mail sent to inmates. This is true of those who reside outside prison walls, as well. In fact, “the United States Postal Service uses a ‘Mail Isolation Control and Tracking Program’ to create a permanent record of who is corresponding with each other via snail mail.” Believe it or not, each piece of physical mail received by the Postal Service is photographed and stored in a database. Approximately 160 billion pieces of mail sent out by average Americans are recorded each year and the police and other government agents have access to this information.

Prison officials also monitor outgoing phone calls made by inmates. This is similar to what the NSA, the telecommunication corporation, and various government agencies do continually to American citizens. The NSA also downloads our text messages, emails, Facebook posts, and so on while watching everything we do.

Then there are the crowd control tactics: helmets, face shields, batons, knee guards, tear gas, wedge formations, half steps, full steps, pinning tactics, armored vehicles, and assault weapons. Most of these phrases are associated with prison crowd control because they were perfected by prisons.

Finally, when a prison has its daily operations disturbed, often times it results in a lockdown. What we saw with the “free world” lockdowns following the 2013 Boston Marathon bombing and the melees in Ferguson, Missouri and Baltimore, Maryland, mirror a federal prison lockdown.

These are just some of the similarities between the worlds inhabited by locked-up inmates and those of us who roam about in the so-called “free world.”

Is there any real difference?

To those of us who see the prison that’s being erected around us, it’s a bit easier to realize what’s coming up ahead, and it’s not pretty. However, and this must be emphasized, what most Americans perceive as life in the United States of America is a far cry from reality. Real agendas and real power are always hidden.

As Author Frantz Fanon notes, “Sometimes people hold a core belief that is very strong. When they are presented with evidence that works against that belief, the new evidence cannot be accepted. It would create a feeling that is extremely uncomfortable, called cognitive dissonance. And because it is so important to protect the core belief, they will rationalize, ignore and even deny anything that doesn’t fit in with the core belief.”

This state of denial and rejection of reality is the essential plot of John Carpenter’s 1988 film They Live, where a group of down-and-out homeless men discover that people have been, in effect, so hypnotized by media distractions that they do not see their prison environment and the real nature of those who control them—that is, an oligarchic elite.

Caught up in subliminal messages such as “obey” and “conform,” among others, beamed out of television and various electronic devices, billboards, and the like, people are unaware of the elite controlling their lives. As such, they exist, as media analyst Marshall McLuhan once wrote, in “prisons without walls.” And of course, any resistance is met with police aggression.

A key moment in the film occurs when John Nada, a homeless drifter, notices something strange about people hanging about a church near the homeless settlement where he lives. Nada decides to investigate. Entering the church, he sees graffiti on a door: They live, We sleep. Nada overhears two men, obviously resisters, talking about “robbing banks” and “manufacturing Hoffman lenses until we’re blue in the face.” Moments later, one of the resisters catches Nada fumbling in the church and tells him “it’s the revolution.” When Nada nervously backs off, the resister assures him, “You’ll be back.”

Rummaging through a box, Nada discovers a handful of cheap-looking sunglasses, referred to earlier as Hoffman lenses. Grabbing a pair and exiting the church, he starts walking down a busy urban street.

Sliding the sunglasses on his face, Nada is shocked to see a society bombarded and controlled on every side by subliminal messages beamed at them from every direction. Billboards are transformed into authoritative messages: a bikini-clad woman in one ad is replaced with the words “MARRY AND REPRODUCE.” Magazine racks scream “CONSUME” and “OBEY.” A wad of dollar bills in a vendor’s hand proclaims, “THIS IS YOUR GOD.”

What’s even more disturbing than the hidden messages, however, are the ghoulish-looking creatures—the elite—who appear human until viewed them through the lens of truth.

This is the subtle message of They Live, an apt analogy of our own distorted vision of life in the American police state. These things are in plain sight, but from the time we are born until the time we die, we are indoctrinated into believing that those who rule us do it for our good. The truth, far different, is that those who rule us don’t really see us as human beings with dignity and worth. They see us as if “we’re livestock.”

It’s only once Nada’s eyes have been opened that he is able to see the truth: “Maybe they’ve always been with us,” he says. “Maybe they love it—seeing us hate each other, watching us kill each other, feeding on our own cold f**in’ hearts.” Nada, disillusioned and fed up with the lies and distortions, is finally ready to fight back. “I got news for them. Gonna be hell to pay. Cause I ain’t daddy’s little boy no more.”

What about you?

As I point out in my book Battlefield America: The War on the American People, the warning signs have been cautioning us for decades. Oblivious to what lies ahead, most have ignored the obvious. We’ve been manipulated into believing that if we continue to consume, obey, and have faith, things will work out. But that’s never been true of emerging regimes. And by the time we feel the hammer coming down upon us, it will be too late.

As Rod Serling warned:

All the Dachaus must remain standing. The Dachaus, the Belsens, the Buchenwalds, the Auschwitzes—all of them. They must remain standing because they are a monument to a moment in time when some men decided to turn the earth into a graveyard, into it they shoveled all of their reason, their logic, their knowledge, but worst of all their conscience. And the moment we forget this, the moment we cease to be haunted by its remembrance. Then we become the grave diggers.

The message: stay alert.

Take the warning signs seriously. And take action because the paths to destruction are well disguised by those in control.

This is the lesson of history.

Catching Up With the Unabomber. When Does the End Justify the Means?

220px-Unabomberforpresident

By Brian Whitney

Source: Disinfo.com

The Unabomber, known as Ted Kaczynski, was not a fan of technology. To expose the world to his anti-technology philosophy, from the years 1978 to 1995, Kaczynski sent 16 bombs to universities and airlines, killing three people and injuring 23, before he was eventually caught and sent to prison. He remains there today. At one time, he was possibly the most famous criminal in the world.

He said of technology’s role:

The system does not and cannot exist to satisfy human needs. Instead, it is human behavior that has to be modified to fit the needs of the system. It is the fault of technology, because the system is guided not by ideology but by technical necessity.

In his essay Industrial Society and Its Future, Kaczynski argued that while his bombings were “a bit” extreme, they were quite necessary to attract attention to the loss of human freedom caused by modern technology. His book Technological Slavery: The Collected Writings of Theodore J. Kaczynski, a.k.a. “The Unabomber” breaks all of his philosophies down for those of us that just know him through corporate news stations.

Was the Unabomber crazy, or just so sane he was blowing our minds?

I talked to David Skrbina, confidant of Kaczynski, and philosophy professor at the University of Michigan. Skrbina wrote the intro to Technological Slavery.

Can you tell me a bit about how you and Kaczynski began to communicate? Are you still in touch with him today?

Back in 2003, I began work on a new course at the University of Michigan: Philosophy of Technology. Surprisingly, such a course had never been offered before, at any of our campuses. I wanted to remedy that deficiency.

I then began to pull together recent and relevant material for the course, focusing on critical approaches to technology. These, to me, were more insightful and more interesting, and were notably under-analyzed among current philosophers of technology. Most of them are either neutral toward modern technology, or positively embrace it, or accept its presence resignedly. As I found out, very few philosophers of the past four decades adopted anything like a critical stance. This, for me, was highly revealing.

Anyway, I was well aware of Kaczynski’s manifesto, “Industrial society and its future,” which was published in late 1995 at the height of the Unabomber mania. I was very impressed with its analysis, even though most of the ideas were not new to me (many were reiterations of arguments by Jacques Ellul, for example—see his 1964 book The Technological Society). But the manifesto was clear and concise, and made a compelling argument.

After Kaczynski was arrested in 1996, and after a year-long trial process, he was stashed away in a super-max prison in Colorado. The media then decided that, in essence, the story was over. Case closed. No need to cover Kaczynski or his troubling ideas ever again.

By 2003, I suspected he was still actively researching and writing, but I had heard nothing of substance about him in years. So I decided to write to him personally, hoping to get some follow-up material that might be useful in my new course. Fortunately, he replied. That began a long string of letters, all on the problem of technology. To date, I’ve received something over 100 letters from him.

Most of the letters occurred in the few years prior to, and just after, the publication of Technological Slavery. Several of his more important and detailed replies to me were included in that book—about 100 pages worth.

We’ve had less occasion to communicate in the past couple years. My most recent letter from him was in late 2014.

You have said that his ideas “threaten to undermine the power structure of our technological order. And since the system’s defenders are unable to defeat the ideas, they choose to attack the man who wrote them.” Can you expand on that?

The present military and economic power of the US government, and governments everywhere, rests on advanced technology. Governments, by their very nature, function to manipulate and coerce people—both their own citizens, and any other non-citizens whom they declare to be of interest. Governments have a monopoly on force, and this force is manifest through technological structures and systems.

Therefore, all governments—and in fact anyone who would seek to exert power in the world—must embrace modern technology. American government, at all levels, is deeply pro-tech. So too are our corporations, universities, and other organized institutions. Technology is literally their life-blood. They couldn’t oppose it in any substantial way without committing virtual suicide.

So when a Ted Kaczynski comes along and reminds everyone of the inherent and potentially catastrophic problems involved with modern technology, “the system” doesn’t want you to hear it. It will do everything possible to distort or censor such discussion. As you may recall, during the final years of the Unabomber episode, there was very little—astonishingly little—discussion of the actual ideas of the manifesto. Now and then, little passages would be quoted in the newspapers, but that was it; no follow-up, no discussion, no analysis.

Basically, the system’s defenders had no counterarguments. The data, empirical observation, and common sense all were on the side of Kaczynski. There was no rational case to be made against him.

The only option for the defenders was an ad hominem attack: to portray Kaczynski as a sick murderer, a crazed loner, and so on. That was the only way to ‘discredit’ his ideas. Of course, as we know, the ad hominem tactic is a logical fallacy. Kaczynski’s personal situation, his mental state, or even his extreme actions, have precisely zero bearing on the strength of his arguments.

The system’s biggest fear was—and still is—that people will believe that he was right. People might begin, in ways small or large, to withdraw from, or to undermine, the technological basis of society. This cuts to the heart of the system. It poses a fundamental threat, to which the system has few options, apart from on-going propaganda efforts, or brute force.

What do you think of the fact that when our government, or any figure in authority such as a police officer, kills in the name of the established belief system, it is thought of as just. But when a guy like Kaczynski kills in the name of his belief system, he is thought of as a deranged psychopath?

As I mentioned, governmental authorities have a monopoly on force. Whenever they use it, it is, almost by definition, ‘right.’ Granted, police can be convicted of ‘excessive force.’ But such cases, as we know, are very rare. And militaries can never be so convicted.

At best, if the public is truly appalled by some lethal action of our police or military, they may vote in a more ‘pacifist’ administration. But even that rarely works. People were disgusted by the war-monger George W. Bush, and so they voted in the “anti-war” Obama. Ironically, he continued on with much the same killing. And through foreign aid and UN votes, Obama continues to support and defend murderous regimes around the world. So much for pacifism.

Let’s keep in mind: Kaczynski killed three people. This was tragic and regrettable, but still, it was just three people. American police kill that many citizens every other day, on average. The same with Obama’s drone operators. Technology kills many times that number, every day—even every hour. Let’s keep things in perspective.

Kaczynski killed in order to gain the notoriety necessary to get the manifesto into the public eye. And it worked. When it was published, the Washington Post sold something like 1.2 million copies that day—still a record. He devised a plan, executed it, and thereby caused millions of people to contemplate the problem of technology in a way they never had before.

Does the end justify the means? It’s too early to tell. If Kaczynski’s actions ultimately have some effect on averting technological disaster, there will be no doubt: his actions were justified. They may yet save millions of lives, not to mention much of the natural world. Time will tell.

You recently wrote a book, The Metaphysics of Technology. Can you tell us a little about that?

Sure. In thinking about the problem of technology, it struck me that there was very little philosophical analysis about what, exactly, technology is. We’ve had many action plans, ranging from tepid and mild (think Sherry Turkle), to Bill Joy’s thesis of “relinquishment” of key technologies, to Kaczynski’s total revolution. But if we don’t really understand what we’re dealing with, our actions are likely to be misguided and ineffectual. In short, we need a true metaphysics of technology.

On my view, technology advances with a tremendous, autonomous power. Humans are the implementers of this power, but we can’t really guide it and we certainly can’t stop it. In effect, it functions as a law of nature. It advances with an evolutionary force, and that’s why we are heading toward disaster.

I see technology much as the ancient Greeks did—as a combination of two potent entities, Technê and Logos (hence ‘techno-logy’). For them, these were quasi-divine forces. Logos was the guiding intelligence behind all order in the universe. Technê was the process by which all things—manmade and otherwise—came into being. These were not mere mythology; they were rational conclusions regarding the operation of the cosmos.

Like the Greeks, I argue that technê is a universal process. All order in the universe is a form of technê. Hence my coining of the term ‘Pantechnikon’—the universe as an orderly construction, manifesting a kind of intelligence, or Logos. Our modern, human technology is on a continuum with all order in the universe. (Harvard astrophysicist Eric Chaisson has argued for precisely the same point, incidentally; see his 2006 book Epic of Evolution.)

The net effect of all this is not good news for us. Technology is like a wave moving through the Earth, and the universe. For a long while, we were at the peak of that wave. Now we’re on the downside. Technology is rapidly heading toward true autonomy. Our opportunity to slow or redirect it is rapidly vanishing. If technology achieves true autonomy—we can take Kurzweil’s singularity date of 2045 as a rough guide—then it’s game over for us. We will likely either become more or less enslaved, or else wiped out. And then technology will continue on its merry way without us.

This is not mere speculation on my part, incidentally. Within the past year, Stephen Hawking, Elon Musk, and Bill Gates have all come out with related concerns. They don’t understand the metaphysics behind it, but they’re seeing the same trend.

How has your experience communicating with Kaczynski changed you as a person and as a philosopher?

As a philosopher, not that much. Kaczynski generally avoids philosophy and metaphysics, preferring practical issues. In a sense, we are operating on different planes, even as we are working on the same problem.

As a person, I have a greater understanding of the basis for the ‘extreme’ actions that he took. It’s not often in life that you get a chance to communicate with someone with such a total commitment to their cause. It’s impressive.

Also, the media treatment of his whole case has been enlightening. When his book, Technological Slavery, came out in 2010, I expected that there would be at least some media coverage. But there was none. The most famous “American terrorist” publishes a complete book from a super-max prison—and it’s not news? Seriously? Compare this topic to the garbage shown on our national evening news programs, and it’s a joke. NPR, 60 Minutes, Wired magazine, etc.—all decided it wasn’t newsworthy. Very telling.

One last thing: Expect to hear from Kaczynski again soon. His second book is nearing completion. The provisional title is “Anti-Tech Revolution: Why and How.” But don’t look for it on your evening news.

Things are not as simplistic as you think.


Buy Technological Slavery, by Ted Kaczynski, and The Metaphysics of Technology by David Skrbina. Kaczynski does not profit from his book.

Brian Whitney’s latest book is Raping the Gods.

Related articles: Ted and the CIA Part 1 & 2 by David Kaczynski

http://blog.timesunion.com/kaczynski/ted-and-the-cia-part-1/271/

http://blog.timesunion.com/kaczynski/ted-and-the-cia-part-2/285/

The Internet Doesn’t Exist

history

By Jacob Silverman

Source: The Baffler

The Internet has been very busy. In just the last week, Caitlyn Jenner broke the Internet, but she also united it. The FCC made war on the Internet. The Internet shamed a couple. The Internet had a dark side, Nikki Finke was barred from the Internet, the Supreme Court made the Internet less safe for women, the Internet named a famous fetus, the Internet did stuff with a superhero movie, and the Internet changed. A girl also won the Internet, Jack White had a difficult history with the Internet, and the Internet “shafted” a Canadian journalist.

“The Internet” is the universal straw man, a hero or villain for every occasion. The Internet, the Internet, the Internet—this decentralized communications network has long been granted a proper noun and practically a degree of sentience. Yet few people talk about “the Telephone” as if it were some person or place, though perhaps they once did. This eagerness to grant the Internet some degree of autonomy—to make it into an actor, an entity—stems in part from its apparent abstraction. Where does all this information come from? As Ray Bradbury famously said, “To hell with you and to hell with the Internet. It’s distracting. It’s meaningless; it’s not real. It’s in the air somewhere.”

Bradbury wasn’t just slipping into kneejerk techno-fear. He was also guilty of the same fallacy that crops up again and again in digital journalism: the assumption that the Internet is some monolithic mass, a discrete population or interest group. “It’s distracting,” Bradbury said, without specifying what “it” was.

But in another, more important way, Bradbury was absolutely right: the Internet doesn’t exist.

A couple years ago, Rachel Law, a grad student at Parsons at the time, had this to say: “The ‘Internet’ does not exist. Instead, it is many overlapping filter bubbles which selectively curate us into data objects to be consumed and purchased by advertisers.” As she also said, a bit less academically, “Browsing is now determined by your consumer profile and what you see, hear and the feeds you receive are tailored from your friends’ lists, emails, online purchases, etc.”

What we call the Internet—and what web writers so lazily draw on for their work—is less a hive mind or a throng or a gathering place and more a personalized set of online maneuvers guided by algorithmic recommendations. When we look at our browser windows, we see our own particular interests, social networks, and purchasing histories scrambled up to stare back at us. But because we haven’t found a shared discourse to talk about this complex arrangement of competing influences and relationships, we reach for a term to contain it all. Enter “the Internet.”

The Internet is a linguistic trope but also an ideology and even a business plan. If your job is to create content out of (mostly) nothing, then you can always turn to something/someone that “the Internet” is mad or excited about. And you don’t have to worry about alienating readers because “the Internet” is so general, so vast and all-encompassing, that it always has room. This form of writing is widely adaptable. Now it’s common to see stories where “Facebook” or “Twitter” stands in for the Internet, offering approval or judgment on the latest viral schlock. Choose your (anec)data carefully, and Twitter can tell any story you want.

We fall back on “the Internet” because it gives us a rhetorical life raft to hang onto amidst an overwhelming tide of information or a piece of sardonic shorthand to utter with a wink and a grimace, much like “never read the comments.” It also reflects a strange irony about today’s culture: despite being highly distributed, and despite offering an outlet for every subculture and niche interest and political quirk, what we think of the Internet often does feel rather uniform and monolithic.

This impression is partly based in fact; the tech and media industries are currently undergoing a kind of recentralization, exemplified by the rise of massive platforms like Facebook and recent mega-deals, such as Verizon buying AOL or Charter Communications (who?) snapping up Time Warner Cable. Attention is increasingly being manipulated and auctioned off by a handful of big conglomerates. The relegation of Twitter to also-ran in the social media sweepstakes—the loser to Facebook in the rush to industry monopoly—also reflects this centralization. That a company with hundreds of millions of users can seem like a failure only shows how bad the market is at apportioning value. (But there I go falling into abstractions again—as if there is anything called “the market.”)

“The Internet” is easy, a convenient reference point and an essential concept for web journalists tasked with surfacing monetizable content from this great informational morass. Digital culture, or writing about “what people are talking about on the Internet,” is considered its own beat now. But in the same way that someone born in the 1980s might not think of himself as a millennial—an arbitrary distinction crafted by demographers and marketers—a user of an online service is not necessarily from, or part of, the Internet. Even some of the subcultures often held up as part of the Internet are mostly notional. Is “Black Twitter” a specific, homogenous entity, as it’s so often described in news coverage? Or is it more something that people do, a set of social relations acted out by varying groups of mostly black Twitter users?

The more we write about what takes place online as if it occurred in some other world, the more we fail to relate this communication system, and everything that happens through it, to the society around us. To understand the Internet, we have to destroy it as an idea.

Politics as therapy: they want us to be just sick enough not to fight back

freedom-mental-slavery2

By Michael Richmond

Source: Transformation

On 10 October it is World Mental Health Day. I used to be outgoing, but a descent into crushing depression left me housebound. After Occupy I started asking: how does social environment shape our psychology?

I used to buy the Sun newspaper. Not just to fit in with mates at secondary school but right into my first year at university. I knew there was something to be ashamed of in this filthy habit, armed as I was with my oft-deployed excuse: “I only buy it for the crossword and the football transfers.”

This was true. I never read the news. In general, I lived a remarkably apolitical existence. This was some feat considering I have a Jewish communist great grandfather, socialist grandparents, a union lawyer dad and an older brother who went through his Che Guevara phase at around fifteen.

I dropped out of university in early 2007, five months before Northern Rock bank hit the skids. Who knows whether the student experience would have politicised me? Perhaps the process would have been helped along by the backdrop of the approaching financial crisis?

But something else politicised me instead: a crushing, rapid descent into depression, social wilderness and personal crisis.

I experienced anxiety and depression as a hostile takeover of my life and sense of self. I went from being outgoing and sociable to being unable to talk to people or leave the house. This was within the space of a few days. There was no discernible cause.

It was quickly clear that I couldn’t continue at university and so I moved back into my parents’ house, where I have lived ever since.

Several years of isolation, suicidal thoughts and internal struggle followed. I remained unable to escape the confines of my bullying psyche, let alone my house.

Unable to work or study, have friendships, or experience joy, reading became my true love, my source of meaning, my attempt to make sense of what had happened to me. I obsessively read classic literature, history, philosophy, political economy – I had felt a profound sense of loss at not being able to finish university. I became determined that I would instead educate myself.

But an impenetrable sense of terror and despair continued to accompany me through my every waking and sleeping hour. I began to work my way through an impressive list of psychotropic medications and psychotherapies and eventually attended an NHS psychiatric day hospital for six months.

A “service user” within the psychiatric system gains a unique insight and practical education in state discipline as well as the lengths gone to in enforcing normativity. Having grown up white, straight, male and middle class, I was privileged to rarely, if ever, be told that I had to be something other than what I was.

I seldom encountered gross injustice or violence, blatant discrimination or the kind of treatment faced from the earliest ages if you happen to be a person of colour, don’t fit a gender binary or adhere to accepted ideals of sexual behaviour.

Apart from being a non-religious Jew and encountering minimal levels of playground anti-Semitism, this was the first time I found myself in a situation of social and political ostracism (as well as a self-ostracism that proved just as powerful). I discovered for myself that the experience of the personal deeply informs the political.

Leaving the psychiatric day hospital to instead attend the asylum of Occupy the London Stock Exchange at St Paul’s Cathedral was in many ways a descent into further madness. Many “occupiers” were well acquainted with psychiatric services and medications – as well as using drugs not sanctioned by the state, but often taken for similar reasons.

Chaotic, naïve, and ultimately politically problematic and ineffectual, the initial occupied space did nevertheless open up the possibility for social and political interaction that is elsewhere absent from society.

I felt that I was in crisis, but also that the crisis was much bigger than just me. Getting involved in political praxis seemed to be the best way to channel what I was experiencing.

There is a lot to be said for the practice of “politics as therapy.”

The personal account or “journey” format often proves insufficient when attempting to understand what we do and why we do it. An analysis of political subjectivity is crucial. Shifts in capitalist expansion, social environment and class composition, technological development and the onset of crises tend to precipitate political transformation on an individual and collective basis.

The advent of the printing press or the collapse of the automotive industry in mid-west America, for example, are not external factors to people’s lives or isolated moments in history. Indeed, any such upheaval is bound to lead to transformative changes in the lives and political ideation of those experiencing it.

Our social environment shapes our psychology. We must consider how the policy, ideology and debate that surrounds “mental health” or madness is framed.

The individualisation of suffering is key to the prevailing ideology and discourse surrounding mental illness. This will often focus on a supposed misfiring of brain chemicals, a “cure” to which can be found in the form of pharmaceuticals – often prescribed by your GP before any contact with mental health services.

Attention may also turn to an individual’s lack of positive attitude, but this problem can be “fixed” by a six-week course of cognitive behavioural therapy. So much human suffering is pathologised and medicated when it is either “natural” (i.e grief or the general variety of mental experience) or is directly or indirectly linked to social, political and economic factors that remain absent from debate, let alone actively contested on this terrain.

Psychologist and author Bruce E Levine suggests that much of today’s intervention under the auspices of “mental health” is all too political.

“What better way to maintain the status quo,” Levine asks, “than to view inattention, anger, anxiety, and depression as biochemical problems of those who are mentally ill rather than normal reactions to an increasingly authoritarian society?”

He also argues that many potential activists and “natural anti-authoritarians” are prevented from opposing power: “Some activists lament how few anti-authoritarians there appear to be in the US. One reason could be that many natural anti-authoritarians are now psychopathologised and medicated before they achieve political consciousness of society’s most oppressive authorities.”

The historical origins of madness within western culture and how it became increasingly medicalised should not be forgotten. Michel Foucault exposed how the origins of “confinement” of the “insane” in asylums and workhouses were an integral part of the violent replacement of the feudal commons way of life with capitalist work discipline during the 16th and 17th centuries.

This process is in keeping with continual “primitive accumulation” akin to and contemporary with the conquest of the “New World” and the persecution of heretics and witches. Their land and means of reproduction were stolen and appropriated, while authorities continually oppressed and attempted to proletarianise them.

Initially, the “Great Confinement” saw the imprisonment of the old, the unemployed, the “criminal”, the “insane.”

As Foucault explains: “Before having the medical meaning we give it, or that at least we like to suppose it has, confinement was required by something quite different from any concern with curing the sick. What made it necessary was an imperative of labour. Our philanthropy prefers to recognise the signs of a benevolence toward sickness where there is only a condemnation of idleness.”

The conflation of pejoratives like lazy, sick, unemployed, idle are more than familiar to us in today’s discourse surrounding welfare benefits and the imperatives of labour. And it is not just the DWP and Atos who pressure people back into work, NHS psychiatric services also seem to believe that it is work that sets you free.

The capitalist class would like us to be just sick enough not to fight back, but not so sick that we cannot work. The challenge for us is to find ways of organising and helping each other so that we can find adequate levels of social reproduction, care and support to give us a platform to engage in the therapy of class struggle.

 

Saturday Matinee: The Fuck-It Point

THE-FUCK-IT-POINT

Synopsis by Savage Revival:

A film about civilization, why we should bring it down and why most civilized people don’t.

[THE FUCK-IT POINT]
‘When you have had enough. When you decide to take matters into your own hands and don’t care what’s going to happen to you. When you know that from now on you will resist with whatever tactic you think is most effective.’

Now Streaming: The Plague Years

3768363

By A. S. Hamrah

Source: The Baffler

When things are very American, they are as American as apple pie. Except violence. H. Rap Brown said violence “is as American as cherry pie,” not apple pie. Brown’s maxim makes us see violence as red and gelatinous, spooned from a can.

But for Brown, in 1967, American violence was white. Explicitly casting himself as an outsider, Brown said in his cherry pie speech that “violence is a part of America’s culture” and that Americans taught violence to black people. He explained that violence is a necessary form of self-protection in a society where white people set fire to Bowery bums for fun, and where they shoot strangers from the towers of college campuses for no reason—this was less than a year after Charles Whitman had killed eleven people that way at the University of Texas in Austin, the first mass shooting of its kind in U.S. history. Brown compared these deadly acts of violence to the war in Vietnam; president Lyndon B. Johnson, too, was burning people alive. He said the president’s wife was more his enemy than the people of Vietnam were, and that he’d rather kill her than them.

Brown, who was then a leader of the Student Nonviolent Coordinating Committee and who would soon become the Black Panther Party’s minister of justice, delivered a version of this speech, or rant, to about four hundred people in Cambridge, Maryland. When it was over, the police went looking for him and arrested him for inciting a riot. Brown’s story afterward is eventful and complicated, but this is an essay about zombie movies. Suffice it to say, Brown knows about violence. Fifty years after that speech, having changed his name to Jamil Abdullah al-Amin, he’s spending life in prison for killing a cop.

The same day Brown was giving his speech in Maryland, George A. Romero, a director of industrial films, was north of Pittsburgh in a small Pennsylvania town called Evans City. Romero was shooting his first feature film, a low-budget horror movie in black and white called Night of the Living Dead. Released in October 1968, the first modern zombie movie tells the story of a black man trying to defend himself and others from a sudden plague of lumbering corpses who feed on the living. At the film’s end, he is unceremoniously shot and killed by cops who assume he is a zombie trying to kill them. The cops quickly dispose of his body, dumping it in a fire with a heap of the undead, as a posse moves on to hunt more zombies.

Regional gore films were nothing new in themselves; a number had appeared earlier in the 1960s. Night of the Living Dead, with its shambling, open-mouthed gut-munchers dressed in business suits and housecoats, might have seemed merely gross or oddly funny in a context other than the America of 1968. But Martin Luther King Jr. had been assassinated six months before its release. The news on TV, which most people still saw in black and white, consisted largely of urban riots and war reports from Vietnam. The My Lai Massacre had occurred the month before King was shot.

Romero’s film, seen in the United States the year it came out, had more in common with Rome Open City than it did with a drive-in horror movie made for teens—it was close to a work of neorealism. And it was unfunny and dire, much like John Cassavetes’s Faces, released the same year, whose laughing drunks stopped laughing when they paused to look in the mirror. Romero was a revisionist director of horror in the same way that Peckinpah and Altman were in their career-making genres, the western and the war movie.

Romero cast an African American in the lead, and he shifted the horror genre’s dynamic, aligning it with black-and-white antiwar documentaries like Emile de Antonio’s In the Year of the Pig, also released in 1968, and distinguishing it from the lurid color horror films Roger Corman and Hammer Films had been turning out up till then. Those films made certain concessions to the film industry; Night of the Living Dead did not. This was an American horror movie, so it needed no English accents or familiar character actors. It was grim and unflinching, showing average citizens, played by average people, eating the arms and intestines of their fellow townsfolk. Romero drove home this central point—that a zombie-infested America differed from the status quo only in degree, not in kind—by ending his film with realistic-looking fake news photos depicting his characters’ banal atrocities.

Mainstream film reviewers, including Roger Ebert, were shocked and disgusted by Night of the Living Dead. They discouraged people from seeing it, but Romero’s images proved to be indelible. The film’s reputation grew. In 1978 Romero made the film’s first sequel, Dawn of the Dead, this time in color. Today, if there’s one thing every American knows, it’s that zombies can only be killed with a shot to the head. This is common knowledge, cultural literacy, a kind of historical fact, like George Washington chopping down the cherry tree. American-flag bumper stickers assert that “these colors don’t run,” but one of them does. It runs like crazy through American life, through American movies, and now TV, like a faucet left on.

Dead Reckonings

The Huffington Post has had a Zombie Apocalypse header since 2011, under which the editors file newsy blog posts chronicling our continuing fascination with zombie pop culture, alongside any nonfiction news story horrible enough to relate to zombies or cannibalism. The infamous Miami face-eater attack of May 2012, which the media gleefully heralded as the start of a “real” zombie apocalypse, contributed to America’s sense that it could happen here, provided we wished for it hard enough. Reading through the Zombie Apocalypse posts, one gets a growing sense that we want the big, self-devouring reckoning to happen because it is the one disaster we are truly mentally prepared for. It won’t be the total letdown of the Ebola scare.

The face-eating incident was initially linked to bath salts: ground-up mineral crystals everyone hoped would become the new homemade drug of choice for America’s scariest users. It turned out the perpetrator, although naked, was only high on marijuana. He was black, killed by the police as he gouged out his homeless victim’s eyes and chewed his face on a causeway over Biscayne Bay. The incident was captured on surveillance video. Here in the golden age of user-generated content, the zombie movies self-generate—much like zombies themselves. The bridge backdrop of this all-too-real zombie vignette neatly summed up both the crumbling condition of America’s infrastructure and our more generalized state of neoliberal collapse.

The zombie apocalypse, our favorite apocalypse, seems to unite the right and left. It combines the apocalypse brought about by climate change and the subsequent competition for scant resources with the one loosed by secret government experiments gone awry. Better still, both of these scenarios, as we’re typically shown in graphic detail, will necessitate increased gun-toting and firearms expertise.

More than that, the fast-approaching zombie parousia allows us to indulge our fantasies of a third apocalypse, one that only the most clueless don’t embrace: the consumerist Day of Judgment, in which we will all be punished for being fat and lazy and living by remote control, going through our daily routines questioning nothing as the world falls apart and we continue shopping. Supermarkets and shopping carts, malls and food warehouses all figure prominently in the iconography of the post–Night of the Living Dead zombie movie, reminding us that even in our quotidian consumerist daze, we are one step away from looting and cannibalism, the last two items on everyone’s bucket list.

Still, despite its galvanizing power to place all of humanity on the same side of the cosmic battlefront, the zombie apocalypse, like all ideological constructs, nonetheless manages to cleave the world into two camps. One camp gets it and the other doesn’t. One is aware the apocalypse is under way, and the other is blithely oblivious to the world around it.

To confuse matters further, people move in and out of both camps, becoming inert, zombified creatures when obliviousness suits their mood. People blocking our progress on the street as they natter into their hands-free earsets stare straight ahead, refusing to admit that other people exist. At least they don’t bite us as we flatten ourselves against walls to pass them without contact. A paradox of the ubiquity of zombie-themed pop culture is how there are surely next to no people left who have not enjoyed a zombie movie, TV show, book, or videogame, yet there are more and more people shuffling around like extras in a zombie film, moving their mouths and making gnawing sounds.

The smartphone-based zombification of street life is a strange testament to Romero’s original insight, which becomes more pronounced as the wealth gap widens. The disenfranchised look ever more zombie-fied to the rich, who in turn all look the same and act the same as they take over whole neighborhoods and wall themselves up in condo towers. This, indeed, is exactly what happens in Romero’s fourth zombie movie, 2005’s Land of the Dead, which predicted things as consequential as what happened during Hurricane Katrina in New Orleans and as minor as the rise of food trucks.

The Zombie Apocalypse is also a parable of the Protestant work ethic, come to reap vengeance at the end of days. It assures us that only very resourceful, tough-minded people will be able to hack it when the dead come back to life. If the rest had really wanted to survive—if they deserved to survive—they would have spent a little less time on the sofa. But here, too, the simple and obvious moral takes a perverse turn: the best anti-zombie combatants should be the ones who’ve watched the most zombie movies, yet by the very logic of our consumer-baiting zombie fables, they won’t be physically capable of survival because all they did was watch TV.

Selective Service

What these couch potatoes will need, inarguably, is the protection of a strong leader, one who hasn’t spent his life in the vain and sodden leisure pursuits that they’ve inertly embraced—Rick Grimes in The Walking Dead, for instance. Why such a person would want to help them is a question they don’t ask. With this search for an ultimate hero, the zombie genre has veered into the escapism of savior lust, leaving Romero’s unflinching, subversive neorealism behind. In Night of the Living Dead, a witless humanity is condemned by its own herd mentality and racism. In latter-day zombie fictions, a quasi-fascist social order is required, uniting us regardless of race, creed, or color.

The predicament of the characters (and the actors) in all the nouveau zombie movies relates to this passive consumerism. Both the characters and the actors in new zombie movies have to act like zombie films don’t already exist, even though the existence of Romero’s films is what permits the existence of the film they are in. Somehow, the characters pull their savvy out of thin air. They must pretend that they have never heard of zombies, even as they immediately and naturally know what to do once their own particular Zombie Apocalypse gets under way.

This paradox underscores the fantasy aspect of the twenty-first-century zombie infatuation, in which a fixed set of roles is available for cosplay in a repeatable drama that already took place somewhere else. The difference between Romero’s films and the new zombie movies is that the more time that passes since 1968, the more Romero’s films don’t seem like they were designed as entertainment—even as they are endlessly exploited by the zombie-themed cultural productions that copy them, and even as they remain entertaining. The new zombie films cannibalize Romero’s films in an attempt to remake them ideologically, so that we will stop looking for meaning in them and just accept the inevitable.

The Primal Hordes

A primal fantasy of the Zombie Apocalypse is that when the shit hits the fan, we will be able to kill our own children or parents. We won’t have a choice. The decision to get rid of the generation impeding us will have been made for us by the zombie plague, absolving us of responsibility. We are, after all, killing somebody who is already dead and who, in his or her current state, is a threat to our continued existence.

Against the generalized dystopian entertainment landscape that followed the economic collapse of 2008, the Zombie Apocalypse made more sense than ever. But YA action-drama dropped it in favor of promoting teen heroes who were stronger than their nice-but-loserish sad sack parents. This is the uplifting generational affirmation that imbues Suzanne Collins’s Hunger Games franchise and Veronica Roth’s Divergent trilogy.

YA comedy, on the other hand, did not ignore zombie movies. Instead, it domesticated the Zombie Apocalypse, making it friendly. Nonthreatening zom-coms showed young viewers how the opposite sex was really not that scary, that being in a couple was still the most important thing, and that dystopias gave nerds an unprecedented chance to prove they could get the girl or boy. Dystopia, it turns out, is really a best-of-all-possible-worlds scenario for starry-eyed-kids-with-a-disease, or so we learn from zom-coms like Warm Bodies and Life After Beth.

The latest iteration of this trend, which sets a zombie heroine in a marginally less dystopian world that mirrors our tentative economic comeback, is the CW TV show iZombie. The series is a brain-eating entertainment for tweens in which they learn you can be okay and have a chill job even if you’re a living corpse who’s just trying to figure things out. When a zombie gets her own tween-empowerment show on The CW, it’s a good indication that zombies don’t carry the stern, unbekannt stigmas they used to. Zombies, much like corpses in TV commercials, are used as grotesque comic relief in things like animated Adult Swim shows. Such is the diminished status of the zombie; it is now a signifier that can be plugged in anywhere. To paraphrase the undead philosopher of capitalism’s own walking-dead demise: first time cannibalism, second time farce.

Reality Bites

The way zombie movies progress, with isolated groups splitting into factions and various elimination rounds as contestants disappear, suggests that Night of the Living Dead is also a secret source of reality TV. It makes sense, then, that 2009’s Zombieland, one of the first YA dystopian zombie entertainments, was penned by screenwriters who created The Joe Schmo Show and I’m a Celebrity . . . Get Me Out of Here!

Zombieland’s protagonist, a college-age dude played by Jesse Eisenberg, is a bundle of phobias, an OCD-style follower of rules who finds himself in a Zombie Apocalypse after an unexpected date with a hot girl out of his league (Amber Heard) goes wrong. Mentored by Woody Harrelson, who more or less reprised this same role in the Hunger Games movies, Eisenberg’s millennial character undergoes a reality-TV-scripted makeover. In expiation for his pusillanimity in the opening reel, he winds up rescuing a tough girl (Emma Stone) who also would have been out of his league in the pre-Apocalypse scheme of dating. Zombieland presents Eisenberg as gutless and Stone as ruthless, but she’s the one who ends up a hostage, and he becomes her hero. In fact, one of his rules, “Don’t be a hero,” changes on screen to “Be a hero,” as we once again learn that millennials really do have what it takes to kill zombies. Earlier in the film, Eisenberg accidentally shoots and kills a non-zombie Bill Murray, playing himself, showing that millennials can also, regretfully, take out Baby Boomers, including the cool ones who aren’t undead.

Edgar Wright’s 2004 Shaun of the Dead, the first movie zom-com, was a more intelligent version of this same storyline. An English comedy from the “Isn’t it cute how much we suck?” school, Wright’s film acquiesced to the coupling-up plot rom-coms require, but not without first presenting the routine, pointless daily life of its protagonist (Simon Pegg) as pre-zombified. Shaun of the Dead will likely remain the only sweet little comedy in which the protagonist kills his mother, a scene the film has the guts to play without flinching. The joke of Wright’s film is that it takes something as brutal as a zombie apocalypse to wake us from our stupor and to show us how good we had it all along. By the film’s end, Pegg and his girlfriend (Kate Ashfield) are in exactly the same place they were when the film started, but now at least they live together. A cover of the Buzzcocks’ song “Everybody’s Happy Nowadays” jangles over the credits, providing a zombified dose of circa-1979 irony.

Wright and Pegg’s goofy rethinking of the zombie movie proved how firmly zombies are entrenched in our consciousness, and how easy they are to manipulate for comedic effect. The same month Shaun of the Dead came out, a Hollywood remake of Romero’s Dawn of the Dead was released. It, too, cleaned up at the box office. This new Dawn of the Dead seemed like it was made by one of the nerds in the American zom-coms, a jerk desperate to prove he’s bad-ass. (The director now makes superhero movies.) Johnny Cash’s “The Man Comes Around” accompanies the opening credits, setting a high bar for artistic achievement the ensuing film does not come near to clearing. Jim Carroll’s “People Who Died” plays at the end—its placement there as repulsive as anything else in the film.

As all nouveau zombie films must, the remake starts in the suburbs, where a couple is watching American Idol in bed, underscoring the genre’s newfound connection to reality TV. The film’s CGI effects, which at the time injected a souped-up faux energy into the onscreen mayhem, dated instantly. They’re now the kind of off-the-rack effects featured in Weird Al videos when someone gets hit by a car.

The main point of this new Dawn of the Dead is that after the Zombie Apocalypse, people will spend their time barking orders at each other and calling each other “asshole.” The film nods in the direction of loving the military and the police, and totally sanitizes Romero’s use of a shopping mall as a site of consumerist critique. Like many films of the 2000s, it postulates that living in a mall wouldn’t be a Hobbesian dystopia at all; it would be rad. If the remake had been made five years later, maybe it would have had to grapple with the “dead malls” that began to adorn the American landscape with greater frequency after the economy collapsed. Instead, the mall serving as the film’s principal backdrop is spotless and fun. The remake’s island-set, sequel-ready false happy ending makes one long for the denouement of Michael Haneke’s Funny Games—a longing more unimaginable than any real-life wish-fulfillment fantasy about the Zombie Apocalypse actually coming to pass.

The American Way of Death

Fanboys liked the Dawn of the Dead remake and, inexplicably, so did many critics. Manohla Dargis, then at the Los Angeles Times, wrote that the film was “the best proof in ages that cannibalizing old material sometimes works fiendishly well,” a punny sentiment she might well walk back today.

The next year, when George A. Romero released his first new zombie film in twenty years, it did not fare as well in the suddenly crowded marketplace of the undead. While Land of the Dead (2005) is fittingly seen as something of a masterpiece now, on its initial release it puzzled genre fans, who had gotten used to the sort of “fast zombies” that were first featured in the nihilistic-with-a-happy-ending British movie 28 Days Later (2002). Romero’s new film was as trenchant as his others, but many fans weren’t having it.

IMDb user reviews provide a record of their immediate reactions. “This movie was terrible!” one wrote the month Land of the Dead premiered. “The storyline—can’t use the word plot as that would give it too much credit—was tedious! Some say it was a great perspective on class? Are you kidding me!!!” Less then a year into George W. Bush’s second term, Romero was archly depicting a society much different from the one he’d shown in Night of the Living Dead. This new society—today’s—was more class-riven, more opportunistic, more cynical. And Romero, even while moving in the direction of Hawksian classicism, was exposing these failings with radical acuity. His dark fable of two Americas at war over the control of the resources necessary to survive was concise, imaginative, and well constructed. Few at the time wanted to consider the film’s style, which seemed out of date compared to the Dawn of the Dead remake. Fewer still wanted to grapple with its implications.

Ten years later, it is clear that no American genre film from that period digests and exposes the Bush era more skillfully than Land of the Dead. Romero’s film was uncomfortably ahead of its time, and like his other zombie work, it hasn’t dated; it speaks of 2015 as much as 2005. Tightly controlled scenes avoid the pointlessness and repetition of the nouveau zombie films, limning class struggle in unexpected ways. Zombies, slowly coming to consciousness, use the tools of the trades from which they’ve been recently dispossessed to shatter the glass of fortified condos. A zombie pumps gas through the windshield of a limo. The rich commit suicide, only to come back to life as zombies and feed on their children. America, as the original-zombie-era Funkadelic LP taught us, eats its young.

As zombie fantasies go, these scenes are much richer than the random, unsatisfying mayhem of the nouveau zombie films. Romero, unlike his counterparts, does not shy away from race. He shows African Americans pushing back against the injustices and indignities of a militarized police state, thereby completing a circle that began with Duane Jones’s performance in Night of the Living Dead.

Walking Tall

For the latest generation of zombie enthusiasts, the zombie genre means just one thing: AMC’s massively popular cable series The Walking Dead. The show is so much better than any of the recent non-Romero zombie movies that it’s among the leading exhibits in the case against the cineplex. The show’s politics and implications are widely discussed, and The Walking Dead has engendered national debate about all sorts of ethical issues, including something Romero’s films raised only in the negative: America’s future. But the first problem The Walking Dead solved was how to make its own debates about these things interesting: whenever scenes get too talky, a “walker” sidles up and has to be dispatched in the time-honored fashion. At its core, the zombie drama is like playing “You’re it!” The show could be called Game of Tag.

The Walking Dead debuted in 2010, emerging from a period in U.S. history when, all of a sudden, we found ourselves in a junked, collapsed, post-American environment. New dystopian dramas, especially the YA ones, reflected this chastened reality. The Walking Dead looked at first like it might become just another placeholding entry in this cavalcade of glumness, much like TNT’sSpielberg-produced, families vs. aliens sci-fi show Falling Skies. Zombies were maybe the most dated way possible to dramatize our newly trashed world.

It was The Walking Dead’s dated qualities, however, that saved it from becoming cable TV’s Hunger Games. The show’s grunge aesthetic and majority-adult cast situated it elsewhere. And if that particular elsewhere felt like the past as much as the future, that was part of what made the show work for premium cable’s Gen X audience. Greg Nicotero, a makeup man who worked under Romero, is one of the show’s producers. His presence indicated the people behind the show took the genre seriously, unlike anyone else in Hollywood who had touched it.

Television works by imitating success, by zombifying proven formulas through a process called mimetic isomorphism. When television producers saw The Walking Dead’s ratings beating broadcast-network ratings—a first for cable drama—they took notice and began spawning. Copies of copies like Resurrection, The Last Ship, The Leftovers, and 12 Monkeys showed that plague is contagious, but it doesn’t have to be zombie plague. Meanwhile, The Walking Dead continues its success, and AMC will debut a companion series this summer, unimaginatively called Fear the Walking Dead.

If the worst zombie movies unselfconsciously imitate higher-gloss broadcast-network reality trash like Survivor, The Walking Dead succeeds by staying closer to the lowest grade of cable-network reality TV. The world of The Walking Dead is closer to Hoarders than it is to Big Brother. Hoarders presents an America engulfed in mounds of trash that its psychologically damaged possessors can’t part with. Mounds of Big Gulp cups and greeting cards and heaps of car parts and instruction manuals overwhelm their homes, spilling into their yards. Shows like Storage Wars, Pawn Stars, and American Pickers present an America of valueless junk that maybe somebody can make a buck on—if only by televising it for our own lurid delectation. These shows are the opposite of pre-collapse valuation shows like Antiques Roadshow, in which the junk people had lying around proved to be worth more than they had imagined. The detritus of Hoarders is worthless, the kind of trash that will blow around everywhere after the Zombie Apocalypse.

Hoarders vs. Horde

In his recent book 24/7, an analysis of the end of sleep and our twenty-four-hour consumption-and-work cycle, Jonathan Crary writes that “part of the modernized world we inhabit is the ubiquitous visibility of useless violence and the human suffering it causes. . . . The act of witnessing and its monotony can become a mere enduring of the night, of the disaster.” Zombies, not quite awake but never asleep, are the living-dead reminders of this condition, stumbling through our fictions. When they are not transformed by the wishful thinking of ideology into our pals, they retain this status.

Celebrated everywhere, zombies are the opposite of celebrities, who swoop into our disaster areas like gods from Olympus to rescue us from the calamities that also allow them to flourish. Zombies, far from being elevated, descend into utter undistinguishable anonymity and degradation, which is why they can be destroyed in good conscience. Brad Pitt, one of the producers of ABC’s Resurrection, also starred in World War Z, the most expensive zombie movie ever made. The last line of that odious movie—the first neoliberal zombie movie—is “Our war has just begun.”

Whatever that was supposed to mean to the audience, these fables of the plague years drive home just who the zombies are supposed to be—and who, when the plague hits, will helicopter out holding the machine guns. Col. Kurtz’s faithful devotee from Apocalypse Now, Dennis Hopper, the counterculture hero who became a Republican golf nut, plays the leader of the remaining 1 percent in Land of the Dead. “We don’t negotiate with terrorists,” he says when he’s faced with the choice between his money and our lives.

Follow the trail of facts, hints, and allegations—connect the dots

images

By Edward Curtin

Source: Intrepid Report

“There were incidents and accidents/there were hints and allegations.—Paul Simon

Children love to trace, to connect the dots, to make connections, but often the connections they make frighten adults who try to ignore their points or offer some ridiculous circumlocutions. Maybe we adults are much like children in our desires to make connections, but the thought of it frightens us.

Suppose we could for a while calm those fears and concentrate long enough to trace through the dim glimmerings of a faded pattern a clarifying story that would jolt us into an awareness that could change our lives and society. I offer here an arc of history that you may consider tedious. Try patience. I could yell, I could scream, I could try all the classical argumentation and logic that comes “naturally” to me. I could be a wise guy, amuse you, try to provoke you, curse, sing a song, stomp my feet—even write post-modern gibberish. As Andre Vltchek says, it’s hard—I’m putting it nicely—to get through, to have an impact that counts. We desperately want to believe in a world where we really are children and BIG Daddy (apologies to Burl Ives) has told the truth. Obviously I have reached some stern conclusions, but I think the conclusions follow from the facts. See what you think.

  • 1957, Massachusetts Senator John Kennedy delivers a Senate speech in support of the Algerian liberation movement, in support of African liberation generally, and against colonial imperialism. The speech causes an international uproar, and Kennedy is harshly attacked by Eisenhower, Nixon, John Foster Dulles, and even liberals such as Adlai Stevenson. He is praised in the third world.
  • 1959, George H. W. Bush moves his oil company—Zapata Offshore—to Houston, Texas. One of Zapata’s drilling rigs, Scorpion, having been moved from the Gulf of Mexico the previous year, is now operating 54 miles north of Cuba
  • 1960. On March 17, President Eisenhower approves the Bay of Pigs project.
  • 1961. On January 17, in anticipation of Kennedy’s inauguration in three days, the Belgian government in complicity with the CIA assassinates Congolese nationalist leader Patrice Lumumba. On February 13, a devastated Kennedy receives a belated phone call informing him of Lumumba’s murder.
  • 1961, April. More than a week before the CIA led Bay of Pigs invasion of Cuba—code-named the Zapata Operation—the CIA discovers that the Soviets have learned the date of the invasion and informed Castro. Knowing the invasion is doomed in advance, the CIA Director Allen Dulles doesn’t tell Kennedy. When the invasion fails, the CIA blames JFK who angrily says he wants “to splinter the CIA into a thousand pieces and scatter it to the winds.” Kennedy fires Dulles.
  • 1962. On June 13, Lee Harvey Oswald, ex-Marine and alleged traitor, returns from the Soviet Union with a loan from the State Department that also arranges for him, together with his Russian wife, to be met at the dock in Hoboken, New Jersey by Spas T. Raikin, an official of an anti-communist organization with extensive intelligence connections. Oswald soon moves to Dallas, Texas where, at the behest of the CIA, he is chaperoned around by CIA asset and George H. W. Bush’s old friend, George de Mohrenschildt.
  • 1963, June 10. JFK delivers his famous American University address calling for an end to “a Pax Americana enforced on the world by American weapons of war.”
  • 1963. On October 11, Kennedy issues National Security Action Memorandum 263 calling for the withdrawal of 1,000 American troops from Vietnam by the end of 1963 and all of them by the end of 1965.
  • 1963, November 2. At the last minute JFK cancels his trip to Chicago to attend the Army-Air Force football game when it is learned that a four-man rifle team has plotted to assassinate him. The four are never charged or named, but an alienated ex-Marine scapegoat with CIA connections, Thomas Arthur Vallee, is arrested on a pretext. Vallee works in a building overlooking a dog-leg turn where JFK’s car was to pass.
  • 1963, November 22. JFK is shot in Dallas on a dog-leg turn at 12:30 P.M. and dies at 1 P.M. At 1:38 P.M. Walter Cronkite makes the first public announcement of the president’s death. At 1:45 P.M. George H. W. Bush, who is in Tyler, Texas an hour and a half southeast of Dallas, telephones Houston FBI agent Graham W. Kitchel to inform him that he’s heard gossip that a Houston man, James Parrot, has been talking about killing Kennedy when he comes to Houston (JFK had been in Houston the day before). Parrot is questioned and deemed harmless. Bush tells the FBI agent that he’ll be going to Dallas in the evening, though he fails to mention that he was there the night before. At 1:50 PM the Dallas police arrest Lee Harvey Oswald in the Texas theatre and charge him with the murder of Dallas police Officer J.D. Tippett. A few minutes after Oswald’s arrest and his exit out the front door to waiting police cars, a second Oswald is arrested in the theatre and surreptitiously taken out the back door. Later in the day Oswald is charged with also killing President Kennedy from behind from the 6th floor of the Texas School Book Depository. But the fatal shot to Kennedy’s head comes from the left front.
  • 1963. Two days later, Ruby kills Oswald, who claimed he was a patsy, in the Dallas police building. That same afternoon LBJ tells Henry Cabot Lodge that “I am not going to lose Vietnam.”
  • 1963, November 29. LBJ announces the formation of the Warren Commission whose key member is Allen Dulles, the former CIA Director fired by Kennedy.
  • 1963. On December 24, Johnson tells the Joint Chiefs of Staff: “Just get me elected, and then you can have your war.”
  • 1964, August. The fraudulent Tonkin Gulf Incidents and Tonkin Gulf Resolution. Johnson orders the bombing of North Vietnam. The Vietnam War starts in earnest.
  • 1964 ,September. The Warren Commission findings are made public. Oswald is declared the lone assassin with the magic bullet explanation being the key.
  • 1967. Martin Luther King delivers his Riverside Church speech—“A Time to Break Silence”—denouncing the Vietnam War and calling for opposition to it, while linking it to social and economic oppression at home.
  • 1968, April 4. Martin Luther King is assassinated in Memphis. The authorities blame it on James Earl Ray, a petty criminal loner.
  • 1968. On June 6 in Los Angeles, Senator Robert Kennedy. On the cusp of becoming the Democratic nominee for president, is assassinated. The accused lone assassin, Sirhan Sirhan, was standing in front and to the left of RFK. The autopsy shows Kennedy was killed by a bullet from behind and below that entered his head behind his right ear. Sirhan is subsequently convicted as the lone crazed gunman, despite many witnesses seeing a girl, in a polka dot dress, with a male companion, running down the back stairs of the hotel, shouting. “We shot him! We shot him! We shot Senator Kennedy.”
  • 1972, June 17. Five CIA employees and veterans of the Bay of Pigs operation are arrested inside the Watergate offices of the Democratic National Committee. Together with H. Howard Hunt (CIA) and G. Gordon Liddy, they are later indicted. The burglars are caught by a security guard who notices that these skilled undercover operatives have taped locks open from the outside so that the tape is showing.
  • The Watergate story is primarily reported by reporters Bob Woodward and Carl Bernstein who work at the Washington Post under Editor Ben Bradlee. Woodward had earlier served in Naval Intelligence, as had Bradlee, while Bradlee and the Washington Post have deep ties to the CIA and intelligence communities.
  • 1974, August 9. Nixon is forced to resign. He is the second president in eleven years to be removed from office. Gerald Ford, a former member of the Warren Commission assumes the presidency. Dick Cheney is named White House Chief of staff and Donald Rumsfeld, Secretary of Defense.
  • 1976, January 30. Having been nominated by Ford, George H. W. Bush assumes the directorship of the CIA, despite critics arguing that he has no intelligence experience. He serves in that capacity for 365 days.
  • 1976. George de Mohrenschildt, Oswald’s CIA chaperone and George H. W. Bush’s old friend, writes a letter to CIA Director Bush begging for help “we are being followed everywhere. . . .”
  • 1977, March 27. George de Mohrenschildt, about to be questioned by the House Select Committee on Assassinations, allegedly commits suicide in Florida.
  • 1979, November 4. Fifty-two Americans are taken hostage in the U.S. Embassy in Tehran.
  • 1980. Ronald Reagan is elected president and George H. W. Bush, vice-president. It is later alleged that Bush, CIA officer Robert Gates, and CIA Director William Casey met secretly with Iranian officials in Paris before the election and made a secret deal to insure Reagan/Bush an election victory by not releasing the hostages before the vote. The hostages were subsequently released a few minutes after Reagan and Bush were sworn in on January 20, 1981.
  • 1985-88. The Iran-Contra scandal plays out as it is discovered that the Reagan administration was secretly selling arms to Iran in exchange for hostages and using the proceeds to illegally arm the anti-Sandinista rebels in Nicaragua in violation of the Boland amendment. Oliver North becomes the public face of the secret machinations while Reagan and Bush plead ignorance. Many are indicted, while Bush, when running for president in 1988, claims he was “out of the loop.”
  • 1988, July 16. In the midst of the presidential campaign pitting Bush against Dukakis, the Nation magazine publishes an article by Joseph McBride, “The Man Who Wasn’t There, ‘George Bush,’ CIA Operative.” The article centers around a newly discovered memo from FBI Director J. Edgar Hoover, dated November 29, 1963, concerning the JFK assassination and an oral briefing the bureau had given on November 23 regarding the assassination to “Mr. George Bush of the Central Intelligence Agency.” A Bush spokesman denies it was candidate Bush.
  • 1988, July 3. The USS Vincennes shoots down in Iranian airspace civilian Iran Flight 655 killing 299, including 66 children. Vice President Bush says, “ I will never apologize for the U.S. I don’t care what the facts are . . . I’m not an apologize-for-America kind of guy.”
  • 1988. George H. W. Bush is elected president.
  • 1990-91. President Bush attacks Iraq, called the Gulf War, public and congressional support for which is given a huge boost on the testimony of a nurse who claims she witnessed Iraqi soldiers in Kuwait City hospital grabbing babies out of incubators and throwing them on the floor to die. It is later discovered that the “nurse” in question was the daughter of the Kuwaiti ambassador to the United States and that she hadn’t lived in Kuwait at the time. Her story had been hatched by the Hill and Knowlton public relations firm and was a lie—a successful lie.
  • 1991, May 19. A few weeks after filming had begun on Oliver Stone’s movie, JFK, the Washington Post’s national security reporter George Lardner, Jr., writes a scathing review of the film based on a stolen copy of the first draft of the screenplay.
  • 1991, December 20. Stone’s film, JFK, is released.
  • 1991,0n December 24, President Bush grants pardons to six former members of the Reagan/Bush administration facing prosecution in the Iran-Contra scandal.
  • 1993-2000. President Bill Clinton bombs Iraq, Yugoslavia, Afghanistan, Sudan . . . killing untold numbers of people, while maintaining economic sanctions on Iraq.
  • 1996, May 12. On CBS’s Sixty Minutes, Clinton’s Secretary of State Madeleine Albrecht says that the deaths of over 500,000 Iraqi children as a result of the sanctions are worth it.
  • 1997. The Project for the New American Century, a neoconservative enterprise, three of whose signatories are Dick Cheney, Donald Rumsfeld, and Jeb Bush, is launched. Among other things, they call for the overthrow of Saddam Hussein in Iraq. Ten signees of the statement of principles go on to serve in the George W. Bush administration.
  • 1999. On April 26, CIA headquarters was named the George Bush Center for Intelligence in honor of former president George H.W. Bush who served as CIA Director for 357 days.
  • 1999. A jury in Memphis, Tennessee returns a verdict in a civil trial brought by Martin Luther King’s family concluding that King was killed, not by James Earl Ray, but by a conspiracy involving agencies of the U. S. government and the Memphis police.
  • 2000, September. The Project for the New American Century releases a position paper, “Rebuilding America’s Defenses,” stating that the United States will not be able to enforce its will on Iraq, Iran, Syria, and Afghanistan and maintain a Pax Americana “absent some catastrophic and catalyzing event—like a new Pearl Harbor.” The paper introduces a new word to refer to the United States of America—“the homeland.”
  • 2000, November. George W. Bush is elected president after a disputed ballot count and the intervention of the Supreme Court. Dick Cheney becomes vice-president and Donald Rumsfeld is named secretary of defense.
  • 2001, May 1. George W. Bush gives a major foreign policy speech at the National Defense University and says that the U.S.A. must be willing to “rethink the unthinkable,” giving public notice that the U. S. planned to withdraw from the ABM treaty. He warns against “weapons of mass destruction” and “weapons of terror” in the hands of rogue actors. The speech closely follows the reasoning of the PNAC paper of the previous year in urging an aggressive foreign policy. Cheney and Rumsfeld are in the audience.
  • 2001, June 22-23. Exercise Dark Winter takes place at Andrews Air Force base. The scenario involves anonymous threatening letters sent to mainstream media. The letters threaten more letters to come with anthrax. Judith Miller, author of Germs, and a notoriously deceptive Iraq war hawk for The New York Times, participates, playing Judith Miller of the New York Times.
  • 2001, September 11. The terrorist attacks in NYC and Washington, D.C. occur. The media immediately starts referring to them as another Pearl Harbor, a new Pearl Harbor. CBS News reports that before going to bed at night George W. Bush wrote in his diary, “The Pearl Harbor of the 21st century took place today.” The site of the Twin Towers is first referred to as “ground zero,” a nuclear war term, by Mark Walsh, identified as a freelancer for Fox News by the Fox News interviewer on the street of lower Manhattan. Presciently anticipating the official explanation for the buildings collapse, Walsh adds that the towers obviously collapsed “mostly due to structural failure since the fires were too intense.”
  • 2001, September 12. The New York Times headlines a story: “Personal Accounts of a Morning Rush that Became the Unthinkable.” Another headline under the byline of future editor Bill Keller, Iraq war hawk, reads, “America’s Emergency Line: 9/11.” The endless emergency and war on terror begin. Henceforth, for the first time in American history, a very important day is referred to by numbers, not by name—an emergency phone number.
  • 2001, September 22. Tom Ridge is named director of the newly created Homeland Security and becomes in charge of politically motivated terror alerts.
  • 2001 September-October. Real and fake anthrax attacks occur. A sham investigation follows with the FBI eventually accusing government scientist Bruce Ivins on little to no evidence, resulting in Ivins alleged suicide.
  • 2001. Throughout the first three weeks of October the major media use the word “unthinkable” repetitively, echoing its association with nuclear war, just as the World Trade Center site is similarly referred to as “ground zero,” another nuclear term. A phony “anthrax” letter containing a harmless white powder, postmarked in St. Petersburg, Florida. On September 20, is sent to Tom Brokaw of NBC. The letter, not made public until October 22, after the media’s repeated use of the word “unthinkable,” begins: “The Unthinkabel” Sample Of How It Will Look. Judith Miller of the New York Times receives an anthrax threat letter also sent from St. Petersburg.
  • 2001, October 7. The U.S.A attacks Afghanistan.
  • 2001 October 27. The Patriot Act is passed.
  • 2001, December 4. George W. Bush says when he was outside the classroom in Florida on September 11, he “had seen this plane fly into the first building. There was a TV set on. . . .” Problem: No one saw the first plane hit the North Tower since it wasn’t televised live. Much later a tape someone had made was shown on television.
  • 2002, October 2. At the Cincinnati Museum Center President Bush gives a speech linking Saddam Hussein to the September 11 attacks and says that “we cannot wait for the final proof—the smoking gun—that that could come in the form of a mushroom cloud.” He urges the disarming of Iraq.
  • 2002-10. Regular color-coded terrorist alerts.
  • 2003, February. Secretary of State Colin Powell gives false testimony at the U.N., asserting that Iraq possesses chemical and biological weapons of mass destruction and must be confronted.
  • 2003, March. The U. S. attacks Iraq based on lies.
  • 2003-8. Bush wages war on Iraq, Afghanistan, etc. Homeland “security” leads to indefinite detention, black sites, torture, spying on Americans, the loss of constitutional rights, etc.
  • 2007, February 10. Barack Obama, having been a U.S. Senator for 2 years, 1 month, announces he is running for president.
  • 2008, September. An international financial meltdown occurs. The government claims it was unforeseen. The Bush administration bails out the big banks and financial institutions.
  • 2008, November. A seriously inexperienced Senator from Illinois, Barack Obama, comes out of nowhere to be elected president on a populist platform of “hope” and “change.” He receives more backing from Wall Street than his Republican rival. Liberals and progressives go wild for joy. Hope and change is proclaimed.
  • 2009. Lawrence Summers, former CEO of Goldman Sachs, takes up his position as head of Obama’s economic team. Timothy Geithner, former head of the New York Federal Reserve, whose father, Peter Geithner, oversaw the Ford Foundation’s programs in Indonesia developed by Obama’s mother, becomes secretary of the Treasury. And Robert Gates, former CIA Director and George W. Bush’s Secretary of Defense continues in that position for Obama.
  • 2009, March. Obama meets with the CEOs of fifteen big banks and tells them that “my administration is the only thing between you and the pitchforks. . . . I’m not out there to go after you. I’m protecting you.”
  • 2009. Obama intensifies the war on Afghanistan.
  • 2009, October 9. Obama is given the Nobel Peace Prize.
  • 2009, December. Obama sends 30,000 more American troops to Afghanistan, saying this “will bring this war to a successful conclusion.”
  • 2010. Obama vows to carry forward the Bush tax cuts for the richest Americans.
  • 2010 and ongoing. Obama chooses his drone war kill list every Tuesday; says the killing of American citizen Anwar al-Awlaki “is an easy one.”
  • 2011. Obama and partners attack Libya and brutally kill Muammar Gaddafi. Libya descends into chaos.
  • 2009 and ongoing. Obama attacks Afghanistan, Iraq, Syria, Yemen, Libya, Sudan, Somalia, etc. Does nothing to stop the Israeli slaughter of Palestinians. Supports and arms terrorists in Syria and other countries. Engineers a coup d’etat in Ukraine and supports neo-Nazi forces attacking eastern Ukraine. Encircles Russia with NATO troops and military exercises. Starts a new Cold War. Maintains military commissions and indefinite detention. Prosecutes more whistleblowers than all previous American presidents combined, but does not prosecute any banksters or torturers. Charges Edward Snowden, Thomas Drake, Jeffrey Sterling, Chelsea Manning, John Kiriakou, et al of violating the 1917 Espionage Act. Acquiesces in the military coup against the democratically elected leader of Egypt, Mohamed Morsi and his subsequent imprisonment. Spies on Americans and other countries. Maintains a national state of emergency and the Patriot Act with minor adjustments. Prosecutes “the war on terror” initiated by George W. Bush. Rules over a technological, computerized war of killing all over the globe and a technological, computerized spying apparatus here at home. And does all this and more with a smile.

It should be clear from this small portion of events over the years that there is a connecting link, that there is a bloody thread running through them connecting key players and the obvious ongoing presence of a secret structure that recruits its team to maintain this oppressive system. To see it should be gutsy child’s play. It is not an issue of either/or; we can’t explain how we have come to this terrifying situation of rule by a murderous, militarized national security apparatus serving the wealthy elites by concentrating on either individuals or structures. People such as Barack Obama, the Bushes, et al don’t emerge from thin air (though in Obama’s case it seems that way, and some have speculated on his CIA links). These people grow out of a system that has cultivated and nurtured them. They become spokesmen for the secretive and powerful monied forces some call the Deep State. (The scholar Peter Dale Scott sees a hidden link between the JFK assassination, Watergate, Iran-Contra, and 9/11.) Spokesmen, yes, but executive spokesmen; they are not innocent victims; they are free executioners. People and ongoing structures are intertwined. Individuals count, but so do structures. We are now living within a structure of non-stop and almost total propaganda that individuals, with the help of alternative structures of communication such as alternative media, can penetrate and understand, but only if they are willing to trudge through history that will allow for context and the connecting of dots. In the end, it takes desire and work. Many individuals concluding alike can lead to change. Connect and be outraged.

The psychiatrist Allen Wheelis once wrote a brilliant little book, called How People Change. His “childish” conclusion was that they change because they want to. Simple but true.

Edward Curtin is a sociologist and writer who teaches at Massachusetts College of Liberal Arts and has published widely.

The Tyranny of Time

time-travel_2

By Mark Rockeymoore

Source: Sacred Space in Time

The days compound, vivid and expansive, experience manifesting as desire frustrated or fulfilled. Memories build personality, the “I” that is a minuscule component of the Self projected into the illusory 3-D world takes on a solidity that further materializes error, identification with the world the causative factor in the inevitable descent into purgatory. The past “I Was” informs the future “I Will Be” and the present “I Am” is lost in a miasma of regrets and worries, projecting backwards and forwards in time to the detriment of the Now.

What is this remorseless construct that both fulfills and detracts from existence, creating imaginary realities that encompass the entirety of the individuated existence? Perception is linear, the future seems to follow the past in successive streams of memory, which builds upon itself, solidifying pathways of thought which become the “I” that experiences the world. Faint memories of the near and distant past arise in the fathomless ocean of consciousness, informing the present moment and constricting choice along pathways of potentiality determined by previous experience. We will be what we were is the seemingly logical rationalization that results in the false determination that we cannot change and that the path we have always followed is the only path available now and in the future.

What is lost in this seemingly simplistic logical formulation is that reality encompasses all possibilities and that who we were is not necessarily who we must become. The physical manifestation of thought include words and actions that proceed from non-material reality to material reality. Thoughts as sub-vocal verbalizations and imagery can originate both internally and externally. Thoughts can free the mind or imprison it. The weight we apply to our thoughts determines our ability to manifest our realities, with ingrained and repetitive thought processes taking precedence over new and original thought processes.

As human biology provides the physical hard-drive for non-physical, energetic interactions with the manifest and unmanifest reality, the brain’s synapses form the wires we thicken or thin with repetitive use. Energy easiest and most efficiently through well-defined pathways.  Habits and patterns of thought, vocalization and behavior follow. Time acts as a structure that encompasses the unknown infinite and eternal within the finite capacity of materiality. Its dictates reinforce holographic immaterialities and further reduce clarity of mind and intention.

Timelessness, as a function of higher perception, can be considered to be the casting off of illusory linearity as the primary mode of interpreting existence. Even while immersed within the moment to moment flow of seconds, minutes, hours, days, weeks and years, it is possible to transcend the limitations time places upon potentiality by recasting thought. By realizing that perception is not bound by the past or constrained by the future. That there are no limits to manifestation. That there are no limits to each of us being, fully, who and what we are.

To release one’s self from the Tyranny of Time is to realize that the Now moment is our access to communion with the infinite and eternal. That within each second lies a gateway to unlimited potentiality that is accessible to those who make the choice to court freedom and to banish the past and future as determinative factors in their decision-making processes. No path is set in stone unless we cast it thusly ourselves. Unlimited potentiality is our birthright. Claim yours.