Why the Drug War Has Been a Forty-Year Lynching

nixon-war-on-drugs-quote

By Bob Fitrakis and Harvey Wasserman

Source: FreePress.org

The Drug War has been a forty-year lynching….
…the corporate/GOP response to the peace and civil rights movements.

It’s used the Drug Enforcement Administration and other policing operations as a high-tech Ku Klux Klan, meant to gut America’s communities of youth and color.

It has never been about suppressing drugs. Quite the opposite.

And now that it may be winding down, the focus on suppressing minority votes will shift even stronger to electronic election theft.

The Drug War was officially born June 17, 1971, (http://www.drugpolicy.org/new-solutions-drug-policy/brief-history-drug-war) when Richard Nixon pronounced drugs to be “Public Enemy Number One.” In a nation wracked by poverty, racial tension, injustice, civil strife, ecological disaster, corporate domination, a hated Vietnam War and much more, drugs seemed an odd choice.
In fact, the Drug War’s primary target was black and young voters.

It was the second, secret leg of Nixon’s “Southern Strategy” meant to bring the former Confederacy into the Republican Party.

Part One was about the white vote.

America’s original party of race and slavery (https://zinnedproject.org/materials/a-peoples-history-of-the-united-states-updated-and-expanded-edition/)was Andrew Jackson’s Democrats (born 1828).

After the Civil War the Party’s terror wing, the KKK, made sure former slaves and their descendants “stayed in their place.”

A century of lynchings (at least 3200 of them) (http://www.yale.edu/ynhti/curriculum/units/1979/2/79.02.04.x.html)efficiently suppressed the southern black community.

In the 1930s Franklin Roosevelt’s New Deal social programs began to attract black voters to the Democratic Party. John Kennedy and Lyndon Johnson’s support for civil and voting rights legislation, plus the 24th Amendment ending the poll tax, sealed the deal. Today blacks, who once largely supported the Party of Lincoln,  vote 90% or more Democrat (http://blackdemographics.com/culture/black-politics/).

But the Democrats’ lean to civil rights angered southern whites. Though overt racist language was no longer acceptable in the 1970s, Nixon’s Republicans clearly signaled an open door to the former Confederacy (https://www.thenation.com/article/why-todays-gop-crackup-is-the-final-unraveling-of-nixons-southern-strategy/).

But recruiting angry southern whites would not be enough for the Republicans to take the south. In many southern states more than 40% of potential voters were black. If they were allowed to vote, and if their votes were actually counted, all the reconstructed Democrat Party would need to hold the south would be a sliver of moderate white support.

That’s where the Drug War came in.

Reliable exact national arrest numbers from 1970 through 1979 are hard to come by.

But according to Michelle Alexander’s superb, transformative The New Jim Crow, and according to research by Marc Mauer and Ryan King of the Sentencing Project, more than 31,000,000 Americans were arrested for drugs between 1980 and 2007 (http://newjimcrow.com).

Further federal uniform crime report statistics compiled by www.freepress.org indicate that, between 2008 and 2014, another 9,166,000 were arrested for drug possession.
Taken together, than means well over 40,000,000 American citizens have been arrested for drugs in the four decades since Nixon’s announcement.
It is a staggering number: more than 10% of the entire United States, nearly four times the current population of Ohio, far in excess of more than 100 countries worldwide.
A number that has gutted the African-American community.  A national terror campaign far beyond the reach of even the old KKK.
Justice Department statistics indicate than half of those arrests have been for simple possession of marijuana.
According to US Bureau of Justice statistics, between 1980 and 2013, while blacks were 12% of the population, blacks constituted 30% of those arrested for drug law violations and nearly 40% of those incarcerated in all U.S. prisons.  Thus some 20,000,000 African-American men have been sent to prison for non-violent “crimes” in the past forty years.
If the Hispanic population is added in, as much as 60% of drug arrests are of racial or ethnic minorities.
On the 40th anniversary of the Drug War in 2010, the Associated Press used public records to calculate that the taxpayer cost of arresting and imprisoning all these human beings has been in excess of $1,000,000,000.
Sending them all to college would have been far cheaper.  It also would have allowed them to enhance and transform their communities.
Instead, they were taken from their families.  Their children were robbed of their parents.  They were assaulted by the prison culture, stripped of their right to vote and stopped from leading the kind of lives that might have moved the nation in a very different direction.
Nixon also hated hippies and the peace movement. So in addition to disenfranchising 20,000,000 African-Americans, the Drug War has imprisoned additional millions of young white and Hispanic pot smokers.
Thus the DEA has been the ultra-violent vanguard of the corporate culture war.
In 1983 Ronald Reagan took the Drug War to a new level.  Using profits from his illegal arms sales to Iran, he illegally funded the Contra thugs who were fighting Nicaragua’s duly elected Sandinista government.
The Contras were drug dealers who shipped large quantities of cocaine into the US—-primarily in the Los Angeles area—-where it was mostly converted to crack.
That served a double function for the GOP.
First, it decimated the inner city.
Then Reagan’s “Just Say No” assault—-based on the drugs his Contra allies were injecting into our body politic—-imposed penalties on crack far more severe than those aimed at the powdered cocaine used in the white community.
In 1970 the US prison population was roughly 300,000 people.  Today it’s more than 2.2 million, the largest in world history by both absolute number and percentage of the general population.  There are more people in prison in the US than in China, which has five times the population (http://www.bjs.gov/index.cfm?ty=tp&tid=11).

According to the Sentencing Project, one in seventeen white males has been incarcerated, one in six Latinos, and one in three blacks.
By all accounts the Drug War has had little impact on drug consumption in the US, except to make it more profitable for drug dealers (http://www.bjs.gov/index.cfm?ty=tp&tid=11).  It’s spawned a multi-billion-dollar industry in prison construction, policing, prison guards, lawyers, judges and more, all of them invested in prolonging the drug war despite its negative impacts on public health.

For them, the stream of ruined lives of non-violent offenders is just another form of cash flow.
Like the Klan since the Civil War, the Drug War has accomplished its primary political goal of suppressing the black vote and assaulting the African-American community.
It’s shifted control of the South from the Democrats back to the Republican Party. By slashing voter eligibility and suppressing black turnout, the Drug War crusade has helped the GOP take full control of both houses of the US Congress and a majority of state governments across the US.
But the repressive impacts hit everyone, and ultimately enhance the power of the corporate state.
Toward that end, the southern corporate Democrat Bill Clinton’s two terms as a Drug Warrior further broadened the official attack on grassroots America. Clinton was determined to make sure nobody appeared tougher on “crime.”  He escalated the decimation of our democracy far beyond mere party politics, deepening the assault on the black community, and the basic rights of all Americans for the benefit of his Wall Street funders.  Obama has been barely marginally better.
In political terms, the Nixon-Reagan GOP remains the Drug War’s prime beneficiary. Today’s Republicans are poised to continue dominating our electoral process through the use of rigged electronic registration rolls and voting machines. That’s a core reality we all must face.
But no matter which party controls the White House or Congress, by prosecuting a behavior engaged in by tens of millions of Americans, the Drug War lets the corporate state arrest (and seize assets from) virtually anyone it wants at any time. It has empowered a de facto corporate police state beyond public control.

Regardless of race, we all suffer from the fear, repression and random assaults of a drug-fueled repressive police force with no real accountability.
In the interim, the Drug War is not now and never has been about drugs.
Legalizing pot is just the beginning of our recovery process.
Until we end the Drug War as a whole, America will never know democracy, peace or justice.
____________________
THE SIXTH JIM CROW: ELECTRONIC ELECTION THEFT & THE 2016 SELECTION will be released by Bob Fitrakis & Harvey Wasserman by January, 2016. Their CITIZEN KASICH will follow soon thereafter. Bob’s FITRAKIS FILES are at www.freepress.org; Harvey’s ORGANIC SPIRAL OF US HISTORY will appear in 2016.

Related video:

Buddhism and the Brain

741-hz-consciousness-expansion-is-bes-airl-hear-the-messages-of-the-universe-in-your-mind

Many of Buddhism’s core tenets significantly overlap with findings from modern neurology and neuroscience. So how did Buddhism come close to getting the brain right?

By David Weisman

Source: Seed Magazine

Over the last few decades many Buddhists and quite a few neuroscientists have examined Buddhism and neuroscience, with both groups reporting overlap. I’m sorry to say I have been privately dismissive. One hears this sort of thing all the time, from any religion, and I was sure in this case it would break down upon closer scrutiny. When a scientific discovery seems to support any religious teaching, you can expect members of that religion to become strict empiricists, telling themselves and the world that their belief is grounded in reality. They are always less happy to accept scientific data they feel contradicts their preconceived beliefs. No surprise here; no human likes to be wrong.

But science isn’t supposed to care about preconceived notions. Science, at least good science, tells us about the world as it is, not as some wish it to be. Sometimes what science finds is consistent with a particular religion’s wishes. But usually not.

Despite my doubts, neurology and neuroscience do not appear to profoundly contradict Buddhist thought. Neuroscience tells us the thing we take as our unified mind is an illusion, that our mind is not unified and can barely be said to “exist” at all. Our feeling of unity and control is a post-hoc confabulation and is easily fractured into separate parts. As revealed by scientific inquiry, what we call a mind (or a self, or a soul) is actually something that changes so much and is so uncertain that our pre-scientific language struggles to find meaning.

Buddhists say pretty much the same thing. They believe in an impermanent and illusory self made of shifting parts. They’ve even come up with language to address the problem between perception and belief. Their word for self is anatta, which is usually translated as ‘non self.’  One might try to refer to the self, but the word cleverly reminds one’s self that there is no such thing.

When considering a Buddhist contemplating his soul, one is immediately struck by a disconnect between religious teaching and perception. While meditating in the temple, the self is an illusion. But when the Buddhist goes shopping he feels like we all do: unified, in control, and unchanged from moment to moment. The way things feel becomes suspect. And that’s pretty close to what neurologists deal with every day, like the case of Mr. Logosh.

Mr. Logosh was 37 years old when he suffered a stroke. It was a month after knee surgery and we never found a real reason other than trivially high cholesterol and smoking. Sometimes medicine is like that: bad things happen, seemingly without sufficient reasons. In the ER I found him aphasic, able to understand perfectly but unable to get a single word out, and with no movement of the right face, arm, and leg. We gave him the only treatment available for stroke, tissue plasminogen activator, but there was no improvement. He went to the ICU unchanged. A follow up CT scan showed that the dead brain tissue had filled up with blood. As the body digested the dead brain tissue, later scans showed a large hole in the left hemisphere.

Although I despaired, I comforted myself by looking at the overlying cortex. Here the damage was minimal and many neurons still survived. Still, I mostly despaired. It is a tragedy for an 80-year-old to spend life’s remainder as an aphasic hemiplegic. The tragedy grows when a young man looks towards decades of mute immobility. But you can never tell with early brain injuries to the young. I was yoked to optimism. After all, I’d treated him.

The next day Mr. Logosh woke up and started talking. Not much at first, just ‘yes’ and ‘no.’ Then ‘water,’ ‘thanks,’ ‘sure,’ and ‘me.’ We eventually sent him to rehab, barely able to speak, still able to understand.

One year later he came back to the office with an odd request. He was applying to become a driver and needed my clearance, which was a formality. He walked with only a slight limp, his right foot a bit unsure of itself. His voice had a slight hitch, as though he were choosing his words carefully.

When we consider our language, it seems unified and indivisible. We hear a word, attach meaning to it, and use other words to reply. It’s effortless. It seems part of the same unified language sphere. How easily we are tricked! Mr. Logosh shows us that unity of language is an illusion. The seeming unity of language is really the work of different parts of the brain, which shift and change over time, and which fracture into receptive and expressive parts.

Consider how easily Buddhism accepts what happened to Mr. Logosh. Anatta is not a unified, unchanging self. It is more like a concert, constantly changing emotions, perceptions, and thoughts. Our minds are fragmented and impermanent. A change occurred in the band, so it follows that one expects a change in the music.

Both Buddhism and neuroscience converge on a similar point of view: The way it feels isn’t how it is. There is no permanent, constant soul in the background. Even our language about ourselves is to be distrusted (requiring the tortured negation of anatta). In the broadest strokes then, neuroscience and Buddhism agree.

How did Buddhism get so much right? I speak here as an outsider, but it seems to me that Buddhism started with a bit of empiricism. Perhaps the founders of Buddhism were pre-scientific, but they did use empirical data. They noted the natural world: the sun sets, the wind blows into a field, one insect eats another. There is constant change, shifting parts, and impermanence. They called this impermanence anicca, and it forms a central dogma of Buddhism.

This seems appropriate as far as the natural world is concerned. Buddhists don’t apply this notion to mathematical truths or moral certainties, but sometimes, cleverly, apply it to their own dogmas. Buddhism has had millennia to work out seeming contradictions, and it is only someone who was not indoctrinated who finds any of it strange. (Or at least any stranger than, say, believing God literally breathed a soul into the first human.)

Early on, Buddhism grasped the nature of worldly change and divided parts, and then applied it to the human mind. The key step was overcoming egocentrism and recognizing the connection between the world and humans. We are part of the natural world; its processes apply themselves equally to rocks, trees, insects, and humans. Perhaps building on its heritage, early Buddhism simply did not allow room for human exceptionalism.

I should note my refusal to accept that they simply got this much right by accident, which I find improbable. Why would accident bring them to such a counterintuitive belief? Truth from subjective religious rapture is also highly suspect. Firstly, those who enter religious raptures tend to see what they already know. Secondly, if the self is an illusion, then aren’t subjective insights from meditation illusory as well?

I don’t mean to dismiss or gloss over the areas where Buddhism and neuroscience diverge. Some Buddhist dogmas deviate from what we know about the brain. Buddhism posits an immaterial thing that survives the brain’s death and is reincarnated. After a person’s death, the consciousness reincarnates. If you buy into the idea of a constantly changing immaterial soul, this isn’t as tricky and insane as it seems to the non-indoctrinated. During life, consciousness changes as mental states replace one another, so each moment can be considered a reincarnation from the moment before. The waves lap, the sand shifts. If you’re good, they might one day lap upon a nicer beach, a higher plane of existence. If you’re not, well, someone’s waves need to supply the baseline awareness of insects, worms, and other creepy-crawlies.

The problem is that there’s no evidence for an immaterial thing that gets reincarnated after death. In fact, there’s even evidence against it. Reincarnation would require an entity (even the vague, impermanent one called anatta) to exist independently of brain function. But brain function has been so closely tied to every mental function (every bit of consciousness, perception, emotion, everything self and non-self about you) that there appears to be no remainder. Reincarnation is not a trivial part of most forms of Buddhism. For example, the Dalai Lama’s followers chose him because they believe him to be the living reincarnation of a long line of respected teachers.

Why have the dominant Western religious traditions gotten their permanent, independent souls so wrong? Taking note of change was not limited to Buddhism. The same sort of thinking pops up in Western thought as well. The pre-Socratic Heraclitus said, “Nothing endures but change.” But that observation didn’t really go anywhere. It wasn’t adopted by monotheistic religions or held up as a central natural truth. Instead, pure Platonic ideals won out, perhaps because they seemed more divine.

Western thought is hardly monolithic or simple, but monotheistic religions made a simple misstep when they didn’t apply naturalism to themselves and their notions of their souls. Time and again, their prominent scholars and philosophers rendered the human soul exceptional and otherworldly, falsely elevating our species above and beyond nature. We see the effects today. When Judeo-Christian belief conflicts with science, it nearly always concerns science removing humans from a putative pedestal, a central place in creation. Yet science has shown us that we reside on the fringes of our galaxy, which itself doesn’t seem to hold a particularly precious location in the universe. Our species came from common ape-like ancestors, many of which in all likelihood possessed brains capable of experiencing and manifesting some of our most precious “human” sentiments and traits. Our own brains produce the thing we call a mind, which is not a soul. Human exceptionalism increasingly seems a vain fantasy. In its modest rejection of that vanity, Buddhism exhibits less error and less original sin, this one of pride.

How well will any religion apply the lessons of neuroscience to the soul? Mr. Logosh, like every person who’s brain lesion changes their mind, challenges the Western religions. An immaterial soul cannot easily account for even a stroke associated with aphasia. Will monotheistic religions change their idea of the soul to accommodate data? Will they even try? It is doubtful. The rigid human exceptionalism is cemented firmly into dogma.

Will Buddhists allow neuroscience to render their idea of reincarnation obsolete? This is akin to asking if the Dalai Lama and his followers will decide he’s only the symbolic reincarnation of past teachers. This is also doubtful, but Buddhism’s first steps at least made it possible. Unrelated to neuroscience and neurology, in 1969 the Dalai Lama said his “office was an institution created to benefit others. It is possible that it will soon have outlived its usefulness.” Impermanence and shifting parts entail constant change, so perhaps it is no surprise that he’s lately said he may choose the next office holder before his death.

Buddhism’s success was to apply the world’s impermanence to humans and their souls. The results have carried this religion from ancient antiquity into modernity, an impressive distance. With no fear of impermanent beliefs or constant change, how far will they go?

Follow the Money: From Paris to ISIS to Paris

By James Corbett

Source: The Corbett Report

Let us for one moment accept the whole Paris attacks narrative hook, line and sinker.

That the French government could not possibly have foreseen an attack.

That a multi-site emergency exercise planned for the same day just happened to be simulating an armed group committing attacks around the city just hours before that very scenario unfolded.

That the “mastermind” of the attacks just happens to be the latest in a string of terrorist boogeymen who manage to escape capture time and time again.

OK, fine. The question that the French people should STILL be asking themselves, even if they believe all of that, is this: How does the Islamic State, a ragtag band of jihadis who are supposedly at war with the combined military might of the US, Turkey, the Saudis, the Russians, the Iraqis, the Iranians and many others (including, of course, the Syrians) manage to fund and coordinate spectacular international terror attacks, including not only the Paris attack, but also (apparently) bombings in Turkey and Lebanon, and the take down of Russian airliners? How is it that governments can flag and track the “suspicious” financial transactions of anyone withdrawing or transferring over $10,000 from their own bank account, but can’t seem to find a way to restrict cash flows, arms and munitions to a geographically isolated enemy who are dependent on oil sales for their financial survival?

Good question. Just don’t ask the US State Department spokesman those questions, because he doesn’t have the answers. When asked earlier this week by RT’s Gayane Chichakyan “whether the US has sanctioned any banks suspected of carrying out transactions for ISIL,” department spokesman Mark Toner responded with a resounding: “I’d have to look into that. I don’t have the answer in front of me.”

Apparently the question of how ISIS is financing its operations is of so little interest to the State Department that they haven’t bothered to look into it. So in the interest of helping them out with their homework, let’s connect a few dots, shall we?

Earlier this year it was revealed that French President François Hollande had authorized illegal shipments of arms to the Syrian terrorists in 2012. The deliveries–including cannons, machine guns, rocket launchers and anti-tank missiles–were in direct contravention of an EU embargo that was in place at the time.

In late 2012 it was revealed that one of the most prominent backers of the Syrian terrorists was the French government, who in addition to their illegal arms shipments were also delivering money directly to the terrorist opposition leaders.

Last year the French arms export industry enjoyed its best sales in 15 years, with revenues up 18%. The reason for the Merchant of Death bonanza? A spike in sales to Saudi Arabia and Qatar, two of the main funders and supporters of ISIS.

Of course, not all of the blame for the fostering, funding, arming, equipping and training of ISIS belongs to France. Much of it belongs to the United States, its Gulf allies, Turkey and Israel, as well as assorted other NATO members. But there is a line to be drawn from the arms and funds that France supplied to the “moderate” terrorists in Syria and the seeming international operational abilities of this seemingly unstoppable terrorist boogeyman group.

France is a nation in mourning. But perhaps the French people can reserve at least some of their outrage for the government which has used their own tax money to fund, supply and support the terrorists they are now at war with.

 

 

Related Video:

The Reason You Work So Hard to Participate in the Rat Race

6a01116888abb7970c01a3fd366b75970b-800wi-800x425

By M.J. Higby

Source: Waking Times

Ralph Waldo Emerson once said, “A man in debt is so far a slave.” Money has no intrinsic value yet we spend our days damaging our health and spirit in order to obtain it. Why do we sacrifice our well-being for it? Is it the cliché that “we just want to provide a better life for our kids than we had?” Is it just way of the civilized world? The most important question to ask, however, is what power do we have to change this way of thinking and living? The reality is simple: money is a vehicle for social control. Debt makes us good, obedient workers and citizens.

The traditional workweek started in 1908 at The New England Cotton Mill in order to allow followers of the Jewish religion to adhere to Sabbath.  With the passage of The Fair Labor Standards Act in 1938, the 40-hour workweek became the norm. Data from the 2013 American Community Survey showed that the average commute time in America is about 26 minutes each way. According to a Gallup poll, the average workweek in America is 34.4 hours, however, when only taking into account full time workers, that average shoots up to 47, or 9.4 hours per day during a 5-day workweek. Keeping averages in mind then, between commuting, working and figuring in an hour for lunch (usually less), that puts us at approximately 11 hours and 40 minutes for the average full time worker. If you have a family with young kids, just add in another few hours for homework, baths, etc.

When the day is done, how much time do you have for yourself? To exercise, meditate or otherwise unwind the way that all the healthy living gurus preach? And how much of yourself, your presence of mind, is left to devote to family? We give the company the heat of our most intense mental fire while our families get the smoke. Yet Jeb Bush, the 2016 GOP presidential hopeful, says we need to work more.

The answer to why we put ourselves through this daily grind is multifaceted. The most pervasive reason is workplace and societal pressures. We are raised in a matrix of sorts. The cycle starts around the age of five when we are expected to adhere to a regimented 8-hour day of school. At this age, we don’t have the intellect to question why, so we mechanistically follow the path that’s laid out. This daily path becomes engraved in our minds and becomes as automatic as the sun’s daily journey. Our school system is adept at churning out working class individuals en masse.  We are taught along the way not to question authority, again adhering to the working class mentality.

On the opposite end of the spectrum are those in power. They are the ones that like to color outside the lines. Many books abound with titles such as The Wisdom of Psychopaths that illustrate how people with psychopathic traits, ones who don’t tend to follow rules, are often found in managerial roles such as CEOs all the way up to presidents of countries. With these rare manipulative, coldhearted personalities in place and the rest of us following like good sheeple without questioning, the stage is set for compliance.

If you have been in the working world long enough, then the following statement should ring true: if you work extra hours, you are a great worker; if you decline, you’re useless and apathetic. In the work world, there’s typically no in between. The pressure to succeed for the pride and benefit of the company unfortunately supersedes that of the pressure to be a good parent, sibling, son or daughter. According to a study done by the economic policy institute, between 1948 and 2013, productivity has grown 240% while income for non-managerial workers has grown by 108%. To make up for this discordance, pride of doing what’s best for the company has been employed as a motivational tactic. This tactic has been used as a sharp IV needle that’s been inserted into our veins and we have willingly ingested the contents that are injected through it. Pressure to conform toward achieving the company’s goals has overcome our will to be compensated accordingly.

The other side of this pressure comes from society as a whole outside the education/workplace. A close friend of mine works for a state court and makes about $40K/year. He is also a self-employed business owner on the off hours. I estimate that he works about 70-80 hours a week. He owns a home in a well-to do neighborhood and he drives a seventy thousand dollar luxury car. This crystallizes the saying ‘big hat, no cattle.’ But when a lie is told over and over, the lie becomes the truth.

When we look at someone who drives a luxury car and lives in an upscale part of town, we see this as success because of how often that visual of it has been pounded tirelessly into our minds. We fail to see that these are nothing but symbols of success and false ones at that. They appear real because as a society, we have been conditioned to see them this way by the advertising industry. In the book, The Millionaire Next Door, the authors annihilate this illusion. Numbers don’t lie and the statistics show that most true millionaires, those with a net worth of over one million dollars, do not own those luxuries that we typically associate with success and wealth. They view them as the reality of what they are: a depreciating liability. According to the book, the typical millionaire owns a home in the two to three hundred thousand dollar-range and a non-luxury automobile. If something goes wrong with either, they have the cash reserves to fix it. On the other hand, the commonplace owner of the luxury home and car can’t afford the roof and the tires respectively without going deeper into debt if they should need replacing.

Ownership of these symbols of wealth becomes a self-perpetuating illusion to satisfy the psychological need for acceptance. Unfortunately, human behavior dictates that emotional needs often override logical thinking. It’s been said that the borrower is slave to the debt-owner and with luxury items, debt is the rule, not the exception. Debt is healthy for those in power and contributes to a needy and thus obligated worker.

The current wisdom of slave, spend and save for retirement has only one destiny. That destiny can be summed up in three sentences. Spend your healthiest and most productive years working to support a life of materials and thus illusions of success while elevated stress damage your health. During this time, be sure to save enough money for retirement so you can enjoy those years of the subsequent poor health. And lastly, do it in the name of pride for your company and country.

I take pride in being American, as I’m sure most Americans do, however, if you’re reading this you’re likely smart enough to see the holes in the daily grind. It saps our creative potential and our physical, as well as our spiritual energy. We don’t need any studies to tell us how stressed we are and subsequently, how unhealthy we are. The physical manifestations of stress such as obesity, hypertension, heart disease, increased risk of cancer, depression, anxiety and many others tell us all we need to know. They tell us that we need a better work/life balance. They tell us that the pendulum has swung too much in the direction of work and away from life. Fortunately, there’s a way that we can take it back.

The most important way to restore this balance is to realize the power that we, as consumers, hold. Tyler Durden, the protagonist in the film, Fight Club said it best…

“…advertising has us chasing cars and clothes, working jobs we hate so we can buy shit we don’t need.”

The marketing and advertising industry know, more than anyone else, what motivates the human mind and how to tap into those instinctual drives. To defend against this industries seductiveness, we need to journey within ourselves and bring to light what’s really important to us. What most of us will find is that experiences and time well spent, not materials, are what makes us happy. In the book, aptly titled Well Being, the authors Tom Rath and Jim Harter discuss how experiences have been proven to make us happier than material posessions.

We revel in the anticipation of the experience, we enjoy the experience itself and we look back on it fondly for as long as we live. We do this while the expensive car or house that we borrowed money long ago to obtain falls apart causing us to borrow more money. If we live according to the rule that everything we purchase, with the exception of a home, is acquired by cash, then we fail to become slaves to debt and by extension, work. We no longer relinquish our power to creditors.

Oscar Wilde was famously quoted as saying that anyone who lives within their means suffers from a lack of imagination. Materialistically speaking, living by this notion will bind us with shackles to a life of debt servitude. When we rip those shackles of debt from our wrists, our minds become clear and we see what truly makes us happy. We spend more time with friends and family. We focus on our passions and hobbies. In essence, we get back to the foundation of what it means to be human. After all, none of us will ever arrive upon the mountain of our last moments of existence wishing we spent more time at the office. We will instead arrive wishing we completed that book, that painting or that experience with those we love most. For those can be purchased not with debt, but with time. And there is no more cunning, covert and deceitful thief of time as that villain we call debt.

 

About the Author

M.J. Higby practices medicine in Phoenix, AZ. He is passionate about martial arts, most notably Brazilian Jiu Jitsu. He enjoys writing about mental, spiritual and physical well being and questioning the methods by which we attain it. You can reach him on Facebook and Twitter @MJHigby

Politics as therapy: they want us to be just sick enough not to fight back

freedom-mental-slavery2

By Michael Richmond

Source: Transformation

On 10 October it is World Mental Health Day. I used to be outgoing, but a descent into crushing depression left me housebound. After Occupy I started asking: how does social environment shape our psychology?

I used to buy the Sun newspaper. Not just to fit in with mates at secondary school but right into my first year at university. I knew there was something to be ashamed of in this filthy habit, armed as I was with my oft-deployed excuse: “I only buy it for the crossword and the football transfers.”

This was true. I never read the news. In general, I lived a remarkably apolitical existence. This was some feat considering I have a Jewish communist great grandfather, socialist grandparents, a union lawyer dad and an older brother who went through his Che Guevara phase at around fifteen.

I dropped out of university in early 2007, five months before Northern Rock bank hit the skids. Who knows whether the student experience would have politicised me? Perhaps the process would have been helped along by the backdrop of the approaching financial crisis?

But something else politicised me instead: a crushing, rapid descent into depression, social wilderness and personal crisis.

I experienced anxiety and depression as a hostile takeover of my life and sense of self. I went from being outgoing and sociable to being unable to talk to people or leave the house. This was within the space of a few days. There was no discernible cause.

It was quickly clear that I couldn’t continue at university and so I moved back into my parents’ house, where I have lived ever since.

Several years of isolation, suicidal thoughts and internal struggle followed. I remained unable to escape the confines of my bullying psyche, let alone my house.

Unable to work or study, have friendships, or experience joy, reading became my true love, my source of meaning, my attempt to make sense of what had happened to me. I obsessively read classic literature, history, philosophy, political economy – I had felt a profound sense of loss at not being able to finish university. I became determined that I would instead educate myself.

But an impenetrable sense of terror and despair continued to accompany me through my every waking and sleeping hour. I began to work my way through an impressive list of psychotropic medications and psychotherapies and eventually attended an NHS psychiatric day hospital for six months.

A “service user” within the psychiatric system gains a unique insight and practical education in state discipline as well as the lengths gone to in enforcing normativity. Having grown up white, straight, male and middle class, I was privileged to rarely, if ever, be told that I had to be something other than what I was.

I seldom encountered gross injustice or violence, blatant discrimination or the kind of treatment faced from the earliest ages if you happen to be a person of colour, don’t fit a gender binary or adhere to accepted ideals of sexual behaviour.

Apart from being a non-religious Jew and encountering minimal levels of playground anti-Semitism, this was the first time I found myself in a situation of social and political ostracism (as well as a self-ostracism that proved just as powerful). I discovered for myself that the experience of the personal deeply informs the political.

Leaving the psychiatric day hospital to instead attend the asylum of Occupy the London Stock Exchange at St Paul’s Cathedral was in many ways a descent into further madness. Many “occupiers” were well acquainted with psychiatric services and medications – as well as using drugs not sanctioned by the state, but often taken for similar reasons.

Chaotic, naïve, and ultimately politically problematic and ineffectual, the initial occupied space did nevertheless open up the possibility for social and political interaction that is elsewhere absent from society.

I felt that I was in crisis, but also that the crisis was much bigger than just me. Getting involved in political praxis seemed to be the best way to channel what I was experiencing.

There is a lot to be said for the practice of “politics as therapy.”

The personal account or “journey” format often proves insufficient when attempting to understand what we do and why we do it. An analysis of political subjectivity is crucial. Shifts in capitalist expansion, social environment and class composition, technological development and the onset of crises tend to precipitate political transformation on an individual and collective basis.

The advent of the printing press or the collapse of the automotive industry in mid-west America, for example, are not external factors to people’s lives or isolated moments in history. Indeed, any such upheaval is bound to lead to transformative changes in the lives and political ideation of those experiencing it.

Our social environment shapes our psychology. We must consider how the policy, ideology and debate that surrounds “mental health” or madness is framed.

The individualisation of suffering is key to the prevailing ideology and discourse surrounding mental illness. This will often focus on a supposed misfiring of brain chemicals, a “cure” to which can be found in the form of pharmaceuticals – often prescribed by your GP before any contact with mental health services.

Attention may also turn to an individual’s lack of positive attitude, but this problem can be “fixed” by a six-week course of cognitive behavioural therapy. So much human suffering is pathologised and medicated when it is either “natural” (i.e grief or the general variety of mental experience) or is directly or indirectly linked to social, political and economic factors that remain absent from debate, let alone actively contested on this terrain.

Psychologist and author Bruce E Levine suggests that much of today’s intervention under the auspices of “mental health” is all too political.

“What better way to maintain the status quo,” Levine asks, “than to view inattention, anger, anxiety, and depression as biochemical problems of those who are mentally ill rather than normal reactions to an increasingly authoritarian society?”

He also argues that many potential activists and “natural anti-authoritarians” are prevented from opposing power: “Some activists lament how few anti-authoritarians there appear to be in the US. One reason could be that many natural anti-authoritarians are now psychopathologised and medicated before they achieve political consciousness of society’s most oppressive authorities.”

The historical origins of madness within western culture and how it became increasingly medicalised should not be forgotten. Michel Foucault exposed how the origins of “confinement” of the “insane” in asylums and workhouses were an integral part of the violent replacement of the feudal commons way of life with capitalist work discipline during the 16th and 17th centuries.

This process is in keeping with continual “primitive accumulation” akin to and contemporary with the conquest of the “New World” and the persecution of heretics and witches. Their land and means of reproduction were stolen and appropriated, while authorities continually oppressed and attempted to proletarianise them.

Initially, the “Great Confinement” saw the imprisonment of the old, the unemployed, the “criminal”, the “insane.”

As Foucault explains: “Before having the medical meaning we give it, or that at least we like to suppose it has, confinement was required by something quite different from any concern with curing the sick. What made it necessary was an imperative of labour. Our philanthropy prefers to recognise the signs of a benevolence toward sickness where there is only a condemnation of idleness.”

The conflation of pejoratives like lazy, sick, unemployed, idle are more than familiar to us in today’s discourse surrounding welfare benefits and the imperatives of labour. And it is not just the DWP and Atos who pressure people back into work, NHS psychiatric services also seem to believe that it is work that sets you free.

The capitalist class would like us to be just sick enough not to fight back, but not so sick that we cannot work. The challenge for us is to find ways of organising and helping each other so that we can find adequate levels of social reproduction, care and support to give us a platform to engage in the therapy of class struggle.

 

Everything You Think You Know About Addiction is Wrong: Smashing the Drug War Paradigm

war-on-drugs

By Matt Agorist

Source: The Free Thought Project

“Overwhelming evidence points to not just the failure of the drug control regime to attain its stated goals but also the horrific unintended consequences of punitive and prohibitionist laws and policies,” states a study, published by the Global Commission on Drug Policy (GCDP).

“A new and improved global drug control regime is needed that better protects the health and safety of individuals and communities around the world,”
 the report says. “Harsh measures grounded in repressive ideologies must be replaced by more humane and effective policies shaped by scientific evidence, public health principles and human rights standards.”

This sudden onset of logic by political bodies across the globe is likely due to the realization of the cruel and inhumane way governments deal with addiction. Arbitrary substances are deemed “illegal” and then anyone caught in possession of those substances is then kidnapped and locked in a cage, or even killed.

The fact that this barbaric and downright immoral practice has been going on for so long speaks to the sheer ignorance of the state and its dependence upon violence to solve society’s ills.

The good news is that the drug war’s days are numbered, especially seeing that it’s reached the White House, and they are taking action, even if it is symbolic. Evidence of this is everywhere. States are defying the federal government and refusing to lock people in cages for marijuana. Colorado and Washington served as a catalyst in a seemingly exponential awakening to the government’s immoral war.

Following suit were Oregon, D.C., and Alaska. Medical marijuana initiatives are becoming a constant part of legislative debates nationwide. We’ve even seen bills that would not only completely legalize marijuana but deregulate it entirely, like corn.

As more and more states refuse to kidnap and cage marijuana users, the drug war will continue to implode.

Knowledge is a key role in this battle against addiction tyranny and investigative journalist Johann Hari, has some vital information to share. In a recent TEDx talk, Hari smashes the paradigm of the war on drugs.

What really causes addiction — to everything from cocaine to smart-phones? And how can we overcome it? Johann Hari has seen our current methods fail firsthand, as he has watched loved ones struggle to manage their addictions. He started to wonder why we treat addicts the way we do — and if there might be a better way. As he shares in this deeply personal talk, his questions took him around the world, and unearthed some surprising and hopeful ways of thinking about an age-old problem.

Now Streaming: The Plague Years

3768363

By A. S. Hamrah

Source: The Baffler

When things are very American, they are as American as apple pie. Except violence. H. Rap Brown said violence “is as American as cherry pie,” not apple pie. Brown’s maxim makes us see violence as red and gelatinous, spooned from a can.

But for Brown, in 1967, American violence was white. Explicitly casting himself as an outsider, Brown said in his cherry pie speech that “violence is a part of America’s culture” and that Americans taught violence to black people. He explained that violence is a necessary form of self-protection in a society where white people set fire to Bowery bums for fun, and where they shoot strangers from the towers of college campuses for no reason—this was less than a year after Charles Whitman had killed eleven people that way at the University of Texas in Austin, the first mass shooting of its kind in U.S. history. Brown compared these deadly acts of violence to the war in Vietnam; president Lyndon B. Johnson, too, was burning people alive. He said the president’s wife was more his enemy than the people of Vietnam were, and that he’d rather kill her than them.

Brown, who was then a leader of the Student Nonviolent Coordinating Committee and who would soon become the Black Panther Party’s minister of justice, delivered a version of this speech, or rant, to about four hundred people in Cambridge, Maryland. When it was over, the police went looking for him and arrested him for inciting a riot. Brown’s story afterward is eventful and complicated, but this is an essay about zombie movies. Suffice it to say, Brown knows about violence. Fifty years after that speech, having changed his name to Jamil Abdullah al-Amin, he’s spending life in prison for killing a cop.

The same day Brown was giving his speech in Maryland, George A. Romero, a director of industrial films, was north of Pittsburgh in a small Pennsylvania town called Evans City. Romero was shooting his first feature film, a low-budget horror movie in black and white called Night of the Living Dead. Released in October 1968, the first modern zombie movie tells the story of a black man trying to defend himself and others from a sudden plague of lumbering corpses who feed on the living. At the film’s end, he is unceremoniously shot and killed by cops who assume he is a zombie trying to kill them. The cops quickly dispose of his body, dumping it in a fire with a heap of the undead, as a posse moves on to hunt more zombies.

Regional gore films were nothing new in themselves; a number had appeared earlier in the 1960s. Night of the Living Dead, with its shambling, open-mouthed gut-munchers dressed in business suits and housecoats, might have seemed merely gross or oddly funny in a context other than the America of 1968. But Martin Luther King Jr. had been assassinated six months before its release. The news on TV, which most people still saw in black and white, consisted largely of urban riots and war reports from Vietnam. The My Lai Massacre had occurred the month before King was shot.

Romero’s film, seen in the United States the year it came out, had more in common with Rome Open City than it did with a drive-in horror movie made for teens—it was close to a work of neorealism. And it was unfunny and dire, much like John Cassavetes’s Faces, released the same year, whose laughing drunks stopped laughing when they paused to look in the mirror. Romero was a revisionist director of horror in the same way that Peckinpah and Altman were in their career-making genres, the western and the war movie.

Romero cast an African American in the lead, and he shifted the horror genre’s dynamic, aligning it with black-and-white antiwar documentaries like Emile de Antonio’s In the Year of the Pig, also released in 1968, and distinguishing it from the lurid color horror films Roger Corman and Hammer Films had been turning out up till then. Those films made certain concessions to the film industry; Night of the Living Dead did not. This was an American horror movie, so it needed no English accents or familiar character actors. It was grim and unflinching, showing average citizens, played by average people, eating the arms and intestines of their fellow townsfolk. Romero drove home this central point—that a zombie-infested America differed from the status quo only in degree, not in kind—by ending his film with realistic-looking fake news photos depicting his characters’ banal atrocities.

Mainstream film reviewers, including Roger Ebert, were shocked and disgusted by Night of the Living Dead. They discouraged people from seeing it, but Romero’s images proved to be indelible. The film’s reputation grew. In 1978 Romero made the film’s first sequel, Dawn of the Dead, this time in color. Today, if there’s one thing every American knows, it’s that zombies can only be killed with a shot to the head. This is common knowledge, cultural literacy, a kind of historical fact, like George Washington chopping down the cherry tree. American-flag bumper stickers assert that “these colors don’t run,” but one of them does. It runs like crazy through American life, through American movies, and now TV, like a faucet left on.

Dead Reckonings

The Huffington Post has had a Zombie Apocalypse header since 2011, under which the editors file newsy blog posts chronicling our continuing fascination with zombie pop culture, alongside any nonfiction news story horrible enough to relate to zombies or cannibalism. The infamous Miami face-eater attack of May 2012, which the media gleefully heralded as the start of a “real” zombie apocalypse, contributed to America’s sense that it could happen here, provided we wished for it hard enough. Reading through the Zombie Apocalypse posts, one gets a growing sense that we want the big, self-devouring reckoning to happen because it is the one disaster we are truly mentally prepared for. It won’t be the total letdown of the Ebola scare.

The face-eating incident was initially linked to bath salts: ground-up mineral crystals everyone hoped would become the new homemade drug of choice for America’s scariest users. It turned out the perpetrator, although naked, was only high on marijuana. He was black, killed by the police as he gouged out his homeless victim’s eyes and chewed his face on a causeway over Biscayne Bay. The incident was captured on surveillance video. Here in the golden age of user-generated content, the zombie movies self-generate—much like zombies themselves. The bridge backdrop of this all-too-real zombie vignette neatly summed up both the crumbling condition of America’s infrastructure and our more generalized state of neoliberal collapse.

The zombie apocalypse, our favorite apocalypse, seems to unite the right and left. It combines the apocalypse brought about by climate change and the subsequent competition for scant resources with the one loosed by secret government experiments gone awry. Better still, both of these scenarios, as we’re typically shown in graphic detail, will necessitate increased gun-toting and firearms expertise.

More than that, the fast-approaching zombie parousia allows us to indulge our fantasies of a third apocalypse, one that only the most clueless don’t embrace: the consumerist Day of Judgment, in which we will all be punished for being fat and lazy and living by remote control, going through our daily routines questioning nothing as the world falls apart and we continue shopping. Supermarkets and shopping carts, malls and food warehouses all figure prominently in the iconography of the post–Night of the Living Dead zombie movie, reminding us that even in our quotidian consumerist daze, we are one step away from looting and cannibalism, the last two items on everyone’s bucket list.

Still, despite its galvanizing power to place all of humanity on the same side of the cosmic battlefront, the zombie apocalypse, like all ideological constructs, nonetheless manages to cleave the world into two camps. One camp gets it and the other doesn’t. One is aware the apocalypse is under way, and the other is blithely oblivious to the world around it.

To confuse matters further, people move in and out of both camps, becoming inert, zombified creatures when obliviousness suits their mood. People blocking our progress on the street as they natter into their hands-free earsets stare straight ahead, refusing to admit that other people exist. At least they don’t bite us as we flatten ourselves against walls to pass them without contact. A paradox of the ubiquity of zombie-themed pop culture is how there are surely next to no people left who have not enjoyed a zombie movie, TV show, book, or videogame, yet there are more and more people shuffling around like extras in a zombie film, moving their mouths and making gnawing sounds.

The smartphone-based zombification of street life is a strange testament to Romero’s original insight, which becomes more pronounced as the wealth gap widens. The disenfranchised look ever more zombie-fied to the rich, who in turn all look the same and act the same as they take over whole neighborhoods and wall themselves up in condo towers. This, indeed, is exactly what happens in Romero’s fourth zombie movie, 2005’s Land of the Dead, which predicted things as consequential as what happened during Hurricane Katrina in New Orleans and as minor as the rise of food trucks.

The Zombie Apocalypse is also a parable of the Protestant work ethic, come to reap vengeance at the end of days. It assures us that only very resourceful, tough-minded people will be able to hack it when the dead come back to life. If the rest had really wanted to survive—if they deserved to survive—they would have spent a little less time on the sofa. But here, too, the simple and obvious moral takes a perverse turn: the best anti-zombie combatants should be the ones who’ve watched the most zombie movies, yet by the very logic of our consumer-baiting zombie fables, they won’t be physically capable of survival because all they did was watch TV.

Selective Service

What these couch potatoes will need, inarguably, is the protection of a strong leader, one who hasn’t spent his life in the vain and sodden leisure pursuits that they’ve inertly embraced—Rick Grimes in The Walking Dead, for instance. Why such a person would want to help them is a question they don’t ask. With this search for an ultimate hero, the zombie genre has veered into the escapism of savior lust, leaving Romero’s unflinching, subversive neorealism behind. In Night of the Living Dead, a witless humanity is condemned by its own herd mentality and racism. In latter-day zombie fictions, a quasi-fascist social order is required, uniting us regardless of race, creed, or color.

The predicament of the characters (and the actors) in all the nouveau zombie movies relates to this passive consumerism. Both the characters and the actors in new zombie movies have to act like zombie films don’t already exist, even though the existence of Romero’s films is what permits the existence of the film they are in. Somehow, the characters pull their savvy out of thin air. They must pretend that they have never heard of zombies, even as they immediately and naturally know what to do once their own particular Zombie Apocalypse gets under way.

This paradox underscores the fantasy aspect of the twenty-first-century zombie infatuation, in which a fixed set of roles is available for cosplay in a repeatable drama that already took place somewhere else. The difference between Romero’s films and the new zombie movies is that the more time that passes since 1968, the more Romero’s films don’t seem like they were designed as entertainment—even as they are endlessly exploited by the zombie-themed cultural productions that copy them, and even as they remain entertaining. The new zombie films cannibalize Romero’s films in an attempt to remake them ideologically, so that we will stop looking for meaning in them and just accept the inevitable.

The Primal Hordes

A primal fantasy of the Zombie Apocalypse is that when the shit hits the fan, we will be able to kill our own children or parents. We won’t have a choice. The decision to get rid of the generation impeding us will have been made for us by the zombie plague, absolving us of responsibility. We are, after all, killing somebody who is already dead and who, in his or her current state, is a threat to our continued existence.

Against the generalized dystopian entertainment landscape that followed the economic collapse of 2008, the Zombie Apocalypse made more sense than ever. But YA action-drama dropped it in favor of promoting teen heroes who were stronger than their nice-but-loserish sad sack parents. This is the uplifting generational affirmation that imbues Suzanne Collins’s Hunger Games franchise and Veronica Roth’s Divergent trilogy.

YA comedy, on the other hand, did not ignore zombie movies. Instead, it domesticated the Zombie Apocalypse, making it friendly. Nonthreatening zom-coms showed young viewers how the opposite sex was really not that scary, that being in a couple was still the most important thing, and that dystopias gave nerds an unprecedented chance to prove they could get the girl or boy. Dystopia, it turns out, is really a best-of-all-possible-worlds scenario for starry-eyed-kids-with-a-disease, or so we learn from zom-coms like Warm Bodies and Life After Beth.

The latest iteration of this trend, which sets a zombie heroine in a marginally less dystopian world that mirrors our tentative economic comeback, is the CW TV show iZombie. The series is a brain-eating entertainment for tweens in which they learn you can be okay and have a chill job even if you’re a living corpse who’s just trying to figure things out. When a zombie gets her own tween-empowerment show on The CW, it’s a good indication that zombies don’t carry the stern, unbekannt stigmas they used to. Zombies, much like corpses in TV commercials, are used as grotesque comic relief in things like animated Adult Swim shows. Such is the diminished status of the zombie; it is now a signifier that can be plugged in anywhere. To paraphrase the undead philosopher of capitalism’s own walking-dead demise: first time cannibalism, second time farce.

Reality Bites

The way zombie movies progress, with isolated groups splitting into factions and various elimination rounds as contestants disappear, suggests that Night of the Living Dead is also a secret source of reality TV. It makes sense, then, that 2009’s Zombieland, one of the first YA dystopian zombie entertainments, was penned by screenwriters who created The Joe Schmo Show and I’m a Celebrity . . . Get Me Out of Here!

Zombieland’s protagonist, a college-age dude played by Jesse Eisenberg, is a bundle of phobias, an OCD-style follower of rules who finds himself in a Zombie Apocalypse after an unexpected date with a hot girl out of his league (Amber Heard) goes wrong. Mentored by Woody Harrelson, who more or less reprised this same role in the Hunger Games movies, Eisenberg’s millennial character undergoes a reality-TV-scripted makeover. In expiation for his pusillanimity in the opening reel, he winds up rescuing a tough girl (Emma Stone) who also would have been out of his league in the pre-Apocalypse scheme of dating. Zombieland presents Eisenberg as gutless and Stone as ruthless, but she’s the one who ends up a hostage, and he becomes her hero. In fact, one of his rules, “Don’t be a hero,” changes on screen to “Be a hero,” as we once again learn that millennials really do have what it takes to kill zombies. Earlier in the film, Eisenberg accidentally shoots and kills a non-zombie Bill Murray, playing himself, showing that millennials can also, regretfully, take out Baby Boomers, including the cool ones who aren’t undead.

Edgar Wright’s 2004 Shaun of the Dead, the first movie zom-com, was a more intelligent version of this same storyline. An English comedy from the “Isn’t it cute how much we suck?” school, Wright’s film acquiesced to the coupling-up plot rom-coms require, but not without first presenting the routine, pointless daily life of its protagonist (Simon Pegg) as pre-zombified. Shaun of the Dead will likely remain the only sweet little comedy in which the protagonist kills his mother, a scene the film has the guts to play without flinching. The joke of Wright’s film is that it takes something as brutal as a zombie apocalypse to wake us from our stupor and to show us how good we had it all along. By the film’s end, Pegg and his girlfriend (Kate Ashfield) are in exactly the same place they were when the film started, but now at least they live together. A cover of the Buzzcocks’ song “Everybody’s Happy Nowadays” jangles over the credits, providing a zombified dose of circa-1979 irony.

Wright and Pegg’s goofy rethinking of the zombie movie proved how firmly zombies are entrenched in our consciousness, and how easy they are to manipulate for comedic effect. The same month Shaun of the Dead came out, a Hollywood remake of Romero’s Dawn of the Dead was released. It, too, cleaned up at the box office. This new Dawn of the Dead seemed like it was made by one of the nerds in the American zom-coms, a jerk desperate to prove he’s bad-ass. (The director now makes superhero movies.) Johnny Cash’s “The Man Comes Around” accompanies the opening credits, setting a high bar for artistic achievement the ensuing film does not come near to clearing. Jim Carroll’s “People Who Died” plays at the end—its placement there as repulsive as anything else in the film.

As all nouveau zombie films must, the remake starts in the suburbs, where a couple is watching American Idol in bed, underscoring the genre’s newfound connection to reality TV. The film’s CGI effects, which at the time injected a souped-up faux energy into the onscreen mayhem, dated instantly. They’re now the kind of off-the-rack effects featured in Weird Al videos when someone gets hit by a car.

The main point of this new Dawn of the Dead is that after the Zombie Apocalypse, people will spend their time barking orders at each other and calling each other “asshole.” The film nods in the direction of loving the military and the police, and totally sanitizes Romero’s use of a shopping mall as a site of consumerist critique. Like many films of the 2000s, it postulates that living in a mall wouldn’t be a Hobbesian dystopia at all; it would be rad. If the remake had been made five years later, maybe it would have had to grapple with the “dead malls” that began to adorn the American landscape with greater frequency after the economy collapsed. Instead, the mall serving as the film’s principal backdrop is spotless and fun. The remake’s island-set, sequel-ready false happy ending makes one long for the denouement of Michael Haneke’s Funny Games—a longing more unimaginable than any real-life wish-fulfillment fantasy about the Zombie Apocalypse actually coming to pass.

The American Way of Death

Fanboys liked the Dawn of the Dead remake and, inexplicably, so did many critics. Manohla Dargis, then at the Los Angeles Times, wrote that the film was “the best proof in ages that cannibalizing old material sometimes works fiendishly well,” a punny sentiment she might well walk back today.

The next year, when George A. Romero released his first new zombie film in twenty years, it did not fare as well in the suddenly crowded marketplace of the undead. While Land of the Dead (2005) is fittingly seen as something of a masterpiece now, on its initial release it puzzled genre fans, who had gotten used to the sort of “fast zombies” that were first featured in the nihilistic-with-a-happy-ending British movie 28 Days Later (2002). Romero’s new film was as trenchant as his others, but many fans weren’t having it.

IMDb user reviews provide a record of their immediate reactions. “This movie was terrible!” one wrote the month Land of the Dead premiered. “The storyline—can’t use the word plot as that would give it too much credit—was tedious! Some say it was a great perspective on class? Are you kidding me!!!” Less then a year into George W. Bush’s second term, Romero was archly depicting a society much different from the one he’d shown in Night of the Living Dead. This new society—today’s—was more class-riven, more opportunistic, more cynical. And Romero, even while moving in the direction of Hawksian classicism, was exposing these failings with radical acuity. His dark fable of two Americas at war over the control of the resources necessary to survive was concise, imaginative, and well constructed. Few at the time wanted to consider the film’s style, which seemed out of date compared to the Dawn of the Dead remake. Fewer still wanted to grapple with its implications.

Ten years later, it is clear that no American genre film from that period digests and exposes the Bush era more skillfully than Land of the Dead. Romero’s film was uncomfortably ahead of its time, and like his other zombie work, it hasn’t dated; it speaks of 2015 as much as 2005. Tightly controlled scenes avoid the pointlessness and repetition of the nouveau zombie films, limning class struggle in unexpected ways. Zombies, slowly coming to consciousness, use the tools of the trades from which they’ve been recently dispossessed to shatter the glass of fortified condos. A zombie pumps gas through the windshield of a limo. The rich commit suicide, only to come back to life as zombies and feed on their children. America, as the original-zombie-era Funkadelic LP taught us, eats its young.

As zombie fantasies go, these scenes are much richer than the random, unsatisfying mayhem of the nouveau zombie films. Romero, unlike his counterparts, does not shy away from race. He shows African Americans pushing back against the injustices and indignities of a militarized police state, thereby completing a circle that began with Duane Jones’s performance in Night of the Living Dead.

Walking Tall

For the latest generation of zombie enthusiasts, the zombie genre means just one thing: AMC’s massively popular cable series The Walking Dead. The show is so much better than any of the recent non-Romero zombie movies that it’s among the leading exhibits in the case against the cineplex. The show’s politics and implications are widely discussed, and The Walking Dead has engendered national debate about all sorts of ethical issues, including something Romero’s films raised only in the negative: America’s future. But the first problem The Walking Dead solved was how to make its own debates about these things interesting: whenever scenes get too talky, a “walker” sidles up and has to be dispatched in the time-honored fashion. At its core, the zombie drama is like playing “You’re it!” The show could be called Game of Tag.

The Walking Dead debuted in 2010, emerging from a period in U.S. history when, all of a sudden, we found ourselves in a junked, collapsed, post-American environment. New dystopian dramas, especially the YA ones, reflected this chastened reality. The Walking Dead looked at first like it might become just another placeholding entry in this cavalcade of glumness, much like TNT’sSpielberg-produced, families vs. aliens sci-fi show Falling Skies. Zombies were maybe the most dated way possible to dramatize our newly trashed world.

It was The Walking Dead’s dated qualities, however, that saved it from becoming cable TV’s Hunger Games. The show’s grunge aesthetic and majority-adult cast situated it elsewhere. And if that particular elsewhere felt like the past as much as the future, that was part of what made the show work for premium cable’s Gen X audience. Greg Nicotero, a makeup man who worked under Romero, is one of the show’s producers. His presence indicated the people behind the show took the genre seriously, unlike anyone else in Hollywood who had touched it.

Television works by imitating success, by zombifying proven formulas through a process called mimetic isomorphism. When television producers saw The Walking Dead’s ratings beating broadcast-network ratings—a first for cable drama—they took notice and began spawning. Copies of copies like Resurrection, The Last Ship, The Leftovers, and 12 Monkeys showed that plague is contagious, but it doesn’t have to be zombie plague. Meanwhile, The Walking Dead continues its success, and AMC will debut a companion series this summer, unimaginatively called Fear the Walking Dead.

If the worst zombie movies unselfconsciously imitate higher-gloss broadcast-network reality trash like Survivor, The Walking Dead succeeds by staying closer to the lowest grade of cable-network reality TV. The world of The Walking Dead is closer to Hoarders than it is to Big Brother. Hoarders presents an America engulfed in mounds of trash that its psychologically damaged possessors can’t part with. Mounds of Big Gulp cups and greeting cards and heaps of car parts and instruction manuals overwhelm their homes, spilling into their yards. Shows like Storage Wars, Pawn Stars, and American Pickers present an America of valueless junk that maybe somebody can make a buck on—if only by televising it for our own lurid delectation. These shows are the opposite of pre-collapse valuation shows like Antiques Roadshow, in which the junk people had lying around proved to be worth more than they had imagined. The detritus of Hoarders is worthless, the kind of trash that will blow around everywhere after the Zombie Apocalypse.

Hoarders vs. Horde

In his recent book 24/7, an analysis of the end of sleep and our twenty-four-hour consumption-and-work cycle, Jonathan Crary writes that “part of the modernized world we inhabit is the ubiquitous visibility of useless violence and the human suffering it causes. . . . The act of witnessing and its monotony can become a mere enduring of the night, of the disaster.” Zombies, not quite awake but never asleep, are the living-dead reminders of this condition, stumbling through our fictions. When they are not transformed by the wishful thinking of ideology into our pals, they retain this status.

Celebrated everywhere, zombies are the opposite of celebrities, who swoop into our disaster areas like gods from Olympus to rescue us from the calamities that also allow them to flourish. Zombies, far from being elevated, descend into utter undistinguishable anonymity and degradation, which is why they can be destroyed in good conscience. Brad Pitt, one of the producers of ABC’s Resurrection, also starred in World War Z, the most expensive zombie movie ever made. The last line of that odious movie—the first neoliberal zombie movie—is “Our war has just begun.”

Whatever that was supposed to mean to the audience, these fables of the plague years drive home just who the zombies are supposed to be—and who, when the plague hits, will helicopter out holding the machine guns. Col. Kurtz’s faithful devotee from Apocalypse Now, Dennis Hopper, the counterculture hero who became a Republican golf nut, plays the leader of the remaining 1 percent in Land of the Dead. “We don’t negotiate with terrorists,” he says when he’s faced with the choice between his money and our lives.

Capture, Smear, Contaminate: The Politics Of GMOs

gmo_crops_genfood_735_350-400x190

By Colin Todhunter

Source: RINF

When rich companies with politically-connected lobbyists and seats on public bodies bend policies for their own ends, we are in serious trouble. It is then that public institutions become hijacked and our choices, freedoms and rights are destroyed. Corporate interests have too often used their dubious ‘science’, lobbyists, political connections and presence within the heart of governments to subvert institutions set up to supposedly protect the public interest for their own commercial benefit. Once their power has been established, anyone who questions them or who stands in their way can expect a very bumpy ride.

The revolving door between the private sector and government bodies has been well established. In the US, many senior figures from the Genetically Modified Organisms (GMOs) industry, especially Monsanto, have moved with ease to take up positions with the Food and Drug Administration and Evironmental Protection Agency and within the government. Writer and researcher William F Engdahl writes about a similar influence in Europe, noting the links between the GMO sector within the European Food Safety Authority. He states that over half of the scientists involved in the GMO panel which positively reviewed the Monsanto’s study for GMO maize in 2009, leading to its EU-wide authorisation, had links with the biotech industry.

“Monsanto should not have to vouchsafe the safety of biotech food. Our interest is in selling as much of it as possible. Assuring its safety is the FDA’s job” – Phil Angell, Monsanto’s director of corporate communications. “Playing God in the Garden” New York Times Magazine,October 25, 1998.

Phil Angell’s statement begs the question: then who should vouchsafe for it, especially when the public bodies have been severely comprised? Monsanto has all angles covered.

When corporate interests are able to gain access to such positions of power, little wonder they have some heavy-duty tools at their disposal to try to fend off criticism by all means necessary.

A well-worn tactic of the pro-GMO lobby is to slur and attack figures that have challenged the ‘science’ and claims of the industry. With threats of lawsuits and UK government pressure, some years ago top research scientist Dr Arpad Pusztai was effectively silenced over his research concerning the dangers of GM food. A campaign was set in motion to destroy his reputation. Professor Seralini and his team’s research was also met with intense industry pressure, with Monsanto effectively targeting the heart of science to secure its commercial interests. There are numerous examples of scientists being targeted like this. A WikiLeaks cable highlighted how GMOs were being forced into European nations by the US ambassador to France who plotted with other US officials to create a ‘retaliatory target list’ of anyone who tried to regulate GMOs. That clearly indicates the power of the industry.

What the GMO sector fails to grasp is that the onus is on it to prove that its products are safe. And it has patently failed to do this. No independent testing was done before Bush senior allowed GMOs onto the US market. The onus should not be on others to prove they are safe (or unsafe) after they are on the market, especially as public attorney Steven Druker‘s book ‘Altered Genes, Twisted Truth’ shows that GMOs are on the US market due to fraudulent practices and the bypassing of scientific evidence pointing to potential health hazards.

We therefore have the right to ask whether we should trust studies carried out by the sector itself that claims GM crops are safe? Let us turn to Tiruvadi Jagadisan for an answer.

He worked with Monsanto for nearly two decades, including eight years as the managing director of India operations. A few years ago, he stated that Monsanto “used to fake scientific data” submitted to government regulatory agencies to get commercial approvals for its products in India. The former Monsanto boss said government regulatory agencies with which the company used to deal with in the 1980s simply depended on data supplied by the company while giving approvals to herbicides. As reported in India Today, he is on record as saying that India’s Central Insecticide Board simply accepted foreign data supplied by Monsanto and did not even have a test tube to validate the data which at times was faked.

Now that scientists such as Professor Seralini are in a sense playing catch-up by testing previously independently untested GMOs, he is attacked. However, the attacks on Seralini and his study have been found to be based on little more than unscientific polemics and industry pressure. In fact, in new study, Seralini highlights the serious flaws of industry-backed studies that were apparently slanted to distort results. It remains to be seen whether he and his team are in for another bout of smears and attacks.

But this is symptomatic of the industry: it says a product is safe, therefore it is – regardless that science is being used as little more than an ideological smokescreen. We are expected to take its claims at face value. The revolving door between top figures at Monsanto and positions at the FDA makes it difficult to see where the line between lobbying and regulation is actually drawn. People are rightly suspicious of the links between the FDA and GMO industry in the US and the links between it and the regulatory body within the EU.

GM represents the so-called “Green Revolution’s” second coming. Agriculture has changed more over the last two generations than it did in the previous 12,000 years. Environmentalist Vandana Shiva notes that, after 1945, chemical manufacturers who had been involved in the weapons industry turned their attention to applying their chemical know-how to farming. As a result ‘dwarf seeds’ were purposively created to specifically respond to their chemicals. Agriculture became transformed into a chemical-dependent industry that has destroyed much biodiversity. What we are left with is crop monocultures, which negatively impact food security and nutrition. In effect, modern agriculture is part of the paradigm of control based on mass standardization and a dependency on corporate products.

The implications have been vast. Chemical-industrial agriculture has proved extremely lucrative for the oil and chemicals industry, courtesy of oil-rich Rockefeller interests which were instrumental in pushing for the green revolution throughout the world, and has served to maintain and promote Western hegemony, not least via ‘structural adjustment’ and the consequent uprooting of traditional farming practices in favour of single-crop export-oriented policies, dam building to cater for what became a highly water intensive industry, loans and indebtedness, boosting demand for the US dollar, etc.

Agriculture has been a major tool of US foreign policy since 1945 and has helped to secure its global hegemony. One must look no further than current events in Ukraine, where the strings attached to financial loans are resulting in the opening up of (GM) agriculture to Monsanto. From Africa to India and across Asia, the hijack of indigenous agriculture and food production by big corporations is a major political issue as farmers struggle for their rights to remain on the land, retain ownership of seeds, grow healthy food and protect their livelihoods.

Apart from tying poorer countries into an unequal system of global trade and reinforcing global inequalities, the corporate hijacking of food and agriculture has had many other implications, not least where health is concerned.

Dr Meryl Hammond, founder of the Campaign for Alternatives to Pesticides, told a Canadian parliament committee in 2009 that a raft of studies published in prestigious peer-reviewed journals point to strong associations between chemical pesticides and a vast range of serious life-threatening health consequences. Shiv Chopra, a top food advisor to the Canadian government, has documented how all kinds of food products that were known to be dangerous were passed by the regulatory authority and put on the market there due to the power of the food industry.

Severe anemia, permanent brain damage, Alzheimer’s, dementia, neurological disorders, reproductive problems, diminished intelligence, impaired immune system, behavioural disorders, cancers, hyperactivity and learning disability are just some of the diseases that numerous studies have linked to our food.

Of course, just like cigarettes and the tobacco industry before, trying to ‘prove’ the glaringly obvious link will take decades as deceit is passed off as ‘science’ or becomes institutionalized due to the hijacking of government bodies by the corporations involved in food production.

But anyone who questions the need for GMOs in the first place and the risks they bring and devastating impacts they have is painted as clueless and indulging in scare mongering and falsehoods, while standing in the way of human progress. But can we expect much better from an industry that has a record of smearing and attempting to ruin people who criticise it? Are those of us who question the political links of big agritech and the nature of its products ready to take lessons on ethics and high-minded notions of ‘human progress’ from anyone involved with it?

This is an industry that has contaminated crops and bullied farmers with lawsuits in North America, an industry whose companies have been charged with and most often found guilty of contaminating the environment and seriously damaging health with PCBs and dioxins, an industry complicit in concealing the deadly impact of GM corn on animals, an industry where bribery seems to be second nature (Monsanto in Indonesia), an industry associated with human rights violations in Brazil and an industry that will not label its foods in the US.

A great myth forwarded by the pro-GMO lobby is that governments are freely choosing to adopt GMOs. Any brief analysis of the politics of GM highlights that this is nonsense. Various pressures are applied and agritech companies have captured policy bodies and have a strategic hold over the WTO and trade deals like the TTIP.

For instance, take the 2005 US-India nuclear deal (allowing India to develop its nuclear sector despite it not being a signatory of the Non-Proliferation Treaty and allegedly pushed through with a cash for votes tactic in the Indian parliament). It was linked to the Knowledge Initiative on Agriculture, which was aimed at widening access to India’s agricultural and retail sectors. This initiative was drawn up with the full and direct participation of representatives from various companies, including Monsanto, Cargill and Walmart.

When the most powerful country comes knocking at your door seeking to gain access to your markets, there’s good chance that once its corporate-tipped jackboot is in, you won’t be able to get it out.

And it seems you can’t. So far, Bt cotton has been the only GM crop allowed in India, but the open field trials of many GM crops are now taking place around the country despite an overwhelming consensus of official reports warning against this. The work of numerous public bodies and research institutes is now compromised as a result of Monsanto’s strategic influence within India (see this and this).

If global victory cannot be achieved by the GMO biotech sector via the hijack of public bodies and trade deals or intimidation, then the politics of another form of contamination may eventually suffice:

“The hope of the industry is that over time the market is so flooded [with GMOs] that there’s nothing you can do about it. You just sort of surrender” – Don Westfall, biotech industry consultant and vice-president of Promar International, in the Toronto Star,January 9 2001.

Open field planting is but one way of achieving what Westfall states. Of course, there are numerous other ways too (see this).

As powerful agribusiness concerns seek to ‘consolidate the entire food chain’ with their seeds, patents and GMOs, it is clear that it’s not just the health of the nation (any nation) that is at stake but the global control of food and by implication nations.

“What you are seeing is not just a consolidation of seed companies, it’s really a consolidation of the entire food chain” – Robert Fraley, co-president of Monsanto’s agricultural sector 1996, in the Farm Journal. Quoted in: Flint J. (1998) Agricultural industry giants moving towards genetic monopolism. Telepolis, Heise.

Colin Todhunter is an independent writer: colintodhunter.com