Towards a Critical Public Pedagogy of Predatory Anthropocene


By Michael B. McDonald

Source: The Hampton Institute

In 2015, a group of scientists published ” The trajectory of the Anthropocene: The Great Acceleration “. They showed that rising consumption and increasing rates of impact on Earth Systems began after the Second World War. It was the expansion of economic activity charged by increasing resource use that created new technologies that expanded rates of consumption. This was a celebrated new socio-economic phase called the Great Acceleration that was supposed to lead to full employment and a bright future for all. It was also the beginning of a next phase of world capitalism accelerated by increasing urbanization. By 2008 humanity officially entered a new urban phase where 50% of the earth’s population lives in urban spaces. More cities will be built in the next thirty years than in all previous human history. Earth System scientists have shown that all of these changes are having unprecedented impacts on the Earth. Human life is changing the Earth, they call it anthropocene.

But the Great Acceleration did not lead to full employment nor a bright future. In fact, it has led to massive inequality created by a very small percentage of people controlling a staggering amount of wealth. In 2010, OECD countries had 18% of the earth’s population but accounted for 74% of GDP. But only .1% controlled this vast wealth through a system that I call predatory anthropocene.

The system of predatory anthropocene can be found in changes to the global economy and a fundamental shift in the way the economy works through its transformation of subjective, social and environmental ecologies, what Felix Guattari called the Three Ecologies. One aspect of this change has been called semiocapitalism, the blending of imagination, ideas, language and capital. Semiocapitalism works by capturing evolutionary life. Belonging, for instance, is now produced by the consumption of psycho-social products that gain economic value in consumption and are financed by increasing debt. The GDP of the United States is now 70% consumption.

Making community through mass consumption is eroding the anthropological basis upon which human life is built. We need a language for this. Perhaps we need to recognize that the communicative and biological systems of the human species have habitats. The biosphere sustains biological life while the ethnosphere sustains communicational life. The biosphere is quite well known but the ethnosphere less so. Wade Davis suggests that the ethnosphere is a global quilt of local cultures, a band of cultural life functioning in tandem with the biosphere for the creation, organization, and expression of human communicational life.[1] The ethnosphere is a collection of languages, ideas, and dreams. It is the anthropological rituals that have accompanied human evolution, has organized social reproduction, it is the institution of language [2] in all its complexity, but is also beyond language.[3] When people talk about humanity in general, they mean the biosphere and ethnosphere, the cultures of the world in their physical, expressive, subjective dimensions. But now ethnosphere complexity is reduced by global commodities, unique cultures consumed by Hollywood-hegemony, human imagination consumed by consumer products, dreams being replaced by corporate produced and globalized desires. A single system is producing hegemony in ways that no single system was ever before capable. It is necessary for us to see that our species is under threat by a monster system that we have created, a monsterous, cancerous, predatory system poisoning the Earth. Henry Giroux has argued that:

What makes American society distinct in the present historical moment are a culture and social order that have not only lost their moral bearings but produce levels of symbolic and real violence whose visibility and existence set a new standard for cruelty, humiliation, and mechanizations of a mad war machine, all of which serves the interest of the political and corporate walking dead-the new avatars of death and cruelty-that plunder the social order and wreak ecological devastation. We now live in a world overrun with flesh-eating zombies, parasites who have a ravenous appetite for global destruction and civic catastrophe. (2014, xi-xii)[4]

Because I follow Guattari’s cybernetic view I am less certain than Giroux appears to be, that is it possible to tell zombies from non-zombies in a period where a) agribusiness replaces agriculture and transforms all aspects of domestic life that b) creates stretches of suburbs that wipe out, without social discussion, the farmland that has laid the foundation of human flourishing, c) as mounting debt continues without slowing and without discourse in the public sphere, as d) waves of fellow humans are dislocated everyday due to military, economic, and environmental calamities. And none of this is news, we watch all of it studiously, staring at our displays unmoved by the misery and pain we see on the faces, and hear in the cries of fellow humans. Too many of us escape our responsibilities to confront this pain by fleeing to walled-in communities whose walls are maintained, not by bricks but by the capacity to carry the mortgage debt (that machinically contributes to predatory anthropocene) in the hopes of living in relative safety while the poor (who can not access debt) are left in decaying city centers. But as foreclosures swept across America after the housing bubble burst, suburban safety was shown to be precarious. It is important for us to take notice of the fact that we know all of this and collectively do very little to change it. We sign petitions on Facebook, but we still shop at malls that we built on farmland and we clearly have little access to empathy. And I am not saying this to be critical of you. I am truly stuck. After many years of being inspired by Adbusters and semio-politics and culture jamming I’m not sure what the next step is. I feel free space disappearing. I’m looking for options.

This difficulty of expressing empathy tells us something about hegemony under semiocapitalism. We now know that empathy is not something we develop, but something that we shut down. Vittorio Gallese in ” The Manifold Nature of Interpersonal Relations: The Quest for a Common Mechanism” has shown that for us to “know that another human being is suffering or rejoicing, is looking for food or shelter, is about to attack or kiss us, we do not need verbal language” (Virno 2008: 175) we only need the activation of what Gallese called mirror neurons a “class of premotor neurons [that] was discovered in the macaque monkey brain that discharged not only when the monkey executes goal-related hand actions like grasping objects, but also when observing other individuals (monkeys or humans) executing similar actions” (Gallese: 522). Experiments successfully illustrated that mirror neurons were also in the human brain “positioned in the ventral part of the inferior frontal lobe, consisting of two areas, 44 and 45, both of which belong to the Broca region” (Virno: 177). Mirror neurons allow us to experience what we see. When we see someone doing something that we’ve never done, our brain reacts as if we are doing it, what Gallese calls “embodied simulation.” This means that empathy is not something that we need to develop it is something that is functioning in our brains whether we like it or not. But as Paulo Virno points out, humans are clearly adept at seeing other humans as not-humans in order to override “embodied simulation”. We are constantly unmoved watching violent death in both fiction and non-fiction, and constantly enacting laws to restrict sexuality and eroticism in the social sphere. In this context there is little doubt that a public pedagogy of human negation is taking place that values violence and negates the erotic energy that produces new human life! What this means is that “every naturalist thinker must acknowledge one given fact: the human animal is capable of not recognizing another human animal as being on of its own kind.” How does this public pedagogy of negation occur? Virno argues that verbal language, “distinguishes itself from other communicative codes, as well as from cognitive prelinguistic performance, because it is able to negate any type of semantic content.”(176). Through language we are able to negate others as not-human, shutting down the empathy that is produced by mirror neurons. But all is not lost as Paulo Freire points out, pedagogies of dehumanization can be countered through critical pedagogy. That we might learn to negate dehumanization is our hope, to dissolve the oppressor-oppressed binary through the creation of new anti-predatorial segnifications. Virno suggests that while language introduced human-negation into communication it also provides us the technology to negate-negation. In this way critical pedagogy is the negation-of-negation. But only when it is used in this way. I make one amendment to Virno’s suggestion, that it is necessary to go beyond the notion of linguistic negation to identify the ways that negation is in the production of subjectivity, not just in the linguistic negation but in complex existential negations that occur within complex machinic semiotics. It is necessary to see the ways that the production of aesthetic systems produces collective subjectivities that produce We’ness as well as Other’ness.

Cultural technologies produce cultural workers who reproduce subjectivity-producing systems that produce subjects who reduce the ethnosphere and pollute the biosphere. Theodore Adorno was right to be concerned about the culture industry just as Walter Benjamin saw with clear sight the dangers of the absorption of aesthetics into politics. They both saw that the industrialization of the satisfaction of desire, what we might call affective-capitalism, has significant socio-political-economic impacts. There is a real danger when anthropological rituals developed for the social life are replaced by capitalist products. The production and satisfaction of desire on the marketplace is a constantly undermining of love of the local, a replacement of belonging with having the same mass manufactured private property, the replacement of environmentally-embedded anthropological bonds with capital resource consuming exchange. The production of subjectivity is consumed by the factory, negating living, thus extending the contractions of capitalism beyond the factory into all aspects of live time. Giroux has called this a “new kind of authoritarianism that does not speak in the jingoistic discourse of empowerment, exceptionalism, or nationalism. Instead, it defines itself in the language of cruelty, suffering, and fear, and it does so with a sneer and an unbridled disdain for those considered disposable. Neoliberal society mimics the search for purity we have seen in other totalitarian societies” (2014, xvii). And it does so through the production of subjectivity, in the distribution of social subjection and the institution of machinic enslavement. Together these form the contents of the public pedagogy of culture industry, the negation of lived time that blocks access to mirror neurons, limits our ability to negate the negations of the neoliberal culture industry, thus limiting our ability to resist through the production life affirming social machines, liberatory and collectively produced social subjectivations and life affirming machinic enslavements.
I, Terminator

Some people however, are arguing that the changes I call predatory anthropocene are a step forward for humanity. Luciano Floridi, for instance, imagines a new humanity as interconnected informational organisms (inforgs) active in “sharing with biological agents and engineered artifacts, a global environment ultimately made of information” (2011,9). Collectively these inforgs produce an infosphere that either replaces or contributes to the ethnosphere. But Floridi does not account for political economy and therefore misses that his dreams of the infosphere are enslaved by the algorithms of capitalism.

Franco ‘Bifo’ Berardi however, shows that inforgs are not liberated informational workers but are ‘cognitariate’ (exploited proletarians of information) controlled by the automatisms of machinic enslavements, no longer disciplined but under subjectively captured within the new means of control. Machinic enslavement is not discipline, but it is none-the-less controlling. No longer is there a need for an authority to hover over your shoulder to keep you in line. Machinic enslavement works to lead you into accepting the circuits of capture and control embedded cybernetically in modes of production, exchange and consumption. In the machinic enslavement of predatory anthropocene your only value is through economic consumption, and control is located in your desire to fulfill your consumptive role. Desire (libidinal, economic, social) is no longer a location of liberation, but a mechanism of discipline. This is power within predatory anthropocene.

Floridi’s infosphere and its cognitarians are colonizers machinically enslaving dreams and desires. Their colonization does not in fact produce the infosphere but instead a nightmarish mechanosphere. The mechanosphere converts the anthropological ethnosphere into capitalist products, cognitive capitalism “produces and domesticates the living on a scale never before seen” (Boutang 2011, 48). Felix Guattari and Franco Berardi “emphasize that entire circuits and overlapping and communicating assemblages integrate cognitive labor and the capitalistic exploitation of its content”[5] in a model they call semiocapitalism, that captures “the mind, language and creativity as its primary tools for the production of value”( Berardi 2009, 21). Our language is being transformed into capitalist value, our words, dreams, desires and subjectivities are lost to the mechanosphere, “the authoritarian disimagination machine that affirms everyone as a consumer and reduces freedoms to unchecked self-interest while reproducing subjects who are willingly complicit with the plundering of the environment, resources, and public goods by the financial elite” (Giroux 2014, xxi).

Predatory anthropocene not only massively increases earth system impacts but creates massive inequality. In early 2015 year Oxfam released Working for the Few a terrifying document that shows, “Almost half of the world’s wealth is now owned by just one percent of the population” and that, “The bottom half of the world’s population owns the same as the richest 85 people in the world,” and that this already extreme economic disparity is getting worse.

But we do not tell stories of predatory anthropocene to our children. Instead we tell myths of consumption, stories of gleeful elves happily working in non-unionized factories making toys for unproblematically good children, all the while supported by a covert group of elf spies that complicit parents move around their house for weeks. This is the childhood public pedagogy of predatory anthropocene where domestic life is machinically enslaved to global capitalism, domesticated to surveillance-of-consumption, young lives converted to effective consumer-citizens. Perhaps it’s time to start telling our children the very true story of predatory anthropocene, the killer system that we have created and released into our world but refuse to name, refuse to accept, and spend a great deal of money and words denying. There is no sense denying predatory anthropocene, we need to talk of the monster that is killing our planet, we need to develop a critical pedagogy of predatory anthropocene, to learn to negate the negation.



[1] Davis, Wade. (2007). Light at the Edge of the World: A Journey Through the Realm of Vanishing Cultures. Vancouver, BC: Douglas &McIntyre Ltd.

[2] Virno, Paolo. (2008). Multitude: Between Innovation and Negation. Los Angeles, California: Semiotext(e)Foreign Agents Series.

[3] Here I am thinking about post Spinozist philsophers that argue for a semiotics beyond language signification and even beyond logocentric significations and include Gilles Deleuze, Felix Guattari, Michel Foucault, Maurizio Lazzarato, Rosi Braidotti. Most compelling is the Deleuze and Guatarri suggestion that Lazzarato has picked up on in Signs and Machines and Governing by Debt that there is a machinic order as well as a logocentric order. My argument here is that predatory anthropocene functions through a machinic order that is little impacted by traditional semiotics, by political sloganeering, or even by radical critique. That there must be a politics of doing, or dropping out of predatory anthropocene in the way that Franco ‘Bifo’ Berrardi suggests in After the Future.

[4] Giroux, Henry (2014). Zombie Politics and Culture in the Age of Casino Capitalism New York: Peter Lang Press.

[5] Genosko, Gary. 2012. Remodeling Communication: From WWII to WWW. Toronto, Can: University of Toronto Press. (pg. 150)

Posted in Authoriarianism, Consumerism, Corporate Crime, culture, Economics, Environment, Philosophy, Science, society, State Crime, Technology | Tagged , , , , , , , , , | Leave a comment

Everything You Think You Know About Addiction is Wrong: Smashing the Drug War Paradigm


By Matt Agorist

Source: The Free Thought Project

“Overwhelming evidence points to not just the failure of the drug control regime to attain its stated goals but also the horrific unintended consequences of punitive and prohibitionist laws and policies,” states a study, published by the Global Commission on Drug Policy (GCDP).

“A new and improved global drug control regime is needed that better protects the health and safety of individuals and communities around the world,”
 the report says. “Harsh measures grounded in repressive ideologies must be replaced by more humane and effective policies shaped by scientific evidence, public health principles and human rights standards.”

This sudden onset of logic by political bodies across the globe is likely due to the realization of the cruel and inhumane way governments deal with addiction. Arbitrary substances are deemed “illegal” and then anyone caught in possession of those substances is then kidnapped and locked in a cage, or even killed.

The fact that this barbaric and downright immoral practice has been going on for so long speaks to the sheer ignorance of the state and its dependence upon violence to solve society’s ills.

The good news is that the drug war’s days are numbered, especially seeing that it’s reached the White House, and they are taking action, even if it is symbolic. Evidence of this is everywhere. States are defying the federal government and refusing to lock people in cages for marijuana. Colorado and Washington served as a catalyst in a seemingly exponential awakening to the government’s immoral war.

Following suit were Oregon, D.C., and Alaska. Medical marijuana initiatives are becoming a constant part of legislative debates nationwide. We’ve even seen bills that would not only completely legalize marijuana but deregulate it entirely, like corn.

As more and more states refuse to kidnap and cage marijuana users, the drug war will continue to implode.

Knowledge is a key role in this battle against addiction tyranny and investigative journalist Johann Hari, has some vital information to share. In a recent TEDx talk, Hari smashes the paradigm of the war on drugs.

What really causes addiction — to everything from cocaine to smart-phones? And how can we overcome it? Johann Hari has seen our current methods fail firsthand, as he has watched loved ones struggle to manage their addictions. He started to wonder why we treat addicts the way we do — and if there might be a better way. As he shares in this deeply personal talk, his questions took him around the world, and unearthed some surprising and hopeful ways of thinking about an age-old problem.

Posted in Activism, consciousness, culture, Drug War, Health, Philosophy, Psychology, Science, society, State Crime, Video | Tagged , , , , , , , , | Leave a comment

Now Streaming: The Plague Years


By A. S. Hamrah

Source: The Baffler

When things are very American, they are as American as apple pie. Except violence. H. Rap Brown said violence “is as American as cherry pie,” not apple pie. Brown’s maxim makes us see violence as red and gelatinous, spooned from a can.

But for Brown, in 1967, American violence was white. Explicitly casting himself as an outsider, Brown said in his cherry pie speech that “violence is a part of America’s culture” and that Americans taught violence to black people. He explained that violence is a necessary form of self-protection in a society where white people set fire to Bowery bums for fun, and where they shoot strangers from the towers of college campuses for no reason—this was less than a year after Charles Whitman had killed eleven people that way at the University of Texas in Austin, the first mass shooting of its kind in U.S. history. Brown compared these deadly acts of violence to the war in Vietnam; president Lyndon B. Johnson, too, was burning people alive. He said the president’s wife was more his enemy than the people of Vietnam were, and that he’d rather kill her than them.

Brown, who was then a leader of the Student Nonviolent Coordinating Committee and who would soon become the Black Panther Party’s minister of justice, delivered a version of this speech, or rant, to about four hundred people in Cambridge, Maryland. When it was over, the police went looking for him and arrested him for inciting a riot. Brown’s story afterward is eventful and complicated, but this is an essay about zombie movies. Suffice it to say, Brown knows about violence. Fifty years after that speech, having changed his name to Jamil Abdullah al-Amin, he’s spending life in prison for killing a cop.

The same day Brown was giving his speech in Maryland, George A. Romero, a director of industrial films, was north of Pittsburgh in a small Pennsylvania town called Evans City. Romero was shooting his first feature film, a low-budget horror movie in black and white called Night of the Living Dead. Released in October 1968, the first modern zombie movie tells the story of a black man trying to defend himself and others from a sudden plague of lumbering corpses who feed on the living. At the film’s end, he is unceremoniously shot and killed by cops who assume he is a zombie trying to kill them. The cops quickly dispose of his body, dumping it in a fire with a heap of the undead, as a posse moves on to hunt more zombies.

Regional gore films were nothing new in themselves; a number had appeared earlier in the 1960s. Night of the Living Dead, with its shambling, open-mouthed gut-munchers dressed in business suits and housecoats, might have seemed merely gross or oddly funny in a context other than the America of 1968. But Martin Luther King Jr. had been assassinated six months before its release. The news on TV, which most people still saw in black and white, consisted largely of urban riots and war reports from Vietnam. The My Lai Massacre had occurred the month before King was shot.

Romero’s film, seen in the United States the year it came out, had more in common with Rome Open City than it did with a drive-in horror movie made for teens—it was close to a work of neorealism. And it was unfunny and dire, much like John Cassavetes’s Faces, released the same year, whose laughing drunks stopped laughing when they paused to look in the mirror. Romero was a revisionist director of horror in the same way that Peckinpah and Altman were in their career-making genres, the western and the war movie.

Romero cast an African American in the lead, and he shifted the horror genre’s dynamic, aligning it with black-and-white antiwar documentaries like Emile de Antonio’s In the Year of the Pig, also released in 1968, and distinguishing it from the lurid color horror films Roger Corman and Hammer Films had been turning out up till then. Those films made certain concessions to the film industry; Night of the Living Dead did not. This was an American horror movie, so it needed no English accents or familiar character actors. It was grim and unflinching, showing average citizens, played by average people, eating the arms and intestines of their fellow townsfolk. Romero drove home this central point—that a zombie-infested America differed from the status quo only in degree, not in kind—by ending his film with realistic-looking fake news photos depicting his characters’ banal atrocities.

Mainstream film reviewers, including Roger Ebert, were shocked and disgusted by Night of the Living Dead. They discouraged people from seeing it, but Romero’s images proved to be indelible. The film’s reputation grew. In 1978 Romero made the film’s first sequel, Dawn of the Dead, this time in color. Today, if there’s one thing every American knows, it’s that zombies can only be killed with a shot to the head. This is common knowledge, cultural literacy, a kind of historical fact, like George Washington chopping down the cherry tree. American-flag bumper stickers assert that “these colors don’t run,” but one of them does. It runs like crazy through American life, through American movies, and now TV, like a faucet left on.

Dead Reckonings

The Huffington Post has had a Zombie Apocalypse header since 2011, under which the editors file newsy blog posts chronicling our continuing fascination with zombie pop culture, alongside any nonfiction news story horrible enough to relate to zombies or cannibalism. The infamous Miami face-eater attack of May 2012, which the media gleefully heralded as the start of a “real” zombie apocalypse, contributed to America’s sense that it could happen here, provided we wished for it hard enough. Reading through the Zombie Apocalypse posts, one gets a growing sense that we want the big, self-devouring reckoning to happen because it is the one disaster we are truly mentally prepared for. It won’t be the total letdown of the Ebola scare.

The face-eating incident was initially linked to bath salts: ground-up mineral crystals everyone hoped would become the new homemade drug of choice for America’s scariest users. It turned out the perpetrator, although naked, was only high on marijuana. He was black, killed by the police as he gouged out his homeless victim’s eyes and chewed his face on a causeway over Biscayne Bay. The incident was captured on surveillance video. Here in the golden age of user-generated content, the zombie movies self-generate—much like zombies themselves. The bridge backdrop of this all-too-real zombie vignette neatly summed up both the crumbling condition of America’s infrastructure and our more generalized state of neoliberal collapse.

The zombie apocalypse, our favorite apocalypse, seems to unite the right and left. It combines the apocalypse brought about by climate change and the subsequent competition for scant resources with the one loosed by secret government experiments gone awry. Better still, both of these scenarios, as we’re typically shown in graphic detail, will necessitate increased gun-toting and firearms expertise.

More than that, the fast-approaching zombie parousia allows us to indulge our fantasies of a third apocalypse, one that only the most clueless don’t embrace: the consumerist Day of Judgment, in which we will all be punished for being fat and lazy and living by remote control, going through our daily routines questioning nothing as the world falls apart and we continue shopping. Supermarkets and shopping carts, malls and food warehouses all figure prominently in the iconography of the post–Night of the Living Dead zombie movie, reminding us that even in our quotidian consumerist daze, we are one step away from looting and cannibalism, the last two items on everyone’s bucket list.

Still, despite its galvanizing power to place all of humanity on the same side of the cosmic battlefront, the zombie apocalypse, like all ideological constructs, nonetheless manages to cleave the world into two camps. One camp gets it and the other doesn’t. One is aware the apocalypse is under way, and the other is blithely oblivious to the world around it.

To confuse matters further, people move in and out of both camps, becoming inert, zombified creatures when obliviousness suits their mood. People blocking our progress on the street as they natter into their hands-free earsets stare straight ahead, refusing to admit that other people exist. At least they don’t bite us as we flatten ourselves against walls to pass them without contact. A paradox of the ubiquity of zombie-themed pop culture is how there are surely next to no people left who have not enjoyed a zombie movie, TV show, book, or videogame, yet there are more and more people shuffling around like extras in a zombie film, moving their mouths and making gnawing sounds.

The smartphone-based zombification of street life is a strange testament to Romero’s original insight, which becomes more pronounced as the wealth gap widens. The disenfranchised look ever more zombie-fied to the rich, who in turn all look the same and act the same as they take over whole neighborhoods and wall themselves up in condo towers. This, indeed, is exactly what happens in Romero’s fourth zombie movie, 2005’s Land of the Dead, which predicted things as consequential as what happened during Hurricane Katrina in New Orleans and as minor as the rise of food trucks.

The Zombie Apocalypse is also a parable of the Protestant work ethic, come to reap vengeance at the end of days. It assures us that only very resourceful, tough-minded people will be able to hack it when the dead come back to life. If the rest had really wanted to survive—if they deserved to survive—they would have spent a little less time on the sofa. But here, too, the simple and obvious moral takes a perverse turn: the best anti-zombie combatants should be the ones who’ve watched the most zombie movies, yet by the very logic of our consumer-baiting zombie fables, they won’t be physically capable of survival because all they did was watch TV.

Selective Service

What these couch potatoes will need, inarguably, is the protection of a strong leader, one who hasn’t spent his life in the vain and sodden leisure pursuits that they’ve inertly embraced—Rick Grimes in The Walking Dead, for instance. Why such a person would want to help them is a question they don’t ask. With this search for an ultimate hero, the zombie genre has veered into the escapism of savior lust, leaving Romero’s unflinching, subversive neorealism behind. In Night of the Living Dead, a witless humanity is condemned by its own herd mentality and racism. In latter-day zombie fictions, a quasi-fascist social order is required, uniting us regardless of race, creed, or color.

The predicament of the characters (and the actors) in all the nouveau zombie movies relates to this passive consumerism. Both the characters and the actors in new zombie movies have to act like zombie films don’t already exist, even though the existence of Romero’s films is what permits the existence of the film they are in. Somehow, the characters pull their savvy out of thin air. They must pretend that they have never heard of zombies, even as they immediately and naturally know what to do once their own particular Zombie Apocalypse gets under way.

This paradox underscores the fantasy aspect of the twenty-first-century zombie infatuation, in which a fixed set of roles is available for cosplay in a repeatable drama that already took place somewhere else. The difference between Romero’s films and the new zombie movies is that the more time that passes since 1968, the more Romero’s films don’t seem like they were designed as entertainment—even as they are endlessly exploited by the zombie-themed cultural productions that copy them, and even as they remain entertaining. The new zombie films cannibalize Romero’s films in an attempt to remake them ideologically, so that we will stop looking for meaning in them and just accept the inevitable.

The Primal Hordes

A primal fantasy of the Zombie Apocalypse is that when the shit hits the fan, we will be able to kill our own children or parents. We won’t have a choice. The decision to get rid of the generation impeding us will have been made for us by the zombie plague, absolving us of responsibility. We are, after all, killing somebody who is already dead and who, in his or her current state, is a threat to our continued existence.

Against the generalized dystopian entertainment landscape that followed the economic collapse of 2008, the Zombie Apocalypse made more sense than ever. But YA action-drama dropped it in favor of promoting teen heroes who were stronger than their nice-but-loserish sad sack parents. This is the uplifting generational affirmation that imbues Suzanne Collins’s Hunger Games franchise and Veronica Roth’s Divergent trilogy.

YA comedy, on the other hand, did not ignore zombie movies. Instead, it domesticated the Zombie Apocalypse, making it friendly. Nonthreatening zom-coms showed young viewers how the opposite sex was really not that scary, that being in a couple was still the most important thing, and that dystopias gave nerds an unprecedented chance to prove they could get the girl or boy. Dystopia, it turns out, is really a best-of-all-possible-worlds scenario for starry-eyed-kids-with-a-disease, or so we learn from zom-coms like Warm Bodies and Life After Beth.

The latest iteration of this trend, which sets a zombie heroine in a marginally less dystopian world that mirrors our tentative economic comeback, is the CW TV show iZombie. The series is a brain-eating entertainment for tweens in which they learn you can be okay and have a chill job even if you’re a living corpse who’s just trying to figure things out. When a zombie gets her own tween-empowerment show on The CW, it’s a good indication that zombies don’t carry the stern, unbekannt stigmas they used to. Zombies, much like corpses in TV commercials, are used as grotesque comic relief in things like animated Adult Swim shows. Such is the diminished status of the zombie; it is now a signifier that can be plugged in anywhere. To paraphrase the undead philosopher of capitalism’s own walking-dead demise: first time cannibalism, second time farce.

Reality Bites

The way zombie movies progress, with isolated groups splitting into factions and various elimination rounds as contestants disappear, suggests that Night of the Living Dead is also a secret source of reality TV. It makes sense, then, that 2009’s Zombieland, one of the first YA dystopian zombie entertainments, was penned by screenwriters who created The Joe Schmo Show and I’m a Celebrity . . . Get Me Out of Here!

Zombieland’s protagonist, a college-age dude played by Jesse Eisenberg, is a bundle of phobias, an OCD-style follower of rules who finds himself in a Zombie Apocalypse after an unexpected date with a hot girl out of his league (Amber Heard) goes wrong. Mentored by Woody Harrelson, who more or less reprised this same role in the Hunger Games movies, Eisenberg’s millennial character undergoes a reality-TV-scripted makeover. In expiation for his pusillanimity in the opening reel, he winds up rescuing a tough girl (Emma Stone) who also would have been out of his league in the pre-Apocalypse scheme of dating. Zombieland presents Eisenberg as gutless and Stone as ruthless, but she’s the one who ends up a hostage, and he becomes her hero. In fact, one of his rules, “Don’t be a hero,” changes on screen to “Be a hero,” as we once again learn that millennials really do have what it takes to kill zombies. Earlier in the film, Eisenberg accidentally shoots and kills a non-zombie Bill Murray, playing himself, showing that millennials can also, regretfully, take out Baby Boomers, including the cool ones who aren’t undead.

Edgar Wright’s 2004 Shaun of the Dead, the first movie zom-com, was a more intelligent version of this same storyline. An English comedy from the “Isn’t it cute how much we suck?” school, Wright’s film acquiesced to the coupling-up plot rom-coms require, but not without first presenting the routine, pointless daily life of its protagonist (Simon Pegg) as pre-zombified. Shaun of the Dead will likely remain the only sweet little comedy in which the protagonist kills his mother, a scene the film has the guts to play without flinching. The joke of Wright’s film is that it takes something as brutal as a zombie apocalypse to wake us from our stupor and to show us how good we had it all along. By the film’s end, Pegg and his girlfriend (Kate Ashfield) are in exactly the same place they were when the film started, but now at least they live together. A cover of the Buzzcocks’ song “Everybody’s Happy Nowadays” jangles over the credits, providing a zombified dose of circa-1979 irony.

Wright and Pegg’s goofy rethinking of the zombie movie proved how firmly zombies are entrenched in our consciousness, and how easy they are to manipulate for comedic effect. The same month Shaun of the Dead came out, a Hollywood remake of Romero’s Dawn of the Dead was released. It, too, cleaned up at the box office. This new Dawn of the Dead seemed like it was made by one of the nerds in the American zom-coms, a jerk desperate to prove he’s bad-ass. (The director now makes superhero movies.) Johnny Cash’s “The Man Comes Around” accompanies the opening credits, setting a high bar for artistic achievement the ensuing film does not come near to clearing. Jim Carroll’s “People Who Died” plays at the end—its placement there as repulsive as anything else in the film.

As all nouveau zombie films must, the remake starts in the suburbs, where a couple is watching American Idol in bed, underscoring the genre’s newfound connection to reality TV. The film’s CGI effects, which at the time injected a souped-up faux energy into the onscreen mayhem, dated instantly. They’re now the kind of off-the-rack effects featured in Weird Al videos when someone gets hit by a car.

The main point of this new Dawn of the Dead is that after the Zombie Apocalypse, people will spend their time barking orders at each other and calling each other “asshole.” The film nods in the direction of loving the military and the police, and totally sanitizes Romero’s use of a shopping mall as a site of consumerist critique. Like many films of the 2000s, it postulates that living in a mall wouldn’t be a Hobbesian dystopia at all; it would be rad. If the remake had been made five years later, maybe it would have had to grapple with the “dead malls” that began to adorn the American landscape with greater frequency after the economy collapsed. Instead, the mall serving as the film’s principal backdrop is spotless and fun. The remake’s island-set, sequel-ready false happy ending makes one long for the denouement of Michael Haneke’s Funny Games—a longing more unimaginable than any real-life wish-fulfillment fantasy about the Zombie Apocalypse actually coming to pass.

The American Way of Death

Fanboys liked the Dawn of the Dead remake and, inexplicably, so did many critics. Manohla Dargis, then at the Los Angeles Times, wrote that the film was “the best proof in ages that cannibalizing old material sometimes works fiendishly well,” a punny sentiment she might well walk back today.

The next year, when George A. Romero released his first new zombie film in twenty years, it did not fare as well in the suddenly crowded marketplace of the undead. While Land of the Dead (2005) is fittingly seen as something of a masterpiece now, on its initial release it puzzled genre fans, who had gotten used to the sort of “fast zombies” that were first featured in the nihilistic-with-a-happy-ending British movie 28 Days Later (2002). Romero’s new film was as trenchant as his others, but many fans weren’t having it.

IMDb user reviews provide a record of their immediate reactions. “This movie was terrible!” one wrote the month Land of the Dead premiered. “The storyline—can’t use the word plot as that would give it too much credit—was tedious! Some say it was a great perspective on class? Are you kidding me!!!” Less then a year into George W. Bush’s second term, Romero was archly depicting a society much different from the one he’d shown in Night of the Living Dead. This new society—today’s—was more class-riven, more opportunistic, more cynical. And Romero, even while moving in the direction of Hawksian classicism, was exposing these failings with radical acuity. His dark fable of two Americas at war over the control of the resources necessary to survive was concise, imaginative, and well constructed. Few at the time wanted to consider the film’s style, which seemed out of date compared to the Dawn of the Dead remake. Fewer still wanted to grapple with its implications.

Ten years later, it is clear that no American genre film from that period digests and exposes the Bush era more skillfully than Land of the Dead. Romero’s film was uncomfortably ahead of its time, and like his other zombie work, it hasn’t dated; it speaks of 2015 as much as 2005. Tightly controlled scenes avoid the pointlessness and repetition of the nouveau zombie films, limning class struggle in unexpected ways. Zombies, slowly coming to consciousness, use the tools of the trades from which they’ve been recently dispossessed to shatter the glass of fortified condos. A zombie pumps gas through the windshield of a limo. The rich commit suicide, only to come back to life as zombies and feed on their children. America, as the original-zombie-era Funkadelic LP taught us, eats its young.

As zombie fantasies go, these scenes are much richer than the random, unsatisfying mayhem of the nouveau zombie films. Romero, unlike his counterparts, does not shy away from race. He shows African Americans pushing back against the injustices and indignities of a militarized police state, thereby completing a circle that began with Duane Jones’s performance in Night of the Living Dead.

Walking Tall

For the latest generation of zombie enthusiasts, the zombie genre means just one thing: AMC’s massively popular cable series The Walking Dead. The show is so much better than any of the recent non-Romero zombie movies that it’s among the leading exhibits in the case against the cineplex. The show’s politics and implications are widely discussed, and The Walking Dead has engendered national debate about all sorts of ethical issues, including something Romero’s films raised only in the negative: America’s future. But the first problem The Walking Dead solved was how to make its own debates about these things interesting: whenever scenes get too talky, a “walker” sidles up and has to be dispatched in the time-honored fashion. At its core, the zombie drama is like playing “You’re it!” The show could be called Game of Tag.

The Walking Dead debuted in 2010, emerging from a period in U.S. history when, all of a sudden, we found ourselves in a junked, collapsed, post-American environment. New dystopian dramas, especially the YA ones, reflected this chastened reality. The Walking Dead looked at first like it might become just another placeholding entry in this cavalcade of glumness, much like TNT’sSpielberg-produced, families vs. aliens sci-fi show Falling Skies. Zombies were maybe the most dated way possible to dramatize our newly trashed world.

It was The Walking Dead’s dated qualities, however, that saved it from becoming cable TV’s Hunger Games. The show’s grunge aesthetic and majority-adult cast situated it elsewhere. And if that particular elsewhere felt like the past as much as the future, that was part of what made the show work for premium cable’s Gen X audience. Greg Nicotero, a makeup man who worked under Romero, is one of the show’s producers. His presence indicated the people behind the show took the genre seriously, unlike anyone else in Hollywood who had touched it.

Television works by imitating success, by zombifying proven formulas through a process called mimetic isomorphism. When television producers saw The Walking Dead’s ratings beating broadcast-network ratings—a first for cable drama—they took notice and began spawning. Copies of copies like Resurrection, The Last Ship, The Leftovers, and 12 Monkeys showed that plague is contagious, but it doesn’t have to be zombie plague. Meanwhile, The Walking Dead continues its success, and AMC will debut a companion series this summer, unimaginatively called Fear the Walking Dead.

If the worst zombie movies unselfconsciously imitate higher-gloss broadcast-network reality trash like Survivor, The Walking Dead succeeds by staying closer to the lowest grade of cable-network reality TV. The world of The Walking Dead is closer to Hoarders than it is to Big Brother. Hoarders presents an America engulfed in mounds of trash that its psychologically damaged possessors can’t part with. Mounds of Big Gulp cups and greeting cards and heaps of car parts and instruction manuals overwhelm their homes, spilling into their yards. Shows like Storage Wars, Pawn Stars, and American Pickers present an America of valueless junk that maybe somebody can make a buck on—if only by televising it for our own lurid delectation. These shows are the opposite of pre-collapse valuation shows like Antiques Roadshow, in which the junk people had lying around proved to be worth more than they had imagined. The detritus of Hoarders is worthless, the kind of trash that will blow around everywhere after the Zombie Apocalypse.

Hoarders vs. Horde

In his recent book 24/7, an analysis of the end of sleep and our twenty-four-hour consumption-and-work cycle, Jonathan Crary writes that “part of the modernized world we inhabit is the ubiquitous visibility of useless violence and the human suffering it causes. . . . The act of witnessing and its monotony can become a mere enduring of the night, of the disaster.” Zombies, not quite awake but never asleep, are the living-dead reminders of this condition, stumbling through our fictions. When they are not transformed by the wishful thinking of ideology into our pals, they retain this status.

Celebrated everywhere, zombies are the opposite of celebrities, who swoop into our disaster areas like gods from Olympus to rescue us from the calamities that also allow them to flourish. Zombies, far from being elevated, descend into utter undistinguishable anonymity and degradation, which is why they can be destroyed in good conscience. Brad Pitt, one of the producers of ABC’s Resurrection, also starred in World War Z, the most expensive zombie movie ever made. The last line of that odious movie—the first neoliberal zombie movie—is “Our war has just begun.”

Whatever that was supposed to mean to the audience, these fables of the plague years drive home just who the zombies are supposed to be—and who, when the plague hits, will helicopter out holding the machine guns. Col. Kurtz’s faithful devotee from Apocalypse Now, Dennis Hopper, the counterculture hero who became a Republican golf nut, plays the leader of the remaining 1 percent in Land of the Dead. “We don’t negotiate with terrorists,” he says when he’s faced with the choice between his money and our lives.

Posted in Art, conditioning, Consumerism, culture, Economics, Film, Health, media, Philosophy, Social Control, society, Uncategorized | Tagged , , , , , , , , , , , , | Leave a comment

Two for Tuesday


Posted in Art, culture, Music Video, Two for Tuesday, Video | Tagged , , , | Leave a comment

Obama Accuses Russia of Going After America’s “Good Guy Terrorists”


By Prof Michel Chossudovsky

Source: Global Research

Amply documented but rarely mentioned in news reports, the ISIS is a creation of US intelligence, recruited, trained and financed by the US and its allies including Britain, France, Saudi Arabia, Qatar, Turkey, Israel and Jordan.  

Until recently, the ISIS was known as Al Qaeda in Iraq (AQI). In 2014, it was renamed the Islamic State (Islamic State of Iraq and Syria, Islamic State of Iraq and the Levant).

Russia is Now Involved in the War on Terrorism

A major turning point in the dynamics of the Syria-Iraq war is unfolding. Russia is now directly involved in the counter-terrorism campaign in coordination with the Syrian and Iraqi governments.

While Washington has acknowledged Moscow’s resolve, Obama is now complaining that the Russians are targeting the “good guy terrorists” who are supported by Washington.

From the Horse’s Mouth

According to the Wall Street Journal:

Russian Airstrike in Syria Targeted CIA-Backed Rebels, U.S. Officials Say

One area hit was location primarily held by rebels receiving funding, arms, training from CIA and allies

One important piece of unspoken information conveyed in this WSJ report is that the CIA is supporting terrorists as a means to triggering “regime change” in Syria, implying the conduct of covert intelligence operations within Syrian territory:

“The U.S. spy agency has been arming and training rebels in Syria since 2013 to fight the Assad regime  (WSJ, September 30, 2015 emphasis added, author’s note: covert support to the terrorists was provided from the outset of the war in March 2011)

The above statement is something which is known and documented but which has barely been acknowledged by the mainstream media.

Al Nusra: “Good Guy Terrorists”

While the Pentagon now candidly acknowledges that the CIA is supporting Al Qaeda affiliated groups inside Syria, including Al Nusra, it nonetheless deplores the fact that Russia is allegedly targeting the “good guy terrorists”, who are supported by Washington:

One of the [Russian] airstrikes hit an area primarily held by rebels backed by the Central Intelligence Agency and allied spy services, U.S. officials said, …

Among seven areas that Syrian state media listed as targets of Russian strikes, only one—an area east of the town of Salamiyah in Hama province—has a known presence of Islamic State fighters. The other areas listed are largely dominated by moderate rebel factions or Islamist groups such as Ahrar al-Sham and the al Qaeda-affiliated Nusra Front.  (WSJ, September 30, 2015 emphasis added)

Affiliated to Al Qaeda, Al Nusra is a US sponsored  ”jihadist” terrorist organization which has been responsible for countless atrocities. Since 2012, AQI and Al Nusra — both supported by US intelligence– have been working hand in glove in various terrorist undertakings within Syria.

In recent developments, the Syrian government has identified its own priority areas for the Russian counter-terrorism air campaign, which consists essentially in targeting Al Nusra.  Al Nusra is described as the terrorist arm of the Free Syrian Army (FSA).

While Washington has categorized Al Nusra as a terrorist organization (early 2012), it nonetheless provides support to both Al Nusra and it’s so-called “moderate rebels” in the form of weapons, training, logistical support, recruitment, etc. This support is channeled by America’s Persian Gulf allies, including Qatar and Saudi Arabia as well as through Turkey and Israel.

Ironically, The UN Security Council in a May 2012 decision “blacklisted Syria’s al-Nusra Front as an alias of al-Qaeda in Iraq”, namely the ISIL:

a decision that will subject the group to sanctions including an arms embargo, travel ban and assets freeze, diplomats said.

The US mission to the United Nations said none of the 15 council members objected to adding al-Nusra as an alias of al-Qaeda in Iraq on Thursday.

Al-Nusra, one of the most effective forces fighting President Bashar al-Assad, last month pledged allegiance to al-Qaeda leader Ayman al-Zawahri. (Al Jazeera, May 2012)

And now Russia is being blamed for targeting a terrorist entity which is not only on the UN Security Council blacklist but which has ties to the Islamic State (ISIS).

What is the significance of these accusations?

While the media narrative acknowledges that Russia has endorsed the counter-terrorism campaign, in practice Russia is (indirectly) fighting the US-NATO coalition  by supporting the Syrian government against the terrorists, who happen to be the foot soldiers of the Western military alliance, with Western mercenaries and military advisers within their ranks. In practice, what Russia is doing is fighting terrorists who are supported by the US.

The forbidden truth is that by providing military aid to both Syria and Iraq, Russia is (indirectly) confronting America. 

Moscow will be supporting both countries in their proxy war against the ISIL which is supported by the US and its allies.

Posted in History, Geopolitics, Empire, war on terror, black ops, war, State Crime, CIA | Tagged , , , , , , , , , , , , , | 2 Comments

Having Their Cake and Eating Ours Too


By Chris Lehmann

Source: The Baffler

What are billionaires for? It’s time we sussed out a plausible answer to this question, as their numbers ratchet upward across the globe, impervious to the economic setbacks suffered by mere mortals, and their “good works” ooze across the fair land. The most recent count from Forbes reports a record 1,826 of these ten-figure, market-cornering Croesuses, with familiar North American brands holding down the top three spots: Bill Gates, Carlos Slim, and Warren Buffett. Esteemed newcomers to the list include Uber kingpin Travis Kalanick, boasting $5.3 billion in net worth; gay-baiting, evangelical artery-hardeners Dan and Bubba Cathy, of Chick-fil-A fame ($3.2 billion); and Russ Weiner, impresario of the antifreeze-by-another-name energy drink Rockstar ($2.1 billion). For the first time, too, Mark Zuckerberg has cracked the elite Top 20 of global wealth; in fact, fellow Californians, most following Zuckerberg’s savvy footsteps into digital rentiership, account for 23 of the planet’s new billionaires and 131 of the total number—more than supplied by any nation apart from China and the Golden State’s host country, a quaint former republic known as the United States.

What becomes of the not-inconsiderable surplus that your average mogul kicks up in his rush to market conquest? In most cases, he (and in the vast majority of cases, it is still a “he”) parks his boodle in inflation-boosted goods like art and real estate, which neatly double as venerable monuments to his own vanity or taste.

But what happens when the super-rich turn their clever minds toward challenges beyond getting up on the right side of their well-feathered beds? Specifically, what are the likely dividends of their decisions to “give back to the community,” as the charitable mantra of the moment has it? Once upon a time, the Old World ideal of noblesse oblige might have directed their natural stirrings of conscience toward the principles of mutuality and reciprocity. But this is precisely where the new millennial model of capital-hoarding falls apart. The notion that the most materially fortunate among us actually owe the rest of us anything from their storehouses of pelf is now as unlikely as a communard plot twist in an Ayn Rand novel.

Look around at the charitable causes favored among today’s info-elite, and you’ll see the public good packaged as one continual study in billionaire self-portraiture. The Bill and Melinda Gates Foundation, endowed by a celebrated prep-school graduate and Harvard dropout, devotes the bulk of its endowment and nearly all of its intellectual firepower to laying waste to the nation’s teachers’ unions. The Eli and Edythe Broad Foundation is but the Gates operation on steroids, unleashing a shakedown syndicate of overcapitalized and chronically underperforming charter schools in the beleaguered urban centers where the democratic ideal of the common school once flourished. The Clinton Global Initiative, when it’s not furnishing vaguely agreeable alibis for Bill Clinton’s louche traveling companions, is consumed by neoliberal delusions of revolutionary moral self-improvement via the most unlikely of means—the proliferation of the very same sort of dubious financial instruments that touched off the 2008 economic meltdown. In this best of all possible investors’ worlds, swashbuckling info-moralists will teach international sex workers about the folly of their life choices by setting them up with a laptop and an extended tutorial on the genius of microloans.

This recent spike in elite self-infatuation, in other words, bespeaks a distressing new impulse among the fabulously well-to-do. While past campaigns of top-down charity focused on inculcating habits of bourgeois self-control among the lesser-born, today’s philanthro-capitalist seigneurs are seeking to replicate the conditions of their own success amid the singularly unpromising social world of the propertyless, unskilled, less educated denizens of the Global South. It’s less a matter of philanthro-capitalism than one of philanthro-imperialism. Where once the gospel of industrial success held sway among the donor class, we are witnessing the gospel of the just-in-time app, the crowdsourced startup, and the crisply leveraged microloan. This means, among other things, that the objects of mogul charity are regarded less and less as moral agents in their own right and more and more as obliging bit players in a passion play exclusively devoted to dramatizing the all-powerful, disruptive genius of our info-elite. They aren’t “giving back” so much as peering into the lower depths of the global social order and demanding, in the ever-righteous voice of privilege, “Who’s the fairest of them all?”

Noblesse Sans Oblige

There was plenty to deride in the Old World model of noblesse oblige; it dates back to the bad old days of feudal monarchy, when legacy-royal layabouts not only abjured productive labor entirely, but felt justified in the notion that they owned the souls of the peasants tethered to their sprawling estates. It’s no accident, therefore, that the idea of the rich being in receipt of any reciprocal obligation to the main body of the social order failed to make it onto the American scene. The sturdy mythology of the American self-made man didn’t really permit an arriviste material adventurer to look back to his roots at all, save to assure those within earshot that he’d definitively risen above them by the sheer force of an indomitable will-to-succeed.

But the relevant defining trait is the oblige part: the notion that the wealthy not only could elect to “give back” when it might suit their fancy, but that they had to positively let certain social goods alone—and assertively fund others—by virtue of their privileged station. Traditions such as the English commons stemmed from the idea that certain public institutions were inviolate, so far as the enfeoffing prerogatives of the landowning class went. The state church is another, altogether more problematic, legacy of this ancien régime; in addition to owning feudal souls outright, the higher orders of old had to evince some institutional concern for their ultimate destiny. There was exploitation and corruption galore woven into this social contract, of course, but for the more incendiary figures who dared to take its spiritual precepts seriously, there were also strong speculative grounds for envisioning another sort of world entirely, one in which the radical notion of spiritual equality took hold. As the Puritan Leveller John Lilburne—a noble by birth—put it in 1646, in the midst of the English Civil War:

All and every particular and individual man and woman, that ever breathed in the world . . . are by nature all equal and alike in their power, dignity, authority, and majesty, none of them having (by nature) any authority, dominion, or magisterial power, one over or above another.

Of course, the Levellers clearly were not on the winning side of British history, but this militant Puritan spirit migrated to the American colonies to supply the seedbed of our own communitarian ideal, expounded most famously in John Winthrop’s social-gospel oration “A Model of Christian Charity” aboard the Arbella in 1630. Throughout his sermon, Winthrop repeatedly exhorted his immigrant parishioners to practice extreme liberality in charity. “He that gives to the poor, lends to the Lord,” Winthrop declared in an appeal to philanthropic mutuality far less widely quoted than his fabled simile of the colonial settlement of New England as a city on a hill. “And he will repay him even in this life an hundredfold to him or his.” Citing a litany of biblical precedent, Winthrop went on to remind his mostly well-to-do Puritan flock that “the Scripture gives no caution to restrain any from being over liberal this way.” Indeed, he drove home the point much more forcefully as he highlighted the all-too-urgent imperative for these colonial adventurers to hand over the entirety of their substance for fellow settlers in material distress. “The care of the public must oversway all private respects,” Winthrop thundered—and then, sounding every bit the proto-socialist that his countryman Lilburne was: “It is a true rule that particular estates cannot subsist in the ruin of the public.”

The Accumulator As Paragon

The story of how Winthrop’s model of Christian charity degenerated into the neoliberal shibboleths of the Gates and Zuckerberg age is largely the saga of American monopoly capitalism, and far too epic to dally with here. But there is a key transitional figure in this shift: the enormously wealthy, self-made, and terminally self-serious steel-titan-cum-social reformer Andrew Carnegie. Born in rural Scotland in 1835 to an erratically employed artisan weaver, Carnegie grew up on the Chartist slogans that, amid the more secular social unrest of the industrial revolution, came to supplant the Levellers’ democratic visions of a world turned upside down. When he rose from an apprenticeship in a Pittsburgh telegraph office to true mogul status in the railroad, iron, and steel industries, Carnegie continued to cleave to the pleasing reverie that he was a worker’s kind of robber baron. Thanks to his own class background, he intoned, he had unique insight into the plight of the workmen seeking to hew their livings out of the harsh conditions of a new industrial capitalist social order. “Labor is all that the working man has to sell,” Carnegie pronounced just ahead of a series of wage cuts at his Pittsburgh works in 1883. “And he cannot be expected to take kindly to reductions of wages. . . . I think the wages paid at the seaboard of the United States are about as low as men can be expected to take.”

It was vital to Carnegie’s moral vanity to keep maintaining this self-image as the benevolent industrial noble, and he did so well past the point where his actually existing business interests dictated (as he saw it) the systematic beggaring of his workers. When the managers of Carnegie-owned firms would sell their workers short, lock them out, or bust their unions, Carnegie would typically blame the workers for not obtaining better contracts at rival iron, steel, and railroad concerns. While he might sympathize with their generally weak bargaining position, Carnegie well understood that he couldn’t have his competitors undercutting his own bottom line with cheaper labor costs—and with cheaper goods to market to Carnegie’s customers.

Carnegie’s patrician moral sentiments were genuine; throughout his career, he erected an elaborate philosophical defense of philanthropy as the only proper path for the disposition of riches, and famously spent his last years furiously trying to disperse as much of his fortune as possible to pay for charitable foundations, libraries, church organs, and the like. As he saw it, the mogul receives a sacred charge from the larger historical forces that conspire in the creation of his wealth: the rich man must act as a “trustee” for the needier members of the community.

Because the millionaire had proved his mettle as an accumulator of material rewards in the battle for business dominion, it followed that he had also been selected to be the most beneficent, and judicious, dispenser of charitable support for the lower orders as well. In Carnegie’s irenic vision of ever-advancing moral progress, all social forces were tending toward “an ideal state, in which the surplus wealth of the few will become, in the best sense, the property of the many, because administered for the common good,” as he preached in his famous 1889 essay “The Gospel of Wealth.” “And this wealth, passing through the hands of the few, can be made a much more potent force for the elevation of our race than if it had been distributed in small sums to the people themselves.” The accomplished mogul was, in Carnegie’s fanciful telling, nothing less than a dispassionate expert in the optimal disbursal of resources downward: “The man of wealth,” he wrote, became “the mere agent and trustee for his poorer brethren, bringing to their service his superior wisdom, experience, and ability to administer, doing for them better than they would or could do for themselves.”

Such blissfully un-self-aware flourishes of elite condescension—and the intolerable contradictions that called them into being—point at the tensions lurking just beneath Carnegie’s placid, controlling social muse. For as his own career as a market-cornering industrialist made painfully clear, precisely none of Carnegie’s fortune stemmed from serving out a benevolent trusteeship in the interests of the poor and working masses. Indeed, something far more perverse and unsightly impelled the business model for Carnegie’s commercial and charitable pursuits, as his biographer David Nasaw notes: Carnegie used the alibi of his own enlightened, philanthropic genius as the primary justification for denying collective bargaining rights to his workers.

Since he was clearly foreordained to serve the best interests of these workers better than they could, it was ultimately to everyone’s benefit to transform Carnegie’s business holdings into the most profitable enterprises on the planet—all the better to sluice more of the mogul’s ruthlessly extracted wealth back into the hands of a grateful hoi polloi, once it was rationalized and sanctified by the great man’s “superior wisdom, experience, and ability to administer.” In the sanctum of his New York study, where he spent the bulk of his days once his wealth disencumbered him of direct managerial duties at his Pittsburgh holdings, Carnegie found thrilling confirmation of his enlightened moral standing in the writings of social Darwinist Herbert Spencer. Yes, the wholesale of workers, widows, and orphans might seem “harsh,” Spencer preached to his ardent business readership. But when viewed from the proper vantage—the end point toward which all of humanity’s evolutionary struggles were ineluctably trending—this remorseless process of deskilling, displacement, and death was actually a sacred mandate, not to be tampered with: “When regarded not separately, but in connection with the interests of universal humanity, these harsh fatalities are seen to be of the highest beneficence.”

And so, indeed, it came to pass, albeit a bit too vividly for Carnegie’s own moral preference. At the center of the Carnegie firms’ labor-bleeding business model was a landmark tragedy in American labor relations: the 1892 strike at Carnegie’s Homestead works. Carnegie’s lieutenant, Henry Clay Frick, locked out the facility’s workforce after the Amalgamated Association of Iron and Steel Workers pressed management to suspend threatened wage cuts and pare back punishing twelve-hour shifts for steel workers. Frick clumsily tried to ferry in Pinkerton forces on the Monongahela River to take control of the plant; Homestead workers, backed by their families and local business owners, fought to repel the Pinkerton thugs. Gunfire was exchanged on both sides, killing two Pinkertons and nine workers. Eventually, Frick got the state militia to disperse the crowds of workers and their supporters; with his field of action cleared, the plant’s manager proceeded to starve out the strikers, breaking the strike five months after it began. The Amalgamated Union collapsed into oblivion the following year. No union would ever again darken the door of a Carnegie-owned business, no matter what sort of lip service he continued to pay to the dignity of the workingman in public.

Homestead was a bitter rebuke to Carnegie’s self-image as the workers’ expert missionizing advocate—but tellingly, it didn’t do any lasting damage to the larger edifice of his charitable pretension. Partly, this was a function of Carnegie’s genuine generosity. More fundamentally, though, the steel mogul’s outsized moral self-regard endured in its prim, unmolested state thanks to the larger American public consensus on the proper Olympian status of men of wealth, especially when gauged against the demoralizing spectacle of industrial conflict.

Strings, Attached

The desperate intellectual acrobatics of the self-made Carnegie were never viewed as pathological, for the simple reason that they mirrored the logic by which American business interests at large pursued public favor. In this scheme of things, the lords of commerce were always to be the unquestioned possessors of a magisterial historical prerogative, and the base, petty interests of a self-organized labor movement were always the retrograde obstacle to true progress. What else could it mean, after all, for the owners of capital to always and forever be acting “in connection with the interests of universal humanity”? Following the broad contours of Carnegie’s founding efforts in this sphere, a long succession of American business leaders would proceed to claim for themselves the mantle of enlightened market despotism, from GM CEO Charlie “Engine” Wilson’s breezy midcentury conflation of his corporation’s grand good fortune with that of its host nation to the confident prognostications of today’s tech lords that we are about to efface global poverty in the swipe of a few well-designed apps.

So how does the philanthropic debauching of the public sphere unfold today, now that Carnegie’s bifurcated model of exploitation for charity’s sake has receded into the dimly remembered newsreel footage of the industrial age? Well, for one thing, it’s become a lot less genteel. Trusteeship isn’t the model any longer; it’s annexation.

Take one especially revealing case involving our own age’s pet mogul crusade of school reform. Just five years ago, Mark Zuckerberg made a splashy, Oprah-choreographed gift of $100 million to the chronically low-performing Newark public school district—an announcement also timed to coincide with the national release of the union-baiting school reform documentary Waiting for “Superman.” The idea was to enlist the Facebook wizard’s fellow philanthro-capitalists in a matching donor drive, so that the city’s schools, already staked to a $1 billion state-administered budget, would also pick up $200 million of private-sector foundation dosh, to be spent on charter schools and other totems of managerial faux-excellence. With this dramatic infusion of money from our lead innovation industries, it would be largely a formality to “turn Newark into a symbol of educational excellence for the whole nation,” as Zuckerberg told a cheerleading Oprah.

And sure enough, all the usual deep-pocketed benefactors turned out in force to meet the Zuckerberg challenge: Eli Broad, the Gates Foundation, the Walton Foundation, and even Zuckerberg’s chief operation officer, Sheryl “Lean In” Sandberg, all kicked into the kitty. At the public forums rolling out the initiative—organized for a cool $1.3 million by Tusk Strategies, a consultancy concern affiliated with erstwhile New York mayor Michael Bloomberg’s own school-privatizing fiefdom—Newark parents more concerned with securing basic protections for their kids in local schools, such as freedom from gang violence and drug trafficking, exhorted the newly parachuted reform class to focus on the mundane prerequisites of infrastructure support and student safety. But try as they might, they found their voices continually drowned out by a rising chorus of vacuous reform-speak. “It’s destiny that we become the first city in America that makes its whole district a system of excellence,” then-mayor Cory Booker burbled at one such gathering. “We want to go from islands of excellence to a hemisphere of hope.”

But for all these stirring reprises of the Spencerian catechism on “the interests of universal humanity,” the actual state of schooling in Newark was not measurably improving. The leaders of the reform effort (which was, of course, entitled “Startup:Education”) couldn’t answer the most basic questions about how the rapidly deployed battery of excellence-incubating Newark charter schools would coexist beside the shambolic wrecks of the city’s merely public schools, where a majority of Newark kids would still be enrolled—or even how parents of charter kids would get their kids to and from school, since these wise, reforming souls neglected to allot due funding for bus transportation. Not surprisingly, the new plan’s leaders were also cagey about explaining how all the individual school budgets, charter and public alike, were to be brought into line.

So in short order, the magic Zuckerberg seed money, together with the additional $100 million in matching grants, had all vanished. More than $20 million of that went to pay PR and consultancy outfits like Tusk Strategies, according to New Yorker writer Dale Russakoff, who notes that “the going rate for individual consultants in Newark was a thousand dollars a day.” Another $30 million went to pad teachers’ salaries with back pay to buy off workers’ good will—and far more important, to gain the necessary leverage to dismiss or reassign union-protected teachers who didn’t project as the privatizing Superman type. The most enduring legacy of Startup:Education appears to be a wholly unintended political one: disenchanted Newark citizens rallied behind the mayoral candidacy of Ras Baraka, former principal of Newark’s Central High School and son of the late radical poet Amiri Baraka, who was elected last year on a platform of returning Newark educational policy to the control of the community.

With all due allowances for the dramatically disparate character of the underlying social order, and the shift from an Industrial Age economy to a service-driven information one, it’s nonetheless striking to note just how little about the purblind conduct of overclass charity has changed since Carnegie’s time. Just as Carnegie’s own sentimental and imaginary identification with the workers in his employ supplied him with the indispensable rhetorical cover for beggaring said workers of their livelihoods and rights to self-determination in the workplace, so did the leaders of Startup:Education evince just enough peremptory interest in the actual living conditions of Newark school families to net optimal Oprah coverage. And once the Klieg lights dimmed, the real business plan kicked into gear: a sustained feeding frenzy for the neoliberal symbolic analysts professionally devoted to stage-managing the appearance of far-seeing school reform. These high-priced hirelings were of course less brutal and bloodthirsty than the Pinkertons Frick had unleashed on the Homestead workers, but their realpolitik charge was, at bottom, equally stark: to discredit teachers’ unions and community activists while delivering control of a vital social good into the hands of a remote investing and owning class. If the parents and kids grew restive in their appointed role as stage props for the pleasing display of patrician largess, why, they could just hire Uber drivers to dispatch themselves to the new model charter schools, or maybe scare off local gang members by assembling an artillery of firearms generated via their 3-D printers.

In truth, no magic-bullet privatization plan could begin to address the core conditions that sent the Newark schools spiraling into systemic decay: rampant white flight after the 1967 riots, which in turn drained the city of the property-tax revenues needed to sustain a quality educational system, combined with corruption within the city’s political establishment and (yes) among the leadership of its teachers’ unions. To make local education districts respond meaningfully to the needs of the communities they serve, reformers would have to begin at the very opposite end of the class divide from where Startup:Education set up shop—by giving power to the members of said communities, not their self-appointed neoliberal overseers. In other words, common schools should rightly be understood as a commons, not as playthings for bored digital barons or as little success engines, managed like startups in the pejorative sense, left to stall out indefinitely in beta-testing mode until all the money’s gone.

Andrew Carnegie, at least, had the depth of character to recognize when his vision of his world-conquering destiny had gone badly off the rails. In the last years of his life, his infatuation with the stolid charms of mere libraries and church organs seemed to fade, so he adopted a quixotic quest to recalibrate human character entirely. Starting with an ardent—and quite worthy—campaign to stem the worst excesses of American imperialism in the wake of the Spanish-American War, Carnegie then turned to the seemingly insoluble challenge of stamping out altogether the human propensity to make war. When this latter crusade ran afoul of the colossal carnage unleashed in the Great War, he became an uncharacteristically depressed, isolated, and retiring figure, barely reemerging in public life before his death in 1919.

In today’s America, however, no one learns from our mogul class’s leadership mistakes and moral disasters—we just proceed to copy them faster. So when New York’s neoliberal governor Andrew Cuomo tore a page from the Zuckerberg playbook and launched a system of lavish tax breaks for tech firms affiliated with colleges and universities—surely these educational outposts would be model incubators of just-in-time prosperity—nemesis once again beckoned. Indeed, when Cuomo’s economic savants unleashed tech money to do its own bidding in the notional public sphere, the end results proved to be no different than they had been in the Zuckerberg-funded mogul playground of Newark charter schools. Cuomo’s ballyhooed, billion-dollar, five-year plan for way-new digital job creation—called, you guessed it, “Startup New York”—yielded just seventy-six jobs in 2014, according to a report from the state’s Committee on Economic Development. This isn’t a multiplier effect so much as a subtraction one; it’s hard to see how Cuomo could have netted a less impressive return on investment if he had simply left a billion dollars lying out on the street.

Just as Newark vouchsafed us a vision of educational excellence without the messy parents, neighborhood social ills, and union-backed teachers who louse the works up, so has Cuomo choreographed a seamless model of tax breaks operating in a near-complete economic vacuum. Say what you will about the abuses of Old World wealth; a little noblesse oblige might go a long way in these absurdly predatory times.


Posted in culture, Economics, Financial Crisis, History, Social Control, society | Tagged , , , , , , , , , , , | 2 Comments

Why Empathy Matters


By Zen Gardner


It’s not easy to stay sensitive in such a cruel, desensitized world but it’s imperative we do. That’s the beauty of empathic souls; they have open and loving hearts, even if it hurts, which is why each of us needs to generate and receive so much love and encouragement.

Loving empathy is its own reward, even if we’re not showered directly with supporting human love as much as we’d prefer. The spiritual is supreme. Connection to Source is our unfailing infinite supply line of everything we need. But I agree, it’s sure nice when that consciousness is manifest in another human being and can be shared between us.

That’s the nature of true interpersonal love, and we need to shower it upon each other.

Empathize or Cauterize

I feel strongly that if we don’t allow ourselves to have broken hearts for the lost and suffering we’re virtually useless consciousness and a betrayal to Source. True heartfelt empathy heals and strengthens ourselves and those around us as we go through this voyage. Letting these sincere emotions course through us, whether it be sharing the pain of battered and betrayed victims of all sorts, including animals, or the sadness of the passing of a dear friend, they’re good for us and are a wonderful opportunity to draw closer to Source.

Using these deep experiences as an energy carrier signal to piggy back other issues on our hearts and minds into the great bosom of Love is a real key. When channeled consciously from the heart, these experiences lead to much greater intuitive understandings, strength of spirit, and that deep, deep peace that passes understanding.

Those who cannot move with these fully awake empathic spiritual impulses in effect have become cauterized. The media works hard at this, bashing the collective head with desensitizing, violent images and mind crushing propaganda constantly. That’s why they do it. Not just to promote their programs, but to shut down our conscious awareness, the all-empowering Source of love and light.

That’s what they fear the most. That we will awaken and tap into our magnificence.

Counteracting the War Against Love

This is fundamentally what this current hijack attempt is all about. Extinguishing love. Love is soft, love is kind. But it is also extremely powerful. Love is a form of creation at work. It contradicts everything we’re witnessing in today’s media driven control structure.

Express your love every chance you get. Others are starved for it just as you are. Give and it will return, but don’t do it with that motive. It just happens naturally, because that is the co-creative nature of love. So many are starved for a word of encouragement, a kind gesture, a thank you or word of appreciation. The downdraft of ugliness is so strong right now we need to support each other in any way we can.

Make yourself vulnerable. It’s the most protected space there is. Put a little love in your heart – and let it out!

Our Warfare Is Spiritual

The forces of darkness cannot overcome the Light, as hard as they may try. Any success they may seem to have at harnessing humanity for their own ends is so very temporary. While we are infinite spiritual beings, they are temporal, parasitic forces.

Keep that in mind, no matter how things may appear at times.

Let’s fully manifest and get this era done away with by letting Universe work fully through us. It happens one heart at a time, but each of us has to keep doing what we’re each meant to do and be.

Stay soft and loving, yet strong and resolved. Our weapons are spiritual – don’t let them entice you into their arena of mind games and ignorant lower vibrational reactionism. Stay where you are strong, yet engage them nonetheless. On your terms.

Thank all of you who give so much. Please know how loved and appreciated you are by so, so many.

Let it flow – we’re just getting started!

Love always, Zen

Posted in Activism, consciousness, culture, Philosophy, Psychology, society, Spirituality | Tagged , , , , , | 1 Comment

Saturday Matinee: Freeway


“Freeway” (1996) is a satisfyingly dark retelling of Little Red Riding Hood written/directed by Matthew Bright and featuring excellent performances by Reese Witherspoon as the protagonist, a seemingly stereotypical “white trash” teen, and Keifer Sutherland as the “wolf”, a secretly psychopathic high school guidance counselor. While the film works on the level of a traditional exploitation film, it provides much welcome commentary on racism, sexism, class-ism and societal hypocrisy.

(Video may not stream on some portable devices.)

Posted in Art, culture, Film, Humor, Saturday Matinee, society, Video | Tagged , , , , , | Leave a comment