Snow, Death, and the Bewildered Herd

By Edward Curtin

Source: Behind the Curtain

Few people at this hour – and I refer to the time before the breaking out of this most grim war, which is coming to birth so strangely, as if it did not want to be born – few, I say, these days still enjoy that tranquility which permits one to choose the truth, to abstract one in reflection.  Almost all the world is in tumult, is beside itself, and when man is beside himself he loses his most essential attribute: the possibility of meditating, or withdrawing into himself to come to terms with himself and define what it is he believes and what it is that he does not believe; what he truly esteems and what he truly detests.  Being beside himself bemuses him, blinds him, forces him to act mechanically in a frenetic somnambulism.

-Ortega Y Gasset “The Self and the Other”

As I write these words, the house is being buried in a snowstorm. Heavy flakes fall slowly and silently as a contemplative peace muffles the frenetic agitation and speed of a world gone mad. A beautiful gift like this has no price, though there are those who would like to set one, as they do on everything.  In my mind’s eye I see Boris Pasternak’s Yurii Zhivago, sitting in the penumbra of an oil lamp in the snowy night stillness of Varykino, scratching out his poems in a state of inspired possession.  Outside the wolves howl. Inside the bedroom, his doomed lover, Lara, and her daughter sleep peacefully.  The wolves are always howling.

Then my mind’s lamp flickers, and Ignacio Silone’s rebel character, Pietro Spina (from the novel Bread and Wine) appears.  He is deep into heavy snow as he flees the Italian fascists by hiking into the mountains. There, too, howl the wolves, the omnipresent wolves, as the solitary rebel – the man who said “No” – slowly trudges in a meditative silence, disguised as a priest.

Images like these, apparitions of literary characters who never existed outside the imagination, might at first seem eccentric. But they appear to me because they are, like the silent snow that falls outside, evocative reminders of our need to stop the howling media streams long enough to set our minds on essential truths, to think and meditate on our fates – the fate of the earth and our individual fates. To resist the forces of death we need to concentrate, and that requires slow silence in solitude.  That is why the world’s archetypal arch-enemy, Mr. Death himself, aka Satan, aka Screwtape, advises his disciple Wormwood in C. S. Lewis’s The Screwtape Letters to befuddle people against the aberration of logic by keeping them distracted with contradictory, non-stop news reports. He tells him that “Your business is to fix his attention on the stream.  Teach him to call it ‘real life’ and don’t let him ask what he means by ‘real.’ “

It is a commonplace to say that we are being buried in continuous and never-ending information. Yet it is true.  We are being snowed by this torrent of indigestible “news,” and it’s not new, just vastly increased in the last twenty-five years or so.

Writing fifty-eight years ago, C. Wright Mills argued:

It is not only information they need – in the Age of Fact, information often dominates their attention and overwhelms their capacities to assimilate it….What they need…is a quality of mind that will help them to use information and to develop reason in order to achieve lucid summations of what is going on in the world and of what may be happening within themselves….what may be called the sociological imagination.

Today, as we speed down the information superhighway, Mills’s words are truer than ever.  But how to develop an imagination suffused with reason to arrive at lucid summations?  Is it possible now that “the information bomb” (attributed to Einstein) has fallen?

Albert Camus once said that “at any street corner the feeling of absurdity can strike any man in the face.”  While that is still true today, I would add that the feeling of an agitated and distracted bewilderment is everywhere to be seen as multitudes scan their idiot boxes for the latest revelations. Beeping and peeping, they momentarily quell their nervous anxieties by being informed and simulating proximity through the ether. Permanently busy in their mediated “reality,” they watch as streaming data are instantly succeeded by streaming data in acts of digital dementia. For Camus the absurd was a starting point for a freer world of rebellion. For Walter Lippman, the influential journalist and adviser to presidents and potentates, “the bewildered herd” – his name for regular people, the 99 % – was a beginning and a wished for end. His elites, the 1 %, would bewilder the herd in order to control them. His wish has come true.

A surfeit of information, fundamental to modern propaganda, prevents people from forming considered judgments.  It paralyzes them. Jacques Ellul writes in Propaganda:

Continuous propaganda exceeds the individual’s capacity for attention or adaptations. This trait of continuity explains why propaganda can indulge in sudden twists and turns.  It is always surprising that the content of propaganda can be so inconsistent that it can approve today what it condemned yesterday.

Coherence and unity in claims aren’t necessary; contradictions work just as well.  And the more the better: more contradictions, more consistency, more complementarity – just make it more.  The system demands more.  The informed citizen craves more; craves it faster and faster as the data become dada, an absurdist joke on logical thinking.

Wherever you go in the United States these days, you sense a generalized panic and an inability to slow down and focus.  Depression, anxiety, hopelessness fill the air.  Most people sense that something is seriously wrong, but don’t know exactly what. So they rage and rant and scurry along in a frenzy. It seems so huge, so everything, so indescribable.  Minds like pointilliste canvases with thousands of data dots and no connections.

In the mid-1990s, when the electronic world of computers and the internet were being shoved down our throats by a consortium of national security state and computer company operatives (gladly swallowed then by many and now resulting in today’s total surveillance state), I became a member of The Lead Pencil Club foundered by Bill Henderson (The Pushcart Press) in honor of Thoreau’s father’s pencil factory and meant as a whimsical protest: “a pothole on the information superhighway.”  There were perhaps 37 1/3 members worldwide, no membership roll, and no dues – just a commitment to use pencils to write and think slowly.

“Why should we live with such hurry and waste of life?” Thoreau asked.  “We are determined to be starved before we are hungry.”

So I am writing these words with a pencil, an object, to paraphrase Walter Benjamin, which haunts our present electronic world by being a ruin of the past.  It is not a question of nostalgia, for we are not returning to our lost homes, despite a repressed urge for simpler times. But the pencil is an object that stands as a warning of the technological hubris that has pushed our home on earth to the brink of nuclear extinction and made mush of people’s minds in grasping the reasons why.

I think of John Berger, the great writer on art and life, as I write, erase, cross out, rewrite – roll the words over and look at them, consider them.  Berger who wrote: “Writing is an off-shoot of something deeper”; that “most mainstream political discourse today is composed of words that, separated from any creature of language, are inert….dead ‘word-mongering’ [that] wipes out memory and breeds a ruthless complacency.”

The pencil is not a fetish; it is a reminder to make haste slowly, to hear and feel my thinking on the paper, to honor the sacredness of what Berger calls the “confabulation” between words and their meaning.  I smell the pencil’s wood, the tree of life, its slow ascent, rooted in the earth, the earth our home, our beginning and our end.

Imagining our ends, while always hard, has become much harder in modern times in western industrialized nations, especially the United States that reigns death down on the rest of the world while pretending it is immortal and immune from the nuclear weapons it brandishes. Yet the need to do so has become more important. When in 1939 Ortega y Gasset warned in the epigraph of a most grim war coming to birth so strangely, as people acted “mechanically in a frenetic somnambulism,” he was writing before nuclear weapons, the ultimate technology. If today we cannot imagine our individual deaths, how can we imagine the death of the earth? In a 1944 newspaper column George Orwell made an astute observation: “I would say that the decay of the belief in personal immortality has been as important as the rise of machine civilization.” He connected this growing disbelief to the modern cult of power worship.  “I do not want the belief in life after death to return,” he added, “and in any case it is not likely to return.  What I do point out is that its disappearance has left a big hole, and that we ought to take notice of that fact.”

I think that one reason we have not taken notice of this fact of the presence of a huge absence (not to say whether this disbelief is “true”) is the internet of speed, celebrated and foreseen by the grandmaster of electronic wizardry and obscurantic celebrator of retribalized man, Marshall McLuhan, who called the electronic media our gods whom we must serve and who argued that the extensions of human faculties through media would bring about abstract persons who would wear their brains outside their skulls and who would need an external conscience. Shall we say robots on fast forward?

Once the human body is reduced to a machine and human intercourse accepted as a “mediated reality” through so-called smart devices, we know – or should – that we are in big trouble.  John Ralston Saul, a keen observer of the way we live now, mimics George Carlin by saying, “If Marx were functioning today, he would have been hard put to avoid saying that imaginary sex is the opiate of the people.”

Saul is also one of the few thinkers to follow-up on Orwell’s point.  “Inexplicable violence is almost always the sign of deep fears being released and there can be no deeper fear than mortality unchained.  With the disappearance of faith and the evaporation of all magic from the image, man’s fear of mortality has been freed to roam in a manner not seen for two millennia.”  Blind reason, amoral and in the service of expertise and power, has replaced a holistic approach to understanding that includes at its heart art, language, “spirit, appetite, faith and emotion, but also intuition, will and, most important, experience.”  People, he argues, run around today in an inner panic as if they are searching for a lost forgotten truth.

Zygmunt Bauman, the brilliant sociological thinker, is another observer who has noticed the big hole that is staring us in the face.  “The devaluation of immortality,” he writes, “cannot but augur a cultural upheaval, arguably the most decisive turning point in human cultural history.”  He too connects our refusal in the west to contemplate this fact to the constant busyness and perpetual rushed sense of emergency engendered by the electronic media with its streaming information.  To this end he quotes Nicole Aubert:

Permanent busyness, with one emergency following another, gives the security of a full life or a ‘successful career’, sole proofs of self-assertion in a world from which all references to the ‘beyond’ are absent, and where existence, with its finitude, is the only certainty…When they take action people think short-term – of things to be done immediately or in the very near future…All too often, action is only an escape from the self, a remedy from the anguish.

McLuhan’s abstract persons, who rush through the grey magic of electronic lives where flesh and blood don’t exist, not only drown in excessive data that they can’t understand, but drift through a world of ghostly images where “selves” with nothing at the core flit to and fro. Style, no substance.  Perspective, no person.  Life, having passed from humans to things and the images of things, reduced and reified.  Nothing is clear, the images come and go, fact and fiction blend, myth and history coalesce, time and space collapse in a collage of confusion, surfaces appear as depths, the person becomes a perspective, a perspective becomes a mirror, a mirror reflects an image, and the individual is left dazed and lost, wondering what world he is in and what personality he should don. In McLuhan’s electronic paradise that is ours, people don’t live or die, people just float through the ether and pass away, as do the victims of America’s non-stop wars of aggression simply evaporate as statistics that float down the stream, while the delusional believe the world will bloodlessly evaporate in a nuclear war that they can’t imagine coming and won’t see gone. Who in this flow can hear the words of Federico Garcia Lorca: “Beneath all the totals, a river of warm blood/A river that goes singing/past the bedrooms…”?

If you shower the public with the thousands of items that occur in the course of a day or a week, the average person, even if he tries hard, will simply retain thousands of items which mean nothing to him.  He would need a remarkable memory to tie some event to another that happened three weeks or three months ago….To obtain a rounded picture one would have to do research, but the average person has neither the desire or time for it.  As a result, he finds himself in a kind of kaleidoscope in which thousands of unconnected images follow each other rapidly….To the average man who tries to keep informed, a world emerges that is astonishingly incoherent, absurd, and irrational, which changes rapidly and constantly for reasons he cannot understand.

Jaques Ellul wrote that in 1965. Lucid summations are surely needed now.

Here’s one from Roberto Calasso from The Forty-Nine Steps: “The new society is an agnostic theocracy based on nihilism.”

Anyone who sits silently and does a modicum of research while honestly contemplating the current world situation will have no trouble in noticing that there is one country in the world – the U.S.A. – that has used nuclear weapons, is modernizing its vast obscene arsenal, and has announced that it will use it as a first strike weapon. A quick glance at a map will reveal the positioning of U.S. NATO troops and weapons right up to Russia’s borders and the aggressive movement of U.S. forces close to China.  Hiroshima and Nagasaki make no difference.  The fate of the earth makes no difference. Nothing makes a difference. Obama started this aggressiveness, but will this change under Trump?  That’s very unlikely. We are talking about puppets for the potentates. It’s easy to note that the U.S. has 1,000,000 troops stationed in 175 countries because they advertise that during college basketball games, and of course you know of all the countries upon which the U.S. is raining down death and destruction in the name of peace and freedom.  That’s all you need to know.  Meditate on that and that hole that has opened up in western culture, and perhaps in your heart.

“If you are acquainted with the principle,” wrote Thoreau, “what do you care for myriad instances and applications?”  Simplify, simplify, simplify.

But you may prefer complexity, following the stream.

The snow is still falling, night has descended, and the roads are impassable.  The beautiful snow has stopped us in our tracks. Tomorrow we can resume our frantic movements, but for now we must simply stay put and wonder.

Eugene Ionesco, known for his absurdist plays, including Rhinoceros, puts it thus:

In all the cities of the world, it is the same.  The universal and modern man is the man in a rush (i.e. a rhinoceros), a man who has no time, who is a prisoner of necessity, who cannot understand that a thing might be without usefulness; nor does he understand that, at bottom, it is the useful that may be a useless and back-breaking burden.  If one does not understand the usefulness of the useless and the uselessness of the useful, one cannot understand art.  And a country where art is not understood is a country of slaves and robots.

Ionesco emphasized the literal insanity of everyday life, comparing people to rhinoceroses that think and act with a herd mentality because they are afraid of the solitude and slowness necessary for lucid thought. They rush at everything with their horns.  Behind this lies the fear of freedom, whose inner core is the fear of death.  Doing nothing means being nothing, so being busy means being someone.  And today being busy means being “plugged into the stream” of information meant to confound, which it does.

I return to the artist Pasternak, since the snowy night can’t keep me away. Or has he returned to me? I hear Yurii Zhivago’s uncle Nikolai speaking:

Only individuals seek the truth, and they shun those whose sole concern is not the truth.  How many things in the world deserve our loyalty?  Very few indeed.  I think one should be loyal to immortality, which is another word for life, a stronger word for it ….What you don’t understand is that it is possible to be an atheist, it is possible to not know whether God exists, or why, and yet believe that man does not live in a state of nature but in history….Now what is history?  It is the centuries of systematic explorations of the riddle of death, with a view to overcoming death. That’s why people discover mathematical infinity and electromagnetic waves, that’s why they write symphonies.  Now, you can’t advance in this direction without a certain faith.  You can’t make such discoveries without spiritual equipment.  And the basic elements of this equipment are in the Gospels.  What are they?  To begin with, love of one’s neighbor, which is the supreme form of vital energy.  Once it fills the heart of man it has to overflow and spend itself.  And then the two basic ideals of modern man – without them he is unthinkable – the idea of free personality and the idea of life as sacrifice.  Mind you, all of this is still extraordinarily new….Man does not die in a ditch like a dog – but at home in history, while the work toward the conquest of death is in full swing; he dies sharing in this work.  Ouf!  I got quite worked up, didn’t I?  But I might as well be talking to a blank wall.

I look outside and see the snow has stopped.  It is time to sleep.  Early tomorrow the plows will grind up the roads and the rush will ensue.  Usefulness will flow.

But for now the night is beautiful and slow. A work of art.

The Corporate Liberal in America

hillary-clinton-winking-AP-640x480

By Jason Hirthler

Source: CounterPunch

“I have almost reached the regrettable conclusion that the Negro’s greatest stumbling block in his stride toward freedom is not the White Citizen’s Counciler or the Ku Klux Klanner, but the white moderate, who is more devoted to ‘order’ than to justice.”

— Martin Luther King, Jr., Letter from a Birmingham Jail

Whether seated in Congress or exiting a voting booth, a corporate liberal is someone who supports anything progressive that does not challenge corporate power. In practice, this means corporate liberals will fight for progressive identity politics. If it has to do with race, sexual orientation, and gender, it generally doesn’t challenge corporate power. Major corporations support progressive positions on those issues, too. Corporate liberals march for gay rights and the larger LGBTQ community itself. They support feminism. They support reproductive rights. They support African-American protests against police brutality—up to the point where they become threatening to the establishment. (Bill Clinton did initiate the prison industrial complex that unduly incarcerates huge numbers of minorities.)

This support is all to the good. Tremendous progress has been made by popular protest of the devastating prejudices that have for years denied individual rights. But when their elected Democrats undermine economic justice, promote imperial warfare, and refuse to seriously address climate change, corporate liberals just look the other way. As Joe Clifford noted in his piece on Bernie Sanders, being a corporate liberal also means rejecting, “…a ban on fracking, a proposal to oppose TPP, the $15 per hour minimum wage proposal, a call for single-payer health care, and a statement of opposition to the illegal Israeli occupation.” These proposals, courageously put forward by James Zogby, Bill Mckibben, Cornell West and the rest of the Sanders contingent at the recent Democratic Platform Committee meeting in Washington, were all struck down. In a beautiful expression of moral courage, West refused to back the platform in its final iteration, saying,

“[If] we can’t say a word about [Trans-Pacific Partnership], if we can’t talk about Medicare for all explicitly, if the greatest prophetic voice dealing with impending ecological catastrophe can hardly win a vote and if we can’t even acknowledge occupation as something that’s real in the lives of a slice of humanity … it just seems to me there’s no way in good conscience I can say take it to the next stage.”

Yada Yada Yada

Words like these have no effect on the corporate liberal. If there’s a centimeter’s difference between their Democratic platform and the diseased corpus of Republican anarchism, the corporate conscience is salved. A corporate liberal is the one that puts “occupation” in quotes. A corporate liberal never makes the perfect the enemy of the good. A corporate liberal believes in reform, in humanitarian warfare, in the responsibility to protect, and in The New York Times front page. A corporate liberal supports all of this, though reform may be glacial, though good wars may slay millions, though interventions may undermine sovereignty, and though The Times may be rife with half-truth. It makes no difference, so long as reform is better than rollback, Barack’s slaughter is numerically less than Dubya’s, and The Washington Post is marginally more truthful than FOX News. As long as you can trust Erin Burnett more than Bill O’Reilly, it makes no difference that we will move further and further to the right, picking up steam until we barrel straight into corporate fascism. So long as the corporate liberal sits to the left of the patrician publican, he has some claim on the progressive mandate. Or so he says. Yet the best way to repel fascism, and realize that progressive mandate, is by joining a movement headed left, rather than a party moving right.

Nothing Forbidden

As Alan Nasser elucidates, there is nothing intolerable in the lesser evilism of the corporate liberal. He will endure—or more likely, watch others endure—intolerable realities while maintaining the unblinking rectitude of the blind ideologue. Author Chris Hedges writes that capitalism is “plunging us into a state of neo-feudalism, perpetual war, ecological disaster, and a dystopian nightmare.” But this, too, is not intolerable. We must accept it in order to ensure that the real nightmare—whomever happens to be running on the Republican ticket—is barred from the White House forever.

We must tolerate whatever Democrats do because they are better than Republicans. Even if that means, as it surely has and surely will, for all the identity groups corporate liberals support, a deteriorating quality of life. Lower incomes, higher unemployment. Bigger debts, bullshit jobs. Higher infant mortality, higher heart disease. More inequality, less social support. Less social support, more incarceration. More suicide, more alcoholism, more drug abuse, more debt, more stress, more unhappiness. And, if one is aware enough, the consciousness of having—perhaps unwittingly at the time—for more slaughter of brown people abroad, and the deliberate aggression against nuclear powers that will raise the prospect of nuclear extermination for millions. The Democrats have no such mandate, but the corporate liberal gives them the power to pretend they do. These are the wages of neoliberalism and imperialism, enabled by the logic of the lesser evil.

Like Dr. King, Karl Marx understood the major threat was not the fanatic on the fringe, but the moderate in the middle. The real threat is not the extremist, who will burn out by necessity if not already burned down by the moderate herd. It is the moderate herd that threatens to permit the intolerable through gradualism. Incremental genocide. Slow-motion regime change. The soft coup. The generational heist of millions of working class jobs. The decade-long liquidation of working class home equity. The century-long evisceration of labor rights. The hidden decades of disinformation campaigns that conflate freedom with free markets. Marx said, “Our task is that of ruthless criticism, and much more against ostensible friends than against open enemies.” He understood what King did, which is part of the reason why they are two of the most revolutionary figures of the last couple of centuries of Western civilization.

Too Much to Lose

Corporate liberals rehearse Manichean pieties about good and evil locked in a dualistic embrace, fighting to the death. There are no third parties in this vision. It is a necessary dualism. Hence the occasional need to undermine democracy to save it, as Hillary’s campaign demonstrated through repeated voting irregularities and financial chicanery engineered through her DNC front. It’s just simpler that way. For a political party of millionaires backed by billionaires, it just doesn’t do to disturb the status quo, rock the boat, upset the apple cart, shake the foundations, incite protest, disturb our creature comforts, move us out of our comfort zone, spark rebellion, overthrow the system, or change the world.

Is lesser evilism an elaborate rationale for preserving the status quo? Lenin said you can’t make a revolution in white gloves, and there are plenty of corporate liberals paying lip service to progress while glad-handing its well-heeled antagonists. That is why, in the end, corporate liberals are anti-revolutionaries. They would rather save capitalism than endure a potentially messy transition to socialism. Leave the revolution to Universal Studios and stubble-cheeked Third World rebels in hand-knitted berets. Social reforms in capitalist countries seem to happen like they did in South Africa, where identity politics achieve astounding successes, and calls for economic justice are swallowed up in the celebratory din. This is because corporate power cares deeply about economic power, but couldn’t care less about your sexual identity. For corporations—even if the executive board morally supports it—the gay community is ultimately another target market, a rich source of disposable income to be mined.

The least oppressed in any electorate always seem to be the greatest obstacle to change. Always willing to put justice on layaway. Always arguing for incrementalism, which strikes me as a luxury of the leisure class. Social progress will have little impact on them anyway, but paying lip service to its values will burnish their reputation. The discomfiting appearance of Bernie Sanders disturbed the polished script rehearsed by the Hillary camp for years. It was her turn, the first female president, upholding the rights of the vulnerable and achieving hard-won incremental gains through patience, hard work, and political acumen. For a moment, the Hillary faithful looked harried, wrong-footed, and exposed to the will of the mob. But now that the dodgy primaries are done, and Bernie has scampered back to the warmth of the herd, we can return to the language of compromise and the lesser evil. Had Bernie broke with the party he refused to technically join for 40 years, joined Jill Stein on the Green ticket, garnered support from voices like Kshama Sawant and movements like Socialist Alternative and Black Lives Matter, he could have founded a serious alternative to the mercenary duopoly. But he fell for the ruse of internal reform. But not everyone does. King continued in his Birmingham letter to discuss the white moderate, saying he was the one,

“… who prefers a negative peace which is the absence of tension to a positive peace which is the presence of justice; who constantly says: “I agree with you in the goal you seek, but I cannot agree with your methods of direct action”; who paternalistically believes he can set the timetable for another man’s freedom; who lives by a mythical concept of time and who constantly advises the Negro to wait for a “more convenient season.” Shallow understanding from people of good will is more frustrating than absolute misunderstanding from people of ill will. Lukewarm acceptance is much more bewildering than outright rejection.”

Sound familiar? King’s white moderate and Marx’s ostensible friend is our corporate liberal. Same spin, different decade. The corporate liberal is an embodiment of the idea that political parties are the graveyards of movements. Hedges himself wrote a book called, “Death of the Liberal Class” five years ago. It should’ve been the elegy before the interment of the Democratic Party as a serious option in electoral politics. Yet here we are, about to anoint another corporate liberal to the highest seat in the land. In that case, consider this article yet another epitaph awaiting its headstone. Let’s hope it’s not a long wait. Voices like Sawant’s and the momentum of movements like BLM give us reason to think it won’t be.

Jason Hirthler is a veteran of the communications industry and author of The Sins of Empire: Unmasking American Imperialism. He lives in New York City and can be reached at jasonhirthler@gmail.com.

The Weird Politics Of Aspartame: Conspiracy Theory Or Startling Truth?

050406aspartame3

By Paanii Powell Cleaver

Source: Inquisitr

Earlier this month, news wires and Twitter feeds were abuzz with info about the potential danger of certain artificial sweeteners.

In reality, recent reports about the potential perils of low-calorie sweeteners are not exactly breaking news. Almost 20 years ago, on December 29, 1996, Mike Wallace conducted an eye-opening segment about aspartame, also known as NutraSweet, on 60 Minutes.

The segment aired in response to a flurry of reports noting a dramatic increase in brain tumors and other serious health issues following the approval of aspartame for use in dry foods in 1981. Fifteen years after its somewhat dubious approval, (the most controversial FDA decision to date), more than 7,000 consumer reports of adverse reaction to aspartame had been delivered to the FDA. As reported by 60 Minutes, the litany of consumer complaints included severe headaches, dizziness, respiratory issues, and seizures. The FDA countered with a statement that aspartame was the most tested product in FDA history.

Dr. Virginia V. Weldon, a pediatrician from Missouri and Vice President of Public Policy for the Monsanto Company from 1989 through 1998, told 60 Minutes that aspartame is “one of the safest food ingredients ever approved by the Food and Drug Administration.”

Dr. John W. Olney, a neuroscientist at Washington University School of Medicine, vehemently disagreed with Dr. Weldon’s assessment of the controversial super sweetener. Notable for his discovery of the brain-harming effects of an amino acid called glutamate, Dr. Olney was influential in legislating the ban of MSG in baby food. At the time of the 60 Minutes segment, he had been studying the effects of aspartame and other compounds on brain health for more than two decades.

Olney told 60 Minutes’ Mike Wallace that since the approval of aspartame, there had been “a striking increase in the incidence of malignant brain tumors.” The doctor did not directly blame aspartame for the increase. He did, however, state that there was enough questionable evidence to merit reevaluating the chemical compound. He said that the FDA should reassess aspartame and that “this time around, they should do it right.”

Dr. Erik Millstone, Professor of Science Policy at the University of Sussex, told 60 Minutes that Searle’s testing procedures in the early 1970s were so flawed that there was no way to know for certain if aspartame was safe for human consumption. Millstone claimed that the company’s failure to dissect a test animal that died during an aspartame experiment was merely one example of “deficiencies” in Searle’s conduct. He also noted that when test mice presented with tumors, the tumors were “cut out and discarded and not reported.” In addition, Dr. Millstone told Mike Wallace that G.D. Searle and contractors hired by the drug company administered antibiotics to some test animals yet neglected to reveal this information in official reports.

In 1974, after the G.D. Searle company had already manufactured a significant quantity of aspartame, then-commissioner of the FDA, Alexander Schmidt, came very close to approving the chemical for human food purposes. Relying solely on evidence provided by Searle, the FDA allowed a mere 30 days for the public to respond before putting the FDA seal of approval on the now-controversial food additive. Dr. John Olney wasted no time in joining forces with James Turner, a public interest attorney who also worked with consumer advocate Ralph Nader. Just before the allotted time for public response ran out, Dr. Olney and his attorney petitioned the FDA with data that indicated the dangerous similarities between aspartame and glutamate.

In his 1970 best seller, The Chemical Feast, Turner detailed numerous ways the FDA shirked its obligation to protect the American people. At the time of its publication, Time magazine described the tome as “the most devastating critique of a U.S. government agency ever issued.”

In response to Dr. Olney’s allegations that aspartame was potentially as brain-damaging as glutamate, the FDA called for a task force to investigate the matter. By late 1975, the FDA found that Searle’s own research into the safety of aspartame was so flawed that they stayed the approval process, citing “a pattern of conduct which compromises the scientific integrity of the studies.”

Former U.S. Senator Howard Metzenbaum told 60 Minutes that when Searle presented information to the FDA in 1974, the drug company “willfully misrepresented” and omitted facts that may have halted approval of what would soon become its best-selling product. Metzenbaum went on to say that the FDA was so disturbed by its findings, it forwarded a file to the U.S. Attorney’s Chicago office in 1975 in the interest of calling a grand jury to determine whether criminal indictments against Searle were warranted.

When did the grand jury convene? Never. According to 60 Minutes, U.S. Attorney Samuel Skinner requested a grand jury investigation in 1977 but recused himself from the case when he was offered a job at the Sidley & Austin law firm, which also happened to be the Chicago law firm that represented the G.D. Searle company. The investigation was stalled until the statute of limitations ran its course, and no grand jury ever heard the case against Searle’s questionable research standards. Skinner, by the way, did eventually accept the job with Searle’s Chicago law office.

In 1977, a new FDA task force convened in Skokie, Illinois, with the sole purpose of investigating the research methods employed by G.D. Searle in its effort to gain FDA approval of aspartame. The task force examined raw data from 15 studies that Searle used to back up its uniformly positive claims about aspartame. According to journalist Andrew Cockburn, the task force noted numerous “falsifications and omissions” in Searle’s research reports.

In 1980, at the tail end of the Carter administration, the FDA conducted two-panel investigations into claims that aspartame caused brain tumors. Led by scientific and medical experts, each panel concluded that more tests were needed to prove the safety of the sweetener. Both panels concluded that aspartame should not be approved at that time.

So, how does the fellow in the cover picture figure into this equation?

If you do not recognize the face in the feature photo, here is a memory refresher: The man in the pic — and up to his ears in the aspartame controversy — is Donald Rumsfeld. Perhaps best known as Secretary of Defense during the George W. Bush presidency, Donald Rumsfeld also happened to be CEO of the G.D. Searle drug company when Ronald Reagan was sworn in as President of the United States on January 1981.

Regardless of its safety or potential peril, the fact remains that without the clout and political influence of Donald Rumsfeld, aspartame might never have been approved for human use at all.

Donald Rumsfeld, who at one time aspired to be Reagan’s running mate, was a member of the new president’s transition team. Part of the team’s duties involved the selection of a new FDA Commissioner. Rumsfeld et al chose a pharmacist with no experience in food additive science to lead the agency.

On January 21, 1981, Ronald Reagan’s first full day as president, the G.D. Searle company, headed by Donald Rumsfeld, reapplied for FDA approval of aspartame. That same day, in one of his first official acts, President Reagan issued an executive order that rescinded much of the FDA commissioner’s power.

In April 1981, newly appointed FDA commissioner Arthur Hayes Hull, Jr., put together a five-person panel tasked to reevaluate the agency’s 1975 decision to not approve aspartame as a food additive. At first, the panel voted 3-2 to uphold non-approval of the chemical sweetener. Hayes then invited another member to the official FDA panel, and the vote was retaken. The panel deadlocked, and Hayes contributed his own vote to break the tie. Two months later, the product that the FDA refused to approve for seven long years was suddenly approved for human consumption.

Four years later, in 1985, the Monsanto corporation bought G.D. Searle and established a separate division, The NutraSweet Company, to manage the sales and public relations of one of its best-selling and most profitable products. It may be worth noting that when Monsanto purchased Searle and the patent on aspartame, Donald Rumsfeld reportedly received a fat $12 million bonus.

Before reading this, how much did you know about the origins of aspartame? If you’re like most Americans, the answer is “not much.” And, if you’re like many Americans, your interest in a story such as this one will wane as soon as the next hot topic comes along. Perhaps this is the reason that there has been little if any public outcry regarding aspartame or the weird way that it received FDA approval.

Updates:

In 1987, UPI investigative journalist Gregory Gordon reported that Dr. Richard Wurtman, a neuroscientist at Massachusetts Institute of Technology and a die-hard supporter of aspartame during its 1981 rush to approval, had reversed his thoughts on the sweetener. He noted the once-ardent supporter as saying his views had evolved along with scientific studies and his increased skepticism of industry research standards.

In 1997, the U.K. government obliged makers of sweetened food to prominently include the words “with sweeteners” on product labels. Ten years later, U.K. supermarket chain Mark & Spencer announced the end of artificial sweeteners and coloring in their chilled goods and bakery departments, according to the Daily Mail.

In his well received 2007 book, Rumsfeld: His Rise, Fall and Catastrophic Legacy, author Andrew Cockburn described the results of the 1977 FDA task force that found “falsifications and omissions” in Searle’s research data. The New York Times called author Cockburn’s biographical tome “quite persuasive.”

In 2009, Woolworths, a South African retailer, announced that it would no long brand products containing aspartame.

On February 28, 2010, Dr. Arthur Hayes Hull, Jr., the FDA Commissioner who hurried aspartame to market and later squelched public fear of Tylenol during the 1982 poisoning scare, died in Connecticut. According to his New York Times obituary, Hayes was employed as president of E. M. Pharmaceuticals after his term at the FDA. Hayes succumbed to leukemia at age 76.

A 2011 report in the Huffington Post noted that 10,000 American consumers notified the FDA about the ill effects of aspartame between 1981 and 1995. According to the article, the use of aspartame elicited more complaints than any other product in history, comprising 75 percent of complaints received by the U.S. Food and Drug Administration.

In 2013, the EFSA (the British equivalent of the FDA), reiterated its claim that aspartame is harmless. Professor Erik Millstone responded with his own reevaluation of aspartame, in which he noted that every study the EFSA used to approve aspartame was funded by the same industry that manufactures and profits from the controversial sweetener.

Dr. John W. Olney passed away at the age of 83 on April 14, 2015. In addition to his campaigns against aspartame and glutamate, the doctor devoted half a century of his life to finding a cure for Multiple Sclerosis, the crippling neurological disease that claimed his own sister when she was 16. According to the St. Louis Post Dispatch, cause of death of the pioneering brain researcher included complications of ALS, a neurological disorder more commonly known as Lou Gehrig’s Disease.

Those interested in learning more about the approval of aspartame are invited to read the FDA Commissioner’s Final Report, published by the Department of Health and Human Services on July 24, 1981. A detailed version of the aspartame timeline is available at Rense.com.

Mother Earth May Have Good Reason to Slaughter Us

jing-jang-yin-yang-planet-hd-high-definition-fullscreen

By Jack Balkwill

Source: Dissident Voice

Decades ago James Lovelock constructed a principle called the Gaia hypothesis, contending that a biosphere teeming with life works together with inorganic matter to self-regulate conditions for maintaining a livable planet.

The oxygen levels in our air are maintained, and the salinity of the seas – everything that’s needed to keep conditions within the zones which nurture life on the planet.  This is a theory embraced by many deep environmentalists because it offers hope for the future of life forms on the planet.

When one creature (such as man) gets to be so out of control that it threatens the other life forms, Gaia, or Mother Earth, pushes back toward a healthy balance, according to some theorists (the Gaia principle has many variations).

In the ancient Greek religion, before Zeus was king of the gods in the classical period, or Zeus’ father Cronus was king of the gods, or Cronus’ father Uranus was king of the gods, there was Gaia, the earth mother, who created the heavens, the various gods, and man.  Gaia regulated the growing of crops, healed the sick, and was the earth itself to her followers.

Many of the most ancient religions around the world had as their chief deity a female, and my guess is because they reasoned that since it is the female who gives birth, a creator must be female.

The universe within us

Each of us humans is a microcosm of the Gaia principle.  Within us, we have about a hundred trillion unique creatures which do not share our DNA.  Cells containing our DNA only number about ten trillion, so they are vastly outnumbered.  The microbes within us are in many forms — bacteria, fungi, archaea and viruses.

When our microbes are out of balance, it can be life-threatening, so a major function of our immune system is to regulate them, to keep one species from over reproducing, just as, in the Gaia theory, life forms are regulated within the massive biosphere.

If, for example, Candida reproduces to a high level, our immune system will try to destroy enough of it to get back to a balance.  Candida at normal levels may actually be beneficial, and is thought to attack some harmful invaders. At extreme levels of overgrowth Candida may become deadly to us.

Most of the life forms within us are friendly, and we would die without them.  They have a great many functions, working together to keep us alive.  In the end, if we die, they no longer have a home.

And most of the life forms outside of us are also beneficial, aiding Mother Nature in maintaining a delicate balance.

Symbioses

Oak trees have dropped their heavy acorns for millions of years, right beside their trunks.  In such a place, the acorn has little chance of growing with no sunlight under the canopy of mother tree.  But squirrels are happy to carry the acorns away from the tree to bury them in case they are needed for food during an extreme winter.  The squirrels don’t eat all of what they bury most years, giving the oak an opportunity to spread its genetic material.

In return the oak provides a home for the squirrel, which builds nests in oak trees and eats the acorns.  There are interactions between species all over the planet with which we are not yet familiar, but it is clear that species depend on one another for survival, just as the microbes within us are maintained in a balance that sustains life.

A flower may provide pollen to the bee, and in return the bee pollinates other flowers, benefiting both species.

But sometimes man gets in the way

Who would think a massive animal like a moose would rely on the lowly beaver for its well being?  When beaver hats were a popular fad, beaver were killed off in such large numbers that moose began to starve.  One of the favorite foods of moose is the shoots growing in wetlands, and without beavers to dam streams creating wetlands, moose began to go hungry and started feeding on tree bark, killing trees.

Of course, it was inadvertent that a fad of humans started killing off moose.  But we’ve done such things throughout our history and have more control over nature than we realize.

When sperm whales were slaughtered to near extinction, giant squid began to rise up to the surface in the oceans, no longer having to fear their primary enemy, the sperm whales that fed upon them.  Giant squid previously stayed in deep parts of the ocean to avoid sperm whales.  We have no idea what happens in the long term when a creature like the giant squid, with a ravenous appetite, begins feeding in a part of the biosphere from which it was banned for millions of years, but certainly it must upset the food chain.

It is thought that some animals, such as mammoths, became extinct at the hand of man.  Such creatures disappeared in North America about the time it was populated by humans.

Whether directly or indirectly, we are responsible for the extinction of a great many species.

Intelligence, whatever that is

Many people seem to think that humans are somehow superior creatures.  We have a formula for determining intelligence which predicts that a species is intelligent when its brain is large enough to take care of all of the functions of its body, with something left over.  That something left over is intelligence.  So it’s largely brain size in proportion to body size that suggests degrees of intelligence.

There is an old belief that elephants have a pea brain, but it is not true.  An elephant has a large brain, but needs most of it for maintaining its massive bodily functions, so what’s left over may not be great intelligence, but the elephant is certainly an intelligent animal.

The cetaceans, the large toothed whales, all have brains larger than human brains.  Some scientists have speculated that they may be more intelligent than humans.

When people say, “But cetaceans haven’t invented nuclear weapons,” they are showing, perhaps, a flaw in the human being, not a comparative virtue.

Those who support the theory that cetaceans are more intelligent theorize that they may understand that being more in harmony with nature is the intelligent thing to do for long term survival, rather than making automobiles which pollute the planet and the many other destructive things humans do.

At any rate the other creatures appear to help maintain the balance of life within the biosphere, interrelating in complex ways, while humans have reproduced out of control, crowding out other life forms, taking more than our share of resources, and polluting the planet.

So another way to look at the Gaia theory is to describe it as a kind of immune system for the biosphere.  When it has an organism that is overpopulating and causing other organisms to die, that organism must be regulated, just as for a Candida overgrowth or cancer within a human.

The traditional way that Mother Nature has regulated the human population is with disease.  It worked well up to the twentieth century, when humans began to poison their drinking water with chlorine or other agents to kill off water-borne diseases, which had previously wiped out the populations of entire cities.

Will humans be brought under control by Mother Nature?

In the 1970’s there was a movement to reduce the human population, quite popular with many.  I donated to that cause, and was surprised to see it vanish.  I suspected that it was killed by the capitalists, who have a vision that the population must continue to grow for there to be more consumers, hence, more profit.  Capitalists insist that “growth” continue without considering finite limits consistent with the size of the planet.

So how will Gaia maintain the delicate balance with the human organism out of control?  She might introduce a new disease for which we have no antidote.  It was the first thing I thought of when the AIDS epidemic began decades ago.  A perfect killer, to destroy the immune function, allowing almost anything to then kill the host.  But mankind seems to now have that disease under control.

Or Mother Nature might allow us to commit suicide by climate change from our nasty habit of spewing carbon emissions, and other anti-environmental things we are doing in destroying our little blue planet. We are releasing massive toxins into the environment in the form of dioxins from paper and plastic making, radiation from nuclear power plants and bomb making, insecticides, herbicides, and other dangerous chemicals.

A recent report by The World Economic Forum and Ellen MacArthur Foundation stated that at the current rate, the weight of plastic in the oceans will exceed the weight of the fish.  When I heard this a few weeks ago I posted on Facebook, “The epitaph for human beings will read ‘they thought they were an intelligent species.’”

As an old man I take heart that young people seem to be far more aware of the degradation of the planet’s environment, giving me hope that they will find a solution and assist Mother Gaia in her quest for purification and renewal.

The alternative is to leave her no choice but to see us as a cancer that must be eliminated for the good of the whole.

 

Jack Balkwill is an activist in Virginia. He can be reached at libertyuv@hotmail.com Read other articles by Jack.

Tap runs brown in Louisiana’s impoverished northeast

CZTfXQlUkAAN0c7.jpg_large

With infrastructure funds going unspent, residents have lack of drinking supplies and iron sediment flowing from pipes

By

Source: Al Jazeera

ST. JOSEPH, Louisiana – “I’m scared to take a bath,” said Ethel Strum, who lives in St. Joseph, a community of barely 1,000 people in northeast Louisiana.

She turned on the tap in the bathroom sink of her tidy one-story home and the water flowed clear for a second before fading to a milky brown. In her kitchen, a few cases of bottled water, which she uses for everything from brushing her teeth to cooking, are stacked on the table.

“I drive to Newellton to shower. It’s a 20-minute drive but I can do it in 12,” she laughed.

But talking about the town’s water issues makes her visibly upset. Among the many problems are frequent outages, water thick with iron sediment from the aging pipes, and poorly communicated boil-water advisories. “Water should be free until it’s fixed,” she said.

Strum can’t even wash her car with town water because it leaves a rust coating. Despite minimal use, she says she has received high bills and has to buy 20 cases of bottled water every month.

While state officials and the EPA have deemed the water safe to drink, virtually no one risks it. Most here do not even use tap water to cook or brush teeth, and many, especially children, bathe with bottled water. Lots of residents spend several hundred a month on store-bought water.

To add to the mounting frustration, $6 million of state funds allocated to St. Joseph for water line repairs in 2013 are still being withheld because the town’s mayor, Edward Brown, has failed several times to turn in a mandatory financial audit on time. New Governor John Bel Edwards said this week his office was working with the town of St. Joseph and the Department of Health and Hospitals (DHH) to fast track the allocation of at least some of that money to start system repair work. Mayor Brown said he expects to file the overdue audit by the end of February.

“What we’ve experienced here is policy failures that have allowed these communities to fall through the cracks,” said St. Joseph resident Garrett Boyte, drawing a comparison to the disaster in Flint, Michigan.

Boyte, along with colleagues in the Servant Leadership Corps of the Episcopal Diocese of Western Louisiana, have sought to attract public attention on the issue through a recent social media push and an online petition to the Obama administration for federal assistance to St. Joseph. The effort has just over one-tenth of the 100,000 signatures it needs by Feb. 19 to reach its goal. The group notes, however, that with no evidence of lead contamination, the situation in St. Joseph is considerably less dire than in Flint.

Nevertheless, the town’s water woes illustrate a more slow-moving and commonplace catastrophe: failing infrastructure in small, impoverished communities that cannot afford to replace their systems, leaving residents with limited resources to cope on their own.

Established in 1834, St. Joseph lies squarely in the so-called Black Belt, a term coined by Booker T. Washington at the turn of the century to refer to the swath of dark, fertile soil that spans much of the American south. It has evolved to describe the largely black, poor communities that have been in decline since the mechanization of farming drove out small farmers, with no industry to replace it.

The town is the seat of Tensas Parish — one of the state’s poorest, with 34 percent of residents living in poverty and a median household income of around $27,000 (compared to $45,000 statewide). The least populous parish in the state, over half of its population is African-American. Unemployment in St. Joseph is likely higher than the official parish average of 9 percent, and the town’s poverty is written in its shoddy roads and houses in dire need of repair.

St. Joseph’s decaying water distribution system, installed in the 1930’s, is the main cause of the town’s water problems. “Over time, these old cast iron pipes that convey the water, they deteriorate and start to crack and leak,” said Davis Cole, a Baton Rouge-based civil engineer working on the redesign of St. Joseph’s water system. Leaks cost the town money; according to Mayor Brown, the system loses 50 percent of its water. And with resources already stretched thin, long-term repairs are out of reach. “This is typical of communities probably all over the U.S., especially poor communities,” Cole said.

The water’s rusty tint comes from naturally occurring iron and manganese sediment in the underground well that has built up in the water lines over the years. Every time the system has to be shut down for repairs, and then restarted, sediment is injected into the water flow. The problem started to become obvious over a decade ago, according to residents, but has gotten progressively worse. The water main reportedly broke four times last month alone.

While the state does monthly bacterial tests, the last detailed analysis of St. Joseph’s water was in 2013. It showed 32 times the EPA-recommended level of iron and 9 times that of manganese, according to an analysis by the local Sierra Club. But the EPA considers these contaminants to have merely “aesthetic” affects on the water. They are not regulated by the EPA or the state.

The state’s official approval of the water quality is of little comfort to most residents here. “We don’t know what’s in that water. They say it’s rust but there are so many ‘what if’s’,” said Marie, a grandmother and lifelong resident of St. Joseph who declined to give her real name because of the sensitivities around local politics.

“Who’s more important: the people or the paperwork?” asked Marie, implying that the audit issue was to blame. “Get the water straight and then work on that. It’s not the community’s fault, don’t make it hard on us.”

Iron contamination and aging infrastructure are not uncommon problems in the region. Dr. Jimmy Guidry, state health officer at the Department of Health and Hospitals, confirms that water discoloration caused by iron is a common complaint across the state. “There are probably several hundred water systems that deal with this,” Guidry said, out of approximately 1,360 local water systems in Louisiana.

“As a physician, I’m not going to tell you a lot of iron in your system is not going to affect your health,” Guidry said, when asked about the long-term effects of high iron exposure. “But that’s not something we regulate.” In light of renewed attention on the issue, he said the DHH would be looking again at an iron and manganese water rule that was legislated in 2014.

Of more pressing concern to Guidry was replacing St. Joseph’s decrepit water pipes, which pose a risk of bacterial contamination every time they break. Many Louisianans have been on high alert for water contamination after a brain-eating amoeba was found in four separate water systems last year.

Guidry also noted that frequent leaks in the St. Joseph water main present another threat — they have started to erode the nearby levee. The Mississippi River has swelled to record heights in some areas this winter following heavy rainfall.

“I understand their urgency to [fix the water color]. That will take time,” said Dr. Guidry. “The urgency to get their system back to where they are not at risk of contamination is most important to me.” He said that if they receive approval for two grants totaling around $600,000, the most-needed repairs on the town’s storage tank and water pipes could begin in a matter of weeks.

However, he noted that even with a full revamp of the system, which is planned once an additional $2 million in funding is secured, the brown water of St. Joseph might not disappear. “It may not get rid of it completely, because it doesn’t address the treatment part,” he said. While there are expensive chemicals and filters that can get rid of iron discoloration, “for a poor community, it’s not an easy option to address the iron,” he added.

“If you were born after 2000 in this parish, you were always taught not to drink that water,” said Rosalie Bouobda, who moved to St. Joseph two years ago as a consultant for the Servant Leadership Corps of the Episcopal Diocese of Western Louisiana. That group has been meeting with state and local officials about the water issue for the past two years. “Children are taught not to deal with [the water]. It’s a reflex to them…They’ve never known clean water.”

Bouobda, along with Garrett Boyte, who started posting pictures of the muddy water on Twitter, has been instrumental in garnering attention beyond social media. But many lifelong residents of St. Joseph have been reluctant to speak out, either because of a fatalistic sense that nothing will change, or out of deference to local politics in a town where so many people are related to each other.

“They just accept it as a fact of life, their water is dirty and there is nothing they can do about it,” Boyte says of people’s perceptions.

For St. Joseph, dirty water may be as much a fact of life as high unemployment and failing schools.

“It’s like nobody cares. That’s how people in this town feel,” Michael Thomas, Jr., a 25-year old father of two, said. His apartment in a small subsidized housing complex stands across from the overgrown ruins of a long-abandoned Tensas Rosenwald High School, one of many built in the early 20th century with funds from Julius Rosenwald, the president of Sears, Roebuck and Company, to educate African-Americans in the South.

“We gotta boil the water just to wash the babies,” he said. “If I could afford it, I’d move.” He said he spends as much as $300 every month on bottled water.

Sitting nearby, Kristi McWilliams, 23, and John Jackson, 24, echoed Thomas’ frustration. “I think about moving all the time, but we don’t have the jobs or the money,” said McWilliams.

“There is more they could be doing,” said Jackson of state officials. “They could drop the water bill. Water should be cheaper in the stores.”

Some residents, like Ann South, an elderly woman who recently suffered a stroke and a heart attack, were frustrated by the disparities with other communities. “Around the lake there is no water problem,” she said, referring to the more affluent, largely white area around Lake Bruin nearby. “Who in the world do we have to talk to about helping my people in St. Joseph?”

Others, like Ethel Strum, were more sharply critical of Mayor Brown’s role in the water crisis. “The Mayor – he’s doing nothing!” she said. Valerie and Chip Sloan, a white couple who own a large house by the levee, claims that Mayor Brown has kicked them out of multiple town hall meetings for asking questions about the water, and that he has ignored their Freedom of Information Act requests for data on town finances.

“There are residents who have spoken out and they are retaliated against, either by the community at large or someone from the city,” added Boyte.

For his part, Brown, the town’s first African-American mayor, says his hands are largely tied on the water issue. “The state of Louisiana is testing this water and is saying that it is safe. And for me to overrule, [the Department of] Health and Hospitals or EPA … I don’t have that expertise, and no one in this town does,” he told Al Jazeera.

He says that with a budget of around $1.5 million – including a $500,000 general fund and around $1 million in water and gas sales – he is left with around $50,000 a year. That would leave little cash for infrastructure upgrades.

As for the lapsed financial audits that have delayed water system funding, Brown says that after an audit firm pulled out mid-audit in 2013, the town was forced to file late. With various lawsuits looming over the town since then, he says he has struggled to find an audit firm that will take on his case.

Brown also points to tight oversight from legislative auditors closely monitoring the town, in light of its failure to comply with financial audit rules. He says state law requires him to cut off utility connections after two months of missed payments and that, in the past, he has been issued a citation by auditors for granting extensions to individuals. Now, he turns off about 10-15 utility connections per month, including houses where he knows there are children and elderly people.

“You tell me what the state cares about the people in this town,” said Brown. “Did they care about people in Flint? They care about them now.”

Now Streaming: The Plague Years

3768363

By A. S. Hamrah

Source: The Baffler

When things are very American, they are as American as apple pie. Except violence. H. Rap Brown said violence “is as American as cherry pie,” not apple pie. Brown’s maxim makes us see violence as red and gelatinous, spooned from a can.

But for Brown, in 1967, American violence was white. Explicitly casting himself as an outsider, Brown said in his cherry pie speech that “violence is a part of America’s culture” and that Americans taught violence to black people. He explained that violence is a necessary form of self-protection in a society where white people set fire to Bowery bums for fun, and where they shoot strangers from the towers of college campuses for no reason—this was less than a year after Charles Whitman had killed eleven people that way at the University of Texas in Austin, the first mass shooting of its kind in U.S. history. Brown compared these deadly acts of violence to the war in Vietnam; president Lyndon B. Johnson, too, was burning people alive. He said the president’s wife was more his enemy than the people of Vietnam were, and that he’d rather kill her than them.

Brown, who was then a leader of the Student Nonviolent Coordinating Committee and who would soon become the Black Panther Party’s minister of justice, delivered a version of this speech, or rant, to about four hundred people in Cambridge, Maryland. When it was over, the police went looking for him and arrested him for inciting a riot. Brown’s story afterward is eventful and complicated, but this is an essay about zombie movies. Suffice it to say, Brown knows about violence. Fifty years after that speech, having changed his name to Jamil Abdullah al-Amin, he’s spending life in prison for killing a cop.

The same day Brown was giving his speech in Maryland, George A. Romero, a director of industrial films, was north of Pittsburgh in a small Pennsylvania town called Evans City. Romero was shooting his first feature film, a low-budget horror movie in black and white called Night of the Living Dead. Released in October 1968, the first modern zombie movie tells the story of a black man trying to defend himself and others from a sudden plague of lumbering corpses who feed on the living. At the film’s end, he is unceremoniously shot and killed by cops who assume he is a zombie trying to kill them. The cops quickly dispose of his body, dumping it in a fire with a heap of the undead, as a posse moves on to hunt more zombies.

Regional gore films were nothing new in themselves; a number had appeared earlier in the 1960s. Night of the Living Dead, with its shambling, open-mouthed gut-munchers dressed in business suits and housecoats, might have seemed merely gross or oddly funny in a context other than the America of 1968. But Martin Luther King Jr. had been assassinated six months before its release. The news on TV, which most people still saw in black and white, consisted largely of urban riots and war reports from Vietnam. The My Lai Massacre had occurred the month before King was shot.

Romero’s film, seen in the United States the year it came out, had more in common with Rome Open City than it did with a drive-in horror movie made for teens—it was close to a work of neorealism. And it was unfunny and dire, much like John Cassavetes’s Faces, released the same year, whose laughing drunks stopped laughing when they paused to look in the mirror. Romero was a revisionist director of horror in the same way that Peckinpah and Altman were in their career-making genres, the western and the war movie.

Romero cast an African American in the lead, and he shifted the horror genre’s dynamic, aligning it with black-and-white antiwar documentaries like Emile de Antonio’s In the Year of the Pig, also released in 1968, and distinguishing it from the lurid color horror films Roger Corman and Hammer Films had been turning out up till then. Those films made certain concessions to the film industry; Night of the Living Dead did not. This was an American horror movie, so it needed no English accents or familiar character actors. It was grim and unflinching, showing average citizens, played by average people, eating the arms and intestines of their fellow townsfolk. Romero drove home this central point—that a zombie-infested America differed from the status quo only in degree, not in kind—by ending his film with realistic-looking fake news photos depicting his characters’ banal atrocities.

Mainstream film reviewers, including Roger Ebert, were shocked and disgusted by Night of the Living Dead. They discouraged people from seeing it, but Romero’s images proved to be indelible. The film’s reputation grew. In 1978 Romero made the film’s first sequel, Dawn of the Dead, this time in color. Today, if there’s one thing every American knows, it’s that zombies can only be killed with a shot to the head. This is common knowledge, cultural literacy, a kind of historical fact, like George Washington chopping down the cherry tree. American-flag bumper stickers assert that “these colors don’t run,” but one of them does. It runs like crazy through American life, through American movies, and now TV, like a faucet left on.

Dead Reckonings

The Huffington Post has had a Zombie Apocalypse header since 2011, under which the editors file newsy blog posts chronicling our continuing fascination with zombie pop culture, alongside any nonfiction news story horrible enough to relate to zombies or cannibalism. The infamous Miami face-eater attack of May 2012, which the media gleefully heralded as the start of a “real” zombie apocalypse, contributed to America’s sense that it could happen here, provided we wished for it hard enough. Reading through the Zombie Apocalypse posts, one gets a growing sense that we want the big, self-devouring reckoning to happen because it is the one disaster we are truly mentally prepared for. It won’t be the total letdown of the Ebola scare.

The face-eating incident was initially linked to bath salts: ground-up mineral crystals everyone hoped would become the new homemade drug of choice for America’s scariest users. It turned out the perpetrator, although naked, was only high on marijuana. He was black, killed by the police as he gouged out his homeless victim’s eyes and chewed his face on a causeway over Biscayne Bay. The incident was captured on surveillance video. Here in the golden age of user-generated content, the zombie movies self-generate—much like zombies themselves. The bridge backdrop of this all-too-real zombie vignette neatly summed up both the crumbling condition of America’s infrastructure and our more generalized state of neoliberal collapse.

The zombie apocalypse, our favorite apocalypse, seems to unite the right and left. It combines the apocalypse brought about by climate change and the subsequent competition for scant resources with the one loosed by secret government experiments gone awry. Better still, both of these scenarios, as we’re typically shown in graphic detail, will necessitate increased gun-toting and firearms expertise.

More than that, the fast-approaching zombie parousia allows us to indulge our fantasies of a third apocalypse, one that only the most clueless don’t embrace: the consumerist Day of Judgment, in which we will all be punished for being fat and lazy and living by remote control, going through our daily routines questioning nothing as the world falls apart and we continue shopping. Supermarkets and shopping carts, malls and food warehouses all figure prominently in the iconography of the post–Night of the Living Dead zombie movie, reminding us that even in our quotidian consumerist daze, we are one step away from looting and cannibalism, the last two items on everyone’s bucket list.

Still, despite its galvanizing power to place all of humanity on the same side of the cosmic battlefront, the zombie apocalypse, like all ideological constructs, nonetheless manages to cleave the world into two camps. One camp gets it and the other doesn’t. One is aware the apocalypse is under way, and the other is blithely oblivious to the world around it.

To confuse matters further, people move in and out of both camps, becoming inert, zombified creatures when obliviousness suits their mood. People blocking our progress on the street as they natter into their hands-free earsets stare straight ahead, refusing to admit that other people exist. At least they don’t bite us as we flatten ourselves against walls to pass them without contact. A paradox of the ubiquity of zombie-themed pop culture is how there are surely next to no people left who have not enjoyed a zombie movie, TV show, book, or videogame, yet there are more and more people shuffling around like extras in a zombie film, moving their mouths and making gnawing sounds.

The smartphone-based zombification of street life is a strange testament to Romero’s original insight, which becomes more pronounced as the wealth gap widens. The disenfranchised look ever more zombie-fied to the rich, who in turn all look the same and act the same as they take over whole neighborhoods and wall themselves up in condo towers. This, indeed, is exactly what happens in Romero’s fourth zombie movie, 2005’s Land of the Dead, which predicted things as consequential as what happened during Hurricane Katrina in New Orleans and as minor as the rise of food trucks.

The Zombie Apocalypse is also a parable of the Protestant work ethic, come to reap vengeance at the end of days. It assures us that only very resourceful, tough-minded people will be able to hack it when the dead come back to life. If the rest had really wanted to survive—if they deserved to survive—they would have spent a little less time on the sofa. But here, too, the simple and obvious moral takes a perverse turn: the best anti-zombie combatants should be the ones who’ve watched the most zombie movies, yet by the very logic of our consumer-baiting zombie fables, they won’t be physically capable of survival because all they did was watch TV.

Selective Service

What these couch potatoes will need, inarguably, is the protection of a strong leader, one who hasn’t spent his life in the vain and sodden leisure pursuits that they’ve inertly embraced—Rick Grimes in The Walking Dead, for instance. Why such a person would want to help them is a question they don’t ask. With this search for an ultimate hero, the zombie genre has veered into the escapism of savior lust, leaving Romero’s unflinching, subversive neorealism behind. In Night of the Living Dead, a witless humanity is condemned by its own herd mentality and racism. In latter-day zombie fictions, a quasi-fascist social order is required, uniting us regardless of race, creed, or color.

The predicament of the characters (and the actors) in all the nouveau zombie movies relates to this passive consumerism. Both the characters and the actors in new zombie movies have to act like zombie films don’t already exist, even though the existence of Romero’s films is what permits the existence of the film they are in. Somehow, the characters pull their savvy out of thin air. They must pretend that they have never heard of zombies, even as they immediately and naturally know what to do once their own particular Zombie Apocalypse gets under way.

This paradox underscores the fantasy aspect of the twenty-first-century zombie infatuation, in which a fixed set of roles is available for cosplay in a repeatable drama that already took place somewhere else. The difference between Romero’s films and the new zombie movies is that the more time that passes since 1968, the more Romero’s films don’t seem like they were designed as entertainment—even as they are endlessly exploited by the zombie-themed cultural productions that copy them, and even as they remain entertaining. The new zombie films cannibalize Romero’s films in an attempt to remake them ideologically, so that we will stop looking for meaning in them and just accept the inevitable.

The Primal Hordes

A primal fantasy of the Zombie Apocalypse is that when the shit hits the fan, we will be able to kill our own children or parents. We won’t have a choice. The decision to get rid of the generation impeding us will have been made for us by the zombie plague, absolving us of responsibility. We are, after all, killing somebody who is already dead and who, in his or her current state, is a threat to our continued existence.

Against the generalized dystopian entertainment landscape that followed the economic collapse of 2008, the Zombie Apocalypse made more sense than ever. But YA action-drama dropped it in favor of promoting teen heroes who were stronger than their nice-but-loserish sad sack parents. This is the uplifting generational affirmation that imbues Suzanne Collins’s Hunger Games franchise and Veronica Roth’s Divergent trilogy.

YA comedy, on the other hand, did not ignore zombie movies. Instead, it domesticated the Zombie Apocalypse, making it friendly. Nonthreatening zom-coms showed young viewers how the opposite sex was really not that scary, that being in a couple was still the most important thing, and that dystopias gave nerds an unprecedented chance to prove they could get the girl or boy. Dystopia, it turns out, is really a best-of-all-possible-worlds scenario for starry-eyed-kids-with-a-disease, or so we learn from zom-coms like Warm Bodies and Life After Beth.

The latest iteration of this trend, which sets a zombie heroine in a marginally less dystopian world that mirrors our tentative economic comeback, is the CW TV show iZombie. The series is a brain-eating entertainment for tweens in which they learn you can be okay and have a chill job even if you’re a living corpse who’s just trying to figure things out. When a zombie gets her own tween-empowerment show on The CW, it’s a good indication that zombies don’t carry the stern, unbekannt stigmas they used to. Zombies, much like corpses in TV commercials, are used as grotesque comic relief in things like animated Adult Swim shows. Such is the diminished status of the zombie; it is now a signifier that can be plugged in anywhere. To paraphrase the undead philosopher of capitalism’s own walking-dead demise: first time cannibalism, second time farce.

Reality Bites

The way zombie movies progress, with isolated groups splitting into factions and various elimination rounds as contestants disappear, suggests that Night of the Living Dead is also a secret source of reality TV. It makes sense, then, that 2009’s Zombieland, one of the first YA dystopian zombie entertainments, was penned by screenwriters who created The Joe Schmo Show and I’m a Celebrity . . . Get Me Out of Here!

Zombieland’s protagonist, a college-age dude played by Jesse Eisenberg, is a bundle of phobias, an OCD-style follower of rules who finds himself in a Zombie Apocalypse after an unexpected date with a hot girl out of his league (Amber Heard) goes wrong. Mentored by Woody Harrelson, who more or less reprised this same role in the Hunger Games movies, Eisenberg’s millennial character undergoes a reality-TV-scripted makeover. In expiation for his pusillanimity in the opening reel, he winds up rescuing a tough girl (Emma Stone) who also would have been out of his league in the pre-Apocalypse scheme of dating. Zombieland presents Eisenberg as gutless and Stone as ruthless, but she’s the one who ends up a hostage, and he becomes her hero. In fact, one of his rules, “Don’t be a hero,” changes on screen to “Be a hero,” as we once again learn that millennials really do have what it takes to kill zombies. Earlier in the film, Eisenberg accidentally shoots and kills a non-zombie Bill Murray, playing himself, showing that millennials can also, regretfully, take out Baby Boomers, including the cool ones who aren’t undead.

Edgar Wright’s 2004 Shaun of the Dead, the first movie zom-com, was a more intelligent version of this same storyline. An English comedy from the “Isn’t it cute how much we suck?” school, Wright’s film acquiesced to the coupling-up plot rom-coms require, but not without first presenting the routine, pointless daily life of its protagonist (Simon Pegg) as pre-zombified. Shaun of the Dead will likely remain the only sweet little comedy in which the protagonist kills his mother, a scene the film has the guts to play without flinching. The joke of Wright’s film is that it takes something as brutal as a zombie apocalypse to wake us from our stupor and to show us how good we had it all along. By the film’s end, Pegg and his girlfriend (Kate Ashfield) are in exactly the same place they were when the film started, but now at least they live together. A cover of the Buzzcocks’ song “Everybody’s Happy Nowadays” jangles over the credits, providing a zombified dose of circa-1979 irony.

Wright and Pegg’s goofy rethinking of the zombie movie proved how firmly zombies are entrenched in our consciousness, and how easy they are to manipulate for comedic effect. The same month Shaun of the Dead came out, a Hollywood remake of Romero’s Dawn of the Dead was released. It, too, cleaned up at the box office. This new Dawn of the Dead seemed like it was made by one of the nerds in the American zom-coms, a jerk desperate to prove he’s bad-ass. (The director now makes superhero movies.) Johnny Cash’s “The Man Comes Around” accompanies the opening credits, setting a high bar for artistic achievement the ensuing film does not come near to clearing. Jim Carroll’s “People Who Died” plays at the end—its placement there as repulsive as anything else in the film.

As all nouveau zombie films must, the remake starts in the suburbs, where a couple is watching American Idol in bed, underscoring the genre’s newfound connection to reality TV. The film’s CGI effects, which at the time injected a souped-up faux energy into the onscreen mayhem, dated instantly. They’re now the kind of off-the-rack effects featured in Weird Al videos when someone gets hit by a car.

The main point of this new Dawn of the Dead is that after the Zombie Apocalypse, people will spend their time barking orders at each other and calling each other “asshole.” The film nods in the direction of loving the military and the police, and totally sanitizes Romero’s use of a shopping mall as a site of consumerist critique. Like many films of the 2000s, it postulates that living in a mall wouldn’t be a Hobbesian dystopia at all; it would be rad. If the remake had been made five years later, maybe it would have had to grapple with the “dead malls” that began to adorn the American landscape with greater frequency after the economy collapsed. Instead, the mall serving as the film’s principal backdrop is spotless and fun. The remake’s island-set, sequel-ready false happy ending makes one long for the denouement of Michael Haneke’s Funny Games—a longing more unimaginable than any real-life wish-fulfillment fantasy about the Zombie Apocalypse actually coming to pass.

The American Way of Death

Fanboys liked the Dawn of the Dead remake and, inexplicably, so did many critics. Manohla Dargis, then at the Los Angeles Times, wrote that the film was “the best proof in ages that cannibalizing old material sometimes works fiendishly well,” a punny sentiment she might well walk back today.

The next year, when George A. Romero released his first new zombie film in twenty years, it did not fare as well in the suddenly crowded marketplace of the undead. While Land of the Dead (2005) is fittingly seen as something of a masterpiece now, on its initial release it puzzled genre fans, who had gotten used to the sort of “fast zombies” that were first featured in the nihilistic-with-a-happy-ending British movie 28 Days Later (2002). Romero’s new film was as trenchant as his others, but many fans weren’t having it.

IMDb user reviews provide a record of their immediate reactions. “This movie was terrible!” one wrote the month Land of the Dead premiered. “The storyline—can’t use the word plot as that would give it too much credit—was tedious! Some say it was a great perspective on class? Are you kidding me!!!” Less then a year into George W. Bush’s second term, Romero was archly depicting a society much different from the one he’d shown in Night of the Living Dead. This new society—today’s—was more class-riven, more opportunistic, more cynical. And Romero, even while moving in the direction of Hawksian classicism, was exposing these failings with radical acuity. His dark fable of two Americas at war over the control of the resources necessary to survive was concise, imaginative, and well constructed. Few at the time wanted to consider the film’s style, which seemed out of date compared to the Dawn of the Dead remake. Fewer still wanted to grapple with its implications.

Ten years later, it is clear that no American genre film from that period digests and exposes the Bush era more skillfully than Land of the Dead. Romero’s film was uncomfortably ahead of its time, and like his other zombie work, it hasn’t dated; it speaks of 2015 as much as 2005. Tightly controlled scenes avoid the pointlessness and repetition of the nouveau zombie films, limning class struggle in unexpected ways. Zombies, slowly coming to consciousness, use the tools of the trades from which they’ve been recently dispossessed to shatter the glass of fortified condos. A zombie pumps gas through the windshield of a limo. The rich commit suicide, only to come back to life as zombies and feed on their children. America, as the original-zombie-era Funkadelic LP taught us, eats its young.

As zombie fantasies go, these scenes are much richer than the random, unsatisfying mayhem of the nouveau zombie films. Romero, unlike his counterparts, does not shy away from race. He shows African Americans pushing back against the injustices and indignities of a militarized police state, thereby completing a circle that began with Duane Jones’s performance in Night of the Living Dead.

Walking Tall

For the latest generation of zombie enthusiasts, the zombie genre means just one thing: AMC’s massively popular cable series The Walking Dead. The show is so much better than any of the recent non-Romero zombie movies that it’s among the leading exhibits in the case against the cineplex. The show’s politics and implications are widely discussed, and The Walking Dead has engendered national debate about all sorts of ethical issues, including something Romero’s films raised only in the negative: America’s future. But the first problem The Walking Dead solved was how to make its own debates about these things interesting: whenever scenes get too talky, a “walker” sidles up and has to be dispatched in the time-honored fashion. At its core, the zombie drama is like playing “You’re it!” The show could be called Game of Tag.

The Walking Dead debuted in 2010, emerging from a period in U.S. history when, all of a sudden, we found ourselves in a junked, collapsed, post-American environment. New dystopian dramas, especially the YA ones, reflected this chastened reality. The Walking Dead looked at first like it might become just another placeholding entry in this cavalcade of glumness, much like TNT’sSpielberg-produced, families vs. aliens sci-fi show Falling Skies. Zombies were maybe the most dated way possible to dramatize our newly trashed world.

It was The Walking Dead’s dated qualities, however, that saved it from becoming cable TV’s Hunger Games. The show’s grunge aesthetic and majority-adult cast situated it elsewhere. And if that particular elsewhere felt like the past as much as the future, that was part of what made the show work for premium cable’s Gen X audience. Greg Nicotero, a makeup man who worked under Romero, is one of the show’s producers. His presence indicated the people behind the show took the genre seriously, unlike anyone else in Hollywood who had touched it.

Television works by imitating success, by zombifying proven formulas through a process called mimetic isomorphism. When television producers saw The Walking Dead’s ratings beating broadcast-network ratings—a first for cable drama—they took notice and began spawning. Copies of copies like Resurrection, The Last Ship, The Leftovers, and 12 Monkeys showed that plague is contagious, but it doesn’t have to be zombie plague. Meanwhile, The Walking Dead continues its success, and AMC will debut a companion series this summer, unimaginatively called Fear the Walking Dead.

If the worst zombie movies unselfconsciously imitate higher-gloss broadcast-network reality trash like Survivor, The Walking Dead succeeds by staying closer to the lowest grade of cable-network reality TV. The world of The Walking Dead is closer to Hoarders than it is to Big Brother. Hoarders presents an America engulfed in mounds of trash that its psychologically damaged possessors can’t part with. Mounds of Big Gulp cups and greeting cards and heaps of car parts and instruction manuals overwhelm their homes, spilling into their yards. Shows like Storage Wars, Pawn Stars, and American Pickers present an America of valueless junk that maybe somebody can make a buck on—if only by televising it for our own lurid delectation. These shows are the opposite of pre-collapse valuation shows like Antiques Roadshow, in which the junk people had lying around proved to be worth more than they had imagined. The detritus of Hoarders is worthless, the kind of trash that will blow around everywhere after the Zombie Apocalypse.

Hoarders vs. Horde

In his recent book 24/7, an analysis of the end of sleep and our twenty-four-hour consumption-and-work cycle, Jonathan Crary writes that “part of the modernized world we inhabit is the ubiquitous visibility of useless violence and the human suffering it causes. . . . The act of witnessing and its monotony can become a mere enduring of the night, of the disaster.” Zombies, not quite awake but never asleep, are the living-dead reminders of this condition, stumbling through our fictions. When they are not transformed by the wishful thinking of ideology into our pals, they retain this status.

Celebrated everywhere, zombies are the opposite of celebrities, who swoop into our disaster areas like gods from Olympus to rescue us from the calamities that also allow them to flourish. Zombies, far from being elevated, descend into utter undistinguishable anonymity and degradation, which is why they can be destroyed in good conscience. Brad Pitt, one of the producers of ABC’s Resurrection, also starred in World War Z, the most expensive zombie movie ever made. The last line of that odious movie—the first neoliberal zombie movie—is “Our war has just begun.”

Whatever that was supposed to mean to the audience, these fables of the plague years drive home just who the zombies are supposed to be—and who, when the plague hits, will helicopter out holding the machine guns. Col. Kurtz’s faithful devotee from Apocalypse Now, Dennis Hopper, the counterculture hero who became a Republican golf nut, plays the leader of the remaining 1 percent in Land of the Dead. “We don’t negotiate with terrorists,” he says when he’s faced with the choice between his money and our lives.

The Golden Era Of Knowledge Resistance Is Upon Us

freedom-mental-slavery2

By Bernie Suarez

Source: Truth and Art TV

With the recent Virginia shooting false flag incident now blown wide open it occurred to me recently that humanity is in the middle of an era of all-time highest abundance of knowledge and information.  We are swimming in knowledge thanks to the internet and technology. This knowledge is easily available to almost anyone in a moderately developed country. Even children have access to this abundance of knowledge. Used properly, the technology and information available to us today gives humanity the tools it needs to overcome and even replace the dying new world order empire seeking control of it.

That’s the good side of today’s technology, knowledge-sharing and information available to humanity. Here’s the dark side of the story however. With much knowledge comes much responsibility and intellectual expectation from the species. Given how much knowledge there is available to humanity, we must face objectively what is transpiring before our very eyes. A large percentage of humans are still blinded by the controllers because they are deliberately resisting this otherwise easily available knowledge and this almost unique first-of-its-kind level of resistance seems to be at an all-time high.

I say “resistance” because more than ever before in the history of mankind humans can rapidly educate themselves about almost anything. Anyone can easily get past the information bottleneck of mainstream media and do their research and discover how real the globalist new world order plans are. After all, they are not even hiding their intentions. Ask yourself why would they publish the Project for a new American Century BEFORE 9/11 telling the world what their intentions would be? How is it that we have access to ‘Operation Northwood‘ documents clearly outlining the controller’s plans for staging false flag terror attacks throughout the U.S. to use as a pretext for war, yet people still think this is not possible? They’ve admitted Vietnam was a war based on a lie but does anyone care? Crisis actors are now regularly being used to stage sloppy staged shootings but many refuse to look into this too deep for fear of what this new knowledge will require of them. Thus today it requires more mental energy to fend off the knowledge that is easily available at everyone’s fingertips. People are apparently willing to put lots of energy into resisting this knowledge just to preserve their paradigms.

Consider that today’s generation has easier access to concrete proof of false flags faster and more accurately than at any point in history. Much of this information can be obtained in almost real-time. Today, unlike any point in history, people who are privileged to this surreal flow of amazing information simply ignore the information like its not there. This unique engineered modern day complacency and resistance to easily available knowledge I believe is very special and must be pointed out to understand where we are and what possible solutions will work best.

Think about that next time you hear people still clinging on to government engineered “official story” lies, false flag events, planned agendas and propaganda. One of the reasons people cling to government lies and resist easily available truth and knowledge is because people don’t want to believe things are that bad. They would rather believe that government is looking out for the greater good of the people. They don’t want to believe anything that will compromise their view of the world and their paradigm about life. Paradigms are very real and many refuse to change how they think to accommodate a new paradigm. They will ignore logic, avoid tough questions, even ridicule the messenger to protect their belief systems and all of these examples are forms of resistance to knowledge.

Cognitive dissonance is at an all-time high. More than ever, people prefer to keep the conversation light. They just don’t want to discuss “conspiracies” or government corruption for fear of getting into an unwanted debate. So in order to avoid a debate or conflict over a political issue it’s better to avoid the topic. Today many work places expect their employees to avoid discussions about controversial topics because it’s not good for the work place. This is an institutionalized resistance to knowledge and it’s all deliberate and engineered. This resistance is cognitive dissonance at a mass scale.

So, although many people are waking up to the nature and true meaning of current and political events both domestically and globally, there is a great sifting going on and more than ever people are resisting simple truth that would have easily persuaded and awakened the average person many centuries ago if they had the same available information in their time. Thus, I believe this is widening the knowledge gap within the species and this knowledge gap widening is part of an intellectual and consciousness extinction phenomenon that I believe is happening. Sadly, many humans are deeply mentally tangled in the matrix of lies and will likely die in the matrix never having understood how or why they died. Like sheep led to the slaughter.

You may wonder, how did we get here? Perhaps due to several factors including that today, propaganda is much more sophisticated and thus more effective. We have sophisticated sensationalized mass media, advertisements, billboards, radio, TV, Hollywood movies, video games, news papers, magazines, social media, cell phone apps and more. All of these sources of information are well equipped to manipulate the human mind especially knowing that today people are subjected to these devices and distractions every day.

Solutions

You may be thinking, what can we do about this mass resistance to knowledge seen today. For one, keep the information awareness circulating. Don’t get too worked up every time they execute another false flag. Instead see the false flag from the point of view of the usual script blueprint they implement almost every time and do your best to share the information with those who are still living in the matrix of lies. Then move on and continue focusing on viable solutions for the problems we face.

Still running into people clinging on to government “official stories” on one false flag event after another? Realize who you are. Remind yourself that truth exists on its own and it’s not up to you to convince anyone, just show them the truth and let them recognize it for themselves. Realize we are living in an era of unusual resistance to knowledge because this resistance is engineered. Realize that it’s not about you winning an argument or convincing others of your personal point of view. It’s about putting out the objective truth and hoping people see the bigger picture for their own survival benefit. You can lead a horse to water but you cannot make it drink.

I believe understanding this engineered resistance to common knowledge we see today is a big part of the puzzle in terms of staying mentally healthy and staying focused on solutions. It’s all about keeping things in the right perspective and maintaining your own personal mental, emotional, spiritual and physiological balance.

Realize that for as much as people resist knowledge and information freely and widely available to humanity today, there is a constant hope and positive outlook that is burning in our hearts that reminds us that we may be part of an inevitable and important mass human awakening phenomenon.

Also lets consider that this knowledge resistance is equivalent to the diver who is too afraid to jump from the high diving board for fear of what will happen if he does. All those people who are afraid of giving up their current paradigms wouldn’t be if they were assured that everything was going to be all right.

Finally, realize that as things get worse many more people will be inclined to want to change their paradigms for a better one. That is why we need to offer hope to others. It’s not enough to point out false flags and focus on the fear and government corruption. We must answer back with some kind of solution and hopeful strategizing. Deliberate knowledge resistance is best met with the fewest amounts of words. Show someone the truth don’t debate with them so much. Allow them to believe whatever they want and move on after telling them what they need to hear. It’s not about convincing these people or winning an argument, it’s about being mentally stronger, more assured and wiser as you expose the other person to objective truth. It’s about staying focused on the bigger picture not about burning energy to fix someone else’s resistance to truth.

Bernie Suarez
Creator of Truth and Art TV Project

Bernie is a revolutionary writer with a background in medicine, psychology, and information technology. He has written numerous articles over the years about freedom, government corruption and conspiracies, and solutions. A former host of the 9/11 Freefall radio show, Bernie is also the creator of the Truth and Art TV project where he shares articles and videos about issues that raise our consciousness and offer solutions to our current problems. His efforts are designed to encourage others to joyfully stand for truth, to expose government tactics of propaganda, fear and deception, and to address the psychology of dealing with the rising new world order. He is also a former U.S. Marine who believes it is our duty to stand for and defend the U.S. Constitution against all enemies foreign and domestic. A peace activist, he believes information and awareness is the first step toward being free from enslavement from the globalist control system which now threatens humanity. He believes love conquers all fear and it is up to each and every one of us to manifest the solutions and the change that you want to see in this world, because doing this is the very thing that will ensure victory and restoration of the human race from the rising global enslavement system, and will offer hope to future generations.