Media, Mind-Control, & Meditation: Plato’s E-Cave Panopticon and Beyond

panopticon-image

By Mankh (Walter E. Harris III)

Source: Axis of Logic

“Relax,” said the night man,
”We are programmed to receive.
You can check-out any time you like, 
But you can never leave!”

 – The Eagles, from “Hotel California”

In Plato’s Allegory of the Cave, people saw a shadow play on the wall and perceived it as reality. Today that Cave has morphed to provide umpteen TV channels and mini-screen gadgets; and the once sanctified living room cave has expanded into a free-range bubble of consciousness – heads bowed before an electronic altar, seemingly oblivious to the outside world.

“The “panopticon” refers to an experimental laboratory of power in which behaviour could be modified, and Foucault viewed the panopticon as a symbol of the disciplinary society of surveillance.”[1]

In Plato’s E-Cave, not only are the people watching a shadow play, they, and the shadows they watch, are being watched.

HyperNormalization
With a veneer of calm aplomb, the masses communicate 24-7, often in a frenetic urgency of the mundane, hence one interpretation of HyperNormalization. In all the years of overhearing cell-phone conversations in public, I can’t recall one snippet of philosophy or practical advice, rather stuff like, ‘yeah OMG I’ll get the chips!’ and ‘I’ll be there in like 30 seconds!’; if you’re old enough, you’d remember the days when, you got there when you got there! That said, a cell-phone can be a helpful even life-saving device.

The USEmpire election appears to have both proven and disproven the main theory of Adam Curtis’ fascinating new documentary “HyperNormalization.”

“Curtis argues that since the 1970s, governments, financiers, and technological utopians have given up on the complex “real world” and built a simple “fake world” that is run by corporations and kept stable by politicians.”[2]

The California cyber-tech-boom was a love-child of LSD-consciousness that found refuge in E-wizardy all the while looking to escape repressive politics. In his book “2030” Pepe Escobar describes it as: “Digital network capitalism would then shape post-modern globalization, from the New Economy before the end of the millennium to every digital wall to be broken beyond. California cosmology forged our world.”

According to Wikipedia:

“The term “hypernormalisation” is taken from Alexei Yurchak’s 2006 book Everything was Forever, Until it was No More: The Last Soviet Generation, about the paradoxes of life in the Soviet Union during the 20 years before it collapsed. A professor of anthropology at the University of California, Berkeley, he argues that everyone knew the system was failing, but as no one could imagine any alternative to the status quo, politicians and citizens were resigned to maintaining a pretence of a functioning society. Over time, this delusion became a self-fulfilling prophecy and the “fakeness” was accepted by everyone as real, an effect which Yurchak termed “hypernormalisation”.”[3]

First off, I see HyperNormalization as a half-truth because while the self-referential, gossipy world is fake, it is a veneer for a very real world of resource extraction and the violence perpetrated to maintain the status quo – and that is what the fakers ignore. As far as the election, many expected the Clinton dynasty to prevail as the corrupt, business as usual lesser of two evils. Breaking snooze: Where’s  the headline news that Melania Trump plans to focus on helping women and children?; I saw that mentioned on CNN’s Anderson Cooper 360, such a perfect antidote for Hillary’s loss and The Donald’s anti-feminist track record.

Even though WikiLeaks revealed the DNC (Democratic National Convention) was rigged against Sanders and leaks of Podesta/Clinton e-mails may have been proverbial straws for Hillary’s campaign camel, the prevailing media sentiment was that she would win – despite the fact that the day before the election some polls indicated a shrinking 3-4 point lead, thus with the typical margin of error, it was, in effect, tied. Yet the mass hyper-surprise!; and all that after the corporate media and comedy shows had simultaneously bashed Trump and given him more air-time than a kite on a windy day – both disdaining and elevating him, a media-mindfuck if ever there was one.

And there was little uproar when Gary Johnson and Jill Stein were excluded from debates and virtually banned from all corporate news, thus proving (if you didn’t know already) that it’s not a truly democratic election. Yet carry on we must and do the best we can with what we got, was the general sentiment, in other words, “the “fakeness” was accepted by everyone as real” aka HyperNormalization.

Yet the non-coastal, poor and middle America White working class (with sides of KKK and minorities!), which reportedly was the key demographic for The Don’s victory, was voting out of bare-bones needs; in effect they were not HyperNormalized. Then again, if Trump, who promises to challenge the status quo, caves-in to Deep State pressure, we will have an answer to an interesting article’s titular question: “President Trump: big liar going to Washington or Tribune of the People?”[4] And if that answer is the former, then the mostly White working poor will also have been duped/HyperNormalized.

You see, it’s hard to know what’s what; which all seems to prove HyperNormalization as the dominant societal charade; which screen is real, if any?

What you perceive is what you get
Another key phrase in Curtis’ documentary is “managed perception” – akin to Chomsky and Herman’s  “Manufacturing Consent: The Political Economy of the Mass Media” (1988) which showed how opinions and desensitized agreement to the status quo became a product to be mined and marketed. Witness CBS Chairman Les Moonves’ comment from February 2016:

“Man, who would have expected the ride we’re all having right now? … The money’s rolling in and this is fun. I’ve never seen anything like this, and this [is] going to be a very good year for us. Sorry. It’s a terrible thing to say. But, bring it on, Donald. Keep going.” [5] And, if i might satirically add, like an orangey topped Duracell battery, The Don does seem to have a lot of energy.

HyperNormalization is also akin to Sheldon Wolin’s theory  of “inverted totalitarianism,” where the faceless corporate machine wields so much entertainment and bureaucratic power that the people barely notice they are being played hook, line, and sinking feelings.

Some of Curtis’ info, however, is inaccurate or questionable. For examples, he blames the demise of the Occupy Movement on lack of vision and planning yet neglects to mention the FBI coordinated systemic crackdown on the ‘camps’ across the country. [6]

About Syria he says that President Bashar al-Assad retaliated with a “vengeful fury,” whereas Assad has defended his actions as self-defense for his people; numerous journalists back the Syrian President’s claim.

Post-election, I wonder: Why were so many liberals, lefties, and women more surprised that Trump won than that the DNC/HRC rigged it against Sanders? Were they simply feeling impotent due to HyperNormalized shadow play? Whatever the case, his victory revealed a crack in the smooth screen of HyperNormalization.

According to Patrick Caddell’s survey before the election, the populace seems to be hip to what’s happening:

“Powerful interests from Wall Street banks to corporations, unions and political interest groups have used campaign and lobbying money to rig the system for them. They are looting the national treasury of billions of dollars at the expense of every man, woman and child. AGREE = 81%; DISAGREE = 13%…

“The country is run by an alliance of incumbent politicians, media pundits, lobbyists and other powerful money interests for their own gain at the expense of the American people. AGREE = 87%; DISAGREE = 10%

The real struggle for America is not between Democrats and Republicans but between mainstream American and the ruling political elites. AGREE = 67%; DISAGREE = 24%.” [7]

This makes it seem it is the ‘how-to bring about change’ that is the deeper conundrum of the HyperNormalized world. Then again, people too often seem to have only their own best interests at heart. Approximately 90% of Americans want their food labeled so as to know if they contain GMOs etc. or not, yet hardly a peep out of anyone that climate change or the “environment” aka Mother Earth & Nature-beings were deliberately excluded as a main topic of the presidential debates. This makes it seem that, in general, Americans want clean food for themselves, they want a clean environment so they can go ‘play in the park’ by themselves. And that brings us to the Great Disconnect and Standing Rock.

Flow like water, be steady as a rock
Part of HyperNormalization is a disconnect from Nature and from being self- and community-guided, and therefore disconnected from the Original Peoples of Turtle Island. While many people are certainly aware of what’s happening with Standing Rock, the corporate media’s lack of attention, let alone empathy, fosters lack of concern for the outcome, lack of being outraged by the violent and racist treatment of those in prayer and peacefully protecting the main source of water for the Standing Rock Sioux and other Natives as well as approximately 17-million people downstream. The real-fake outrage is over the outcome of a fake election.

For many Native reservations where poverty and PTSD are prevalent, the people are anything but HyperNormalized; they are struggling to survive – as with Standing Rock, they are not asking for much: clean water and the space to live their lives with ancient traditions timelessly connected with the land, the water, the stones…

Plato, imagining a prisoner getting outside of the Cave, wrote: “Slowly, his eyes adjust to the light of the sun. First he can only see shadows. Gradually he can see the reflections of people and things in water and then later see the people and things themselves. Eventually, he is able to look at the stars and moon at night until finally he can look upon the sun itself .” [8]

Standing Rock is bringing people together, raising consciousness at many levels. As one example from a little over a week ago:

“In a “historic” show of interfaith solidarity, 500 clergy members prayed along the banks of North Dakota’s Cannonball River on Thursday where they “bore witness with the Standing Rock Sioux Nation,” which has faced intimidation, violence, and arrests for protecting their sacred land and water supply from the threats of a massive oil pipeline. According to the Episcopal News Service, “The interfaith group spent more than five hours on site, marching, singing hymns, sharing testimony, and calling others to join them in standing with the more than [300] tribes who have committed their support to the Sioux Nation as they protest the route of the Dakota Access Pipeline (DAPL).” [9]

This solidarity is in stark contrast with the election where a typically 51/49 system determines an outcome for a society literally programmed for exciting close-call winners – think sporting events,
reality shows, awards ceremonies.

Outside in and inside out
There are many ways to get outside the Cave. One of those is by going within. When I first started to meditate, as is common, I became aware of the constant chatter in my head and soon realized that the mind left unattended will rattle on endlessly. Meditation then showed me that once the chatter quiets, one then has access to other frequencies, other ‘channels’ – what happens then is a very personal matter yet also impersonal because one becomes connected with another source of thinking-seeing-feeling. Chatter, like cell-phones, has it’s usefulness, say, if you forgot to turn the stove off. My point to the personal anecdote is that the corporate media is too much chatter, endlessly looping itself. The gadgets, too, though useful, seem hard-wired for HyperNormalization.

Some of the chatter we must live with, yet by learning to follow our intuitions, listening to and connecting with Nature, talking with elders and little children we can better tune-out from the shadow-play, we can find new and ancient ways for more people – and that includes trees, rivers, etc. –  “to look at the stars and moon at night” and “look upon the sun itself.”

“Sin filo ya y derrotada se quejó: Soy más fuerte que ella, pero no le puedo hacer daño y ella a mi, sin pelear. me ha vencido.’”

“Without sharpness and defeated, it [the sword] complained: ‘I am stronger than the water, but I cannot harm her. And the water, without fighting, has conquered me.’” [10]

NOTES:
1.    Panopticism here and here.
2.    Ibid.
3.    HyperNormalization
4.    See here.
5.    “Les Moonves: Trump’s run is ‘damn good for CBS’”. See here.
6.    “Revealed: how the FBI coordinated the crackdown on Occupy
7.    “Patrick Caddell; The Pollster Who ‘Got it Right’
8.    “Allegory of the Cave
9.    “A Prayer for People and Planet: 500 Clergy Hold ‘Historic’ Mass Gathering for Standing Rock
10.    “Questions & Swords: Folktales of the Zapatista Revolution” as told by Subcomandante Marcos, Cino Puntos Press, El Paso, Texas, 2001, p.82.

To watch “HyperNormalization”, click here.

Mankh (Walter E. Harris III) is an essayist and resident poet on Axis of Logic. In addition to his work as a writer, he is a small press publisher and Turtle Islander. His new book of genre-bending poetic-nonfiction is “Musings With The Golden Sparrow.” You can contact him via his literary website.

  

Soylent Burgers and Cockroach Milk

wICCXTa

Source: The Hipcrime Vocab

“The profitability of production cannot expand indefinitely. Any increase in the quantity of soil, water, minerals, or plants put into a particular production process per unit of time constitutes intensification. It has been the burden of this book to show that intensification inevitably leads to declining efficiencies. That declining efficiencies have adverse effects upon the average standard of living cannot be doubted.’
-MARVIN HARRIS, ‘Cannibals and Kings’

This comment made me chuckle: “The futurology future is starting to look worse than the collapse future.” This was on Reddit in response to an article about cockroaches providing the “milk of the future”:

Scientists think cockroach milk could be the superfood of the future (Science Alert)

This really does seem like The Onion at this point. Someone suggested that Reddit’s collapse and futurology boards should merge at some point. Believe it or not, they aren’t all that far apart.

We’ve already been treated to an endless litany of articles about how insect ranching will provide the protein of the future. Then there’s the meat grown in a petri-dish, and the nutrition shake cheekily named Soylent scarfed down by the Silicon Valley crowd so they can cram in a few more hours of work after popping their Ritalin. Now people are questioning whether the government should step in and force us to eat less meat.

And yet we are still simultaneously told that overpopulation and resource depletion are not a problem, and that more growth is good.

This is progress???

One of the things I’ve written about over the years is this idea that technological innovations are inherently good. But it’s clear what’s really going on: desperately trying to maintain the status quo in the face of increasing population pressure and declining resources. There’s a technical term for this: intensification.

Marvin Harris, whose works serve as a guidepost for this blog, warned us that intensification always leads to lower living standards for the majority of people in the long run, while only benefiting a tiny handful. This is a law of history. Over the years, I’ve tried to point out the difference between true innovation which solves problems or allows us to do things we could not do before, and intensification, which is essentially squeezing blood from a stone. In the former category are things like antibiotics and radio, which solve problems (killer infections) or allow us to do new things (communicate globally). In the latter category are things like electric cars (attempting to keep the unsustainable automobile infrastructure alive) and aquaculture (to make up for stripping the oceans bare of wild fish).

For the majority of people, there is no difference, since both are “growth” and growth is always good, full stop. GDP, the yardstick by which we measure progress in the modern world (which even its creator warned us against) is agnostic as to the source of growth, whether it is producing more food to feed hungry people or asthma inhalers to deal with the lung irritants from air pollution.

People tend to forget we’ve been here before.

Back during the Ice Age (late Pleistocene), we H. sapiens lived primarily off of herds of large fauna, especially reindeer, mammoth and bison. This was supplemented with wild salmon in season. The fattiest parts of the animal were the most prized and sought after. Bones were cracked and boiled to extract the grease. Most calories came from nutrient-dense meat and fat, while plants were consumed for their beneficial vitamins and minerals (plants are less calorie dense).

Then the large fauna started to die off. They died off due to a double-blow of a changing climate and increasing human predation. Scientists debate about which was the primary cause, but it’s pretty clear that whenever humans showed up in a pristine environment, the large animals went extinct shortly thereafter. Many of these animals had survived previous climatic changes, so it’s doubtful that climate change alone was responsible. Skeletons riddled with spear points provide more damning evidence for our species.

In response, we launched a broad spectrum revolution – using our omnivorous diet to exploit a wider variety of foodstuffs, particularly plant foods. This began with acorns and pistachios, but soon moved to grass seeds, sedges and pulses. Meanwhile, the prey animals got smaller and smaller, from reindeer and bison, to gazelles and fallow deer, to hares and waterfowl. Instead of the nutritious and diverse food sources of their ancestors, we became more and more dependent upon eating pulverized grass seeds, obtained at the cost of backbreaking labor for harvesting, threshing and grinding.

The human population became mostly vegetarian by necessity, and remained so for roughly the next 8,000 or so years. The problem is, a vegetarian diet doesn’t provide a lot of necessary vitamins, minerals and nutrients for optimal health. Today’s vegetarians can choose from a plethora of foods year round that simply weren’t available to ancient people. They don’t have to worry about what is in season and have the entire world as their larder. In the past, however, the vast majority of people ended up subsisting on a diet of weak beer and gruel. Regular meat consumption became a privilege restricted to the wealthy upper classes, while everyone else went begging. Hunting, an activity once done by all humans everywhere since time immemorial, became the exclusive provenance of kings and princes – society’s rulers. While it is true that too much meat can be detrimental to health, too little is perhaps even more damaging. Humans are meat-eaters, and a certain level of fat and protein is required for optimal health. The protein in grains and legumes is incomplete (the body needs 22 different types of amino acids to function properly; adults can synthesize 13 of those internally, but the other 9 must be obtained from food), and there are no fats (the human brain is over 60 percent fat). Grains produce an over-abundance of omega-6 fatty acids, poisonous lectins to prevent their consumption, have low nutrient density, and high acidity. They are actually a terrible thing to base a primate diet around. But we had no other choice, thanks to intensification.

And this is dramatically reflected by the skeletons of ancient peoples, who show major signs of malnutrition, disease, and stunted growth. At the same time, arthritis and other signs of wear and tear make their appearance on the bones of people who now have to spend hours a day grinding grain in a saddle quern rather than fishing and chasing after wild animals. This gruel also breaks down into simple sugars in the mouth during digestion, meaning that cavities and premature tooth decay became endemic as well.

As population pressure grew, grains, pulses and sedges, once “unpalatable” dietary supplements cultivated by hunter-gatherers for times of extreme scarcity or fermentation into medicinal beverages, became the chief dietary staple for most people. At the same time, humans found themselves preyed upon by a new class of predator: their own kind, which continues unabated to this day.

In order to keep large herbivores from going totally extinct, we embarked upon what Harris called “the greatest conservation project in history”: animal domestication. Meanwhile, cheap carbohydrates from grain are what kept most of the human population alive from day-to-day for thousands of years, such that “bread” is synonymous in all ancient cultures with “food.”

All this came from attempting to exploit resources more intensively from our environment in the face of increasing population pressure.

This sad tale, memorably spun by Jared Diamond some years ago, reflects Harris’ principle: intensification inevitably leads to benefits for the few; misery and oppression for the many.

During periods of deintensifcation, we actually recovered some of the losses. This was due to either 1.) a reduced population or 2.) new lands and resources opened up for exploitation. For example, signs of health improve after the Black Death in Europe for the survivors, due to the reduced population pressure. There were more resources to go around per head. Also, the opening up of the new lands due to colonization (and the dieoff of the native peoples), brought vast new areas of virgin land under cultivation. This led to more wealth, as well as political freedoms. Serfdom waned after the black death, and the American Revolution put Enlightenment principles of representative democracy and justice into practice. Perhaps the most dramatic result came from the harnessing of millions of years of stored sunlight in fossil fuels, combined with the scientific method. This allowed many more people a higher standard of living, even in the face of increasing population and intensifying resource use. It was during this period that “economics” became the guiding principle of our civilization, and it chalked up all benefits to “institutions”–typically capitalist market institutions–rather than a temporary superabundance of energy and resources.

Thomas Jefferson once noted that the Americans in the room were all a head taller than their European counterparts. That’s what happens when you have plenty for everybody. The first Europeans in North America also noted how much taller the Native Americans were. As this article notes, in the past, Americans ate more meat than today, and were healthier as well:

How Americans Used to Eat (The Atlantic)

Eventually, the Malthusian cycle kicked in again. Population grew, the empty spaces filled up, and the frontier was closed. Increasing competition caused wages and purchasing power to drop. People gradually lost what self-sufficiency they had, allowing the elites to consolidate power. People once again began working longer, harder, for less. Sound familiar?

We intensified again – in order to keep up with the demand for meat, we crowded animals together into feedlots in unsanitary conditions and fed them cheap corn (maize), which they are not adapted to eat. To cope with the inevitable sickness which resulted, we pumped the animals full of antibiotics (which has a side effect of increasing growth). It is these miserable and tortured animals which most of us are forced to eat now, thanks to intensification.

However, domesticated meat is less nutritious than the wild variety. The Omega-3/Omega-6 profile is altered, and there are less antioxidants. Omega-6 fatty acids reduce inflammation, which is increasingly being pinpointed as the root cause of just about every disease you care to name, from autoimmune diseases, to Alzheimer’s, to arthritis, to chronic pain, depression, and cancer. At the same time, it’s been shown that grains actually increase inflammation, and are implicated in a host of metabolic diseases:

This Is Your Brain on Gluten (The Atlantic)

While grass-fed, hormone-free beef is still available, it costs more, meaning it is restricted to those with high incomes, just like in the past. And hunting is still primarily an elite sport for the rich in many places (especially outside North America). Just like in the past, the poor people trapped in “food deserts” feed themselves with cheap carbohydrates, now in the form of processed corn and sugar products made by the industrial food system, while the wealthy can purchase boutique ‘lifestyle” products at Whole Paycheck Foods.Malnutrition now takes the form of obesity as well as starvation, although much of the non-industrialized world still deals with empty bellies, stunted growth and vitamin deficiencies, including many of those who produce export crops for the West. That’s on top of poverty and pollution.

When we scraped the oceans clean of fish and poisoned our air and waterways due to industrial pollutants (e.g. mercury ash is a side effect of coal power generation) we turned to fish farming, (aquaculture) – one of the favorite high-tech “innovations” of the futurist crowd. But farmed fish are nutritionally inferior to wild ones. Wild fish travel widely and get their food from a great variety of sources. This means that they have a much better Omega-3 fatty acid profile (which prevents inflammation and helps brain growth). But farmed fish have to be fed. This means their diet is far more restricted, and hence their meat less nutritious (more Omega-6’s). In fact, salmon needs to be fed a pill in order to turn them pink so that consumers will buy them since their meat does not develop its natural color from their diet. As Spencer Wells notes in Pandora’s Seed, were now doing for fish what we did for ungulates some 8000 years ago: a desperate attempt to preserve what remains. Farmed fish is replacing wild fish in supermarkets. As with grass-fed meat, the wild variety is now sold at a premium affordable only to those with high incomes (sound familiar)?

In each and every case, intensification had led to far more work for ultimately inferior products. This is always the result of intensification in the long run.

We are constantly told we can’t go back to hunting and gathering (even if we wanted to). Why is that? What’s left unsaid is the reason: too many people and too much environmental degradation as the result of 6-8,000 years of intensification, which also brought about disease, governments, wars, taxes, poverty, inequality, and so on. Now we’re told we’ve got to eat less meat (which means more grains), live in small, tightly sealed houses, use less water, take shorter showers, and so forth. In essence, that we will “innovate” our way to success. But all of these are signs of lower living standards. And no wonder: seven billion-plus people, all quarters of the earth occupied and brought under the plow, rain forests being chopped down, the most easily accessible fossil fuels plateauing, toxic pollution of the air, land and water, overpumping of ground water, and the stable climate of the Holocene threatened by carbon levels. Intensification caused all of these things; it is not the solution. The next phase of intensification isn’t going to lead to better living standards any more than the last few rounds. Yet we’ve been tricked into thinking it will, because we don’t realize that fossil fuels are what are ultimately responsible for our current living standards (us Westerners, that is), not intensification. And even then, given the levels of stress, overwork, social dysfunction, health maladies and mental disease in industrialized societies, we might be tempted to wonder if even our living standards are all that great to begin with.

Furthermore, we are told that a healthy diet centered around pastured meat, plants and nuts is just not possible because it’s too damaging to the environment, or too “expensive.” That is, “we” need to “feed the world!” But according to the elites (the ones who benefit from intensification, remember) the answer isn’t less people, or curtailing economic growth. No, instead it’s new “innovations” that are profitable to the parasitical corporate owners of this planet: lab-grown meat, hydroponics, vertical gardens, meal-replacement shakes, protein powder from ground-up crickets, steel-and-glass human anthills. “The futurology future is starting to look worse than the collapse future.” Maybe that’s because the collapse future has more room to grow actual real food, live in a house you built yourself with your friends and family, spend time in nature, work less, play more, and get in touch with what we really are, deep down, instead of what industrial society wants to mold us to be.

Now, for the record, I have no problem with eating bugs. The Permaculturist in me says we should exploit all sources for sustenance in our environment such that they work together in a sustainable, harmonious way in line with the earth’s natural ecosystems. Raising insects, as we now do with bees, makes sense. And, yes, the overconsumption of Americans is grotesque and makes us unhappy, and we’d be better off ditching it (which I already do voluntarily). So to be clear: what I am criticizing is not eating insects or deriving milk from cockroaches per se. Nor am I defending the overconsumption produced by status-driven consumer capitalism. Rather, I am critiquing the idea that these futurology trends are signs of progress rather than collapse. Which is why r/collpase and r/futurology increasingly appear to be turning into the same thing.

P.S This comment nails it.

 

Will Robots Take Your Job?

Walmart Robots

By Nick Srnicek and Alex Williams

Source: ROAR

In recent months, a range of studies has warned of an imminent job apocalypse. The most famous of these—a study from Oxford—suggests that up to 47 percent of US jobs are at high-risk of automation over the next two decades. Its methodology—assessing likely developments in technology, and matching them up to the tasks typically deployed in jobs—has been replicated since then for a number of other countries. One study finds that 54 percent of EU jobs are likely automatable, while the chief economist of the Bank of England has argued that 45 percent of UK jobs are similarly under threat.

This is not simply a rich-country problem, either: low-income economies look set to be hit even harder by automation. As low-skill, low-wage and routine jobs have been outsourced from rich capitalist countries to poorer economies, these jobs are also highly susceptible to automation. Research by Citi suggests that for India 69 percent of jobs are at risk, for China 77 percent, and for Ethiopia a full 85 percent of current jobs. It would seem that we are on the verge of a mass job extinction.

Nothing New?

For many economists however, there is nothing to worry about. If we look at the history of technology and the labor market, past experiences would suggest that automation has not caused mass unemployment. Automation has always changed the labor market. Indeed, one of the primary characteristics of the capitalist mode of production has been to revolutionize the means of production—to really subsume the labor process and reorganize it in ways that more efficiently generate value. The mechanization of agriculture is an early example, as is the use of the cotton gin and spinning jenny. With Fordism, the assembly line turned complex manufacturing jobs into a series of simple and efficient tasks. And with the era of lean production, we have had the computerized management of long commodity chains turn the production process into a more and more heavily automated system.

In every case, we have not seen mass unemployment. Instead we have seen some jobs disappear, while others have been created to replace not only the lost jobs but also the new jobs necessary for a growing population. The only times we see massive unemployment tend to be the result of cyclical factors, as in the Great Depression, rather than some secular trend towards higher unemployment resulting from automation. On the basis of these considerations, most economists believe that the future of work will likely be the same as the past: some jobs will disappear, but others will be created to replace them.

In typical economist fashion, however, these thoughts neglect the broader social context of earlier historical periods. Capitalism may not have seen a massive upsurge in unemployment, but this is not a necessary outcome. Rather, it was dependent upon unique circumstances of earlier moments—circumstances that are missing today. In the earliest periods of automation, there was a major effort by the labor movement to reduce the working week. It was a successful project that reduced the week from around 60 hours at the turn of the century, down to 40 hours during the 1930s, and very nearly even down to 30 hours. In this context, it was no surprise that Keynes would famously extrapolate to a future where we all worked 15 hours. He was simply looking at the existing labor movement. With reduced work per person, however, this meant that the remaining work would be spread around more evenly. The impact of technology at that time was therefore heavily muted by a 33 percent reduction in the amount of work per person.

Today, by contrast, we have no such movement pushing for a reduced working week, and the effects of automation are likely to be much more serious. Similar issues hold for the postwar era. With most Western economies left in ruins, and massive American support for the revitalization of these economies, the postwar era saw incredibly high levels of economic growth. With the further addition of full employment policies, this period also saw incredibly high levels of job growth and a compact between trade unions and capital to maintain a sufficient amount of good jobs. This led to healthy wage growth and, subsequently, healthy growth in aggregate demand to stimulate the economy and keep jobs coming. Moreover, this was a period where nearly 50 percent of the potential labor force was constrained to the household.

Under these unique circumstances, it is no wonder that capitalism was able to create enough jobs even as automation continued to transform for the labor process. Today, we have sluggish economic growth, no commitments to full employment (even as we have commitments to harsh welfare policies), stagnant wage growth, and a major influx of women into the labor force. The context for a wave of automation is drastically different from the way it was before.

Likewise, the types of technology that are being developed and potentially introduced into the labor process are significantly different from earlier technologies. Whereas earlier waves of automation affected what economists call “routine work” (work that can be laid out in a series of explicit steps), today’s technology is beginning to affect non-routine work. The difference is between a factory job on an assembly line and driving a car in the chaotic atmosphere of the modern urban environment. Research from economists like David Autor and Maarten Goos shows that the decline of routine jobs in the past 40 years has played a significant role in increased job polarization and rising inequality. While these jobs are gone, and highly unlikely to come back, the next wave of automation will affect the remaining sphere of human labor. An entire range of low-wage jobs are now potentially automatable, involving both physical and mental labor.

Given that it is quite likely that new technologies will have a larger impact on the labor market than earlier waves of technological change, what is likely to happen? Will robots take your job? While one side of the debate warns of imminent apocalypse and the other yawns from the historical repetition, both tend to neglect the political economy of automation—particularly the role of labor. Put simply, if the labor movement is strong, we are likely to see more automation; if the labor movement is weak, we are likely to see less automation.

Workers Fight Back

In the first scenario, a strong labor movement is able to push for higher and higher wages (particularly relative to globally stagnant productivity growth). But the rising cost of labor means that machines become relatively cheap in comparison. We can already see this in China, where real wages have been surging for more than 10 years, thereby making Chinese labor increasingly less cheap. The result is that China has become the world’s biggest investor in industrial robots, and numerous companies—most famously Foxconn—have all stated their intentions to move towards increasingly automated factories.

This is the archetype of a highly automated world, but in order to be achievable under capitalism it requires that the power of labor be strong, given that the relative costs of labor and machines are key determinants for investment. What then happens under these circumstances? Do we get mass unemployment as robots take all the jobs? The simple answer is no. Rather than mass decimation of jobs, most workers who have their jobs automated end up moving into new sectors.

In the advanced capitalist economies this has been happening over the past 40 years, as workers move from routine jobs to non-routine jobs. As we saw earlier, the next wave of automation is different, and therefore its effects on the labor market are also different. Some job sectors are likely to take heavy hits under this scenario. Jobs in retail and transport, for instance, will likely be heavily affected. In the UK, there are currently 3 million retail workers, but estimates by the British Retail Consortium suggest this may decrease by a million over the next decade. In the US, there are 3.4 million cashiers alone—nearly all of whose work could be automated. The transport sector is similarly large, with 3.7 million truck drivers in the US, most of whose jobs could be incrementally automated as self-driving trucks become viable on public roads. Large numbers of workers in such sectors are likely to be pushed out of their jobs if mass automation takes place.

Where will they go? The story that Silicon Valley likes to tell us is that we will all become freelance programmers and software developers and that we should all learn how to code to succeed in their future utopia. Unfortunately they seem to have bought into their own hype and missed the facts. In the US, 1.8 percent of all jobs require knowledge of programming. This compares to the agricultural sector, which creates about 1.5 percent of all American jobs, and to the manufacturing sector, which employs 8.1 percent of workers in this deindustrialized country. Perhaps programming will grow? The facts here are little better. The Bureau of Labor Statistics (BLS) projects that by 2024 jobs involving programming will be responsible for a tiny 2.2 percent of the jobs available. If we look at the IT sector as a whole, according to Citi, it is expected to take up less than 3 percent of all jobs.

What about the people needed to take care of the robots? Will we see a massive surge in jobs here? Presently, robot technicians and engineers take up less than 0.1 percent of the job market—by 2024, this will dwindle even further. We will not see a major increase in jobs taking care of robots or in jobs involving coding, despite Silicon Valley’s best efforts to remake the world in its image.

This continues a long trend of new industries being very poor job creators. We all know about how few employees worked at Instagram and WhatsApp when they were sold for billions to Facebook. But the low levels of employment are a widespread sectoral problem. Research from Oxford has found that in the US, only 0.5 percent of the labor force moved into new industries (like streaming sites, web design and e-commerce) during the 2000s. The future of work does not look like a bunch of programmers or YouTubers.

In fact, the fastest growing job sectors are not for jobs that require high levels of education at all. The belief that we will all become high-skilled and well-paid workers is ideological mystification at its purest. The fastest growing job sector, by far, is the healthcare industry. In the US, the BLS estimates this sector to create 3.8 million new jobs between 2014 and 2024. This will increase its share of employment from 12 percent to 13.6 percent, making it the biggest employing sector in the country. The jobs of “healthcare support” and “healthcare practitioner” alone will contribute 2.3 million jobs—or 25 percent of all new jobs expected to be created.

There are two main reasons for why this sector will be such a magnet for workers forced out of other sectors. In the first place, the demographics of high-income economies all point towards a significantly growing elderly population. Fewer births and longer lives (typically with chronic conditions rather than infectious diseases) will put more and more pressure on our societies to take care of elderly, and force more and more people into care work. Yet this sector is not amenable to automation; it is one of the last bastions of human-centric skills like creativity, knowledge of social context and flexibility. This means the demand for labor is unlikely to decrease in this sector, as productivity remains low, skills remain human-centric, and demographics make it grow.

In the end, under the scenario of a strong labor movement, we are likely to see wages rise, which will cause automation to rapidly proceed in certain sectors, while workers are forced to struggle for jobs in a low-paying healthcare sector. The result is the continued elimination of middle-wage jobs and the increased polarization of the labor market as more and more are pushed into the low-wage sectors. On top of this, a highly educated generation that was promised secure and well-paying jobs will be forced to find lower-skilled jobs, putting downward pressure on wages—generating a “reserve army of the employed”, as Robert Brenner has put it.

Workers Fall Back

Yet what happens if the labor movement remains weak? Here we have an entirely different future of work awaiting us. In this case, we end up with stagnant wages, and workers remain relatively cheap compared to investment in new equipment. The consequences of this are low levels of business investment, and subsequently, low levels of productivity growth. Absent any economic reason to invest in automation, businesses fail to increase the productivity of the labor process. Perhaps unexpectedly, under this scenario we should expect high levels of employment as businesses seek to maximize the use of cheap labor rather than investing in new technology.

This is more than a hypothetical scenario, as it rather accurately describes the situation in the UK today. Since the 2008 crisis, real wages have stagnated and even fallen. Real average weekly earnings have started to rise since 2014, but even after eight years they have yet to return to their pre-crisis levels. This has meant that businesses have had incentives to hire cheap workers rather than invest in machines—and the low levels of investment in the UK bear this out. Since the crisis, the UK has seen long periods of decline in business investment—the most recent being a 0.4 percent decline between Q12015 and Q12016. The result of low levels of investment has been virtually zero growth in productivity: from 2008 to 2015, growth in output per worker has averaged 0.1 percent per year. Almost all of the UK’s recent growth has come from throwing more bodies into the economic machine, rather than improving the efficiency of the economy. Even relative to slow productivity growth across the world, the UK is particularly struggling.

With cheap wages, low investment and low productivity, we see that companies have instead been hiring workers. Indeed, employment levels in the UK have reached the highest levels on record—74.2 percent as of May 2016. Likewise, unemployment is low at 5.1 percent, especially when compared to their neighbors in Europe who average nearly double that level. So, somewhat surprisingly, an environment with a weak labor movement leads here to high levels of employment.

What is the quality of these jobs, however? We have already seen that wages have been stagnant, and that two-thirds of net job creation since 2008 has been in self-employed jobs. Yet there has also been a major increase in zero-hour contracts (employment situations that do not guarantee any hours to workers). Estimates are that up to 5 percent of the labor force is in such situations, with over 1.7 million zero-hour contracts out. Full-time employment is down as well: as a percentage of all jobs, its pre-crisis levels of 65 percent have been cut to 63 percent and refused to budge even as the economy grows (slowly). The percentage of involuntary part-time workers—those who would prefer a full-time job but cannot find one—more than doubled after the crisis, and has barely begun to recover since.

Likewise with temporary employees: involuntary temporary workers as a percentage of all temporary workers rose from below 25 percent to over 40 percent during the crisis, only partly recovering to around 35 percent today. There is a vast number of workers who would prefer to work in more permanent and full-time jobs, but who can no longer find them. The UK is increasingly becoming a low-wage and precarious labor market—or, in the Tories’ view, a competitive and flexible labor market. This, we would argue, is the future that obtains with a weak labor movement: low levels of automation, perhaps, but at the expense of wages (and aggregate demand), permanent jobs and full-time work. We may not get a fully automated future, but the alternative looks just as problematic.

These are therefore the two poles of possibility for the future of work. On the one hand, a highly automated world where workers are pushed out of much low-wage non-routine work and into lower-wage care work. On the other hand, a world where humans beat robots but only through lower wages and more precarious work. In either case, we need to build up the social systems that will enable people to survive and flourish in the midst of these significant changes. We need to explore ideas like a Universal Basic Income, we need to foster investment in automation that could eliminate the worst jobs in society, and we need to recover that initial desire of the labor movement for a shorter working week.

We must reclaim the right to be lazy—which is neither a demand to be lazy nor a belief in the natural laziness of humanity, but rather the right to refuse domination by a boss, by a manager, or by a capitalist. Will robots take our jobs? We can only hope so.

Note: All uncited figures either come directly from, or are based on authors’ calculations of, data from the Bureau of Labor Statistics, O*NET and the Office for National Statistics.

Fort Lauderdale Shooting: FBI Involvement in Another Act of Violence

4-2-640x426

By James Henry

Source: Who.What.Why.

Two months before Esteban Santiago opened fire with a semi-automatic pistol at Fort Lauderdale’s airport Friday, killing five and injuring six, he underwent an “assessment” by the Federal Bureau of Investigation (FBI).

This procedure, which can involve intrusive investigations and interrogation, ended with the Bureau finding that Santiago had committed no crimes and had no ties to terrorism.

A growing number of these incidents exhibit the same disturbing feature: the FBI and/or other federal agencies had prior knowledge of the perpetrators. And there’s another common thread: the FBI’s ex post facto explanations of those interactions do not make a lot of sense. What is never raised is the possibility that the government’s actions are actually pushing already unstable people over the edge.

The phenomenon has become so common that even mainstream outlets like Fox News have taken to calling people like Santiago “Known Wolves.” However, the problem is usually framed as one of law-enforcement agencies “hamstrung” by “politically correct” culture and outdated “civil liberties” limits placed upon investigators. Issues of who should and who should not be given access to guns inevitably tops the discussion.

Despite all the focus on “known wolves” like Santiago, one line of questioning is seldom pursued: What exactly took place during their interactions with government investigators, and how likely is it that these government actions made violence more probable in the future?

Soon after the shooting, the FBI told reporters that two months earlier Santiago had walked into the Anchorage FBI office and made “disturbing” remarks about hearing voices, and being forced to watch ISIS videos. He seemed “agitated and incoherent,” while maintaining “that his mind was being controlled by a US intelligence agency.” They confiscated his gun, which was registered to him.

The FBI, after deciding he had broken no laws and had no terrorist ties, turned him over to the local police who had him hospitalized briefly.

Anchorage police Chief Chris Tolley said “Santiago was having terroristic thoughts and believe he was being influenced by ISIS.”  Nevertheless, after undergoing some sort of psychiatric evaluation, he “was not adjudicated mentally ill” — and they returned to him his 9mm Walther.

Federal law-enforcement sources told NBC News that they believe it was the same gun he allegedly used in the airport shootings.

After the FBI’s “assessment” was complete, Santiago flew from Anchorage last week, ultimately ending his trip at the Fort Lauderdale, Florida, airport.

Mind Control

While the very mention of “mind control” being conducted by a “US intelligence agency” conjures images of wild-eyed paranoia, and is thus discounted out of hand, there is in fact a long and sordid history of efforts by national security agencies to manipulate individuals for various reasons. “Psychological manipulation” may be a more apt term.

Indeed, there appears to be a pattern emerging: more and more disturbed individuals who commit mass atrocities had many prior interactions with national security agents.

Ted Kaczynski, infamously known as the “UnaBomber,” was the victim of a CIA-funded MK-ULTRA psychological experiment when he was an undergraduate at Harvard University. Part of the experiment involved abusive and humiliating interrogations. Understandably, many familiar with the case have wondered whether this abuse led him to later commit acts of anonymous terror.

Similarly, is it possible that Santiago’s interactions with the FBI or some other federal agency pushed him to the tipping point?

The record shows that various federal agencies have taken investigative interest in Santiago over the last few years. He was investigated by “Homeland Security Investigations” for child pornography in either 2011 or 2012, law-enforcement sources told a local CBS affiliate in Miami. Three weapons and a computer were seized, but there was not enough evidence to prosecute.

A “US military official” also told NBC Nightly News that Santiago, a veteran who served during the war in Iraq, was “being tracked” by Army Criminal Investigation Command because of “psychological issues.”

The FBI, for its part, claims to have conducted an “assessment” of him after its interaction with Santiago in Anchorage in November.

We don’t know — and likely never will know — what those investigations looked like. The agencies involved almost never divulge “sources and methods.” We do know that as a result of his interaction with the FBI, Santiago was sent to an as yet unnamed mental health facility where he underwent some kind of “psychological treatment.” Since he was an Army veteran, it’s likely the Veterans Administration was involved.

An assessment, usually cited by the FBI as the “least intrusive” level of investigation done by the Bureau, can nonetheless be very intrusive. According to an ACLU fact sheet, FBI assessments can include:

collecting information from online sources, including commercial databases.

recruiting and tasking informants to gather information about you.

using FBI agents to surreptitiously gather information from you or your friends and neighbors without revealing their true identity or true purpose for asking questions.

having FBI agents follow you day and night for as long as they want.

The FBI can also conduct an assessment on an individual just to see if he or she would make a good informant — regardless of whether that person is suspected of a crime.

Could these government intrusions push an already unstable person further into paranoia or delusion? Conscientious investigators would surely take care not to “set off” paranoid individuals who have been targeted for investigation. But it is not hard to imagine careless or unscrupulous investigators pushing too hard — particularly if the investigation involved anything touching on “national security.”

Assessing What, Exactly?

It’s worth noting that the FBI had also conducted an assessment of Tamerlan Tsarnaev, the “mastermind” behind the Boston Marathon bombing who died in a gunfight with police.

Attorneys for his younger brother, Dzhokhar, who was convicted and sentenced to death in 2015, wrote in court documents that Tamerlan Tsarnaev’s interactions with the Bureau “were among the precipitating events for Tamerlan’s actions during the week of April 15, 2013.” Family members and “other sources” told Dzhokhar’s defense team that the FBI tried to pressure Tamerlan into becoming an informant.

Dzhokhar’s lawyers suggested that Tamerlan Tsarnaev’s interactions with the Bureau could have “increased his paranoia and distrust.”

We also know that an undercover FBI agent goaded Elton Simpson to “tear up Texas” shortly before he and his roommate, Nadir Soofi, shot up a “Draw Mohammed” contest in Garland, Texas, on May 3, 2015. Hours before the event, the FBI sent a bulletin to local police warning that Simpson was “interested in the event.”

Even more troubling, there was an undercover FBI agent at the event communicating about security measures with a third individual, whom agents knew had been in contact with one of the shooters.

All this information was only made public because some of the agent’s text messages were quoted in court documents.

Arun Kundnani, lecturer on terrorism studies at New York University, told The Intercept about the incident:

The FBI uses informants and undercover agents to pressure suspected ISIS sympathizers into committing acts of violence, so that they can then be prosecuted. The Garland shooter case is the most striking illustration yet of the dangers of this approach. Essentially, it suggests the government may be manufacturing the very threat it is supposed to be countering.

The list goes on: Omar Mateen, the Pulse nightclub shooter; Ahmad Khan Rahimi, the NY/NJ bomber; Usaama Rahim, shot dead after he went after police with a knife in Boston; Army Maj. Nidal Hasan, the Fort Hood shooter; and Wasil Farooqui, who attacked two random people with a knife in Virginia  — all had interactions with the Bureau before seemingly going berserk.

FBI CYA

“The FBI failed there… The federal government already knew about [Santiago’s claim that the CIA was making him watch ISIS videos] for months, they had been evaluating him for a while, but they didn’t do anything,” the accused shooter’s brother Bryan Santiago told the Associated Press.

In what has become almost a boilerplate description of these assessments, the FBI told reporters that the FBI investigated Santiago, conducted “interagency checks” and did “database reviews.”

“During our initial investigation we found no ties to terrorism,” Special Agent Ritzman told reporters. “He broke no laws when he came into our [Anchorage] office making disjointed comments about mind control.”

But as we’ve seen time and again, it’s the FBI’s statements about its interactions with a soon-to-be-violent perpetrator that are disjointed. (Read this for an in-depth analysis and comparison of the FBI’s explanation of its interactions with one of the “Boston bombers” and the more recent “NY/NJ Bomber.”)

Note the specific reference to terrorism in the FBI statement. The implication is that Santiago could not have been investigated further because no direct link to terrorism was found. But he told them he had been watching ISIS videos, so there was a link.

In fact, the FBI routinely goes after people for similar activity. Since 9/11 the Bureau has been repeatedly accused of creating elaborate, time-consuming stings to entrap individuals who, the agency believes,,might commit an act of violence in the future — on no more evidence than social media rants and the like.

Another curious discrepancy in the FBI report: the agency claims that Santiago said in November he didn’t want to hurt anyone, but since he had recently been arrested for domestic violence, there was reason to suspect he was capable of such action.

Maybe “mind control” is too strong a term to describe what these individuals experienced at the hands of government investigators. But whatever is going on in the shadows, it is not ending well for the rest of us.

In a highly indebted world, austerity is a permanent state of affairs

images

By Mark Blyth

Source: Aeon

By 2010, everyone had heard the ‘austerity’ rallying cry. Immediately following the 2008 financial crisis, especially in Europe, it resounded: ‘Stimulate no more, now is the time for all to tighten!’ And tighten governments did, cutting public expenditure across continental Europe, and in the United Kingdom and the United States.

The logic behind ‘austerity’ holds that ‘the market’ – which the public had just bailed out – did not like the debt incurred when states everywhere rescued and recapitalised their banking systems. Unsurprisingly, tax revenues fell as the economy slowed and state expenditures rose. And what were once private debts on the balance sheets of banks became public debt on the balance sheet of states. Given this sorry state of affairs, states (policymakers and business leaders argued) had to take action to restore ‘business confidence’ – which is apparently always and everywhere created by cutting government spending. So governments cut.

Public debt, however, grew, because economies got smaller and grew slower the more they cut. The ‘confidence fairy’ as Paul Krugman named the expected effect, simply failed to show up. Why?

The reason is simple – and it is surprising anyone thought that anything else would happen. Imagine an economy as a sum, with a numerator and a denominator. Make total debt 100 and stick that on the top (the numerator). Make Gross Domestic Product (GDP) 100 and stick that on the bottom (the denominator) to give us a 100 per cent debt-to-GDP ratio. If you cut total spending by 20 per cent to restore ‘confidence’, the economy is ‘balanced’ at 100/80. That means the debt-to-GDP ratio of the country just went up to 120 per cent, all without the government issuing a single cent of new debt.

In short, cuts to spending in a recession make the underlying economy contract. After all, government workers have lost jobs or income, and government workers not shopping has the same effect as private sector workers not shopping. So the debt goes up as the economy shrinks further. States respond by cutting spending further. The pattern continues.

Having a common currency among different countries actually aggravates the problem because cuts in one state reverberate through many states, depressing them all. In 2008, euro area government debt as a share of the economy, including the already profligate Greeks, averaged around 65 per cent of GDP. Following budget cuts and monetary tightening (the European Central Bank twice pushed up interest rates in 2011) Euro Area government debt, by 2014, had risen to 92 per cent of GDP.

Greece is the poster child for this ‘denominator effect’. Under the auspices of ‘bailouts’ from the IMF and the EU, Greece cut more than 20 per cent of GDP in spending. It lost nearly 30 per cent in final consumption. Yet its debt increased from 103 per cent in 2006 to more than 180 per cent by 2014. That’s a 57 per cent increase in debt while spending is being cut.

Let’s look at the originating question again: how is destroying a third of the economy supposed to inspire consumer and business confidence? It won’t – unless you are a creditor – and that’s where the politics comes in.

If you are a holder of government debt (a creditor), three things hurt the value of your asset: if the inflation rate goes above the interest rate on your bond; if the exchange rate moves against you so that what the bond is worth vis-à-vis other currencies falls; and, of course, default – if the government takes the money and runs.

In the post-crisis world, despite major central banks putting trillions of dollars into the global money supply, there is almost no inflation anywhere in the developed world. Exchange rates (Brexit effects apart) are comparatively stable and ultimately move against each other relatively, so that’s not a huge worry. If the country whose debt you hold can have elections, and the public dares to vote against more budget cuts, the European Central Bank will shut down their banking system to make them revisit their choices. That’s what they did to Greece in the summer of 2015.

In this world, our present world, creditors will get paid and debtors will get squeezed. Budgets will be cut to make sure that bondholders get their money. And, in a highly indebted world, austerity – introduced as an ‘emergency’ measure to save the economy, to right the fiscal ship – becomes a permanent state of affairs.

As Britain’s former prime minister David Cameron said (standing beside a throne in a white bow-tie and tails) in 2013: ‘We need to do more with less. Not just now, but permanently.’ But here’s the question hidden in that blithe statement – are you and me part of the ‘we’ here?

Let’s go back to the huge jump in public debt that occurred when governments, ie the people, bailed out the banks. That debt was not, and is not, a liability. As difficult as it can be to make this reality part of the political conversation, public debt is an asset. Even at today’s low rates, it earns interest and retains value. No one is forced to invest in public debt, but every time bonds are issued investors show up and buy them by the truckload. By market criteria, public debt is a great investment.

But who pays for it? That would be the taxpayer. More generally, those who contribute to the payment of debts by not consuming government-produced services that have been cut. Basically, in most countries, this means that the bottom 70 per cent of the income distribution bears the cost of paying for public debt.

Over the past 25 years, to make up for chronically low wage growth, that same 70 per cent of the population has increased its personal indebtedness. Massively. Which means that in an economy deformed by austerity, they are the ones paying out – twice. With stagnant or declining wages, they have to service both the massive private debt they have accumulated to live and the public debt issued in their name.

Meanwhile, those whose assets the public bailed out – those with investible wealth, those who hold ‘all that debt’ and make money from it – do not suffer from the decline in public spending. Since they are net lenders, the hike in personal indebtedness does not trouble them either.

The result, and the situation in which we find ourselves, is a classic bad equilibrium. Those who can’t pay, and don’t earn enough, are being asked to pay the most to service debt, from which they do not and will not benefit. Those who can pay, and earn almost all the income, both contribute the least and benefit the most from ‘all that debt’.

Strip away all the electoral politics at the moment in the US, the UK, Italy, Spain and elsewhere, and that’s the underlying political economy. It’s a creditor/debtor stand-off where the creditors have the whip hand.

And yet, the more they crack the whip, the more the backlash against austerity, in all its forms, gains strength. Donald Trump, Jeremy Corbyn, Marine Le Pen, Pablo Iglesias: Left or Right, they are all riding debtor anger against creditor strength. It might be expressed as anger against, variously, ‘trade’ or ‘the elite’ or the ‘EU’. But what’s underneath all that is the politics of debt.

This is the ‘new normal’. It’s not about flat interest rates or anaemic growth rates. They are the consequences of austerity, not its causes. The new normal is the new politics of debtors versus creditors. It’s here to stay. As we already can see, it’s going to be anything but normal.

How Pure is Your Hate?

trump-obama

By Paul Street

Source: CounterPunch

Fellow workers and citizens, how pure is your hatred? It’s easy to hate on openly authoritarian, loathsome, right-wing political personalities and institutions like Ronald Reagan, George W. Bush, Donald Trump, the Koch brothers, Paul Ryan, the Republican Party, the Heritage Foundation, the American Enterprise Institute, Breitbart News, and FOX News. There’s no serious mystery over what those malicious people and entities are about: the ever upward distribution of wealth and power.

The bigger tests are supposedly liberal and progressive personalities and institutions like Barack Obama, the Clintons, Nancy Pelosi, the Democratic Party, George Soros, the Brookings Institution, the Center for American Progress, the “Public” Broadcasting System (“P”BS), the Washington Post, MSNBC, and the New York Times.

These people and organizations are no less committed than the nation’s more transparently right-wing counterparts to the nation’s unelected deep state dictatorships of money, empire, and white-supremacy, but their allegiance and service to the nation’s reigning oppression structures and ideologies is cloaked by outwardly multicultural, liberal, and even progressive concern for the poor and nonwhite.

“What’s the Something Much Better?”

I was reminded of this distinction for the five thousandth time last Thursday while watching Council of Foreign Relations (CFR) member and PBS NewsHour host Judy Woodruff interview the longtime Senior Obama Advisor and intimate Obama family mentor and confidant Valerie Jarrett.

Read the following passage from the interview last week and then tell me, please, to quote  Alexander Cockburn, “is your hate pure?”

Judy Woodruff, CFR and “P”BS:  Just last night, the United States Senate took another step toward repeal of Obamacare, the Affordable Care Act. There was a budget vote, which is going to lead to other steps, which will lead to repeal. Just yesterday, the president-elect called Obamacare a complete and total disaster.

Valerie Jarrett, White House: I think it’s very easy to say repeal and replace, but we have been encouraging the Republicans, since the president first started embarking on this effort, to put in place a plan for affordable care to come up with their best ideas. And they have had, what, 50, 60 votes to repeal, and not a single replacement plan. So…

Woodruff: Well, they say that’s what they’re going to do. They’re going to get rid of what’s there now and replace it with something much better.

Jarrett: Well, what’s the something much better? That’s my question. That’s the question the president has been asking for eight years right now. So, if there is a something better, let’s hear it. What’s the secret?

Obama, 2003: “What I’d Like to See”

After this exchange, Woodruff moved off the health care topic, with no follow up. That was a statement in itself.  Surely any reasonably informed “public” media journalist would be aware that national Canadian-style single-payer health insurance – Improved Medicare for All – has long been backed by most Americans.  Such a journalist would know that single-payer would provide comprehensive coverage to all the nation’s many millions of uninsured and under-insured while retaining free choice in doctor selection and being the most cost-effective way to go thanks to the elimination of private for-profit insurance corporations’ parasitic control over the system.

A knowledgeable “public” journalist might even know that then state senator Barack Obama endorsed single payer on these very grounds as late as the summer of 2003, when he said the following to the Illinois AFL-CIO:

“I happen to be a proponent of a single payer universal health care program I see no reason why the United States of America, the wealthiest country in the history of the world, spending 14 percent of its Gross National Product on health care cannot provide basic health insurance to everybody. And that’s what Jim is talking about when he says everybody in, nobody out. A single payer health care plan, a universal health care plan. And that’s what I’d like to see.”

Obama would quickly drop those sentiments in the interest of getting campaign backing from the nation’s giant insurance and drug companies and their Wall Street investors on his path to the U.S. Senate and the presidency.

Right after he entered the White House Obama set up a health care reform task force chock full of big insurance company representatives.  Not one of the more than 80 U.S. House of Representative members who had endorsed single payer – not even the veteran Black Congressman John Conyers, author of a House single payer bill – was invited to participate.

A Sicko Game

The outcome was the so-called Affordable Care Act (later dubbed “Obamacare”), a complicated and corporatist bill based on a Republican plan drawn up by the right-wing Heritage Foundation.  Since it left the price- and premium-gouging and profit-taking power of the big insurance and drug syndicates intact, the ACA condemned a vast swath of the nation to continuing inadequate and unaffordable coverage – this while the right-wing noise machine has absurdly railed against “socialized health care.”

Along the way, the new neoliberal president played a sicko (yes, Michael Moore) game to sell his Heritage Foundation bill, promising citizens that his plan would include a public option while having already traded that policy away to get for-profit hospitals to back the ACA. As Miles Moguiescu reported on Huffington Post and as the New York Times confirmed,  “Obama made a backroom deal…with the for-profit hospital lobby that he would make sure there would be no national public option in the final health reform legislation…Even while President Obama was saying that he thought a public option was a good idea and encouraging supporters to believe his healthcare plan would include one,” Moguiescu noted, “he had promised for-profit hospital lobbyists that there would be no public option in the final bill.”

We can be certain that the veteran agent of neoliberal mendacity Valerie Jarrett advised Obama to take this deeply duplicitous path.

The Memory Hole

It’s quite remarkable how completely the dominant “mainstream” media-politics culture manages to throw majority-supported social-democratic policy proposals down George Orwell’s memory hole.

Listening to the Woodruff-Jarrett conversation, you’d think Bernie Sanders had never spoken to giant and enthusiastic crowds on behalf of single payer last year.

You’d think Conyers had never drafted single-payer legislation backed by a considerable number of U.S. Congressman.

You’d think that Canada and most of the industrialized world had never successfully implemented a widely popular nation-wide systems of universal governmental health insurance.

You’d think single-payer didn’t have millions of citizen backers – including many thousands of doctors and National Nurses United – from coast to coast.

You wouldn’t imagine that even Donald Trump has mused that single-payer might be the best way to fund health insurance for all.

So, if there is a something better, let’s hear it. What’s the secret?”

Unreal.

It reminds me of Hillary Clinton’s response as head of newly elected U.S. President Bill Clinton’s health care task force when Dr. David Himmelstien, the head of Physicians for a National Health Program, told her about the incredible possibilities of a comprehensive, single payer “Canadian style” health plan, supported by more than two-thirds of the U.S. public and certified by the Congressional Budget Office as “the most cost-effective plan on offer.”

“David,” Hillary (Michael Moore’s heart throb) commented with fading patience before sending him away in 1993, “tell me something interesting.”

That’s right: tell me something interesting.

Along with the big insurance companies the Clintons deceptively railed against, the co-presidents Bill and Hill decided from the start to exclude the popular health care alternative – single payer – from the national health care “discussion.”  What she advanced instead of the system that bored her was a hopelessly complex and secretly developed program called “managed competition.” Interesting. Obama would have more success with his Heritage Foundation-developed update in 2009 and 2010.

And they wonder why Trump won.

Paul Street’s latest book is They Rule: The 1% v. Democracy (Paradigm, 2014)

Phantom Democracy in the Age of the Internet

trumpandflag

By Nozomi Hayase

Source: Dissident Voice

After the Electoral College vote, the Trump presidency is now official. As denial and blame games continue, it becomes clear this was not a foreign government coup d’état. The truth is that democracy in America has been rotten to the core for decades. It is meddled with by corporate lobbyists, Big Pharma, Big Oil and Wall Street –those who are addicted to money and power.

American democracy is hollowed out, veiled with a loud media echo chamber, bringing feigned solidity to its emptiness. Out of this vacuum emerges a madness for power. U.S. politics is a contest of those who are driven by insatiable hunger – the most callous, cunning and manipulative people in society.

In this system, only people who lack empathy and advance self-serving agendas without concerns for others can rise to the top. The results of this year’s presidential election may mean that this person who many saw as ‘unfit to be president’ was better suited to play this dirty game than his opponent, Hillary Clinton.

Ascent of Trump

Donald Trump, a perceived outsider, seemed to appear out of nowhere. The former producer of the American game show The Apprentice sniffed the vulnerability of disfranchised Americans who are continually betrayed by the establishment. He then quickly moved in for the kill, turning the electoral arena into a new Reality TV show.

With social media as a hunting ground, this new Republican contender made direct connection with his audience, pouring out charm and grooming them with fake promises. By deploying words as weapons of control, he managed to garner favorable reactions from his followers. His language cast a magic spell where contradictory remarks and lies bypassed critical examination. Emotions triumphed over reason and under the grip of irrational logic, facts no longer seemed to matter. With a chameleon-like ability to shape-shift and say whatever voters wanted to hear, he was able to create a mirage and ensnare the populace into a grandiose fantasy.

What was the press, as a supposed watchdog of power doing during this Trump’s uncanny rise in popularity? Mainstream media did nothing to prevent it and instead facilitated this process. His bombastic comments hit jackpot high ratings in the corporate media and rhetoric not bound by facts was not only tolerated, but actively promoted with their shortsighted mentality of profit at any cost.

WikiLeaks and the Democratizing Power of the Internet

This same corporate media also buried a few important facts regarding the 2016 U.S. presidential election. This year’s election was an unprecedented phenomenon. This is not only because the lesser evil game was fought between two of the historically most disliked candidates, but also because of the role played by a new actor from outside of the U.S. electoral arena. Days before the election, a Forbes article acknowledged the significance of WikiLeaksDNC emails, calling them a “Holy Grail of understanding of U.S. electoral politics.” It noted how “few understand the importance of WikiLeaks in the eventual writing of the history of presidential politics.”

WikiLeaks has shown how elections in the existence of a truly free press will never be the same as before. U.S. politics sponsored by corporate masters creates a milieu of deception, lies and fraud that is fraught with corruption. These power driven politicians can only thrive in secrecy. When their actions are exposed, like Hillary’s highly paid Goldman Sachs speeches, crafted public images that suck the masses into their illusions of grandeur tend to shatter. Contrary to hysterical rants of ‘Russia hacked the election!’, the defeat of the Clinton dynasty was a testimony to the power of transparency.

WikiLeaks, the world’s first global 4th estate, which operates outside of any government was birthed on the Internet. It showed a potential for emancipation unleashed by this Net. Much of the force of democratization on the Internet is being subverted to create mass surveillance and censorship. Yet at the same time, its effect of empowering ordinary people cannot be denied.

In fact, Bernie Sander’s campaign was built on social media’s grassroots organizing. With independent campaign funding, this virtually unknown senator from Vermont successfully sparked the idea of socialism and raised issues of Wall Street corruption, economic injustice and poverty at a national level. Sander’s largest support came from millennials. It was these natives of the Internet that galvanized his political revolution.

Fake News and Fake Authority

Democrats appear to be disconnected with this new reality of the Internet’s bottom up spontaneous crowd gathering or even worse were adversaries to it. This was shown in their reaction to the corruption revealed in the DNC email database and Trump’s winning of the election.

On the second day of the Democratic National Convention, hundreds of Sanders delegates who learned about DNC’s rigging of the primary walked out in protest. Chanting “This is what Democracy looks like!”, they vowed not to go with Hillary. This crisis of the American political system opened up an opportunity for real democracy. But then, Bernie turned away, urging his supporters to nominate Hillary and sided with the corrupted Democratic Party. His failure to seize this historical moment helped throw the election to Trump, who the Clinton campaign had portrayed as a ‘pied piper candidate’.

After all this came the Fake News explosion. Some established liberal media, freaked out by the country quickly turning red in this Republican takeover, created a new red scare. On November 24, an article in The Washington Postmade wild accusations that Russia was engaging in propaganda during the election to spread ‘fake news’ in favor of Trump. The anonymous site that claimed to have identified these fake news sites that the author cited in the article, was shown to be nothing but a black list that labels anyone who challenges the official narrative as untrustworthy or even insinuating them to be Russian agents, spies or traitors.

Despite U.S. Intelligence Chief James Clapper’s claim that intelligence agencies lacked strong evidence for WikiLeaks’ connection with an alleged Russian cyberattack, it was way easier for progressives to ignore facts and spread paranoia, blaming the loss of Clinton on anyone but themselves.

In the age of the Internet, fake news can easily be manufactured and spread. Yet, at the same time it can also be shut down with countering views that surround them. Also, in this new environment, traditional media is losing its monopolizing power to disseminate information. They no longer can claim to be the sole purveyor of truth. In the case of the Washington Post‘s fake news scandal, The Intercept and Matt Taibbi of Rolling Stone quickly denounced and challenged its claim, halting this report on ‘fake news’. Social media networks also countered the gatekeepers who tried to dictate what is real through filtering views that challenge the official narrative. In the end, this fake news article was debunked, with Wapo issuing a correction on that story shortly after its publication. What this has shown is the publisher’s false authority and the establishment’s desperate attempt to reassert their shrinking legitimacy to keep people under their sphere of influence.

From Regime Change to Game Changer

The election is over and liberals’ hope to stop the rise of demagoguery is fading. The president elect began recruiting his rich buddies into his cabinet. Recently, he convened a group of Silicon Valley tech leaders to invite them into his new ‘construction project to rebuild America’. As this void of American democracy is being filled with more blatant patronage networks, new insurgencies of civic power are also arising. The potent and creative power of the Internet is already here. Those who have experienced it will not easily succumb to the reality being handed down to them from the teetering Trump Tower.

Just as the power of the Internet can be used by the oligarchic class to corral the masses, it can also be used to empower the people, through its open network. When the liberating force of a free net is claimed by citizens to create movements across borders, linking diverse struggles, it can give all a chance to not only change a regime, but to change the game altogether.

One game changer is WikiLeaks. With the creative use of technology, this Internet of the media built a robust network that is resistant to censorship, making it possible for the organization to be free from state and corporate influence, allowing it to truly serve the interests of the people. It has gained its own credibility through a perfect record of authentication of documents and rigorous scientific journalism that publishes full and verifiable archives. Despite corporate media’s smearing of the organization, public opinion polls indicate that Americans strongly approve WikiLeaks’ Podesta leaks.

Another democratic tool that is available to people everywhere is cryptocurrency like Bitcoin. With this new invention, ordinary people now have power to create their own money and peer-to-peer networks that are not intermediated by any governments, banks or corporations. Just as WikiLeaks distributes free speech beyond borders and lets truth be discovered through each individual’s participation, with Bitcoin, free speech becomes an app that can be downloaded from anywhere by anyone and values are created through people transacting freely, verified by a consensus of equal peers.

In Their Nothingness, We Find Our Power

On January, 20 2017, Trump will be sworn in with the Oath of Office. The White House will become his new executive boardroom. With this United States Incorporated, the Constitution may be slowly shredded off from his business contract. With the president elect’s proposal on Twitter to give penalties, including jail time or loss of citizenship for burning the American flag, coupled with his recent call for the expansion of nuclear weapons, many are rightfully fearful of the future.

Yet, wars and destruction of civil society are already happening around the world. Crackdowns on cash and schemes of demonetization are taking place in countries like Venezuela and India. When faced with the reality of their national currencies quickly disappearing or losing value, people are waking up to the fact that these claimed values are fake and that they are not backed by real economic activity or anything of true value. More and more people are seeing bubbles pumped up by toxic assets and fraud of financial engineering that rent-seeks earnings of hard working people and creates money out of thin air.

In his speech “Currency Wars and Bitcoin’s Neutrality”, technologist and author Andreas Antonopoulos spoke of how “cash is being eradicated around the world as a scourge.” He then pointed out how governments are waging currency wars against other countries and their own people in order to benefit from a crisis they artificially created. He emphasized how governments and central banks can’t win this game, because “cash is something that we can create, electronic cash, self sovereign cash, digital cash – Bitcoin.” He then noted how this math-based ‘Internet of money’ offers an exit from this old world of currency wars. He alerted the Bitcoin community that as the battle intensifies, those who create a new infrastructure as an exit from nation-state gated economies, and those who point to this exit will be called traitors, criminals, thug and terrorists.

This war on cash and censorship with Fake News memes are attacks on our fundamental freedoms. It is a battle for truth, involving the question of who will define our human reality. This war is now full on, yet mostly brewing beneath the radar. Just before Christmas, President Obama quietly signed into law the 2017 National Defense Authorization Act. This included the ‘Countering Disinformation and Propaganda Act’, which was presented to help counter foreign enemy propaganda, yet is actually a McCarthy era-style censorship law.

We live in a time when traditional authority and leaders have failed us and there is vacuousness in this space where a center used to hold. In the story of Faust, Goethe wrote about a universal man following his thirst for knowledge. In this journey, Dr. Faust meets Mephisto (the devil) who tried to trick and tempt him to come under his control. In the scene A Dark Gallery, Faust told Mephisto, “In your Nothingness I hope to find my All”. He then took the key and entered into this mysterious unknown.

Our quest for real democracy invokes this thirst for knowledge. It invites us all to enter into the realm of Nothingness. We no longer want to believe; we want to know. We no longer blindly accept a world conceived by a few elites. Now, in this chaos and abyss we are descending into, we may be able to find the real source of our own legitimacy. With knowledge that springs from deep within, we are able to penetrate the deception of those who seek to control us and recognize their actual emptiness. In their nothingness, we can find the creative power that has always been there, power that can bring life back to this phantom of democracy.

 

Nozomi Hayase, Ph.D., is a writer who has been covering issues of freedom of speech, transparency, and decentralized movements. Her work is featured in many publications. Find her on twitter @nozomimagine. Read other articles by Nozomi.

We Have Met the Alien and He Is Us

73e633dab43437b78c926c0403c06bcc

By William Astore

Source: TomDispatch.com

We Are The Empire
Of U.S. Military Interventions, Alien Disaster Movies, and Star Wars
By William J. Astore

Perhaps you’ve heard the expression: “We have met the enemy and he is us.” Cartoonist Walt Kelly’s famed possum, Pogo, first uttered that cry. In light of alien disaster movies like the recent sequel Independence Day: Resurgence and America’s disastrous wars of the twenty-first century, I’d like to suggest a slight change in that classic phrase: we have met the alien and he is us.

Allow me to explain. I grew up reading and watching science fiction with a fascination that bordered on passion. In my youth, I also felt great admiration for the high-tech, futuristic nature of the U.S. military. When it came time for college, I majored in mechanical engineering and joined the U.S. Air Force. On graduating, I would immediately be assigned to one of the more high-tech, sci-fi-like (not to say apocalyptic) military settings possible: Air Force Space Command’s Cheyenne Mountain.

For those of you who don’t remember the looming, end-of-everything atmosphere of the Cold War era, Cheyenne Mountain was a nuclear missile command center tunneled out of solid granite inside an actual mountain in Colorado. In those days, I saw myself as one of the good guys, protecting America from “alien” invasions and the potential nuclear obliteration of the country at the hands of godless communists from the Soviet Union. The year was 1985 and back then my idea of an “alien” invasion movie was Red Dawn, a film in which the Soviets and their Cuban allies invade the U.S., only to be turned back by a group of wolverine-like all-American teen rebels. (Think: the Vietcong, American-style, since the Vietnam War was then just a decade past.)

Strange to say, though, as I progressed through the military, I found myself growing increasingly uneasy about my good-guy stature and about who exactly was doing what to whom. Why, for example, did we invade Iraq in 2003 when that country had nothing to do with the attacks of 9/11? Why were we so focused on dominating the Earth’s resources, especially its oil? Why, after declaring total victory over the “alien” commies in 1991 and putting the Cold War to bed for forever (or so it seemed then), did our military continue to strive for “global reach, global power” and what, with no sense of overreach or irony, it liked to call “full-spectrum dominance”?

Still, whatever was simmering away inside me, only when I retired from the Air Force in 2005 did I fully face what had been staring back at me all those years: I had met the alien, and he was me.

The Alien Nature of U.S. Military Interventions

The latest Independence Day movie, despite earning disastrous reviews, is probably still rumbling its way through a multiplex near you. The basic plot hasn’t changed: ruthless aliens from afar (yet again) invade, seeking to exploit our precious planet while annihilating humanity (something that, to the best of our knowledge, only we are actually capable of). But we humans, in such movies as in reality, are a resilient lot. Enough of the plucky and the lucky emerge from the rubble to organize a counterattack. Despite being outclassed by the aliens’ shockingly superior technology and awe-inspiring arsenal of firepower, humanity finds a way to save the Earth while — you won’t be surprised to know — thoroughly thrashing said aliens.

Remember the original Independence Day from two decades ago? Derivative and predictable it may have been, but it was also a campy spectacle — with Will Smith’s cigar-chomping military pilot, Bill Pullman’s kickass president in a cockpit, and the White House being blown to smithereens by those aliens. That was 1996. The Soviet Union was half-a-decade gone and the U.S. was the planet’s “sole superpower.” Still, who knew that seven years later, on the deck of an aircraft carrier, an all-too-real American president would climb out of a similar cockpit in a flight suit, having essentially just blown part of the Middle East to smithereens, and declare his very own “mission accomplished” moment?

In the aftermath of the invasion of Afghanistan and the “shock and awe” assault on Iraq, the never-ending destructiveness of the wars that followed, coupled with the U.S. government’s deployment of deadly robotic drones and special ops units across the globe, alien invasion movies aren’t — at least for me — the campy fun they once were, and not just because the latest of them is louder, dumber, and more cliché-ridden than ever. I suspect that there’s something else at work as well, something that’s barely risen to consciousness here: in these years, we’ve morphed into the planet’s invading aliens.

Think about it. Over the last half-century, whenever and wherever the U.S. military “deploys,” often to underdeveloped towns and villages in places like Vietnam, Afghanistan, or Iraq, it arrives very much in the spirit of those sci-fi aliens. After all, it brings with it dazzlingly destructive futuristic weaponry and high-tech gadgetry of all sorts (known in the military as “force-multipliers”). It then proceeds to build mothership-style bases that are often like American small towns plopped down in a new environment. Nowadays in such lands, American drones patrol the skies (think: the Terminator films), blast walls accented with razor wire and klieg lights provide “force protection” on the ground, and the usual attack helicopters, combat jets, and gunships hover overhead like so many alien craft. To designate targets to wipe out, U.S. forces even use lasers!

In the field, American military officers emerge from high-tech vehicles to bark out commands in a harsh “alien” tongue. (You know: English.) Even as American leaders offer reassuring words to the natives (and to the public in “the homeland”) about the U.S. military being a force for human liberation, the message couldn’t be more unmistakable if you happen to be living in such countries: the “aliens” are here, and they’re planning to take control, weapons loaded and ready to fire.

Other U.S. military officers have noticed this dynamic. In 2004, near Samarra in Iraq’s Salahuddin province, for instance, then-Major Guy Parmeter recalled asking a farmer if he’d “seen any foreign fighters” about. The farmer’s reply was as simple as it was telling: “Yes, you.” Parmeter noted, “You have a bunch of epiphanies over the course of your experience here [in Iraq], and it made me think: How are we perceived, who are we to them?”

Americans may see themselves as liberators, but to the Iraqis and so many other peoples Washington has targeted with its drones, jets, and high-tech weaponry, we are the invaders.

Do you recall what the aliens were after in the first Independence Day movie? Resources. In that film, they were compared to locusts, traveling from planet to planet, stripping them of their valuables while killing their inhabitants. These days, that narrative should sound a lot less alien to us. After all, would Washington have committed itself quite so fully to the Greater Middle East if it hadn’t possessed all that oil so vital to our consumption-driven way of life? That’s what the Carter Doctrine of 1980 was about: it defined the Persian Gulf as a U.S. “vital interest” precisely because, to quote former Deputy Secretary of Defense Paul Wolfowitz’s apt description of Iraq, it “floats on a sea of oil.”

Of Cold War Memories and Imperial Storm Troopers

Whether anyone notices or not, alien invasion flicks offer a telling analogy when it comes to the destructive reality of Washington’s global ambitions; so, too, do “space operas” like Star Wars. I’m a fan of George Lucas’s original trilogy, which appeared in my formative years. When I saw them in the midst of the Cold War, I never doubted that Darth Vader’s authoritarian Empire in a galaxy far, far away was the Soviet Union. Weren’t the Soviets, whom President Ronald Reagan would dub “the evil empire,” bent on imperial domination? Didn’t they have the equivalent of storm troopers, and wasn’t it our job to “contain” that threat?

Like most young Americans then, I saw myself as a plucky rebel, a mixture of the free-wheeling, wisecracking Han Solo and the fresh-faced, idealistic Luke Skywalker. Of course, George Lucas had a darker, more complex vision in mind, one in which President Richard Nixon, not some sclerotic Soviet premier, provided a model for the power-mad emperor, while the lovable Ewoks in The Return of the Jedi — with their simple if effective weaponry and their anti-imperial insurgent tactics — were clearly meant to evoke Vietnamese resistance forces in an American war that Lucas had loathed. But few enough Americans of the Cold War-era thought in such terms. (I didn’t.) It went without question that we weren’t the heartless evil empire. We were the Jedi! And metaphorically speaking, weren’t we the ones who, in the end, blew up the Soviet Death Star and won the Cold War?

How, then, did an increasingly gargantuan Pentagon become the Death Star of our moment? We even had our own Darth Vader in Dick Cheney, a vice president who actually took pride in the comparison.

Think for a moment, dear reader, about the optics of a typical twenty-first-century U.S. military intervention. As our troops deploy to places that for most Americans might as well be in a galaxy far, far away, with all their depersonalizing body armor and high-tech weaponry, they certainly have the look of imperial storm troopers.

I’m hardly the first person to notice this. As Iraq war veteran Roy Scranton recently wrote in the New York Times, “I was the faceless storm trooper, and the scrappy rebels were the Iraqis.” Ouch.

American troops in that country often moved about in huge MRAPs (mine-resistant, ambush-protected vehicles) described to me by an Army battalion commander as “ungainly” and “un-soldier like.” Along with M1 Abrams tanks and Bradley fighting vehicles, those MRAPs were the American equivalents of the Imperial Walkers in Star Wars. Such vehicles, my battalion commander friend noted drolly, were “not conducive to social engagements with Iraqis.”

It’s not the fault of the individual American soldier that, in these years, he’s been outfitted like a Star Wars storm trooper. His equipment is designed to be rugged and redundant, meaning difficult to break, but it comes at a cost. In Iraq, U.S. troops were often encased in 80 to 100 pounds of equipment, including a rifle, body armor, helmet, ammunition, water, radio, batteries, and night-vision goggles. And, light as they are, let’s not forget the ominous dark sunglasses meant to dim the glare of Iraq’s foreign sun.

Now, think how that soldier appeared to ordinary Iraqis — or Afghans, Yemenis, Libyans, or almost any other non-Western people. Wouldn’t he or she seem both intimidating and foreign, indeed, hostile and “alien,” especially while pointing a rifle at you and jabbering away in a foreign tongue? Of course, in Star Wars terms, it went both ways in Iraq. A colleague told me that during her time there, she heard American troops refer to Iraqis as “sand people,” the vicious desert raiders and scavengers of Star Wars. If “they” seem like vicious aliens to us, should we be surprised that we just might seem that way to them?

Meanwhile, consider the American enemy, whether the Taliban, al-Qaeda, or any of our other opponents of this era. Typically unburdened by heavy armor and loads of equipment, they move around in small bands, improvising as they go. Such “terrorists” — or “freedom fighters,” take your pick — more closely resemble (optically, at least) the plucky human survivors of Independence Day or the ragtag yet determined rebels of Star Wars than heavy patrols of U.S. troops do.

Now, think of the typical U.S. military response to the nimbleness and speed of such “rebels.” It usually involves deploying yet more and bigger technologies. The U.S. has even sent its version of Imperial Star Destroyers (we call them B-52s) to Syria and Iraq to take out “rebels” riding their version of Star Wars “speeders” (i.e. Toyota trucks).

To navigate and negotiate the complex “human terrain” (actual U.S. Army term) of “planets” like Iraq and Afghanistan, U.S. troops call on a range of space-age technologies, including direction-finding equipment, signal intercept, terrain modeling, and satellite navigation using GPS. The enemy, being part of that “human terrain,” has little need for such technology to “master” it. Since understanding alien cultures and their peculiar “human terrains” is not its forte, the U.S. military has been known to hire anthropologists to help it try to grasp the strange behaviors of the peoples of Planet Iraq and Planet Afghanistan.

Yet unlike the evil empire of Star Wars or the ruthless aliens of Independence Day, the U.S. military never claimed to be seeking total control (or destruction) of the lands it invaded, nor did it claim to desire the total annihilation of their populations (unless you count the “carpet bombing” fantasies of wannabe Sith Lord Ted Cruz). Instead, it promised to leave quickly once its liberating mission was accomplished, taking its troops, attack craft, and motherships with it.

After 15 years and counting on Planet Afghanistan and 13 on Planet Iraq, tell me again how those promises have played out.

In a Galaxy Far, Far Away

Consider it an irony of alien disaster movies that they manage to critique U.S. military ambitions vis-à-vis the “primitive” natives of far-off lands (even if none of us and few of the filmmakers know it). Like it or not, as the world’s sole superpower, dependent on advanced technology to implement its global ambitions, the U.S. provides a remarkably good model for the imperial and imperious aliens of our screen life.

We Americans, proud denizens of the land of the gun and of the only superpower left standing, don’t, of course, want to think of ourselves as aliens. Who does? We go to movies like Independence Day or Star Wars to identify with the outgunned rebels. Evidence to the contrary, we still think of ourselves as the underdogs, the rebels, the liberators. And so — I still believe — we once were, a long time ago in a galaxy far, far away.

We need to get back to that time and that galaxy. But we don’t need a high-tech time machine or sci-fi wormhole to do so. Instead, we need to take a long hard look at ourselves. Like Pogo, we need to be willing to see the evidence of our own invasive nature. Only then can we begin to become the kind of land we say we want to be.

 

A TomDispatch regular, William Astore is a retired lieutenant colonel (USAF) and history professor. He blogs at Bracing Views.