America’s Dystopian Future

By

Source: CounterPunch

Imagine a privatized America where rugged individualism reigns supreme within a vast network of corporate America, Inc., similar to 19th century wild west lifestyle, no social security, no Medicare, no Medicaid, no public law enforcement as individuals stand their own ground. Read all about it in Scott Erickson’s History of the Decline and Fall of America (Azaria Press, 2018).

Erickson’s newly released semi-fictional satire of American history and subsequent decline into deepening pits of despair is a sure-fire treasure trove of Americana, at its best. It’s a page-turner par excellence, rich in accurate textured American history and jam-packed with imagery of a dystopian future that is simply unavoidable based upon America’s character and development over the past two centuries. The dye was cast long before onset of dystopian existence.

The History of the Decline and Fall of America highlights and exposes inherent limitations of democratic capitalism whilst explaining in full living color a future American dystopia that is fully expected based upon America’s beginnings from the time of Captain John Smith at historic Jamestown (1607). The history lesson therein is superb, not missing a beat of what shaped America up to the final tipping point of neoliberal dogma and beyond into a deep dark world order.

This beautifully written and conceived historical fiction is a witty tour de force of America past, present, and future, weaving together all of the historical elements into one coherent story from the widely accepted version of American “business success ” of the early period, but over time wistfully morphing into abject failure!

That process of failure, the root causes, is what intrigues, for example, “Americans were not only inventing a country but inventing what it meant to be an American.” Indeed, America came into being as a brand new experiment in capitalistic democracy. Within that quest for a new way forward, inclusive of equality and fraternity amongst equals, Erickson discovers and reveals unique American traits that belie that mission, leading to a neoliberal/privatization hellhole that goes horribly wrong.

That fascinating pathway is explained via enchanting quips, for example, de Tocqueville’s remarkably astute comment: “ I know of no country, indeed, where the move of money has taken a stronger hold on the affections of men.” This one isolated statement from the 1830s tells a tale of American character molded by artificiality of wealth creation simply for the sake of possessing it. America’s pursuit of happiness was the “pursuit of affluence” and remained its dominant trait for the “remaining 200-plus years of American history.”

Indeed, those predominant American character traits are flushed-out and analyzed in the context of eventual failure, of a dystopian world order emanating out of America’s clumsy experimentation with empire-building and constantly striving for the pot at the end of the rainbow, meaning economic growth above all else. It was a frontier spirit that fed into elusive goals of preeminence: “The frontier resulted in Americans being doers rather than thinkers….”

Real scenes of real American cocksuredness, as well as the clumsiness associated with raw ignorance, come to life, e.g., during the presidential race between Ike and Adlai Stevenson in 1954: “A revealing incident occurred while Stevenson was campaigning for president. A citizen shouted to Stevenson that he ‘had the vote of every thinking person.’Stevenson replied, ‘That’s not enough. We need a majority!”
This is excellent history, comparable to a textbook, as well as a peek into the future shaped via trends rooted throughout Americana. Erickson’s lessons in American history are genuine and accurate, which gives the book depth and a powerful sense of significance well beyond similar treatises that try to lay the challenging groundwork leading to how a nation turns sour into a dystopian society.

He weaves the path of Manifest Destiny all the way from 1840s to the planting of the American flag on the surface of the moon. Until the 1970s when American pre-eminence tipped downward, humiliated in Vietnam in what future generations came to know as The Vietnam Syndrome,” the psychological attempt to live with the unacceptable reality that it was possible for America to not win.

Not only was America no longer a winner in war, its “unparalleled level of affluence… began to decline.” The 1970s marked the high point, forever downward into a bottomless septic tank, a cloaca of messy foul shit earmarking America’s final destiny, third world status within a realm of excessive, pretense of wealth glistening behind spiked electronic gates.

The signs of decline were easy to spot by the early-mid 2000s: “… the situation had declined dramatically. According to statistics from 2015, among industrialized nations, America was notable for having the highest poverty rate, the lowest score on the UN index of ‘material well-being of children,’ the highest health care expenditures, the highest infant mortality rate, the highest prevalence of mental health problems, the highest obesity rate, the highest consumption of antidepressants per capita, the highest homicide rate, and the largest prison population per capita. By international standards, the rural counties of southern West Virginia and eastern Kentucky qualified as developing countries, as did large sections of American cities such as Detroit, Cleveland, Gary, and many others.” (Pg. 112)

Thereafter, America’s youth no longer embraced the long-standing belief that they would have more than their parents. No, they knew it would be less and less. America entered a “permanent recession” cycle.
By the late 2030s American experienced a series of extreme crises. A number of cities declared bankruptcy. Houston, America’s 4th largest city goes bankrupt. Cleveland goes bankrupt. The head of the Federal Reserve quits and becomes a banjo player in a bluegrass group. America’s banking system collapses under the weight of fishy loans and massive crazed derivatives all permitted by increasingly hands-off regulations. The brutal hand of libertarianism smears a once proud republic.

Regular citizens, entire families carry torches surrounding Wall Street in protest of loss savings, ATMs not functioning, banks closed. An economic death spiral unleashed. The Save America Act followed, consisting of pure right wing neoliberal fix-its to save corporate America, to save Wall Street, turning to America, Inc. as the only answer to all that ails.

And, as the financial markets unravel in the face of nationwide bankruptcies, the government convincingly informs the public: “We need to defy the Constitution in order to preserve it… Americans were so thoroughly confused about the relationship between government and economics that most of them thought that the terms democracy, free-enterprise, and capitalism were the same thing.” (Pg. 165)

As time progresses, America’s Labor Day is changed to Management Day, and the Catholic Church is permitted to re-name the Statue of Liberty as “Our Lady of Perpetual Economic Growth.” America the nation turns into America, Inc. It is the only way the establishment knows to drive the country out of its doldrums. As such, The Star Spangled Banner is changed to The Free Market Ramble.

Privatization of the entire country in harmony with massive tax cuts alongside elimination of Social Security, Medicare, Medicaid, public education, law enforcement, postal service, and maintenance of roads and infrastructure, thereafter, people take care of themselves from birth to the death, alone with family backing. Self-directed medical care becomes a beacon of survival of the fittest of the fittest. Those that participated as y0ungsters in Boy/Girl Scouts have a leg up in a society that increasingly places emphasis on rugged individualism. However, the many, many weaklings stumble in rows after rows of slimy gutters.

In the end, and similar to America’s 2008-09 financial collapse, which was only a warm up for much bigger things to come: “The decisive trigger, the one that pushed America beyond the point of no return, was the total collapse of the economy. It had been something of a miracle that the doomed economy had not collapsed long before. Toward the end it had been sustained by little more than momentum, since according to all economic indicators it should not have been functioning at all. The economic system based on infinite growth had reached the point where it could grow no more. American banks could not pay off previous debt by making further loans to generate more money. The pyramid scheme was over… An eerie calm descended upon all those involved in economics and finance.”

“One Long Discomfort”: The Legacy and Future of David Lindsay’s ‘A Voyage to Arcturus’

By Ben Schwartz

Source: We Are the Mutants

Ballantine “Adult Fantasy” edition, 1973, with cover art by Bob Pepper

David Lindsay’s masterpiece A Voyage to Arcturus was first published in London in 1920 by Methuen & Co. It came dressed in a simple red cloth cover; no dust jacket, just the title and author’s name debossed into the front. This first printing sold less than 600 copies, and so Arcturus didn’t come to the US until Macmillan brought it out in 1964. In 1968, Ballantine picked it up after the massive success of the publisher’s Lord of the Rings paperbacks, and, for the first time ever, the cover featured bespoke art, painted by Bob Pepper. The printing predated Ballantine’s influential Adult Fantasy series, edited by Lin Carter, but was eventually given honorary membership, with later printings carrying the unicorn stamp and benefiting from the cachet the series possessed.

With the late-1960s Lord of the Rings phenomenon leading the charge, speculative fiction, and Arcturus with it, rode into the public consciousness on about as high a tide as it has ever had. Lindsay’s biographer Bernard Sellin notes that Ballantine’s edition “[had]… overtaken all the accumulated efforts of forty years” in terms of circulating Lindsay’s first novel. But he’s quick to point out that Lindsay’s audience is still limited, and that “The average, sensual reader is in serious danger of being disappointed in Lindsay.” Sellin wrote this in 1981 and, with a weird choice of words, envisions a “‘superior race’ of readers, anxious to go beyond the plot” of Arcturus and grasp what it’s really about. Today, in 2018, Lindsay’s potential audience, superior or otherwise, struggles against a vanishing text.

In the UK, Gollancz brought out an Arcturus reissue in the ’40s (the “novel… is regarded by some of those who have read it as a work of genius,” the cover read), which was subsequently routed into their “Rare Works of Imaginative Fiction” reissues in the early ’60s. Today, the label keeps it alive in its “Fantasy Masterworks” series as an affordable paperback. A high quality limited edition from Savoy Books was the high point of its publication history, but that small batch is fifteen years gone now.

In the states, the novel languishes in Print on Demand Hell. Most readily available copies are ill-starred editions from nebulous outfits bearing names like CreateSpace and Wilder Publications, featuring non sequitur cover images that look like refugees from a Windows ME screensaver folder: a field of wheat, a macro of autumn leaves, an anonymous, slightly-out-of-focus Roman ruin. Even outside of PoD territory there are some seriously janky efforts, leprous with typos: the first printing of Arcturus from Bison Press misspelled the word “Commemorative” on its own cover, and newer printings still contain fistfuls of errors.

And this is a book that counts Clive Barker, Alan Moore, Michael Moorcock, and Jeff Vandermeer among its admirers. C.S. Lewis called it the “real father” of his Space Trilogy. Pathological anti-genre lit critic Harold Bloom’s sole piece of published fiction—ever—is a pseudo-sequel to Arcturus called A Voyage to Lucifer. Colin Wilson, who became a literary sensation with publication of his The Outsider in 1956, put it in his curriculum while teaching and wrote multiple essays about Lindsay. These and other enthusiasts have tended the flame over the years, keeping the book visible to the small cadre of readers that are likely to respond to it. But will Arcturus ever grow beyond that niche audience?

It may be helpful to explain what readers find when they pick up the novel. On a superficial level, A Voyage to Arcturus is a spacefaring adventure of a strong, competent hero, same as you’d find in any number of time-yellowed pulp paperbacks. After a few strange chapters spent on earth, our hero, Maskull, and his two companions, Nightspore and Krag, journey to Tormance, a planet orbiting Arcturus, which in the book is a binary star with two suns, Branchspell and Alppain. Maskull wakes alone in a fantastical desert on Tormance, and quickly becomes embroiled in this new world. There are rocket ships, tentacle arms, dreamlike landscapes—Tormance is prodigious when it comes to landscapes: like Ifdawn Marest, a place of crags and mountains that are constantly sinking and shooting up in fatal, vertiginous thousand-foot shifts; or Matterplay, a valley so replete with life energy that new beings literally pop into existence, fully formed; or the Sinking Sea, whose water varies in density from place to place and which Maskull navigates by riding a giant, semi-living treelike creature. The evocative names of places and people have a distinctly Amazing Stories vibe: Disscourn, Panawe, Corpang, the Lusion Plain.

Maskull sets out ostensibly looking for Nightspore and Krag. But as he proceeds, it becomes clear that his purpose on Tormance is tied to that of a being called Surtur, who draws Maskull northward with a slow, insistent drumbeat that only he can hear. Every chapter sees Maskull enter a new region of Tormance, each with its own particular landscape and specific philosophical culture—a sort of Gulliver’s Travels recast as a troubling, darkly symbolic dream. Ifdawn Marest lives violently, crudely, simply—its residents engage in contests of mind control to dominate, torture, and kill one another. The land of Sant houses vain ascetics who have renounced all the physical pleasures of the world. In Matterplay, Maskull encounters the last of the phaen, an ancient race composed not of men or women but a third, primordial gender. Names of other supreme beings are revealed: some mention Muspel, but many talk of Crystalman, possibly another god, or maybe just another name for Surtur—the Tormancians’ accounts vary. But when people die on Tormance, their faces twist into a nauseating smile known as Crystalman’s grin. The precise cosmology always remains just out of focus, however, and this refusal to resolve comes to drive Maskull forward more than the thought of finding his companions. And through this driving impetus, Maskull finds each place, each philosophy, exposed as limited, false, incomplete. This falseness usually results in an explosion of ugly violence, and Maskull, often as not, is perpetrating it.

And so the book proceeds, like some dark, cosmic picaresque, until Maskull reaches Surtur’s Ocean, the northernmost ocean of Tormance. He reunites with Krag, who seems to be expecting him. Krag takes the physically failing Maskull on a raft out to sea, on a journey to Muspel, which Maskull learns is the name of the “true world,” the world outside the corruption of illusory things. As they sail along, Maskull, exhausted and spent, dies, which somehow releases Nightspore back into being. Then Krag lets Nightspore off at a lone edifice in the sea. As he ascends through it, Nightspore stops at a succession of windows that show him the nature of reality: there is Muspel, Surtur’s world, the impartial, pure, true world that most are prevented from seeing by the illusory world of Crystalman, who is not an aspect of Surtur but an embodiment of deceit and distraction. Violence, art, love, talk, work, play—all of these are tools Crystalman uses to ensnare the spark of Muspel contained in each living thing, preventing that life from returning to the world it came from. All the inhabitants of Tormance and their multifarious philosophies were blinded to this truth by Crystalman—and that’s why, when they died, their faces contorted into Crystalman’s Grin, the signature of his triumph over their souls.

Arcturus ends with the resurrected/transmogrified/newborn Nightspore descending the tower and meeting up with Krag again, who reveals that he is Surtur, and that his name on earth is Pain. Nightspore steps back onto the raft and the two sail away into the darkness, presumably to continue their struggle against Crystalman, on earth or elsewhere. It’s a powerful, striking, triumphless ending—a metaphysical cliffhanger that opens up long avenues of thought.

Anybody reading with their internal aerial up and receiving would have noticed something going on with Arcturus before the final chapters, but they are only the biggest among many clues that make it clear the novel is more than a weightless adventure yarn. Maskull is an off-putting protagonist. He’s animated less by personality and more by some psychic decree outside of his control (authorial or otherwise). He’s got the wrong proportions for a standard hero: Lindsay describes him as “a kind of giant, but of broader and more robust physique than most giants,” with a full beard, short bristling hair, and features that are “thick and heavy, coarsely modeled, like those of a wooden carving”—and yet with eyes sparkling with “intelligence and audacity.” He’s impulsive, driven, and violent—and key to the dark energy that propels Arcturus away from genre pulp into deeper, thornier territory.

Much early speculative fiction created vistas of longing; they showed better worlds, nobler peoples, purer ways of living. The Lord of the Rings set the standard in this regard but it was hardly alone, and not the first. The Worm Ouroboros, Lud-in-the-Mist, Time and the Gods are others—all committed to beauty and magic and bravery as antidotes to our own world. They didn’t deny their correlation to accepted reality, but they actively opposed aspects of that reality by showing us better versions. Arcturus, rather than look outward over the hills of faerie, turns inward, drills down until it exposes its fundamental vision of existence, and that vision is a searing one. Its aspect is fire, and whereas most speculative fiction is aspirational, Arcturus is agonized; reality is, like the unearthly wound Maskull receives from Krag, “one long discomfort,” a galaxy of damnation:

Millions of grotesque, vulgar, ridiculous, sweetened individuals – once Spirit – were calling out from their degradation and agony for salvation from Muspel…

Arcturus the planet isn’t meant to be “real” like Minas Tirith or Lud-in-the-Mist or Witchland are meant to be real. Instead of creating another world, Lindsay showed us our own; refracted through the alien metaphors of Tormance, yes, but nevertheless recognizable. As anthropologist Loren Eiseley notes in his introduction to the Ballantine edition, Arcturus is really “a long earth journey.” There’s a dystopia in Lindsay’s novel, though the dystopia is not political or societal, but metaphysical. It’s not a nightmare city, but a nightmare world; not a corrupt government, but a corrupt soul. Maskull’s vicious, driving nature allows him to open that final door for readers.

Naturally, this dark, anguished, philosophical heart impacted Arcturus’ initial sales. In 1920, science fiction seemed impossibly far from literary “respectability.” There was a strong undercurrent of literary speculative fiction at the time, but it wasn’t universally popular and certainly not accepted by the establishment. Arcturus came blazing fully-formed into the world, subverting tropes that had barely been established. And you can imagine potential readers either avoiding Arcturus because of those tropes, or dropping it because it didn’t thoroughly conform to nascent genre conventions. Arcturus did itself no commercial favors by tapping SF in the name of art. It made itself a black sheep among black sheep.

Sellin ends his ’81 overview of Linday’s life and work as all essays on Arcturus and Lindsay end: with hope for a wider readership in the future. But I predict Arcturus will continue to be preserved by a small but vocal readership—no more. I think it has already assumed the strange, somewhat sour mantle of an “influential” classic, one whose most visible legacy will always be the way it presaged so much that came after. Once you read Arcturus, you’re always finding chunks of it here and there, like burning fragments of an exploded spaceship smoldering in a field. Its Mariana Trench pessimism turns up in Harlan Ellison and, with a paranoiac twist, in Philip K. Dick. Its deep exploration of reality through violence and sexuality bring to mind A Clockwork Orange, Dhalgren; and Maskull’s surrender into a metaphysical system vaster than himself hits on core conceits in much of Pynchon. And most obviously, science fiction as metaphor for our own world, our own souls, was a shocking and (to some) ugly experiment in Arcturus—but today it’s as common as grass.

I think the novel’s admirers want recognition for Arcturus because Lindsay’s life is always painted as one of frustration, where recognition for his accomplishments was continually withheld. And that’s true. But he also created a masterwork, and it seems weird to quibble with immortality, no matter how it comes. Even today, Lindsay’s first novel stands out in any literary landscape, casting a long shadow: an architecture phased in from a parallel dimension both alien and familiar.

The Pentagon Budget as Corporate Welfare for Weapons Makers

By William Hartung (with introduction by TomDispatch)

Source: TomDispatch.com

What company gets the most money from the U.S. government? The answer: the weapons maker Lockheed Martin. As the Washington Post recently reported, of its $51 billion in sales in 2017, Lockheed took in $35.2 billion from the government, or close to what the Trump administration is proposing for the 2019 State Department budget. And which company is in second place when it comes to raking in the taxpayer dollars? The answer: Boeing with a mere $26.5 billion. And mind you, that’s before the good times even truly begin to roll, as TomDispatch regular and weapons industry expert William Hartung makes clear today in a deep dive into the (ir)realities of the Pentagon budget.  When it comes to the Department of Defense, though, perhaps we should retire the term “budget” altogether, given its connotation of restraint. Can’t we find another word entirely? Like the Pentagon cornucopia?

Sometimes, it’s hard to believe that perfectly sober reportage about Pentagon funding issues isn’t satire in the style of the New Yorker’s Andy Borowitz.  Take, for instance, a recent report in the Washington Examiner that Army Secretary Mark Esper and other Pentagon officials are now urging Congress to release them from a September 30th deadline for fully dispersing their operation and maintenance funds (about 40% of the department’s budget).  In translation, they’re telling Congress that they have more money than even they can spend in the time allotted.

It’s hard to be forced to spend vast sums in a rush when, for instance, you’re launching a nuclear arms “race” of one by “modernizing” what’s already the most advanced arsenal on the planet over the next 30 years for a mere trillion-plus dollars (a sum that, given the history of Pentagon budgeting, is sure to rise precipitously).  In that context, let Hartung usher you into the wondrous world of what, in the age of The Donald, might be thought of (with alliteration in mind) as the Plutocratic Pentagon. Tom

How the Pentagon Devours the Budget
Normalizing Budgetary Bloat
By William D. Hartung

Imagine for a moment a scheme in which American taxpayers were taken to the cleaners to the tune of hundreds of billions of dollars and there was barely a hint of criticism or outrage.  Imagine as well that the White House and a majority of the politicians in Washington, no matter the party, acquiesced in the arrangement.  In fact, the annual quest to boost Pentagon spending into the stratosphere regularly follows that very scenario, assisted by predictions of imminent doom from industry-funded hawks with a vested interest in increased military outlays.

Most Americans are probably aware that the Pentagon spends a lot of money, but it’s unlikely they grasp just how huge those sums really are.  All too often, astonishingly lavish military budgets are treated as if they were part of the natural order, like death or taxes.

The figures contained in the recent budget deal that kept Congress open, as well as in President Trump’s budget proposal for 2019, are a case in point: $700 billion for the Pentagon and related programs in 2018 and $716 billion the following year.  Remarkably, such numbers far exceeded even the Pentagon’s own expansive expectations.  According to Donald Trump, admittedly not the most reliable source in all cases, Secretary of Defense Jim Mattis reportedly said, “Wow, I can’t believe we got everything we wanted” — a rare admission from the head of an organization whose only response to virtually any budget proposal is to ask for more.

The public reaction to such staggering Pentagon budget hikes was muted, to put it mildly. Unlike last year’s tax giveaway to the rich, throwing near-record amounts of tax dollars at the Department of Defense generated no visible public outrage.  Yet those tax cuts and Pentagon increases are closely related.  The Trump administration’s pairing of the two mimics the failed approach of President Ronald Reagan in the 1980s — only more so.  It’s a phenomenon I’ve termed “Reaganomics on steroids.”  Reagan’s approach yielded oceans of red ink and a severe weakening of the social safety net.  It also provoked such a strong pushback that he later backtracked by raising taxes and set the stage for sharp reductions in nuclear weapons.

Donald Trump’s retrograde policies on immigration, women’s rights, racial justice, LGBT rights, and economic inequality have spawned an impressive and growing resistance.  It remains to be seen whether his generous treatment of the Pentagon at the expense of basic human needs will spur a similar backlash.

Of course, it’s hard to even get a bead on what’s being lavished on the Pentagon when much of the media coverage failed to drive home just how enormous these sums actually are. A rare exception was an Associated Press story headlined “Congress, Trump Give the Pentagon a Budget the Likes of Which It Has Never Seen.” This was certainly far closer to the truth than claims like that of Mackenzie Eaglen of the conservative American Enterprise Institute, which over the years has housed such uber-hawks as Dick Cheney and John Bolton.  She described the new budget as a “modest year-on-year increase.” If that’s the case, one shudders to think what an immodest increase might look like.

The Pentagon Wins Big

So let’s look at the money.

Though the Pentagon’s budget was already through the roof, it will get an extra $165 billion over the next two years, thanks to the congressional budget deal reached earlier this month.  To put that figure in context, it was tens of billions of dollars more than Donald Trump had asked for last spring to  “rebuild” the U.S. military (as he put it).  It even exceeded the figures, already higher than Trump’s, Congress had agreed to last December.  It brings total spending on the Pentagon and related programs for nuclear weapons to levels higher than those reached during the Korean and Vietnam wars in the 1950s and 1960s, or even at the height of Ronald Reagan’s vaunted military buildup of the 1980s. Only in two years of Barack Obama’s presidency, when there were roughly 150,000 U.S. troops in Iraq and Afghanistan, or about seven times current levels of personnel deployed there, was spending higher.

Ben Freeman of the Center for International Policy put the new Pentagon budget numbers in perspective when he pointed out that just the approximately $80 billion annual increase in the department’s top line between 2017 and 2019 will be double the current budget of the State Department; higher than the gross domestic products of more than 100 countries; and larger than the entire military budget of any country in the world, except China’s.

Democrats signed on to that congressional budget as part of a deal to blunt some of the most egregious Trump administration cuts proposed last spring.  The administration, for example, kept the State Department’s budget from being radically slashed and it reauthorized the imperiled Children’s Health Insurance Program (CHIP) for another 10 years.  In the process, however, the Democrats also threw millions of young immigrants under the bus by dropping an insistence that any new budget protect the Deferred Action for Childhood Arrivals, or “Dreamers,” program.  Meanwhile, the majority of Republican fiscal conservatives were thrilled to sign off on a Pentagon increase that, combined with the Trump tax cut for the rich, funds ballooning deficits as far as the eye can see — a total of $7.7 trillion worth of them over the next decade.

While domestic spending fared better in the recent congressional budget deal than it would have if Trump’s draconian plan for 2018 had been enacted, it still lags far behind what Congress is investing in the Pentagon.  And calculations by the National Priorities Project indicate that the Department of Defense is slated to be an even bigger winner in Trump’s 2019 budget blueprint. Its share of the discretionary budget, which includes virtually everything the government does other than programs like Medicare and Social Security, will mushroom to a once-unimaginable 61 cents on the dollar, a hefty boost from the already startling 54 cents on the dollar in the final year of the Obama administration.

The skewed priorities in Trump’s latest budget proposal are fueled in part by the administration’s decision to embrace the Pentagon increases Congress agreed to last month, while tossing that body’s latest decisions on non-military spending out the window.  Although Congress is likely to rein in the administration’s most extreme proposals, the figures are stark indeed — a proposed cut of $120 billion in the domestic spending levels both parties agreed to. The biggest reductions include a 41% cut in funding for diplomacy and foreign aid; a 36% cut in funding for energy and the environment; and a 35% cut in housing and community development.  And that’s just the beginning.  The Trump administration is also preparing to launch full-scale assaults on food stamps, Medicaid, and Medicare.  It’s war on everything except the U.S. military.

Corporate Welfare

The recent budget plans have brought joy to the hearts of one group of needy Americans: the top executives of major weapons contractors like Lockheed Martin, Boeing, Northrop Grumman, Raytheon, and General Dynamics. They expect a bonanza from the skyrocketing Pentagon expenditures. Don’t be surprised if the CEOs of these five firms give themselves nice salary boosts, something to truly justify their work, rather than the paltry $96 million they drew as a group in 2016 (the most recent year for which full statistics are available).

And keep in mind that, like all other U.S.-based corporations, those military-industrial behemoths will benefit richly from the Trump administration’s slashing of the corporate tax rate.  According to one respected industry analyst, a good portion of this windfall will go towards bonuses and increased dividends for company shareholders rather than investments in new and better ways to defend the United States.  In short, in the Trump era, Lockheed Martin and its cohorts are guaranteed to make money coming and going.

Items that snagged billions in new funding in Trump’s proposed 2019 budget included Lockheed Martin’s overpriced, underperforming F-35 aircraft, at $10.6 billion; Boeing’s F-18 “Super Hornet,” which was in the process of being phased out by the Obama administration but is now written in for $2.4 billion; Northrop Grumman’s B-21 nuclear bomber at $2.3 billion; General Dynamics’ Ohio-class ballistic missile submarine at $3.9 billion; and $12 billion for an array of missile-defense programs that will redound to the benefit of… you guessed it: Lockheed Martin, Raytheon, and Boeing, among other companies.  These are just a few of the dozens of weapons programs that will be feeding the bottom lines of such companies in the next two years and beyond.  For programs still in their early stages, like that new bomber and the new ballistic missile submarine, their banner budgetary years are yet to come.

In explaining the flood of funding that enables a company like Lockheed Martin to reap $35 billion per year in government dollars, defense analyst Richard Aboulafia of the Teal Group noted that “diplomacy is out; air strikes are in… In this sort of environment, it’s tough to keep a lid on costs. If demand goes up, prices don’t generally come down. And, of course, it’s virtually impossible to kill stuff. You don’t have to make any kind of tough choices when there’s such a rising tide.”

Pentagon Pork Versus Human Security

Loren Thompson is a consultant to many of those weapons contractors.  His think tank, the Lexington Institute, also gets contributions from the arms industry.  He caught the spirit of the moment when he praised the administration’s puffed-up Pentagon proposal for using the Defense Department budget as a jobs creator in key states, including the crucial swing state of Ohio, which helped propel Donald Trump to victory in 2016.  Thompson was particularly pleased with a plan to ramp up General Dynamics’s production of M-1 tanks in Lima, Ohio, in a factory whose production line the Army had tried to put on hold just a few years ago because it was already drowning in tanks and had no conceivable use for more of them.

Thompson argues that the new tanks are needed to keep up with Russia’s production of armored vehicles, a dubious assertion with a decidedly Cold War flavor to it.  His claim is backed up, of course, by the administration’s new National Security Strategy, which targets Russia and China as the most formidable threats to the United States.  Never mind that the likely challenges posed by these two powers — cyberattacks in the Russian case and economic expansion in the Chinese one — have nothing to do with how many tanks the U.S. Army possesses.

Trump wants to create jobs, jobs, jobs he can point to, and pumping up the military-industrial complex must seem like the path of least resistance to that end in present-day Washington.  Under the circumstances, what does it matter that virtually any other form of spending would create more jobs and not saddle Americans with weaponry we don’t need?

If past performance offers any indication, none of the new money slated to pour into the Pentagon will make anyone safer.  As Todd Harrison of the Center for Strategic and International Studies has noted, there is a danger that the Pentagon will just get “fatter not stronger” as its worst spending habits are reinforced by a new gusher of dollars that relieves its planners of making any reasonably hard choices at all.

The list of wasteful expenditures is already staggeringly long and early projections are that bureaucratic waste at the Pentagon will amount to $125 billion over the next five years.  Among other things, the Defense Department already employs a shadow work force of more than 600,000 private contractors whose responsibilities overlap significantly with work already being done by government employees.  Meanwhile, sloppy buying practices regularly result in stories like the recent ones on the Pentagon’s Defense Logistics Agency losing track of how it spent $800 million and how two American commands were unable to account for $500 million meant for the war on drugs in the Greater Middle East and Africa.

Add to this the $1.5 trillion slated to be spent on F-35s that the nonpartisan Project on Government Oversight has noted may never be ready for combat and the unnecessary “modernization” of the U.S. nuclear arsenal, including a new generation of nuclear-armed bombers, submarines, and missiles at a minimum cost of $1.2 trillion over the next three decades.  In other words, a large part of the Pentagon’s new funding will do much to fuel good times in the military-industrial complex but little to help the troops or defend the country.

Most important of all, this flood of new funding, which could crush a generation of Americans under a mountain of debt, will make it easier to sustain the seemingly endless seven wars that the United States is fighting in Afghanistan, Pakistan, Syria, Iraq, Libya, Somalia, and Yemen.  So call this one of the worst investments in history, ensuring as it does failed wars to the horizon.

It would be a welcome change in twenty-first-century America if the reckless decision to throw yet more unbelievable sums of money at a Pentagon already vastly overfunded sparked a serious discussion about America’s hyper-militarized foreign policy.  A national debate about such matters in the run-up to the 2018 and 2020 elections could determine whether it continues to be business-as-usual at the Pentagon or whether the largest agency in the federal government is finally reined in and relegated to an appropriately defensive posture.

 

William D. Hartung, a TomDispatch regular, is the director of the Arms and Security Project at the Center for International Policy and the author of Prophets of War: Lockheed Martin and the Making of the Military-Industrial Complex.

 

Is the U.S. Government Evil? You Tell Me

By John W. Whitehead

Source: The Rutherford Institute

“The greatest evil is not now done … in concentration camps and labour camps. In those we see its final result. But it is conceived and ordered (moved, seconded, carried, and minuted) in clean, carpeted, warmed and well-lighted offices, by quiet men with white collars and cut fingernails and smooth-shaven cheeks who do not need to raise their voices. Hence, naturally enough, my symbol for Hell is something like the bureaucracy of a police state or the office of a thoroughly nasty business concern.” ― C.S. Lewis, The Screwtape Letters

Is the U.S. government evil?

You tell me.

This is a government that treats its citizens like faceless statistics and economic units to be bought, sold, bartered, traded, tracked, tortured, and eventually eliminated once they’ve outgrown their usefulness.

This is a government that treats human beings like lab rats to be caged, branded, experimented upon, and then discarded and left to suffer from the after-effects.

This is a government that repeatedly lies, cheats, steals, spies, kills, maims, enslaves, breaks the laws, overreaches its authority, and abuses its power at almost every turn.

This is a government that wages wars for profit, jails its own people for profit, and then turns a blind eye and a deaf ear while its henchmen rape and kill and pillage.

No, this is not a government that can be trusted to do what is right or moral or humane or honorable but instead seems to gravitate towards corruption, malevolence, misconduct, greed, cruelty, brutality and injustice.

This is not a government you should trust with your life, your loved ones, your livelihood or your freedoms.

This is the face of evil, disguised as a democracy, sold to the people as an institution that has their best interests at heart.

Don’t fall for the lie.

The government has never had our best interests at heart.

Endless wars. The government didn’t have our best interests at heart when it propelled us into endless oil-fueled wars and military occupations in the Middle East that wreaked havoc on our economy, stretched thin our military resources and subjected us to horrific blowback.

A police state. There is no way the government had our best interests at heart when it passed laws subjecting us to all manner of invasive searches and surveillance, censoring our speech and stifling our expression, rendering us anti-government extremists for daring to disagree with its dictates, locking us up for criticizing government policies on social media, encouraging Americans to spy and snitch on their fellow citizens, and allowing government agents to grope, strip, search, taser, shoot and kill us.

Battlefield America. Certainly the government did not have our best interests at heart when it turned America into a battlefield, transforming law enforcement agencies into extensions of the military, conducting military drills on domestic soil, distributing “free” military equipment and weaponry to local police, and desensitizing Americans to the menace of the police state with active shooter drills, color-coded terror alerts, and randomly conducted security checkpoints at “soft” targets such as shopping malls and sports arenas.

School-to-prison pipeline. It would be a reach to suggest that the government had our best interests at heart when it locked down the schools, installing metal detectors and surveillance cameras, adopting zero tolerance policies that punish childish behavior as harshly as criminal actions, and teaching our young people that they have no rights, that being force-fed facts is education rather than indoctrination, that they are not to question governmental authority, that they must meekly accept a life of censorship, round-the-clock surveillance, roadside blood draws, SWAT team raids and other indignities.

Secret human experimentation. One would also be hard-pressed to suggest that the American government had our best interests at heart when it conducted secret experiments on an unsuspecting populace—citizens and noncitizens alike—making healthy people sick by spraying them with chemicals, injecting them with infectious diseases and exposing them to airborne toxins. The government reasoned that it was legitimate (and cheaper) to experiment on people who did not have full rights in society such as prisoners, mental patients, and poor blacks.

As the Associated Press reports, “The late 1940s and 1950s saw huge growth in the U.S. pharmaceutical and health care industries, accompanied by a boom in prisoner experiments funded by both the government and corporations. By the 1960s, at least half the states allowed prisoners to be used as medical guinea pigs … because they were cheaper than chimpanzees.”

In Alabama, for example, 600 black men with syphilis were allowed to suffer without proper medical treatment so that the government could study the natural progression of untreated syphilis. In California, older prisoners were implanted with testicles from livestock and executed convicts so the government could test their virility.

In Connecticut, mental patients were injected with hepatitis so the government could study the disease. In Maryland, sleeping prisoners had a pandemic flu virus sprayed up their noses so the government could monitor their symptoms. In Georgia, two dozen “volunteering” prison inmates had gonorrhea bacteria pumped directly into their urinary tracts through the penis so the government could work on a cure.

In Michigan, male patients at an insane asylum were exposed to the flu so the government could experiment with a flu vaccine. In Minnesota, 11 public service employee “volunteers” were injected with malaria, then starved for five days, so the government could study the impact.

In New York, prisoners at a reformatory prison were split into two groups to determine how a deadly stomach virus was spread: the first group was made to swallow an unfiltered stool suspension, while the second group merely breathed in germs sprayed into the air. In Staten Island, children with mental retardation were given hepatitis orally and by injection to see if they could then be cured.

Unfortunately, these incidents are just the tip of the iceberg when it comes to the atrocities the government has inflicted on an unsuspecting populace in the name of secret experimentation.

For instance, there was the U.S. military’s secret race-based testing of mustard gas on more than 60,000 enlisted men (African-Americans, Japanese-Americans, Hispanics, etc.). As NPR reports, “All of the World War II experiments with mustard gas were done in secret and weren’t recorded on the subjects’ official military records. Most do not have proof of what they went through. They received no follow-up health care or monitoring of any kind. And they were sworn to secrecy about the tests under threat of dishonorable discharge and military prison time, leaving some unable to receive adequate medical treatment for their injuries, because they couldn’t tell doctors what happened to them.”

And then there was the CIA’s Cold War-era program, MKULTRA, in which the government began secretly experimenting on hundreds of unsuspecting American civilians and military personnel by dosing them with LSD, some having the hallucinogenic drug secretly slipped into their drinks, so that the government could explore its uses in brainwashing and controlling targets. The CIA spent nearly $20 million on its MKULTRA program, reportedly as a means of programming people to carry out assassinations and, to a lesser degree, inducing anxieties and erasing memories, before it was supposedly shut down.

Similarly, the top-secret Montauk Project, the inspiration for the hit Netflix series Stranger Things, allegedly was working to develop mind-control techniques that would then be tested out on locals in a nearby village, triggering crime waves or causing teenagers to congregate.

Sounds like the stuff of conspiracy theorists, I know, but the government’s track record of treating Americans like lab rats has been well-documented, including its attempts to expose whole communities to various toxins as part of its efforts to develop lethal biological weapons and study their impact and delivery methods on unsuspecting populations.

In 1949, for instance, the government sprayed bacteria into the Pentagon’s air handling system, then the world’s largest office building. In 1950, special ops forces sprayed bacteria from Navy ships off the coast of Norfolk and San Francisco, in the latter case exposing all of the city’s 800,000 residents.

In 1953, government operatives staged “mock” anthrax attacks on St. Louis, Minneapolis, and Winnipeg using generators placed on top of cars. Local governments were reportedly told that “‘invisible smokescreen[s]’ were being deployed to mask the city on enemy radar.” Later experiments covered territory as wide-ranging as Ohio to Texas and Michigan to Kansas.

In 1965, the government’s experiments in bioterror took aim at Washington’s National Airport, followed by a 1966 experiment in which army scientists exposed a million subway NYC passengers to airborne bacteria that causes food poisoning.

Now one might argue that this is all ancient history and that the government today is different from the government of yesteryear, but has the U.S. government really changed?

Ask yourself: Has the government become any more humane, any more respectful of the rights of the citizenry? Has it become any more transparent or willing to abide by the rule of law? Has it become any more truthful about its activities? Has it become any more cognizant of its appointed role as a guardian of our rights?

Or, having mastered the Orwellian art of Doublespeak and followed the Huxleyan blueprint for distraction and diversion, has the government simply gotten craftier and more conniving, better able to hide its nefarious acts and dastardly experiments under layers of secrecy, legalism and obfuscations?

Consider this: after revelations about the government’s experiments spanning the 20th century spawned outrage, the government began looking for human guinea pigs in other countries, where “clinical trials could be done more cheaply and with fewer rules.”

In Guatemala, prisoners and patients at a mental hospital were infected with syphilis, “apparently to test whether penicillin could prevent some sexually transmitted disease.” More recently, U.S.-funded doctors “failed to give the AIDS drug AZT to all the HIV-infected pregnant women in a study in Uganda even though it would have protected their newborns.” Meanwhile, in Nigeria, children with meningitis were used to test an antibiotic named Trovan. Eleven children died and many others were left disabled.

What kind of government perpetrates such horrific acts on human beings, whether or not they are American citizens?

Is there any difference between a government mindset that justifies experimenting on prisoners because they’re “cheaper than chimpanzees” and a government that sanctions jailhouse strip searches of individuals charged with minor infractions simply because it’s easier on a jail warden’s workload?

John Lennon was right: “We’re being run by maniacs for maniacal ends.”

Unfortunately, the more things change, the more they stay the same.

Just recently, for example, a Fusion Center in Washington State (a Dept. of Homeland Security-linked data collection clearinghouse that shares information between state, local and federal agencies) inadvertently released records on remote mind control tactics (the use of “psycho-electronic” weapons to control people from a distance or subject them to varying degrees of pain).

Mind you, there is no clear evidence to suggest that these particular documents were created by a government agency. Then again, the government—no stranger to diabolical deeds or shady experiments carried out an unsuspecting populace—has done it before.

After all, this is a government that has become almost indistinguishable from the evil it claims to be fighting, whether that evil takes the form of terrorism, torture, drug traffickingsex trafficking, murder, violence, theft, pornography, scientific experimentations or some other diabolical means of inflicting pain, suffering and servitude on humanity.

For too long now, the American people have been persuaded to barter their freedoms for phantom promises of security and, in the process, have rationalized turning a blind eye to all manner of government wrongdoing—asset forfeiture schemes, corruption, surveillance, endless wars, SWAT team raids, militarized police, profit-driven private prisons, and so on—because they were the so-called lesser of two evils.

No matter how you rationalize it, the lesser of two evils is still evil.

There’s a scene in The Third Man, Carol Reed’s influential 1949 film starring Joseph Cotten and Orson Welles in which a rogue war profiteer (Harry Lime) views human carnage with a callous indifference, unconcerned that the diluted penicillin he’s been trafficking underground has resulted in the tortured deaths of young children.

Challenged by his old friend Holly Martins to consider the consequences of his actions, Lime responds, “In these days, old man, nobody thinks in terms of human beings. Governments don’t, so why should we?”

“Have you ever seen any of your victims?” asks Martins.

“Victims?” responds Lime, as he looks down from the top of a Ferris wheel onto a populace reduced to mere dots on the ground. “Look down there. Tell me. Would you really feel any pity if one of those dots stopped moving forever? If I offered you twenty thousand pounds for every dot that stopped, would you really, old man, tell me to keep my money, or would you calculate how many dots you could afford to spare?”

Lime’s callous indifference is no different from the U.S. government’s calculating cost-benefit analyses.

In the eyes of the government, “we the people” are chump change.

So why do Americans keep believing the government has their best interests at heart?

Why do Americans keep trusting the government?

Why do Americans pretend not to know what is so obvious to anyone with eyes and ears and a conscience?

As Carl Sagan recognized, “If we’ve been bamboozled long enough, we tend to reject any evidence of the bamboozle. We’re no longer interested in finding out the truth. The bamboozle has captured us. It’s simply too painful to acknowledge, even to ourselves, that we’ve been taken. Once you give a charlatan power over you, you almost never get it back.”

We should never have trusted the government in the first place.

That’s why the Founders came up with a Bill of Rights. They recognized that without binding legal protections affirming the rights of the people, the newly instituted American government would be no better than the old British despot.

It was Thomas Jefferson who warned, “In questions of power then, let no more be heard of confidence in man, but bind him down from mischief by the chains of the Constitution.”

Unfortunately, we didn’t heed the warning.

As I make clear in my book Battlefield America: The War on the American Peoplethe government has ripped the Constitution to shreds and left us powerless in the face of its power grabs, greed and brutality.

So how do you fight back?

How do you fight injustice? How do you push back against tyranny? How do you vanquish evil?

You don’t fight it by hiding your head in the sand.

Stop being apathetic. Stop being neutral. Stop being accomplices.

Start recognizing evil and injustice and tyranny for what they are. Demand government transparency. Vote with your feet (i.e., engage in activism, not just politics). Refuse to play politics with your principles. Don’t settle for the lesser of two evils.

As British statesman Edmund Burke warned, “The only thing necessary for the triumph of evil is for good men [and women] to do nothing.”

It’s time for good men and women to do something. And soon.

Social Media Behemoths Sweep Alternative News into the Memory Hole

By Kurt Nimmo

Source: Another Day in the Empire

The squabbling between self-identified progressives and conservatives continues as social media transforms itself into a news, information, and opinion gatekeeper.

All information that contradicts the establishment narrative will either be downgraded into obscurity or excluded outright on social media.

Take for instance ThinkProgress, the Soros-financed news website, a project of the Center for American Progress Action Fund welded to the infrastructure of the Democrat party. On May 2, it complained that a bias study at Facebook will be run by conservatives, that is to say establishment Republicans, notably former Arizona Congress critter Jon Kyl.

ThinkProgress believes there is no such thing as bias aimed at conservatives—it’s the liberals who are routinely downgraded at Facebook while so-called conservatives are free to post what progressives characterize as an evil and poisonous ideology.

According to Libby Watson at Splinter News, conservatives are involved in “grift,” flimflamming poor Mark Zuckerberg with untrue claims of bias against the likes of Breitbart News.

It’s all part of a never ending and hugely counterproductive “culture war” that has raged between the ostensible right and left going on thirty years now. Ms. Watson manages to squeeze identity politics into her screed.

“The conservative movement has done a remarkable job over the last half century to bellow and bully its way into having its most ridiculous and reality-divorced concerns taken seriously,” she writes. “It lies about and distorts everything: about tax cuts, about Benghazi and her emails, about immigration, about healthcare, about Diamond and Silk. The further Facebook descends down the path of letting that screaming white face of faux outrage dictate how they run their platform, the harder it’s going to be for them to get away from them.”

The progressive news website Common Dreams complains it has weathered “significant drops in traffic since Google and Facebook began changing algorithms and talking openly about their new attempts to control the kind of news content users see. According to internal data and Google Analytics, traffic to Common Dreams from Google searches fell by 34 percent after the powerful search giant unveiled its new search protocol in April 2017.”

Meanwhile, on the other side of the yawning divide, Brent Bozell, founder of the Media Research Center, rallied around 60 conservatives and fired off an open letter to the social media giants demanding transparency, clarity on the definition of hate speech, equality for conservatives, and respect for the First Amendment.

“Social media censorship and online restriction of conservatives and their organizations have reached a crisis level,” the open letter states. “Facebook CEO Mark Zuckerberg’s hearings on Capitol Hill only served to draw attention to how widespread this problem has become. Conservative leaders now have banded together to call for equal treatment on tech and social media.”

Both liberals and conservatives are missing the point.

Facebook and Google will continue and enlarge the effort to gatekeep information that does not jive with the establishment narrative, be it from the right or left.

The internet and web upended the establishment’s carefully constructed propaganda machine—the CIA’s “Mighty Wurlitzer” under its Operation Mockingbird beginning in the early 1950s—deeply embedded within corporate media.

Beginning with Friendster, MySpace, and like projects in the early 2000s and eventually morphing into the corporate behemoths Facebook, YouTube, and Twitter, social media platforms have extended the reach of alternative media, much to the displeasure of the establishment. Its preferred propaganda conduits have withered and this has seriously hampered its ability to control the narrative.

Both the right and left need to nurture their own social media platforms and drive traffic there.

Of course, this will not be as effective as plugging into the massive matrix of social connectivity provided by the corporate tech giants, but the alternative is to be marginalized and eventually swept into the memory hole as the context of “extremism” narrows and constricts expression, excluding all but the most token disagreement with the establishment narrative.

However, I’m not sure we’re up to it.

The elite has done a remarkable job of using the time tested divide and conquer concept, endlessly pitting the so-called right against the amorphously defined left and vice versa. Liberals and conservatives continue to fight over frivolous ideological points as the funny money asset-driven economy prepares to implode and the mission of infinity war expands to the point where it endangers life on planet Earth.

Disarming the Weapons of Mass Distraction

By Madeleine Bunting

Source: Rise Up Times

“Are you paying attention?” The phrase still resonates with a particular sharpness in my mind. It takes me straight back to my boarding school, aged thirteen, when my eyes would drift out the window to the woods beyond the classroom. The voice was that of the math teacher, the very dedicated but dull Miss Ploughman, whose furrowed grimace I can still picture.

We’re taught early that attention is a currency—we “pay” attention—and much of the discipline of the classroom is aimed at marshaling the attention of children, with very mixed results. We all have a history here, of how we did or did not learn to pay attention and all the praise or blame that came with that. It used to be that such patterns of childhood experience faded into irrelevance. As we reached adulthood, how we paid attention, and to what, was a personal matter and akin to breathing—as if it were automatic.

Today, though, as we grapple with a pervasive new digital culture, attention has become an issue of pressing social concern. Technology provides us with new tools to grab people’s attention. These innovations are dismantling traditional boundaries of private and public, home and office, work and leisure. Emails and tweets can reach us almost anywhere, anytime. There are no cracks left in which the mind can idle, rest, and recuperate. A taxi ad offers free wifi so that you can remain “productive” on a cab journey.

Even those spare moments of time in our day—waiting for a bus, standing in a queue at the supermarket—can now be “harvested,” says the writer Tim Wu in his book The Attention Merchants. In this quest to pursue “those slivers of our unharvested awareness,” digital technology has provided consumer capitalism with its most powerful tools yet. And our attention fuels it. As Matthew Crawford notes in The World Beyond Your Head, “when some people treat the minds of other people as a resource, this is not ‘creating wealth,’ it is transferring it.”

There’s a whiff of panic around the subject: the story that our attention spans are now shorter than a goldfish’s attracted millions of readers on the web; it’s still frequently cited, despite its questionable veracity. Rates of diagnosis attention deficit hyperactivity disorder in children have soared, creating an $11 billion global market for pharmaceutical companies. Every glance of our eyes is now tracked for commercial gain as ever more ingenious ways are devised to capture our attention, if only momentarily. Our eyeballs are now described as capitalism’s most valuable real estate. Both our attention and its deficits are turned into lucrative markets.

There is also a domestic economy of attention; within every family, some get it and some give it. We’re all born needing the attention of others—our parents’, especially—and from the outset, our social skills are honed to attract the attention we need for our care. Attention is woven into all forms of human encounter from the most brief and transitory to the most intimate. It also becomes deeply political: who pays attention to whom?

Social psychologists have researched how the powerful tend to tune out the less powerful. One study with college students showed that even in five minutes of friendly chat, wealthier students showed fewer signs of engagement when in conversation with their less wealthy counterparts: less eye contact, fewer nods, and more checking the time, doodling, and fidgeting. Discrimination of race and gender, too, plays out through attention. Anyone who’s spent any time in an organization will be aware of how attention is at the heart of office politics. A suggestion is ignored in a meeting, but is then seized upon as a brilliant solution when repeated by another person.

What is political is also ethical. Matthew Crawford argues that this is the essential characteristic of urban living: a basic recognition of others.

And then there’s an even more fundamental dimension to the politics of attention. At a primary level, all interactions in public space require a very minimal form of attention, an awareness of the presence and movement of others. Without it, we would bump into each other, frequently.

I had a vivid demonstration of this point on a recent commute: I live in East London and regularly use the narrow canal paths for cycling. It was the canal rush hour—lots of walkers with dogs, families with children, joggers as well as cyclists heading home. We were all sharing the towpath with the usual mixture of give and take, slowing to allow passing, swerving around and between each other. Only this time, a woman was walking down the center of the path with her eyes glued to her phone, impervious to all around her. This went well beyond a moment of distraction. Everyone had to duck and weave to avoid her. She’d abandoned the unspoken contract that avoiding collision is a mutual obligation.

This scene is now a daily occurrence for many of us, in shopping centers, station concourses, or on busy streets. Attention is the essential lubricant of urban life, and without it, we’re denying our co-existence in that moment and place. The novelist and philosopher, Iris Murdoch, writes that the most basic requirement for being good is that a person “must know certain things about his surroundings, most obviously the existence of other people and their claims.”

Attention is what draws us out of ourselves to experience and engage in the world. The word is often accompanied by a verb—attention needs to be grabbed, captured, mobilized, attracted, or galvanized. Reflected in such language is an acknowledgement of how attention is the essential precursor to action. The founding father of psychology William James provided what is still one of the best working definitions:

It is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. Focalization, concentration, of consciousness are of its essence. It implies withdrawal from some things in order to deal effectively with others.

Attention is a limited resource and has to be allocated: to pay attention to one thing requires us to withdraw it from others. There are two well-known dimensions to attention, explains Willem Kuyken, a professor of psychology at Oxford. The first is “alerting”— an automatic form of attention, hardwired into our brains, that warns us of threats to our survival. Think of when you’re driving a car in a busy city: you’re aware of the movement of other cars, pedestrians, cyclists, and road signs, while advertising tries to grab any spare morsel of your attention. Notice how quickly you can swerve or brake when you spot a car suddenly emerging from a side street. There’s no time for a complicated cognitive process of decision making. This attention is beyond voluntary control.

The second form of attention is known as “executive”—the process by which our brain selects what to foreground and focus on, so that there can be other information in the background—such as music when you’re cooking—but one can still accomplish a complex task. Crucially, our capacity for executive attention is limited. Contrary to what some people claim, none of us can multitask complex activities effectively. The next time you write an email while talking on the phone, notice how many typing mistakes you make or how much you remember from the call. Executive attention can be trained, and needs to be for any complex activity. This was the point James made when he wrote: “there is no such thing as voluntary attention sustained for more than a few seconds at a time… what is called sustained voluntary attention is a repetition of successive efforts which bring back the topic to the mind.”

Attention is a complex interaction between memory and perception, in which we continually select what to notice, thus finding the material which correlates in some way with past experience. In this way, patterns develop in the mind. We are always making meaning from the overwhelming raw data. As James put it, “my experience is what I agree to attend to. Only those items which I notice shape my mind—without selective interest, experience is an utter chaos.”

And we are constantly engaged in organizing that chaos, as we interpret our experience. This is clear in the famous Gorilla Experiment in which viewers were told to watch a video of two teams of students passing a ball between them. They had to count the number of passes made by the team in white shirts and ignore those of the team in black shirts. The experiment is deceptively complex because it involves three forms of attention: first, scanning the whole group; second, ignoring the black T-shirt team to keep focus on the white T-shirt team (a form of inhibiting attention); and third, remembering to count. In the middle of the experiment, someone in a gorilla suit ambles through the group. Afterward, half the viewers when asked hadn’t spotted the gorilla and couldn’t even believe it had been there. We can be blind not only to the obvious, but to our blindness.

There is another point in this experiment which is less often emphasized. Ignoring something—such as the black T-shirt team in this experiment—requires a form of attention. It costs us attention to ignore something. Many of us live and work in environments that require us to ignore a huge amount of information—that flashing advert, a bouncing icon or pop-up.

In another famous psychology experiment, Walter Mischel’s Marshmallow Test, four-year-olds had a choice of eating a marshmallow immediately or two in fifteen minutes. While filmed, each child was put in a room alone in front of the plate with a marshmallow. They squirmed and fidgeted, poked the marshmallow and stared at the ceiling. A third of the children couldn’t resist the marshmallow and gobbled it up, a third nibbled cautiously, but the last third figured out how to distract themselves. They looked under the table, sang… did anything but look at the sweet. It’s a demonstration of the capacity to reallocate attention. In a follow-up study some years later, those who’d been able to wait for the second marshmallow had better life outcomes, such as academic achievement and health. One New Zealand study of 1,000 children found that this form of self-regulation was a more reliable predictor of future success and wellbeing than even a good IQ or comfortable economic status.

What, then, are the implications of how digital technologies are transforming our patterns of attention? In the current political anxiety about social mobility and inequality, more weight needs to be put on this most crucial and basic skill: sustaining attention.

*

I learned to concentrate as a child. Being a bookworm helped. I’d be completely absorbed in my reading as the noise of my busy family swirled around me. It was good training for working in newsrooms; when I started as a journalist, they were very noisy places with the clatter of keyboards, telephones ringing and fascinating conversations on every side. What has proved much harder to block out is email and text messages.

The digital tech companies know a lot about this widespread habit; many of them have built a business model around it. They’ve drawn on the work of the psychologist B.F. Skinner who identified back in the Thirties how, in animal behavior, an action can be encouraged with a positive consequence and discouraged by a negative one. In one experiment, he gave a pigeon a food pellet whenever it pecked at a button and the result, as predicted, was that the pigeon kept pecking. Subsequent research established that the most effective way to keep the pigeon pecking was “variable-ratio reinforcement.” Give the pigeon a food pellet sometimes, and you have it well and truly hooked.

We’re just like the pigeon pecking at the button when we check our email or phone. It’s a humiliating thought. Variable reinforcement ensures that the customer will keep coming back. It’s the principle behind one of the most lucrative US industries: slot machines, which generate more profit than baseball, films, and theme parks combined. Gambling was once tightly restricted for its addictive potential, but most of us now have the attentional equivalent of a slot machine in our pocket, beside our plate at mealtimes, and by our pillow at night. Even during a meal out, a play at the theater, a film, or a tennis match. Almost nothing is now experienced uninterrupted.

Anxiety about the exponential rise of our gadget addiction and how it is fragmenting our attention is sometimes dismissed as a Luddite reaction to a technological revolution. But that misses the point. The problem is not the technology per se, but the commercial imperatives that drive the new technologies and, unrestrained, colonize our attention by fundamentally changing our experience of time and space, saturating both in information.

In much public space, wherever your eye lands—from the back of the toilet door, to the handrail on the escalator, or the hotel key card—an ad is trying to grab your attention, and does so by triggering the oldest instincts of the human mind: fear, sex, and food. Public places become dominated by people trying to sell you something. In his tirade against this commercialization, Crawford cites advertisements on the backs of school report cards and on debit machines where you swipe your card. Before you enter your PIN, that gap of a few seconds is now used to show adverts. He describes silence and ad-free experience as “luxury goods” that only the wealthy can afford. Crawford has invented the concept of the “attentional commons,” free public spaces that allow us to choose where to place our attention. He draws the analogy with environmental goods that belong to all of us, such as clean air or clean water.

Some legal theorists are beginning to conceive of our own attention as a human right. One former Google employee warned that “there are a thousand people on the other side of the screen whose job it is to break down the self-regulation you have.” They use the insights into human behavior derived from social psychology—the need for approval, the need to reciprocate others’ gestures, the fear of missing out. Your attention ceases to be your own, pulled and pushed by algorithms. Attention is referred to as the real currency of the future.

*

In 2013, I embarked on a risky experiment in attention: I left my job. In the previous two years, it had crept up on me. I could no longer read beyond a few paragraphs. My eyes would glaze over and, even more disastrously for someone who had spent their career writing, I seemed unable to string together my thoughts, let alone write anything longer than a few sentences. When I try to explain the impact, I can only offer a metaphor: it felt like my imagination and use of language were vacuum packed, like a slab of meat coated in plastic. I had lost the ability to turn ideas around, see them from different perspectives. I could no longer draw connections between disparate ideas.

At the time, I was working in media strategy. It was a culture of back-to-back meetings from 8:30 AM to 6 PM, and there were plenty of advantages to be gained from continuing late into the evening if you had the stamina. Commitment was measured by emails with a pertinent weblink. Meetings were sometimes as brief as thirty minutes and frequently ran through lunch. Meanwhile, everyone was sneaking time to battle with the constant emails, eyes flickering to their phone screens in every conversation. The result was a kind of crazy fog, a mishmash of inconclusive discussions.

At first, it was exhilarating, like being on those crazy rides in a theme park. By the end, the effect was disastrous. I was almost continuously ill, battling migraines and unidentifiable viruses. When I finally made the drastic decision to leave, my income collapsed to a fraction of its previous level and my family’s lifestyle had to change accordingly. I had no idea what I was going to do; I had lost all faith in my ability to write. I told friends I would have to return the advance I’d received to write a book. I had to try to get back to the skills of reflection and focus that had once been ingrained in me.

The first step was to teach myself to read again. I sometimes went to a café, leaving my phone and computer behind. I had to slow down the racing incoherence of my mind so that it could settle on the text and its gradual development of an argument or narrative thread. The turning point in my recovery was a five weeks’ research trip to the Scottish Outer Hebrides. On the journey north of Glasgow, my mobile phone lost its Internet connection. I had cut myself loose with only the occasional text or call to family back home. Somewhere on the long Atlantic beaches of these wild and dramatic islands, I rediscovered my ability to write.

I attribute that in part to a stunning exhibition I came across in the small harbor town of Lochboisdale, on the island of South Uist. Vija Celmins is an acclaimed Latvian-American artist whose work is famous for its astonishing patience. She can take a year or more to make a woodcut that portrays in minute detail the surface of the sea. A postcard of her work now sits above my desk, a reminder of the power of slow thinking.

Just as we’ve had a slow eating movement, we need a slow thinking campaign. Its manifesto could be the German poet Rainer Maria Rilke’s beautiful “Letters to a Young Poet”:

To let every impression and the germ of every feeling come to completion inside, in the dark, in the unsayable, the unconscious, in what is unattainable to one’s own intellect, and to wait with deep humility and patience for the hour when a new clarity is delivered.

Many great thinkers attest that they have their best insights in moments of relaxation, the proverbial brainwave in the bath. We actually need what we most fear: boredom.

When I left my job (and I was lucky that I could), friends and colleagues were bewildered. Why give up a good job? But I felt that here was an experiment worth trying. Crawford frames it well as “intellectual biodiversity.” At a time of crisis, we need people thinking in different ways. If we all jump to the tune of Facebook or Instagram and allow ourselves to be primed by Twitter, the danger is that we lose the “trained powers of concentration” that allow us, in Crawford’s words, “to recognize that independence of thought and feeling is a fragile thing, and requires certain conditions.”

I also took to heart the insights of the historian Timothy Snyder, who concluded from his studies of twentieth-century European totalitarianism that the way to fend off tyranny is to read books, make an effort to separate yourself from the Internet, and “be kind to our language… Think up your own way of speaking.” Dropping out and going offline enabled me to get back to reading, voraciously, and to writing; beyond that, it’s too early to announce the results of my experiment with attention. As Rilke said, “These things cannot be measured by time, a year has no meaning, and ten years are nothing.”

*

A recent column in The New Yorker cheekily suggests that all the fuss about the impact of digital technologies on our attention is nothing more than writers’ worrying about their own working habits. Is all this anxiety about our fragmenting minds a moral panic akin to those that swept Victorian Britain about sexual behavior? Patterns of attention are changing, but perhaps it doesn’t much matter?

My teenage children read much less than I did. One son used to play chess online with a friend, text on his phone, and do his homework all at the same time. I was horrified, but he got a place at Oxford. At his interview, he met a third-year history undergraduate who told him he hadn’t yet read any books in his time at university. But my kids are considerably more knowledgeable about a vast range of subjects than I was at their age. There’s a small voice suggesting that the forms of attention I was brought up with could be a thing of the past; the sustained concentration required to read a whole book will become an obscure niche hobby.

And yet, I’m haunted by a reflection: the magnificent illuminations of the eighth-century Book of Kells has intricate patterning that no one has ever been able to copy, such is the fineness of the tight spirals. Lines are a millimeter apart. They indicate a steadiness of hand and mind—a capability most of us have long since lost. Could we be trading in capacities for focus in exchange for a breadth of reference? Some might argue that’s not a bad trade. But we would lose depth: artist Paul Klee wrote that he would spend a day in silent contemplation of something before he painted it. Paul Cézanne was similarly known for his trance like attention on his subject. Madame Cézanne recollected how her husband would gaze at the landscape, and told her, “The landscape thinks itself in me, and I am its consciousness.” The philosopher Maurice Merleau-Ponty describes a contemplative attention in which one steps outside of oneself and immerses oneself in the object of attention.

It’s not just artists who require such depth of attention. Nearly two decades ago, a doctor teaching medical students at Yale was frustrated at their inability to distinguish between types of skin lesions. Their gaze seemed restless and careless. He took his students to an art gallery and told them to look at a picture for fifteen minutes. The program is now used in dozens of US medical schools.

Some argue that losing the capacity for deep attention presages catastrophe. It is the building block of “intimacy, wisdom, and cultural progress,” argues Maggie Jackson in her book Distracted, in which she warns that “as our attentional skills are squandered, we are plunging into a culture of mistrust, skimming, and a dehumanizing merging between man and machine.” Significantly, her research began with a curiosity about why so many Americans were deeply dissatisfied with life. She argues that losing the capacity for deep attention makes it harder to make sense of experience and to find meaning—from which comes wonder and fulfillment. She fears a new “dark age” in which we forget what makes us truly happy.

Strikingly, the epicenter of this wave of anxiety over our attention is the US. All the authors I’ve cited are American. It’s been argued that this debate represents an existential crisis for America because it exposes the flawed nature of its greatest ideal, individual freedom. The commonly accepted notion is that to be free is to make choices, and no one can challenge that expression of autonomy. But if our choices are actually engineered by thousands of very clever, well-paid digital developers, are we free? The former Google employee Tristan Harris confessed in an article in 2016 that technology “gives people the illusion of free choice while architecting the menu so that [tech giants] win, no matter what you choose.”

Despite my children’s multitasking, I maintain that vital human capacities—depth of insight, emotional connection, and creativity—are at risk. I’m intrigued as to what the resistance might look like. There are stirrings of protest with the recent establishment of initiatives such as the Time Well Spent movement, founded by tech industry insiders who have become alarmed at the efforts invested in keeping people hooked. But collective action is elusive; the emphasis is repeatedly on the individual to develop the necessary self-regulation, but if that is precisely what is being eroded, we could be caught in a self-reinforcing loop.

One of the most interesting responses to our distraction epidemic is mindfulness. Its popularity is evidence that people are trying to find a way to protect and nourish their minds. Jon Kabat-Zinn, who pioneered the development of secular mindfulness, draws an analogy with jogging: just as keeping your body fit is now well understood, people will come to realize the importance of looking after their minds.

I’ve meditated regularly for twenty years, but curious as to how this is becoming mainstream, I went to an event in the heart of high-tech Shoreditch in London. In a hipster workspaces with funky architecture, excellent coffee, and an impressive range of beards, a soft-spoken retired Oxford professor of psychology, Mark Williams, was talking about how multitasking has a switching cost in focus and concentration. Our unique human ability to remember the past and to think ahead brings a cost; we lose the present. To counter this, he advocated a daily practice of mindfulness: bringing attention back to the body—the physical sensations of the breath, the hands, the feet. Williams explained how fear and anxiety inhibit creativity. In time, the practice of mindfulness enables you to acknowledge fear calmly and even to investigate it with curiosity. You learn to place your attention in the moment, noticing details such as the sunlight or the taste of the coffee.

On a recent retreat, I was beside a river early one morning and a rower passed. I watched the boat slip by and enjoyed the beauty in a radically new way. The moment was sufficient; there was nothing I wanted to add or take away—no thought of how I wanted to do this every day, or how I wanted to learn to row, or how I wished I was in the boat. Nothing but the pleasure of witnessing it. The busy-ness of the mind had stilled. Mindfulness can be a remarkable bid to reclaim our attention and to claim real freedom, the freedom from our habitual reactivity that makes us easy prey for manipulation.

But I worry that the integrity of mindfulness is fragile, vulnerable both to commercialization by employers who see it as a form of mental performance enhancement and to consumer commodification, rather than contributing to the formation of ethical character. Mindfulness as a meditation practice originates in Buddhism, and without that tradition’s ethics, there is a high risk of it being hijacked and misrepresented.

Back in the Sixties, the countercultural psychologist Timothy Leary rebelled against the conformity of the new mass media age and called for, in Crawford’s words, an “attentional revolution.” Leary urged people to take control of the media they consumed as a crucial act of self-determination; pay attention to where you place your attention, he declared. The social critic Herbert Marcuse believed Leary was fighting the struggle for the ultimate form of freedom, which Marcuse defined as the ability “to live without anxiety.” These were radical prophets whose words have an uncanny resonance today. Distraction has become a commercial and political strategy, and it amounts to a form of emotional violence that cripples people, leaving them unable to gather their thoughts and overwhelmed by a sense of inadequacy. It’s a powerful form of oppression dressed up in the language of individual choice.

The stakes could hardly be higher, as William James knew a century ago: “The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will.” And what are we humans without these three?

How To Recognize When Your Society Is Suffering A Dramatic Decline

 

By Brandon Smith

Source: Alt-Market.com

When historians and analysts look at the factors surrounding the collapse of a society, they often focus on the larger events and indicators — the moments of infamy. However, I think it’s important to consider the reality that large scale societal decline is built upon a mixture of elements, prominent as well as small. Collapse is a process, not a singular event. It happens over time, not overnight. It is a spectrum of moments and terrible choices, set in motion in most cases by people in positions of power, but helped along by useful idiots among the masses. The decline of a nation or civilization requires the complicity of a host of saboteurs.

So, instead of focusing on the top down approach, which is rather common, let’s start from the foundations of our culture to better understand why there is clear and definable destabilization.

Declining Moral Compass

There is always a conflict between personal gain and personal conscience — this is the nature of being human. But in a stable society, these two things tend to balance out. Not so during societal decline, as personal gain (and even personal comfort and gratification) tends to greatly outweigh the checks and balances of moral principles.

People often mistake the term “morality” to be a religious creation, but this is not what I am necessarily referring to. The concepts of “good” and “evil” are archetypal — that is to say they are psychologically inherent in most human beings from the moment of birth. This is not a matter of faith, but a matter of fact, observed by those in the field of psychology and anthropology over the course of a century of study.  How we relate to these concepts can be affected by our environment and upbringing, but for the most part, our moral compass is psychologically ingrained. It is up to us to either follow it or not follow it.

Watching how people handle this choice is a bit of hobby of mine, and I do take notes. You can learn a lot about the state of your environment by observing what people around you tend to do when faced with the conflict of personal gain versus personal conscience. It is saddening to admit that even though I live in rural America, where you are more likely to find self-reliance and cultural stability, I can still see a faltering nation bleeding through.

I have seen supposedly good people act dishonestly in business agreements. I have seen local institutions scam hardworking citizens. I have seen a court system rife with bias and a “good old boy” attitude of favoritism. I have seen local companies pretend to be benevolent contributors to the community while at the same time running constant frauds and rackets. I have even seen a few people within the liberty movement itself put the movement at risk with their own avarice, gluttony, narcissism and sociopathy.

Again, it is important to make a note of such people and institutions, for as the system continues its downward spiral it is these people that will present the greatest threat to the innocent.

As Carl Jung notes in his book The Undiscovered Self, there is always a contingent of latent sociopaths and psychopaths within any culture; usually about 10% of the population. In normal times, they, at least most of them, are forced into moral acclimation by the rest of the populace. But in times of decline, they seem to leak out of the woodwork like a slimy fungus. During heightened collapse, they no longer have to pretend to be upstanding and they show their true colors.

Most dangerous is when latent sociopaths or full blown sociopaths assume roles of leadership or power during the worst of times. With everyone distracted by their own plight, these people can become a cancer, infecting everything with their narcissistic pursuits and causing destruction in their wake.

Disinterest In Rewarding Conscience

During wider cultural collapse, it can become “fashionable” to see acts of principle as something to be scoffed at or ridiculed or to even see them as threats to the status quo. The concept of “going along to get along” takes precedence over doing what is right even when it is hard; this attitude is not relegated to the less honest people within society.

As a system collapses, a fog of apathy can result. Good people can become passive, scrambling to their individual corner of the world and hoping evil times will simply pass them by. The phrase “I just want to put all this behind me” is spoken regularly; but as we ignore the trespasses of terrible men and women, we also enable them. How? Because by doing nothing we allow them to continue their criminality, and we subject future persons and generations to victimization.

When doing the right thing is treated as laughable or “crazy” by what seems like a majority in the midst of widespread corruption, you are truly in the middle of a great decline.

In Christian circles, the idea of “the remnant” is sometimes spoken of. In Christian terms, this usually represents a minority of true believers surviving a tumultuous and immoral era. I see “the remnant” not so much as a contingent of Christians alone, but as a contingent of people that continue to maintain their principles and conscience when faced with unprecedented adversity. In the worst of times, these people remain stalwart, even if they are ridiculed for it.

Disinterest In Independent Effort

It is said that in this world there are two kinds of people — leaders and followers.  I’m not so sure about that, but I can see why this philosophy is promoted; it helps evil people in power stay in power by encouraging passive acceptance.

I would say that there are in fact two kinds of people in this world — people who want to control others and the people that just want to be left alone. In life sometimes we are both leaders and followers; we just have to be sure that when we lead we lead by example and not by force, and when we follow, we follow someone worth a damn.

In any case, passivity is not a solution to determining our roles in society. In most situations, independent action is required by every person to make the world a better place. Yet, in an era of systemic crisis, it is usually independent effort that is the first thing to go out the window. Millions upon millions of people wait around for someone, anyone, to tell them what they should be doing and how they should be doing it. In this way, society finds itself in stasis, frozen in a position of inaction.  Poisonous collectivism wins through mass aggression, but also through mass passivity.

In fact, when individualists do take action they can be admonished for it during times of societal breakdown, even if their actions have the potential to solve a problem. The idea that one man or woman (or a small group of people) could do anything about anything is sneered at as “fantasy” or “delusion.”  But mass movements of citizens working towards a practical goal are rare, and even more rare is when these movements are not controlled or manipulated to benefit the established order. It is not mass movements that change the world for the better, but individual people and small organizations of the dedicated, acting without permission and without administration.

It is these individuals and small groups that, over time and through relentless effort, inspire a majority to do what is necessary and right. It is these people that inspire others to finally take leadership in their own lives.

Individual Self-Isolation

I write often on the plight of the individual and individual rights within society, and I continue to see the factor of the individual as the most important element in any culture. A culture based on protecting and nurturing individualism and voluntarism is the only culture, in my view, that will ever be successful at avoiding full spectrum collapse. That said, the downside to overt individualism is the danger of self isolation. That is to say, when true individuals only concern themselves with their personal circumstances and ignore the circumstances of the rest of the world, they eventually set themselves up to be crushed by that world.

Organization on a voluntary basis is not only healthy but vital in the longevity of a society. The more people turn in on themselves and only care about their own general conditions, the easier it is for evil people to do evil things unnoticed. Also, self isolation in the wake of collapse sets individuals up for failure, as no one is capable of surviving without at least some help from a wider pool of knowledge and talents.

In a system based on corruption, the establishment will encourage self isolation as a means to control the populace. Or, they will offer a false choice, between self isolation versus mindless collectivism. The truth is there is always a middle ground. Voluntary organization and individualism are not mutually exclusive. I call this the “difference between community and collectivism.” A community does not supplant the individual, while a collective requires the complete erasure of individual pursuits and thought.

If you find yourself surrounded by people who refuse any organization, even practical and voluntary organization in the face of instability, then your society may be in the latter stages of a collapse.

Disaster Denial

Even as a crisis or collapse unfolds, if a society actually reels or reacts to it and takes note of the problem, there is hope for that society. If, however, that society willfully ignores the danger and denies it exists when presented with overwhelming evidence, then that society will likely suffer complete disintegration and will probably have to start all over from scratch — hopefully with a set of principles and ideals based on conscience and honor.

The strength of a culture can be measured by its willingness to self reflect. Its survival can be determined by its willingness to accept its flaws when they arise and its willingness to repair the damage done. Self-aware societies are difficult to corrupt or control. Only in denial can people be easily manipulated and enslaved.

If you cannot accept the reality of the abyss, you cannot move to avoid it or prepare yourself to survive the fall. I see this issue as perhaps the single most important element in the fight to save the portions of our society worth saving. Educating people on the blatant facts behind our own national decline can dissolve the wall of denial, and perhaps we will find when disaster strikes that there are far more awake and aware individuals ready to act than we originally thought.

US Technological Transformations and the Narcotic-Fueled Genocide of American Workers

By James Petras

Source: The Unz Review

Introduction

During his recent visit to New Hampshire on 3/20/18, President Trump declared once again that the US is facing a ‘drug epidemic’. This time he advocated the death penalty for criminal drug dealers as the solution to a national crisis that has killed over 1 million Americans since the 1990’s (when the blockbuster prescription opiate Oxycontin was first released on the market). Trump promised that the Justice Department would develop the most severe penalties for criminal drug traffickers, by which he meant foreigners. He argued that his proposed “Wall” (between the Mexican- US border) would cut the flow of drugs responsible for the ongoing addiction of millions of US citizens – as though the prescription opiate addiction epidemic resulted from a foreign invasion, and not corporate decisions from Big Pharma.

President Trump’s claim that 116 ‘drug deaths’ occur every day (42,000 a year) is a major underestimate. In 2017, alone over 64,000 drug overdose deaths were reported in official statistics (with many unreported cases signed off as natural or undetermined, especially in counties too poor to afford autopsies and expensive forensic toxicology). Another 4 million Americans, at least, are currently addicted to opioids and at risk for overdose.

In comparative terms, more American workers have been killed or devastated by narcotics (mostly via prescription) in 2017 alone, than in the entire decade of the Vietnam War with its 58,000 dead and 500,000 wounded. In 2017, 40,000 Americans died in motor vehicle accidents and another 39,000 by gun violence – and these statistics are not broken down to include vehicular accidents due to drug intoxication or gun violence over drugs. Prescription or illegal opiates, alone or mixed with other sedative drugs, like Valium, or alcohol, are the most prominent and preventable cause of premature death in the United States today.

This pattern is unique to the United States, where the irresponsible medical prescription of highly addicting narcotics has been the primary portal of entry into the degrading life of addiction for millions. Despite President Trump’s claims, the addiction crisis is not a product of urban Afro-American street dealers or Mexican narco-traffickers: This uniquely American crisis has been created and fueled by billionaire-owned US pharmaceutical corporations, which produced, distributed and wildly profited from legal narcotics. They were aided by the irresponsible prescription practice of tens of thousands of doctors and other ‘providers’ who introduced millions of vulnerable patients to the world of narcotic dependency – including youngsters with sports injuries and workers with job-related pain. These are physicians and medical providers who rarely stopped to examine their own responsibility, even when their otherwise healthy patients overdosed or were destroyed by addiction. It is especially outrageous that doctors and ‘Big Pharma’ worked hand in hand for over 20 years to create this epidemic, enjoying wild profits and almost total legal immunity. Few have dared to openly question their irresponsibility and greed. In the poorest and most vulnerable areas of this country, the most irresponsible and unaccountable incompetence has replaced real medical care and created a health care apartheid.

The Federal Drug Enforcement Agency (FDA) and the Drug Enforcement Agency (DEA) have protected the corporate drug traffickers and ensured the manicured and cultured narco-bosses the highest rates of return on their products. These polished pushers have their names engraved on the walls of museums and opera houses around the country.

The majority of Presidential, Federal, State and municipal candidates from both major parties have received millions of dollars in electoral campaign funds from these huge legal narcotic manufacturers and distributors, as well as from physicians and other representative of the ‘pain-treatment industry’. Over the past decades, politicians have openly or secretly opposed or weakened legislation designed to address this crisis.

Why not just ask President Trump to direct his Justice Department to impose the death penalty on the board of directors of the big corporate narcotic manufacturers or distributors or on the CEOs of major ‘pain clinics’ or on the owners of local rural ‘health centers’ that drove the villagers of West Virginia into their life-destroying downward spirals?

When will the DEA finally storm the medical centers to arrest the over-prescribing ‘providers’ of narcotics and benzodiazepine tranquilizers (a very common deadly combination)?

When will the SWAT teams seize the vacation homes of the CEOs of major US hospitals where the convenient and fake ideology of promising a ‘pain-free’ experience (‘make it Zero on the Pain Scale’) led to the generalized promotion of highly addicting narcotics for minor injuries, arthritic pain, or chronic back discomfort due to work or obesity? Responsible alternatives existed and were used in the rest of the world – largely untouched by this prescription-fueled crisis.

No doubt what President Trump has in mind is something else: the expulsion of Latin American workers under the pretext of going after the drug dealers and the even more massive incarceration of petty street dealers in the African American community.

Trump will then turn to further monitoring and arresting small-scale American marijuana farmers, who earn a basic income from growing a product that many believe is safe, non-addicting, and significantly reduces demand for dangerous narcotics.

As ugly as this all seems, the complicity of the political, economic and the medical elite in exponentially spreading deadly narcotics among the poor, working class and downwardly mobile middle class, points to a deeper and more sinister policy goal: the systematic elimination of millions of American workers made redundant in the new economy. This is a ‘gentler genocide’, where millions of workers die prematurely seeking an escape from pain as they have been replaced by a new technology and a new ideology: Robots, artificial intelligence and digitalization have rendered them disposable, while the out-sourcing of work to low paid overseas laborers and immigrants have guaranteed unimaginable profits for the elite decision makers.

This highly profitable process, benefiting the political, pharmaceutical, financial, police and judicial elites, conveniently blames the victims, a significant proportion of whom come from the poor and working class in this country, including white rural and small town addicts, especially youth, stuck at minimum wage jobs with no prospects of a decent future – injured construction workers, 15% of whom abuse prescription narcotics for work-related injuries, as well as the marginalized petty drug dealers from the urban slums and desperate Latino immigrants forced to accommodate the cartels. These people have little rights and are easily monitored, incarcerated, expelled and just written-off in one-line obituaries.

The narcotic-fueled genocide had grown out of a calculated corporate strategy meant to cull and subdue a huge population of potentially restive marginalized workers and their families, blaming the overdosing victims for their own ‘irresponsible’ choices, their reliance on prescription opiates, their lack of access to competent medical care, and their untimely deaths as though this were all a collective suicide as the great nation marches forward.

The higher the death toll among marginalized Americans, the greater the reliance on political distractions and racist deceptions. President Trump loudly blames street-level retail distributors, while ignoring the links between tax-exempt mega-billionaires who have profited from the shortened life-expectancies of addicted workers (scores of billions of dollars already saved in future pension and health care expenses) and the millions fired for addiction and denied jobless benefits and treatment. Trump has yet to even mention the actions of the legal pharma-medical industry that set this in motion.

Meanwhile, the Democratic Party leaders denounce the worker-victims of addiction and their communities as ‘irresponsible and racist’, for having believed the populist rhetoric of candidate Trump. Trump’s most intense rural areas of support coincided with areas of the worst opioid addiction and suicide rates. Trump’s rival, Hillary Clinton wrote off scores of millions of vulnerable Americans as ‘deplorables’ and never once addressed the addiction crisis that grew exponentially during her husband’s administration.

Since the implementation of NAFTA during the 1990’s, scores of millions of American workers have been relegated to unstable, low paid jobs, deprived of health benefits and subject to grueling work, prone to physical and mental injuries. Workplace injuries set the stage for the prescription narcotic crisis. Even worse, today workers are constantly distracted by electronic gadgets at the workplace, with their orders from above arriving digitally. These highly profitable gadgets have created enormous distractions and contributed to workplace death and injuries. The plaything of choice for the masses, the I-phone, has added to the addiction crisis, by increasing the rate of injury. This mind-numbing distraction, produced abroad at incredible profit, has played an unexplored role in the increase in premature death in the US.

The corporate narcotic elites, like the ultra-cultured Sackler clan owners of Perdue Pharmaceuticals, and their allies in the finance sector, support the diverse ideological distractions fashioned by their politician pawns: Eager to please her donor-owners, Hillary Clinton and the Democrats blame the working class for their backwardness and genetic propensity to addiction and degradation. Meanwhile, President Trump and the Republicans blame ‘outside’ suppliers and distributors including Mexican narco-cartels, illegal immigrant traffickers, black urban street dealers and now point to overseas Chinese fentanyl labs – as though the entire crisis came from the outside. Trump’s approach flies in the face of the unquestionable source of most narcotic addiction in the US: Irresponsible prescribing of highly addicting legal narcotics.

No other industrialized country is experiencing this scale of addiction and pre-mature death. No other industrialized country relies on a private, for-profit, unregulated system of delivering medical care to its citizens. Only the US.

Both elite political parties avoid the basic issue of the long-term, large-scale structural imperatives underlying the transformation of the US work places. They refuse to address the marginalization of tens of millions of American workers and their families, made disposable by corporate economic and political decisions.

The US corporate elite are completely incapable of developing, let alone favoring, any policy that addresses the needs of millions of surplus office and factory workers and their family members replaced by new technology and ‘global’ economic policies. The American financial and political elite is not about to support an economic, political and cultural ‘GI’ bill to save the scores of millions shoved to the wayside in their rush to obscene wealth and power.

The unstated, but clearly implemented, ‘final solution’ is a Social Darwinian policy of active and passive neglect, the unleashing of profitable prescription narcotics into the population of vulnerable disposable workers, offering them a convenient, painless way out – the opioid solution to the over-population problem of redundant rural and small town ‘Helots’. The political elite’s willing complicity with Big Pharma, the medical profession, the financial oligarchs and the prison-industrial complex has transformed the country in many ways. Shortened lives and depopulation of rural and small town communities translates into lower demand for public services, such as schools, health care, pensions and housing. This is guaranteeing a greater concentration of national wealth in the hands of a tiny elite. The financial press has openly celebrated the projected decrease in pension liabilities as a result of the drop in worker life expectancy.

The ongoing mass genocide by opioids may have started to arouse popular discontent among working people who do not want to continue dying young and miserable! Social services and child protective services for the millions of orphaned or abandoned children of this crisis have been demanding real policies. Unfortunately, the usual platitudes and failed policies prevail. Drug education and ‘opioid addiction treatment’ programs (currently among the largest expense in some union health plans) are pointless Band-Aids when confronted by the larger policy decisions fuelling this crisis. Nevertheless, thousands of health care professionals are beginning to resist corporate pressure to prescribe cheap opioids – and fight for more expensive, but less dangerous, alternative for addressing their patients’ pain. Even if all medical providers stopped over-prescribing narcotics today, there are still millions of addicts already created by past practice, who seek the most deadly street drugs, like fentanyl, to feed their addiction.

Politicians now publicly denounce ‘Big Pharma’, while privately winking at the lobbyists and accepting millions from their ‘donor-owners’.

Public critics in the corporate media are quick to condemn the workers’ susceptibility to narcotic addiction but not the underlying causative imperatives of global capitalism.

Mainstream academics celebrate corporate technological advances with occasional neo-Malthusian warnings about the dangers of millions of redundant workers, while ignoring the profit-driven role of narcotics in reducing the social threat of excess workers!

Finally the role of an elite and respected profession must be re-evaluated in a historic context: In the 1930’s German doctors helped develop an ideology of ‘racial hygiene’ and a technology to demonize and eliminate millions of human beings deemed redundant and inferior, through overwork in slave camps, starvation and active genocide – serving the ambitions of Nazi expansionism and deriving significant profit for select individuals and corporations. US physicians and the broader medical community have less consciously assisted in the ongoing ‘culling of the herd’ of American laborers and rural residents rendered superfluous and undesirable by the decisions of a global oligarchy increasingly unwilling to share public wealth with its masses. There are similarities.

Once prosperous, industrial cities and towns, as well as rural villages, in the US have seen marked declines in populations and a premature death crisis among those who remain.

This must be reversed.