America’s ‘Unlimited Imperialists’

 

By Francis Boyle

Source: Consortium News

Historically the latest eruption of American militarism in the 21st Century is akin to that of America opening the 20th Century by means of the U.S.-instigated Spanish-American War in 1898.

The then Republican administration of President William McKinley grabbed their colonial empire from Spain in Cuba, Puerto Rico, Guam, and the Philippines; inflicted a near genocidal war against the Filipino people; while at the same time illegally annexing the Kingdom of Hawaii and subjecting the Native Hawaiian people (who call themselves the Kanaka Maoli) to genocidal conditions.

Additionally, McKinley’s military and colonial expansion into the Pacific was also designed to secure America’s economic exploitation of China pursuant to the euphemistic rubric of the “open door” policy.   But over the next four decades America’s aggressive presence, policies, and practices in the so-called “Pacific” Ocean would ineluctably pave the way for Japan’s attack at Pearl Harbor on Dec. 7, 1941, and thus America’s precipitation into the ongoing Second World War.

Today a century later, the serial imperial aggressions launched, waged, and menaced by the neoconservative Republican Bush Junior administration, then the neoliberal Democratic Obama administration and now the reactionary Trump administration threaten to set off World War III.

This Time the Stakes are Higher

By shamelessly exploiting the terrible tragedy of September 11, the Bush Junior administration set forth to steal a hydrocarbon empire from the Muslim states and peoples of color living in Central Asia, the Middle East and Africa under the bogus pretexts of (a) fighting a war against “international terrorism” or “Islamic fundamentalism”; and/or (b) eliminating weapons of mass destruction; and/or (c) promoting democracy; and/or (d) self-styled humanitarian intervention and its avatar “responsibility to protect” (R2P).

Only this time the geopolitical stakes are infinitely greater than they were a century ago:  control and domination of the world’s hydrocarbon resources and thus the very fundaments and energizers of the global economic system – oil and gas.

The Bush Junior and Obama administrations targeted the remaining hydrocarbon reserves of Africa, Latin America (e.g., the Pentagon’s reactivating the Fourth Fleet in 2008), and Southeast Asia for further conquest and domination, together with the strategic choke-points at sea and on land required for their transportation (e.g., Syria, Yemen, Somalia, Djibouti).  Today the U.S. Fourth Fleet threatens oil-rich Venezuela and Ecuador, along with Cuba.

Toward accomplishing that first objective, in 2007, the neoconservative Bush Junior administration announced the establishment of the U.S. Pentagon’s Africa Command (AFRICOM) in order to better control, dominate, steal, and exploit both the natural resources and the variegated peoples of the continent of Africa, the very cradle of our human species.

In 2011 Libya and the Libyans proved to be the first victims to succumb to AFRICOM under the neoliberal Obama administration, thus demonstrating the truly bi-partisan and non-partisan nature of U.S. imperial foreign policy decision-making. Let us put aside as beyond the scope of this article, the American conquest, extermination, and ethnic cleansing of the Native Americans from the face of the continent.

Since America’s instigation of the Spanish-American War in 1898, U.S. foreign policy decision-making has been alternatively conducted by reactionary imperialists, conservative imperialists, and liberal imperialists for the past 120 years and counting.

Trump is just another representative for U.S. imperialism and neoliberal capitalism. He forthrightly and proudly admitted that the United States is in the Middle East in order to steal their oil. At least he was honest about it. Unlike his predecessors who lied about the matter going back to President George Bush Sr. with his War for Persian Gulf oil against Iraq in 1991. Just recently, Trump publicly threatened illegal U.S. military intervention against oil-rich Venezuela and now he’s poised to strike Syria.

It’s About Power, Not Just Resources

But oil and other resources are not the only U.S. motive. Enforcing its global power and undermining or removing leaders who defy it are also at play.

As my teacher, mentor, and friend, the late, great Professor Hans Morgenthau, who coined the term “unlimited imperialism” in his seminal 1948 critique of U.S. foreign policy, Politics Among Nations, said:

          “The outstanding historic examples of unlimited imperialism are the expansionist policies of Alexander the Great, Rome, the Arabs in the seventh and eighth centuries, Napoleon I, and Hitler. They all have in common an urge toward expansion which knows no rational limits, feeds on its own successes and, if not stopped by a superior force, will go on to the confines of the political world. This urge will not be satisfied so long as there remains anywhere a possible object of domination–a politically organized group of men which by its very independence challenges the conqueror’s lust for power. It is, as we shall see, exactly the lack of moderation, the aspiration to conquer all that lends itself to conquest, characteristic of unlimited imperialism, which in the past has been the undoing of the imperialistic policies of this kind…”

Since September 11, 2001, it is the Unlimited Imperialists along the lines of Alexander, Rome, Napoleon, and Hitler who have been in charge of conducting American foreign policy decision-making.  The factual circumstances surrounding the outbreaks of both the First World War and the Second World War currently hover like twin Swords of Damocles over the heads of all humanity.

 

Francis Boyle is a professor of international law at the University of Illinois Urbana-Champaign. Among his many books is “Destroying World Order.”

The ‘Values,’ ‘Vision,’ and ‘Democracy’ of an Inauthentic Opposition

Average Americans, whose economic survival is threatened, have no political party to represent them, including deceptive Democrats who claim to be their champions and blame others when their deception fails, says Paul Street.

By Paul Street

Source: Consortium News

Never underestimate the capacity of the United States’ Inauthentic Opposition Party, the corporate Democrats, for self-congratulatory delusion and the externalization of blame.

Look, for example, at the Democratic National Committee’s (DNC) recently filed 66-page lawsuit against Russia, WikiLeaks, and the 2016 Donald Trump campaign. The document accuses Russia of “mount[ing] a brazen attack on the American democracy,” “destabilize[ing] the U.S. political environment” on Trump’s (and Russia’s) behalf, and “interfering with our democracy….”

“The [RussiaGate] conspiracy,” the DNC Complaint says, “undermined and distorted the DNC’s ability to communicate the [Democratic] party’s values and vision to the American electorate” and “sowed discord within the Democratic Party at a time when party unity was essential…”

Yes, Russia, like numerous other nations living under the global shadow of the American Superpower, may well have tried to have some surreptitious say in 2016 U.S. presidential election. (Why wouldn’t the Kremlin have done that, given the very real and grave threats Washington and its Western NATO allies have posed for many years to post-Soviet-era Russian security and peace in Eastern Europe?)

Still, charging Russia with interfering with US-“American democracy” is like me accusing the Washington Capital’s star left winger Alex Ovechkin of interfering with my potential career as a National Hockey League player (I’m middle aged and can’t skate backwards). The U.S. doesn’t have a functioning democracy to undermine, as numerous careful studies (see this,this,this,this,this,this,this,this, and this) have shown.

We have, rather, a corporate and financial oligarchy, an open plutocracy. U.S.-Americans get to vote, yes, but the nation’s “unelected dictatorship of money” reigns nonetheless in the United States, where, as leading liberal political scientists Benjamin Page (Northwestern) and Marin Gilens (Princeton) find, “government policy…reflects the wishes of those with money, not the wishes of the millions of ordinary citizens who turn out every two years to choose among the preapproved, money-vetted candidates for federal office.”

Our Own Oligarchs

Russia and WikiLeaks “destabilized the U.S. political environment”? Gee, how about the 20 top oligarchic U.S. mega-donors who invested more than $500 million combined in disclosed campaign contributions (we can only guess at how much “dark,” that is undisclosed, money they gave) to candidates and political organizations in the 2016 election cycle? The 20 largest organizational donors also gave a total of more than $500 million. The foremost plutocratic election investors included hard right-wing billionaires like casino owner Sheldon Adelson ($83 million disclosed to Republicans and right-wing groups), hedge-fund manager Paul Singer ($26 million to Republicans and the right), hedge fund manager Robert Mercer ($26 million) and packaging mogul Richard Uihlein ($24 million).

How about the multi-billionaire Trump’s own real estate fortune, which combined with the remarkable free attention the corporate media oligopoly granted him to help catapult the orange-tinted fake-populist beast past his more traditional Republican primary opponents? And what about the savagely unequal distribution of wealth and income in Barack Obama’s America, so extreme in the wake of the Great Recession that Hillary’s primary campaign rival Bernie Sanders could credibly report that the top tenth of the upper U.S.1% possessed nearly as much wealth as the nation’s bottom 90%? Such extreme disparity helped doom establishment, Wall Street- and Goldman Sachs-embroiled candidates like Jeb Bush, Marco Rubio, and Mrs. Clinton in 2016. Russia and WikiLeaks did not create that deep, politically- and neoliberal-policy-generated socioeconomic imbalance.

Double Vision

And just what were the Democratic Party “values and vision” that Russia, Trump, and WikiLeaks supposedly prevented the DNC and the Clinton team from articulating in 2016? As the distinguished political scientist and money-politics expert Thomas Ferguson and his colleagues Paul Jorgensen and Jie Chen noted in an important study released three months ago, the Clinton campaign “emphasized candidate and personal issues and avoided policy discussions to a degree without precedent in any previous election for which measurements exist….it deliberately deemphasized issues in favor of concentrating on what the campaign regarded as [Donald] Trump’s obvious personal weaknesses as a candidate.” Strangely enough, the Twitter-addicted reality television star Trump had a lot more to say about policy than the former First Lady, U.S. Senator, and Secretary of State Hillary Clinton, a wonkish Yale Law graduate.

The Democrats “values and vision” in 2016 amounted pretty much to the accurate but hardly inspiring or mass-mobilizing notion that Donald Trump was an awful person who was unqualified for the White House. Clinton ran almost completely on candidate character and quality. This was a blunder of historic proportions, given Clinton’s own highly problematic character brand. Any campaign needs a reasonably strong policy platform to stand on in case of candidate difficulties.

By Ferguson, Jorgenson, and Chen’s account, Hillary’s peculiar policy silence was about U.S. oligarchs’ campaign money. Thanks to candidate Trump’s bizarre nature and his declared isolationism and nationalism, Clinton achieved remarkable campaign finance success with normally Republican-affiliated capitalist sectors less disposed to abide the standard, progressive-sounding policy rhetoric of Democratic Party candidates than their more liberal counterparts.

One ironic but “fateful consequence” of her curious connection to conservative business interests was her “strategic silence about most important matters of public policy. … Misgivings of major contributors who worried that the Clinton campaign message lacked real attractions for ordinary Americans were rebuffed. The campaign,” Ferguson, Jorgenson, and Chen wrote, “sought to capitalize on the angst within business by vigorously courting the doubtful and undecideds there, not in the electorate.”

Other Clinton mistakes included failing to purchase television ads in Michigan, failing to set foot in Wisconsin after the Democratic National Convention, and getting caught telling wealthy New York City campaign donors that Trump’s white supporters were “a basket of” racist, sexist, nativist, and homophobic “deplorables.” This last misstep was a Freudian slip of the neoliberal variety. It reflected and advanced the corporate Democrats’ longstanding alienation of and from the nation’s rural and industrial and ex-industrial “heartland.”

Fake Progressives

As left historian Nancy Fraser noted after Trump was elected, the Democrats, since at least the Bill Clinton administration, had joined outwardly progressive forces like feminism, antiracism, multiculturalism, and LGBTQ rights to “financial capitalism.” This imparted liberal “charisma” and “gloss” to “policies that …devastated…what were once middle-class lives” by wiping out manufacturing, weakening unions, slashing wages, and increasing the “precarity of work.”

To make matters worse, Fraser rightly added, the “progressive neoliberal” blue-and digital-zone Democrats “compounded” the “injury of deindustrialization” with “the insult of progressive moralism,” which rips red-and analog-zone whites as culturally retrograde (recall candidate Obama’s problematic 2008 reflection on how rural and small-town whites “cling to religion and guns”) and yet privileged by the simple color of their skin.

Such insults from elite, uber-professional class neo-liberals like Obama (Harvard Law) and the Clintons (Yale Law) would sting less in the nation’s “flyover zones” if the those uttering them had not spent their sixteen years in the White House governing blatantly in accord with the wishes of Wall Street, Silicon Valley, and the leading multinational corporations. Like Bill Clinton’s two terms, the Obama years were richly consistent with Sheldon Wolin’s early 2008 description of the Democrats as an “inauthentic opposition” whose dutiful embrace of “centrist precepts” meant they would do nothing to “substantially revers[e] the drift rightwards” or “significantly alter the direction of society.”

The fake-“progressive” Obama presidency opened with the expansion of Washington’s epic bailout of the very parasitic financial elites who recklessly sparked the Great Recession (this with no remotely concomitant expansion of federal assistance to the majority middle- and working-class victims), the abandonment of campaign pledges to restore workers’ right to organize (through the immediately forgotten Employee Free Choice Act), and the kicking of Single Payer health care advocates to the curb as Obama worked with the big drug and insurance syndicates to craft a corporatist, profit-friendly health insurance reform. Obama’s second term ended with him doggedly (if unsuccessfully) championing the arch-authoritarian global-corporatist Trans Pacific Partnership.

This Goldman Sachs and Citigroup-directed policy record was no small part of what demobilized the Democrats’ mass electoral base in ways that “destabilized the U.S. political environment” to the benefit of the reactionary populist Trump, whose Mercer family-backed proto-fascistic strategist and Svengali Steve Bannon was smartly attuned to the Democrats’ elitist class problem.

There was a major 2016 presidential candidate who ran with genuinely progressive “values and vision” – Bernie Sanders. The most remarkable finding in Ferguson, Jorgenson, and Chen’s study is that the self-declared “democratic socialist” Sanders came tantalizingly close to winning the Democratic presidential nomination with no support from Big Business. The small-donor Sanders campaign was “without precedent in American politics not just since the New Deal, but across virtually the whole of American history … a major presidential candidate waging a strong, highly competitive campaign whose support from big business was essentially zero.”

Sanders was foiled by the big-money candidate Clinton’s advance control of the Democratic National Committee and convention delegates. Under a formal funding arrangement it worked up with the Democratic National Committee (DNC) in late September of 2015, the depressing “lying neoliberal warmonger” Hillary’s campaign was granted advance control of all the DNC’s “strategic decisions.” The Democratic Party’s presidential caucuses and primaries were rigged against Sanders in ugly ways that provoked a different lawsuit last year – a class-action suit against the DNC on behalf of Sanders’ supporters. The complaint was dismissed by a federal judge who ruled on the side of DNC lawyers by agreeing that the DNC was within its rights to violate their party’s charter and bylaws by selecting its candidate in advance of the primaries.

How was that for the noble “values and vision” that “American democracy” inspires atop the not-so leftmost of the nation’s two major and electorally viable political parties?

Under Cover of Russia-gate

That’s what “sowed discord within the Democratic Party at a time when party unity was essential…” Russia didn’t do it. Neither did WikiLeaks or the Trump campaign. The Clinton campaign and the Democratic Party establishment – themselves funded by major U.S. oligarchs like San Francisco hedge-fund billionaire Tom Steyer– did that on their own.

Could Sanders – the most popular politician in the U.S. (something rarely reported in a “mainstream” corporate media that could barely cover his giant campaign rallies even as it obsessed over Trump’s every bizarre Tweet) – have defeated the orange-tinted beast in the general election? Perhaps, though much of the oligarchic funding Hillary got would have gone to Trump if “socialist” Bernie had been the Democratic nominee. It is unlikely that Sanders could have accomplished much as president in a nation long controlled by the capitalist oligarchy in numerous ways that go far beyond campaign finance alone.

Meanwhile, under the cover of RussiaGate, the still-dismal and dollar-drenched corporate-imperial Democrats seem content to continue tilting to the center-right, purging Sanders-style progressives from the party’s leadership and citing the party’s special election victories (Doug Jones and Conor Lamb) against deeply flawed and Trump-backed Republicans in two bright-red voting districts (the state of Alabama and a fading Pennsylvania canton) as proof that tepid neoliberal centrism is still (even after Hillary’s stunning defeat) the way to go.

Along the way, the Inauthentic Opposition’s candidate roster for the upcoming Congressional mid-term election is loaded with an extraordinary number of contenders with U.S. military and intelligence backgrounds, consistent with Congressional Democrats repeated votes to give massive military and surveillance-state funds and power to a president they consider (accurately enough) unbalanced and dangerous.

The trick, the neoliberal “CIA Democrats” think, is to run conservative, Wall Street-backed imperial and National Security State veterans who pretend (see Eric Draitser’s recent piece on “How Clintonites Are Manufacturing Faux Progressive Congressional Campaigns”) to be aligned with majority-progressive left-of-center policy sentiments and values. It’s still very much their party.

Whatever happens during the next biennial electoral extravaganza, “the crucial fact” remains, in Wolin’s words nine years ago, “that for the poor, minorities, the working class and anti-corporatists there is no opposition party working on their behalf” in the United States – the self-declared homeland and headquarters of global democracy.

 

Paul Street is an independent radical-democratic policy researcher, journalist, historian, author and speaker based in Iowa City, Iowa, and Chicago, Illinois.  He is the author of seven books. His latest is They Rule: The 1% v. Democracy (Paradigm, 2014)

Disarming the Weapons of Mass Distraction

By Madeleine Bunting

Source: Rise Up Times

“Are you paying attention?” The phrase still resonates with a particular sharpness in my mind. It takes me straight back to my boarding school, aged thirteen, when my eyes would drift out the window to the woods beyond the classroom. The voice was that of the math teacher, the very dedicated but dull Miss Ploughman, whose furrowed grimace I can still picture.

We’re taught early that attention is a currency—we “pay” attention—and much of the discipline of the classroom is aimed at marshaling the attention of children, with very mixed results. We all have a history here, of how we did or did not learn to pay attention and all the praise or blame that came with that. It used to be that such patterns of childhood experience faded into irrelevance. As we reached adulthood, how we paid attention, and to what, was a personal matter and akin to breathing—as if it were automatic.

Today, though, as we grapple with a pervasive new digital culture, attention has become an issue of pressing social concern. Technology provides us with new tools to grab people’s attention. These innovations are dismantling traditional boundaries of private and public, home and office, work and leisure. Emails and tweets can reach us almost anywhere, anytime. There are no cracks left in which the mind can idle, rest, and recuperate. A taxi ad offers free wifi so that you can remain “productive” on a cab journey.

Even those spare moments of time in our day—waiting for a bus, standing in a queue at the supermarket—can now be “harvested,” says the writer Tim Wu in his book The Attention Merchants. In this quest to pursue “those slivers of our unharvested awareness,” digital technology has provided consumer capitalism with its most powerful tools yet. And our attention fuels it. As Matthew Crawford notes in The World Beyond Your Head, “when some people treat the minds of other people as a resource, this is not ‘creating wealth,’ it is transferring it.”

There’s a whiff of panic around the subject: the story that our attention spans are now shorter than a goldfish’s attracted millions of readers on the web; it’s still frequently cited, despite its questionable veracity. Rates of diagnosis attention deficit hyperactivity disorder in children have soared, creating an $11 billion global market for pharmaceutical companies. Every glance of our eyes is now tracked for commercial gain as ever more ingenious ways are devised to capture our attention, if only momentarily. Our eyeballs are now described as capitalism’s most valuable real estate. Both our attention and its deficits are turned into lucrative markets.

There is also a domestic economy of attention; within every family, some get it and some give it. We’re all born needing the attention of others—our parents’, especially—and from the outset, our social skills are honed to attract the attention we need for our care. Attention is woven into all forms of human encounter from the most brief and transitory to the most intimate. It also becomes deeply political: who pays attention to whom?

Social psychologists have researched how the powerful tend to tune out the less powerful. One study with college students showed that even in five minutes of friendly chat, wealthier students showed fewer signs of engagement when in conversation with their less wealthy counterparts: less eye contact, fewer nods, and more checking the time, doodling, and fidgeting. Discrimination of race and gender, too, plays out through attention. Anyone who’s spent any time in an organization will be aware of how attention is at the heart of office politics. A suggestion is ignored in a meeting, but is then seized upon as a brilliant solution when repeated by another person.

What is political is also ethical. Matthew Crawford argues that this is the essential characteristic of urban living: a basic recognition of others.

And then there’s an even more fundamental dimension to the politics of attention. At a primary level, all interactions in public space require a very minimal form of attention, an awareness of the presence and movement of others. Without it, we would bump into each other, frequently.

I had a vivid demonstration of this point on a recent commute: I live in East London and regularly use the narrow canal paths for cycling. It was the canal rush hour—lots of walkers with dogs, families with children, joggers as well as cyclists heading home. We were all sharing the towpath with the usual mixture of give and take, slowing to allow passing, swerving around and between each other. Only this time, a woman was walking down the center of the path with her eyes glued to her phone, impervious to all around her. This went well beyond a moment of distraction. Everyone had to duck and weave to avoid her. She’d abandoned the unspoken contract that avoiding collision is a mutual obligation.

This scene is now a daily occurrence for many of us, in shopping centers, station concourses, or on busy streets. Attention is the essential lubricant of urban life, and without it, we’re denying our co-existence in that moment and place. The novelist and philosopher, Iris Murdoch, writes that the most basic requirement for being good is that a person “must know certain things about his surroundings, most obviously the existence of other people and their claims.”

Attention is what draws us out of ourselves to experience and engage in the world. The word is often accompanied by a verb—attention needs to be grabbed, captured, mobilized, attracted, or galvanized. Reflected in such language is an acknowledgement of how attention is the essential precursor to action. The founding father of psychology William James provided what is still one of the best working definitions:

It is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. Focalization, concentration, of consciousness are of its essence. It implies withdrawal from some things in order to deal effectively with others.

Attention is a limited resource and has to be allocated: to pay attention to one thing requires us to withdraw it from others. There are two well-known dimensions to attention, explains Willem Kuyken, a professor of psychology at Oxford. The first is “alerting”— an automatic form of attention, hardwired into our brains, that warns us of threats to our survival. Think of when you’re driving a car in a busy city: you’re aware of the movement of other cars, pedestrians, cyclists, and road signs, while advertising tries to grab any spare morsel of your attention. Notice how quickly you can swerve or brake when you spot a car suddenly emerging from a side street. There’s no time for a complicated cognitive process of decision making. This attention is beyond voluntary control.

The second form of attention is known as “executive”—the process by which our brain selects what to foreground and focus on, so that there can be other information in the background—such as music when you’re cooking—but one can still accomplish a complex task. Crucially, our capacity for executive attention is limited. Contrary to what some people claim, none of us can multitask complex activities effectively. The next time you write an email while talking on the phone, notice how many typing mistakes you make or how much you remember from the call. Executive attention can be trained, and needs to be for any complex activity. This was the point James made when he wrote: “there is no such thing as voluntary attention sustained for more than a few seconds at a time… what is called sustained voluntary attention is a repetition of successive efforts which bring back the topic to the mind.”

Attention is a complex interaction between memory and perception, in which we continually select what to notice, thus finding the material which correlates in some way with past experience. In this way, patterns develop in the mind. We are always making meaning from the overwhelming raw data. As James put it, “my experience is what I agree to attend to. Only those items which I notice shape my mind—without selective interest, experience is an utter chaos.”

And we are constantly engaged in organizing that chaos, as we interpret our experience. This is clear in the famous Gorilla Experiment in which viewers were told to watch a video of two teams of students passing a ball between them. They had to count the number of passes made by the team in white shirts and ignore those of the team in black shirts. The experiment is deceptively complex because it involves three forms of attention: first, scanning the whole group; second, ignoring the black T-shirt team to keep focus on the white T-shirt team (a form of inhibiting attention); and third, remembering to count. In the middle of the experiment, someone in a gorilla suit ambles through the group. Afterward, half the viewers when asked hadn’t spotted the gorilla and couldn’t even believe it had been there. We can be blind not only to the obvious, but to our blindness.

There is another point in this experiment which is less often emphasized. Ignoring something—such as the black T-shirt team in this experiment—requires a form of attention. It costs us attention to ignore something. Many of us live and work in environments that require us to ignore a huge amount of information—that flashing advert, a bouncing icon or pop-up.

In another famous psychology experiment, Walter Mischel’s Marshmallow Test, four-year-olds had a choice of eating a marshmallow immediately or two in fifteen minutes. While filmed, each child was put in a room alone in front of the plate with a marshmallow. They squirmed and fidgeted, poked the marshmallow and stared at the ceiling. A third of the children couldn’t resist the marshmallow and gobbled it up, a third nibbled cautiously, but the last third figured out how to distract themselves. They looked under the table, sang… did anything but look at the sweet. It’s a demonstration of the capacity to reallocate attention. In a follow-up study some years later, those who’d been able to wait for the second marshmallow had better life outcomes, such as academic achievement and health. One New Zealand study of 1,000 children found that this form of self-regulation was a more reliable predictor of future success and wellbeing than even a good IQ or comfortable economic status.

What, then, are the implications of how digital technologies are transforming our patterns of attention? In the current political anxiety about social mobility and inequality, more weight needs to be put on this most crucial and basic skill: sustaining attention.

*

I learned to concentrate as a child. Being a bookworm helped. I’d be completely absorbed in my reading as the noise of my busy family swirled around me. It was good training for working in newsrooms; when I started as a journalist, they were very noisy places with the clatter of keyboards, telephones ringing and fascinating conversations on every side. What has proved much harder to block out is email and text messages.

The digital tech companies know a lot about this widespread habit; many of them have built a business model around it. They’ve drawn on the work of the psychologist B.F. Skinner who identified back in the Thirties how, in animal behavior, an action can be encouraged with a positive consequence and discouraged by a negative one. In one experiment, he gave a pigeon a food pellet whenever it pecked at a button and the result, as predicted, was that the pigeon kept pecking. Subsequent research established that the most effective way to keep the pigeon pecking was “variable-ratio reinforcement.” Give the pigeon a food pellet sometimes, and you have it well and truly hooked.

We’re just like the pigeon pecking at the button when we check our email or phone. It’s a humiliating thought. Variable reinforcement ensures that the customer will keep coming back. It’s the principle behind one of the most lucrative US industries: slot machines, which generate more profit than baseball, films, and theme parks combined. Gambling was once tightly restricted for its addictive potential, but most of us now have the attentional equivalent of a slot machine in our pocket, beside our plate at mealtimes, and by our pillow at night. Even during a meal out, a play at the theater, a film, or a tennis match. Almost nothing is now experienced uninterrupted.

Anxiety about the exponential rise of our gadget addiction and how it is fragmenting our attention is sometimes dismissed as a Luddite reaction to a technological revolution. But that misses the point. The problem is not the technology per se, but the commercial imperatives that drive the new technologies and, unrestrained, colonize our attention by fundamentally changing our experience of time and space, saturating both in information.

In much public space, wherever your eye lands—from the back of the toilet door, to the handrail on the escalator, or the hotel key card—an ad is trying to grab your attention, and does so by triggering the oldest instincts of the human mind: fear, sex, and food. Public places become dominated by people trying to sell you something. In his tirade against this commercialization, Crawford cites advertisements on the backs of school report cards and on debit machines where you swipe your card. Before you enter your PIN, that gap of a few seconds is now used to show adverts. He describes silence and ad-free experience as “luxury goods” that only the wealthy can afford. Crawford has invented the concept of the “attentional commons,” free public spaces that allow us to choose where to place our attention. He draws the analogy with environmental goods that belong to all of us, such as clean air or clean water.

Some legal theorists are beginning to conceive of our own attention as a human right. One former Google employee warned that “there are a thousand people on the other side of the screen whose job it is to break down the self-regulation you have.” They use the insights into human behavior derived from social psychology—the need for approval, the need to reciprocate others’ gestures, the fear of missing out. Your attention ceases to be your own, pulled and pushed by algorithms. Attention is referred to as the real currency of the future.

*

In 2013, I embarked on a risky experiment in attention: I left my job. In the previous two years, it had crept up on me. I could no longer read beyond a few paragraphs. My eyes would glaze over and, even more disastrously for someone who had spent their career writing, I seemed unable to string together my thoughts, let alone write anything longer than a few sentences. When I try to explain the impact, I can only offer a metaphor: it felt like my imagination and use of language were vacuum packed, like a slab of meat coated in plastic. I had lost the ability to turn ideas around, see them from different perspectives. I could no longer draw connections between disparate ideas.

At the time, I was working in media strategy. It was a culture of back-to-back meetings from 8:30 AM to 6 PM, and there were plenty of advantages to be gained from continuing late into the evening if you had the stamina. Commitment was measured by emails with a pertinent weblink. Meetings were sometimes as brief as thirty minutes and frequently ran through lunch. Meanwhile, everyone was sneaking time to battle with the constant emails, eyes flickering to their phone screens in every conversation. The result was a kind of crazy fog, a mishmash of inconclusive discussions.

At first, it was exhilarating, like being on those crazy rides in a theme park. By the end, the effect was disastrous. I was almost continuously ill, battling migraines and unidentifiable viruses. When I finally made the drastic decision to leave, my income collapsed to a fraction of its previous level and my family’s lifestyle had to change accordingly. I had no idea what I was going to do; I had lost all faith in my ability to write. I told friends I would have to return the advance I’d received to write a book. I had to try to get back to the skills of reflection and focus that had once been ingrained in me.

The first step was to teach myself to read again. I sometimes went to a café, leaving my phone and computer behind. I had to slow down the racing incoherence of my mind so that it could settle on the text and its gradual development of an argument or narrative thread. The turning point in my recovery was a five weeks’ research trip to the Scottish Outer Hebrides. On the journey north of Glasgow, my mobile phone lost its Internet connection. I had cut myself loose with only the occasional text or call to family back home. Somewhere on the long Atlantic beaches of these wild and dramatic islands, I rediscovered my ability to write.

I attribute that in part to a stunning exhibition I came across in the small harbor town of Lochboisdale, on the island of South Uist. Vija Celmins is an acclaimed Latvian-American artist whose work is famous for its astonishing patience. She can take a year or more to make a woodcut that portrays in minute detail the surface of the sea. A postcard of her work now sits above my desk, a reminder of the power of slow thinking.

Just as we’ve had a slow eating movement, we need a slow thinking campaign. Its manifesto could be the German poet Rainer Maria Rilke’s beautiful “Letters to a Young Poet”:

To let every impression and the germ of every feeling come to completion inside, in the dark, in the unsayable, the unconscious, in what is unattainable to one’s own intellect, and to wait with deep humility and patience for the hour when a new clarity is delivered.

Many great thinkers attest that they have their best insights in moments of relaxation, the proverbial brainwave in the bath. We actually need what we most fear: boredom.

When I left my job (and I was lucky that I could), friends and colleagues were bewildered. Why give up a good job? But I felt that here was an experiment worth trying. Crawford frames it well as “intellectual biodiversity.” At a time of crisis, we need people thinking in different ways. If we all jump to the tune of Facebook or Instagram and allow ourselves to be primed by Twitter, the danger is that we lose the “trained powers of concentration” that allow us, in Crawford’s words, “to recognize that independence of thought and feeling is a fragile thing, and requires certain conditions.”

I also took to heart the insights of the historian Timothy Snyder, who concluded from his studies of twentieth-century European totalitarianism that the way to fend off tyranny is to read books, make an effort to separate yourself from the Internet, and “be kind to our language… Think up your own way of speaking.” Dropping out and going offline enabled me to get back to reading, voraciously, and to writing; beyond that, it’s too early to announce the results of my experiment with attention. As Rilke said, “These things cannot be measured by time, a year has no meaning, and ten years are nothing.”

*

A recent column in The New Yorker cheekily suggests that all the fuss about the impact of digital technologies on our attention is nothing more than writers’ worrying about their own working habits. Is all this anxiety about our fragmenting minds a moral panic akin to those that swept Victorian Britain about sexual behavior? Patterns of attention are changing, but perhaps it doesn’t much matter?

My teenage children read much less than I did. One son used to play chess online with a friend, text on his phone, and do his homework all at the same time. I was horrified, but he got a place at Oxford. At his interview, he met a third-year history undergraduate who told him he hadn’t yet read any books in his time at university. But my kids are considerably more knowledgeable about a vast range of subjects than I was at their age. There’s a small voice suggesting that the forms of attention I was brought up with could be a thing of the past; the sustained concentration required to read a whole book will become an obscure niche hobby.

And yet, I’m haunted by a reflection: the magnificent illuminations of the eighth-century Book of Kells has intricate patterning that no one has ever been able to copy, such is the fineness of the tight spirals. Lines are a millimeter apart. They indicate a steadiness of hand and mind—a capability most of us have long since lost. Could we be trading in capacities for focus in exchange for a breadth of reference? Some might argue that’s not a bad trade. But we would lose depth: artist Paul Klee wrote that he would spend a day in silent contemplation of something before he painted it. Paul Cézanne was similarly known for his trance like attention on his subject. Madame Cézanne recollected how her husband would gaze at the landscape, and told her, “The landscape thinks itself in me, and I am its consciousness.” The philosopher Maurice Merleau-Ponty describes a contemplative attention in which one steps outside of oneself and immerses oneself in the object of attention.

It’s not just artists who require such depth of attention. Nearly two decades ago, a doctor teaching medical students at Yale was frustrated at their inability to distinguish between types of skin lesions. Their gaze seemed restless and careless. He took his students to an art gallery and told them to look at a picture for fifteen minutes. The program is now used in dozens of US medical schools.

Some argue that losing the capacity for deep attention presages catastrophe. It is the building block of “intimacy, wisdom, and cultural progress,” argues Maggie Jackson in her book Distracted, in which she warns that “as our attentional skills are squandered, we are plunging into a culture of mistrust, skimming, and a dehumanizing merging between man and machine.” Significantly, her research began with a curiosity about why so many Americans were deeply dissatisfied with life. She argues that losing the capacity for deep attention makes it harder to make sense of experience and to find meaning—from which comes wonder and fulfillment. She fears a new “dark age” in which we forget what makes us truly happy.

Strikingly, the epicenter of this wave of anxiety over our attention is the US. All the authors I’ve cited are American. It’s been argued that this debate represents an existential crisis for America because it exposes the flawed nature of its greatest ideal, individual freedom. The commonly accepted notion is that to be free is to make choices, and no one can challenge that expression of autonomy. But if our choices are actually engineered by thousands of very clever, well-paid digital developers, are we free? The former Google employee Tristan Harris confessed in an article in 2016 that technology “gives people the illusion of free choice while architecting the menu so that [tech giants] win, no matter what you choose.”

Despite my children’s multitasking, I maintain that vital human capacities—depth of insight, emotional connection, and creativity—are at risk. I’m intrigued as to what the resistance might look like. There are stirrings of protest with the recent establishment of initiatives such as the Time Well Spent movement, founded by tech industry insiders who have become alarmed at the efforts invested in keeping people hooked. But collective action is elusive; the emphasis is repeatedly on the individual to develop the necessary self-regulation, but if that is precisely what is being eroded, we could be caught in a self-reinforcing loop.

One of the most interesting responses to our distraction epidemic is mindfulness. Its popularity is evidence that people are trying to find a way to protect and nourish their minds. Jon Kabat-Zinn, who pioneered the development of secular mindfulness, draws an analogy with jogging: just as keeping your body fit is now well understood, people will come to realize the importance of looking after their minds.

I’ve meditated regularly for twenty years, but curious as to how this is becoming mainstream, I went to an event in the heart of high-tech Shoreditch in London. In a hipster workspaces with funky architecture, excellent coffee, and an impressive range of beards, a soft-spoken retired Oxford professor of psychology, Mark Williams, was talking about how multitasking has a switching cost in focus and concentration. Our unique human ability to remember the past and to think ahead brings a cost; we lose the present. To counter this, he advocated a daily practice of mindfulness: bringing attention back to the body—the physical sensations of the breath, the hands, the feet. Williams explained how fear and anxiety inhibit creativity. In time, the practice of mindfulness enables you to acknowledge fear calmly and even to investigate it with curiosity. You learn to place your attention in the moment, noticing details such as the sunlight or the taste of the coffee.

On a recent retreat, I was beside a river early one morning and a rower passed. I watched the boat slip by and enjoyed the beauty in a radically new way. The moment was sufficient; there was nothing I wanted to add or take away—no thought of how I wanted to do this every day, or how I wanted to learn to row, or how I wished I was in the boat. Nothing but the pleasure of witnessing it. The busy-ness of the mind had stilled. Mindfulness can be a remarkable bid to reclaim our attention and to claim real freedom, the freedom from our habitual reactivity that makes us easy prey for manipulation.

But I worry that the integrity of mindfulness is fragile, vulnerable both to commercialization by employers who see it as a form of mental performance enhancement and to consumer commodification, rather than contributing to the formation of ethical character. Mindfulness as a meditation practice originates in Buddhism, and without that tradition’s ethics, there is a high risk of it being hijacked and misrepresented.

Back in the Sixties, the countercultural psychologist Timothy Leary rebelled against the conformity of the new mass media age and called for, in Crawford’s words, an “attentional revolution.” Leary urged people to take control of the media they consumed as a crucial act of self-determination; pay attention to where you place your attention, he declared. The social critic Herbert Marcuse believed Leary was fighting the struggle for the ultimate form of freedom, which Marcuse defined as the ability “to live without anxiety.” These were radical prophets whose words have an uncanny resonance today. Distraction has become a commercial and political strategy, and it amounts to a form of emotional violence that cripples people, leaving them unable to gather their thoughts and overwhelmed by a sense of inadequacy. It’s a powerful form of oppression dressed up in the language of individual choice.

The stakes could hardly be higher, as William James knew a century ago: “The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will.” And what are we humans without these three?

Saturday Matinee: Hi, Mom!

“Hi, Mom!” (1970) is a dark comedy directed by Brian De Palma and stars Robert De Niro. It’s a sequel to Greetings (1968) in which De Nero reprises his role of John Rubin who is now a voyeuristic filmmaker. John later falls in with a group of militant black activists leading to the film’s most memorable scene in which he participates in an experimental theater performance. The show is called “Be Black, Baby” and requires its white audience to don blackface and be subjected to an escalating series of abuses by black actors in whiteface until John arrives as a seemingly real NYPD officer and proceeds to put the white audience members under arrest. De Palma and De Niro’s next collaboration would be 17 years later for The Untouchables.

Big Pharma, Big Oil and Big Banks Meet the Definition of Terrorists

Common threads persist throughout definitions of terrorism: violence, injury or death, intimidation, intentionality, multiple targets and political motivation. Big pharma, big oil and big banks meet them all.

By Paul Buchheit

Source: Mint Press News

Various definitions of terrorism have been proposed in recent years, by organizations such as the FBI, the State DepartmentHomeland Security, and the ACLU. Some common threads persist throughout the definitions: violence, injury or death, intimidation, intentionality, multiple targets, political motivation. All the criteria are met by pharmaceutical and oil and financial companies. They have all injured and intimidated the American public, and caused people to die, with intentionality shown by their refusal to acknowledge evidence of their misdeeds, and political motives clear in their lobbying efforts, where among all U.S. industries Big Pharma is #1, Big Oil is #5, and Securities/Investment #8.

The terror inflicted on Americans is real, and is documented by the facts to follow.

Big Pharma: Qualifying for Trump’s Call for Capital Punishment for Drug Dealers

In a Time Magazine article a young man named Chad Colwell says “I got prescribed painkillers, Percocet and Oxycontin, and then it just kind of took off from there.” Time adds: “Prescriptions gave way to cheaper, stronger alternatives. Why scrounge for a $50 pill of Percocet when a tab of heroin can be had for $5?” About 75% of heroin addicts used prescription opioids before turning to heroin.

Any questions about Big Pharma’s role in violence and death in America have been answered by the Centers for Disease Control and the American Journal of Public Health. Any doubts about Big Pharma’s intentions to intimidate the public have been put to rest by the many occasions of outrageous price gouging. And any uncertainty about political pressure is removed by its #1 lobbying ranking.

As for malicious intentions, Bernie Sanders noted, “We know that pharmaceutical companies lied about the addictive impacts of opioids they manufactured.” Purdue Pharma knew all about the devastating addictive effects of its painkiller Oxycontin, and even pleaded guilty in 2007 to misleading regulators, doctors, and patients about the drug’s risk. Now Purdue and other drug companies are facing a lawsuitfor “deceptively marketing opioids” and ignoring the misuse of their drugs.

No jail for the opioid pushers, though, just slap-on-the-wrist fines that can be made up with a few price increases. But partly as a result of Pharma-related violence, Americans are suffering “deaths of despair”— death by drugs, alcohol and suicide. Suicide is at its highest level in 30 years.

Big Oil: Decades of Terror

Any doubts about the ecological terror caused by fossil fuel companies have been dispelled by the World Health Organization, the American Lung Association, the United Nations, the Pentagon, cooperating governments, and independent research groups, all of whom agree that human-induced climate change is killing people.

The oil industry’s intentionality and political motives have been demonstrated by their refusal to admit the known truth, starting with Exxon, which has covered up its own climate research for 40 years, and continuing through multi-million dollar lobbying efforts by Amoco, the US Chamber of Commerce, General Motors, Koch Industries, and other corporations in their effort to dismantle the Kyoto Protocol against global warming.

Big Banks: Leaving Suicidal Former Homeowners Behind

Any doubts about the violence stemming from the 2008 mortgage crisis have been resolved by studies of recession-caused suicides. Both the British Journal of Psychiatry and the National Institutes of Healthfound definite links between the recession and the rate of suicides.

As with Big Pharma and Big Oil, intentionality and political motives are evident in the banking industry’s lobbying efforts on behalf of deregulation — leading to the same conditions that threatened American homeowners in 2008. There has also been a surge in the number of non-bank lenders, who are less subject to regulation.

Making it all worse are private developers, who make most of their profits by building fancy homes for the rich. And by avoiding affordable housing. Since the recession, Blackstone and other private equity firms — with government subsidies — have been buying up foreclosed houses, holding them till prices appreciate, and in the interim renting them back at exorbitant prices.

This is leaving more and more Americans out in the cold — literally. A head of household in the U.S. needs to make $21.21 an hour to afford a two-bedroom apartment at HUD standards, much more than the $16.38 they actually earn. Since the recession, the situation has continually worsened. From 2010 to 2016 the number of housing units priced for very low-income families plummeted 60 percent.

Here’s the big picture: Since the 1980s there’s been a massive redistribution of wealth from middle-class housing to the investment portfolios of people with an average net worth of $75 million. It’s not hard to understand the “deaths of despair” caused by the terror inflicted on people losing their homes.

 

Russia’s Seattle Consulate Broken Into: the US Openly Flouts its International Obligations

By Alex Gorka

Source: Strategic Culture Foundation

They did it again. On April 25, US inspectors broke into the Russian consulate in Seattle, which had been shuttered and vacated at the order of the American government, in a response to the Skripal case. The “inspection” was actually a break-in, since the locks had to be forced. The Russian staff had closed the mansion on April 24 but kept the keys, as the house is still the property of the Russian government. Officially, the Russian Federation (RF) still owns the mansion and its flag still flies from the roof, but the US owns the land and consular activities will no longer be authorized on that site.

The forced entry into the consulate was a flagrant violation of international law. True, the US government has the right to declare that the mission has been stripped of its diplomatic immunity. But it takes two to tango, and Russia never agreed to lift that immunity. That declaration has no validity without Russia’s consent.

The Vienna Convention on Diplomatic Relations of 1961 protects embassy and consulate property abroad by bestowing upon it the status of inviolability (Article 22). No unauthorized entry is allowed. Moreover, the host country is responsible for protecting all foreign missions from intrusions, damage, and similar events. Diplomatic sites cannot be searched. No document or property can be seized.

The 1963 Vienna Convention on Consular Relations states that consulates, along with their property, are always to be protected by their host, even during an armed conflict. No entrance without permission is allowed (Article 31).

According to the US-USSR Consular Convention of 1968, the diplomatic properties on each other’s soil are sacrosanct. The consulates enjoy diplomatic immunity. Like it or not, the US has just violated that document by entering the Seattle consulate.

As one can see, all the relevant international conventions state, by and large, the same thing – there is no entrance without permission. This is a hard-and-fast rule, but now all of these conventions have just been breached in broad daylight!

The question arises — what’s the use of signing agreements with someone who flouts them? Today they enter foreign compounds, tomorrow they unilaterally pull out of the Iran deal, and then what? The US can walk away from any major arms-control agreement, just like it abandoned the 1972 ABM Treaty in 2002. Washington signs agreements in order to force others to comply with them, while the US enjoys the freedom to interpret them at will. Nothing is binding upon that “shining city on a hill.”

The relevant domestic law in the US, the 1982 Foreign Missions Act, states that the secretary of state may demand that any foreign mission be stripped of its property if such a move is needed to protect US interests. This can be done provided that one year has passed from the date on which that foreign mission ceased its diplomatic or consular functions. In this case, one year has not passed. What’s more, no clear explanation is offered as to what exactly is meant by “US interests.” And in fact, this act is contradicted by international law. Why should a foreign mission comply with it, if all the conventions listed above are very explicit about property rights and the US is a party to all of them? Anyway, the US law is not relevant in this case, unlike the binding accords America has signed.

It is true that the two nations are engaged in an ongoing “diplomatic war.” That’s a process that’s easy to start and extremely difficult to end. It was not Moscow that started this folly. But even wars have their rules. The US actions are unprecedented and are doing serious damage to the country’s international image. The Seattle consulate’s closure has greatly complicated the lives of many people who have nothing to do with politics.

What about gains? There have been hardly any, especially taking into consideration that the US mission in St. Petersburg, which is going to be closed in response, is much more important for Americans than the consulate office in Seattle was for Russians. The expulsions and closures may go on until the ambassadors are the only ones left, but no one will win. “Tit-for-tat” expulsions are a game with no winners or losers. They are meaningless and doomed to ineffectuality

The ongoing Russian-US “diplomatic war” cannot continue forever. The day will inevitably come when Washington will have to reach some new agreements with Moscow about consulate offices. It’s highly likely that Russia will demand additional guarantees of the safety of its property on American soil. Other nations may follow suit.

Gross violations of international law inflict great damage. The US will not be trusted. It will be viewed as a state that can reject its commitments at any time it chooses. From now on, all nations will know that their embassies and consulates in the US are not protected by the international agreements the American government flouts so easily.

 

What Lies Beyond Capitalism and Socialism?

By Charles Hugh Smith

Source: Of Two Minds

The status quo, in all its various forms, is dominated by incentives that strengthen the centralization of wealth and power.

As longtime readers know, my work aims to 1) explain why the status quo — the socio-economic-political system we inhabit — is unsustainable, divisive, and doomed to collapse under its own weight and 2) sketch out an alternative Mode of Production/way of living that is sustainable, consumes far less resources while providing for the needs of the human populace — not just for our material daily bread but for positive social roles, purpose, hope, meaning and opportunity, needs that are by and large ignored or marginalized in the current system.

One cognitive/emotional roadblock I encounter is the nearly universal assumption that there are only two systems: the State (government) or the Market (free trade/ free enterprise). This divide plays out politically as the Right (capitalism, favoring markets) and the Left (socialism, favoring the state). Everything from Communism to Libertarianism can be placed on this spectrum.

But what if the State and the Market are the sources of our unsustainability? What if they are intrinsically incapable of fixing what’s broken?

The roadblock here is adherents to one camp or the other are emotionally attached to their ideological choice, to the point that these ideological attachments have a quasi-religious character.

Believers in the market as the solution to virtually any problem refuse to accept any limits on the market’s efficacy, and believers in greater state power/control refuse to accept any limits on the state’s efficacy.

I often feel like I’ve been transported back to the 30 Years War between Catholics and Protestants in the 1600s.

I’ve written numerous books that (in part) cover the inherent limits of markets and the state, so I’ll keep this brief. Markets are based on two premises: 1) profits are the key motivator of human activity and 2) whatever is scarce can be replaced by something that is abundant (for example, when we’ve wiped out all the wild Bluefin tuna, we can substitute farmed catfish.)

But what about work that creates value but isn’t profitable? This simply doesn’t compute in the market mentality. Neither does the fact that wiping out the wild fisheries disrupts an ecosystem that is essentially impossible to value in terms that markets understand: in a market, the supply and the demand in this moment set the price and thus the value of everything.

But ecosystems simply cannot be valued by the price set in the moment by current supply and demand.

As for the state, its ontological imperative is to concentrate power, and since wealth is power, this means concentrating political and financial power. Once bureaucracies have concentrated power, insiders focus on securing budgets and benefits, and limiting transparency and accountability, as these endanger the insiders’ power, security and perquisites.

Both of these systems share a single quasi-religious ideology: a belief that endless economic growth is an intrinsic good, for it is the ultimate foundation of all human prosperity. In other words, we can only prosper and become more secure if we’re consuming more of everything: resources, credit, energy, and so on.

The second shared ideological faith is that centralizing wealth and power are not just inevitable but good. In other words, Left and Right share a single quasi-religious belief that centralization is not just inevitable but positive; the only difference is in who should hold the concentrated wealth/power, private owners or the state.

This ideology assumes a winner take most structure of winners and losers, with the winnings being concentrated in the hands of a few at the top of the Winners. Thus rising inequality and divisiveness are assumed to be the natural state of any economy.

This ideology underpins the entire status quo spectrum. The “growth at any cost is good” part of the single ideology underpinning the status quo is captured by the 1960 Soviet-era film Letter Never Sent; in its haunting, surreal final scene, a character envisions a grand wilderness untouched by human hands transformed into an industrial wasteland of belching chimneys and sprawling factories. This was not a nightmare–this was the Soviet dream, and indeed, the dream of the “growth at any cost is good” West.

Simply put, the status quo of markets and states is incapable of DeGrowth, i.e. consuming less of everything, including credit, “money”, profits, taxes—everything that fuels both the state and the market. As I have taken pains to explain, it doesn’t matter if a factory is owned by private owners or the state: the mandate of capital is to grow. If capital doesn’t grow, the resulting losses will sink the enterprise—including the state itself.

What lies beyond “growth at any cost” capitalism and socialism? My answer is the self-funded community economy, a system that is self-funded (i.e. no need for a central bank or Treasury) with a digital currency that is created and distributed for the sole purpose of funding work that addresses scarcities in local communities.

I outline this system in my book A Radiocally Beneficial World: Automation, Technology and Creating Jobs for All.

Rather than concentrate power in the hands of state insiders, this system distributes power to communities are participants. Rather than concentrate the power to create currency for the benefit of banks and the state, this system distributes the power to create currency for the sole benefit of those working on behalf of the community, on projects prioritized by the community.

This community economy recognizes that some work is valuable but not profitable. The profit-driven market will never do this work, and the central state is (to use Peter Drucker’s term) the wrong unit size to ascertain each community’s needs and scarcities.

Clearly, we need a socio-economic-political system that has the structure to not just grasp the necessity of DeGrowth and positive social roles (work benefiting the greater community) but to embrace these goals as its raison d’etre (reason to exist).

Human activity is largely guided by incentives, both chemical incentives in our brains and incentives presented by the society/economy we inhabit. In the current system, concentrating power and wealth in the hands of the few at the expense of the many and wasting resources / destroying ecosystems are incentivized if the activity is profitable to some enterprise or deemed necessary by the state.

In the current system, the state incentivizes protecting its wealth and power and the security/benefits of its insiders, and markets incentivize maximizing profits by any means available.

As I have explained many times in the blog and my books, we inhabit a state-cartel economy: the most profitable form of enterprise is the quasi-monopoly or cartel that limits supply and competition in order to extract the maximum profit from its customers.

Monopolies (or quasi-monopolies such as Google, which holds a majority share of global search revenues, excluding China) and cartels quickly amass profits which they then use to secure protection of their cartel from the state via lobbying, campaign contributions, etc. The elites controlling the state benefit from this arrangement, and so the system inevitably becomes a state-cartel system dominated by the state and private sector cartels and incentives that benefit the wealth and power of these institutions.

Once we understand the inevitability of this marriage of state and cartel, we understand socialism and capitalism–the State and Markets–are the yin and yang of one system. Reformers may recognize some of the inherent limits of the state and the market, but they believe these problems can be solved by tweaking policies–in systems-speak, modifying the parameters of the existing subsystems of lawmaking, the judiciary, regulatory agencies, and so on.

But as Donella Meadows explained in her classic paper, Leverage Points: Places to Intervene in a System tweaking the parameters doesn’t actually change the system. For that, we must add a new feedback loop.

The status quo, in all its various forms, is dominated by incentives that strengthen the centralization of wealth and power, increase inequality and divisiveness and the permanent expansion of consumption and credit. That this path leads to implosion / collapse does not compute because the status quo is constructed on the fundamental assumption that permanent growth/expansion of consumption, credit, wealth and state power is not just possible but necessary.

As many of us have labored to show, the financial system has been pushed to unprecedented extremes to maintain the illusion that rapid growth of consumption and credit can be maintained essentially forever.

We need an alternative system that’s built on sustainable incentives and feedback loops so we have a new blueprint to follow as the current arrangement unravels in the next decade or two.

Security and prosperity are worthy goals, but the means to achieve them, as well as the definition of security / prosperity, must be reworked from the ground up. We need to include positive social roles and meaningful work as essential components of security/prosperity.

My conception of a Third / Community Economy does not replace either the state or the free-enterprise market; rather, it does what neither of the existing structures can do. It adds opportunity, purpose, positive social roles and earned income for those left out of the state/cartel/market economy.