Conflict Theory and Biosphere Annihilation

By Robert J. Burrowes

In a recent article titled Challenges for Resolving Complex Conflicts’, I pointed out that existing conflict theory pays little attention to the extinction-causing conflict being ongoingly generated by human over-consumption in the finite planetary biosphere (and, among other outcomes, currently resulting in 200 species extinctions daily). I also mentioned that this conflict is sometimes inadequately identified as a conflict caused by capitalism’s drive for unending economic growth in a finite environment.

I would like to explain the psychological origin of this biosphere-annihilating conflict and how this origin has nurtured the incredibly destructive aspects of capitalism (and socialism, for that matter) from the beginning. I would also like to explain what we can do about it.

Before I do, however, let me briefly illustrate why this particular conflict configuration is so important by offering you a taste of the most recent research evidence in relation to the climate catastrophe and biosphere annihilation and why the time to resolve this conflict is rapidly running out (assuming, problematically, that we can avert nuclear war in the meantime).

In an article reporting a recent speech by Professor James G. Anderson of Harvard University, whose research led to the Montreal Protocol in 1987 to mitigate CFC damage to the Ozone Layer, environmental journalist Robert Hunizker summarizes Anderson’s position as follows: ‘the chance of permanent ice remaining in the Arctic after 2022 is zero. Already, 80% is gone. The problem: Without an ice shield to protect frozen methane hydrates in place for millennia, the Arctic turns into a methane nightmare.’ See ‘There Is No Time Left’.

But if you think that sounds drastic, other recent research has drawn attention to the fact that the ‘alarming loss of insects will likely take down humanity before global warming hits maximum velocity…. The worldwide loss of insects is simply staggering with some reports of 75% up to 90%, happening much faster than the paleoclimate record rate of the past five major extinction events’. Without insects ‘burrowing, forming new soil, aerating soil, pollinating food crops…’ and providing food for many bird species, the biosphere simply collapses. See Insect Decimation Upstages Global Warming’.

So, if we are in the process of annihilating Earth’s biosphere, which will precipitate human extinction in the near term, why aren’t we paying much more attention to the origin of this fundamental conflict? And then developing a precisely focused strategy for transcending it?

The answer to these two questions is simply this: the origin of this conflict is particularly unpalatable and, from my careful observation, most people, including conflict theorists, aren’t anxious to focus on it.

So why are human beings over-consuming in the finite planetary biosphere? Or more accurately, why are human beings who have the opportunity to do so (which doesn’t include those impoverished people living in Africa, Asia, Central/South America or anywhere else) over-consuming in the finite planetary biosphere?

They are doing so because they were terrorized into unconsciously equating consumption with a meaningful life by parents and other adults who had already internalized this same ‘learning’.

Let me explain how this happens.

At the moment of birth, a baby is genetically programmed to feel and express their feelings in response to the stimuli, both internal and external, that the baby registers. For example, as soon after birth as a baby feels hungry, they will signal that need, usually by crying or screaming. An attentive parent (or other suitable adult) will usually respond to this need by feeding the baby and the baby will express their satisfaction with this outcome, perhaps with a facial expression, in a way that most aware parents and adults will have no difficulty identifying. Similarly, if the baby is cold, in pain or experiencing any other stimulus, the baby will express their need, probably by making a loud noise. Given that babies cannot immediately use a cultural language, they use the language that was given to them by evolution: particularly audibly expressed noise of various types that an aware adult will quickly learn to interpret.

Of course, from the initial moments after birth and throughout the next few months, a baby will experience an increasing range of stimuli – including internal stimuli such as the needs for listening, understanding and love, as well as external stimuli ranging from a wet nappy to a diverse set of parental, social, climate and environmental stimuli – and will develop a diverse and expanding range of ways, now including a wider range of emotional expression but eventually starting to include spoken language, of expressing their responses, including satisfaction and enjoyment if appropriate, to these stimuli.

At some vital point, however, and certainly within the child’s first eighteen months, the child’s parents and the other significant adults in the child’s life, will start to routinely and actively interfere with the child’s emotional expression (and thus deny them satisfaction of the unique needs being expressed in each case) in order to compel the child to do as the parent/adult wishes. Of course, this is essential if you want the child to be obedient – a socially compliant slave – rather than to follow their own Self-will.

One of the critically important ways in which this denial of emotional expression occurs seems benign enough: Children who are crying, angry or frightened are scared into not expressing their feelings and offered material items – such as food or a toy – to distract them instead. Unfortunately, the distractive items become addictive drugs. Unable to have their emotional needs met, the child learns to seek relief by acquiring the material substitutes offered by the parent. But as this emotional deprivation endlessly expands because the child has been denied the listening, understanding and love to develop the capacity to listen to, love and understand themself, so too does the ‘need’ for material acquisition endlessly expand.

As an aside, this explains why most violence is overtly directed at gaining control of material, rather than emotional, resources. The material resource becomes a dysfunctional and quite inadequate replacement for satisfaction of the emotional need. And, because the material resource cannot ‘work’ to meet an emotional need, the individual is most likely to keep using direct and/or structural violence to gain control of more material resources in an unconscious and utterly futile attempt to meet unidentified emotional needs. In essence, no amount of money and other assets can replace the love denied a child that would allow them to feel and act on their feelings.

Of course, the individual who consumes more than they need and uses direct violence, or simply takes advantage of structural violence, to do so is never aware of their deeply suppressed emotional needs and of the functional ways of having these needs met. Although, I admit, this is not easy to do given that listening, understanding and love are not readily available from others who have themselves been denied these needs. Consequently, with their emotional needs now unconsciously ‘hidden’ from the individual, they will endlessly project that the needs they want met are, in fact, material.

This is the reason why members of the Rothschild family, Jeff Bezos, Bill Gates, Warren Buffett, Amancio Ortega, Mark Zuckerberg, Carlos Slim, the Walton family and the Koch brothers, as well as the world’s other billionaires and millionaires, seek material wealth and are willing to do so by taking advantage of structures of exploitation held in place by the US military. They are certainly wealthy in the material sense; unfortunately, they are emotional voids who were never loved and do not know how to love themself or others now.

Tragically, however, this fate is not exclusive to the world’s wealthy even if they illustrate the point most graphically. As indicated above, virtually all people who live in material cultures have suffered this fate and this is readily illustrated by their ongoing excessive consumption – especially their meat-eating, fossil-fueled travel and acquisition of an endless stream of assets – in a planetary biosphere that has long been signaling ‘Enough!’

As an aside, governments that use military violence to gain control of material resources are simply governments composed of many individuals with this dysfunctionality, which is very common in industrialized countries that promote materialism. Thus, cultures that unconsciously allow and encourage this dysfunctional projection (that an emotional need is met by material acquisition) are the most violent both domestically and internationally. This also explains why industrialized (material) countries use military violence to maintain political and economic structures that allow ongoing exploitation of non-industrialized countries in Africa, Asia and Central/South America.

In summary, the individual who has all of their emotional needs met requires only the intellectual and few material resources necessary to maintain this fulfilling life: anything beyond this is not only useless, it is a burden.

If you want to read (a great deal) more detail of the explanation presented above, you will find it in Why Violence?’ and Fearless Psychology and Fearful Psychology: Principles and Practice.

So what can we do?

Well, I would start by profoundly changing our conception of sound parenting by emphasizing the importance of nisteling to children – see Nisteling: The Art of Deep Listening’ – and making ‘My Promise to Children’.

For those adults who feel incapable of nisteling or living out such a promise, I encourage you to consider doing the emotional healing necessary by ‘Putting Feelings First’.

If you already feel capable of responding powerfully to this extinction-threatening conflict between human consumption and the Earth’s biosphere, you are welcome to consider joining those who are participating in the fifteen-year strategy to reduce consumption and achieve self-reliance explained in ‘The Flame Tree Project to Save Life on Earth’ and/or to consider using sound nonviolent strategy to conduct your climate or environment campaign. See Nonviolent Campaign Strategy.

You are also welcome to consider signing the online pledge of The Peoples Charter to Create a Nonviolent World.

As the material simplicity of Mohandas K. Gandhi demonstrated: Consumption is not life.

If you are not able to emulate Gandhi (at least ‘in spirit’) by living modestly, it is your own emotional dysfunctionality – particularly unconscious fear – that is the problem that needs to be addressed.

Biodata: Robert J. Burrowes has a lifetime commitment to understanding and ending human violence. He has done extensive research since 1966 in an effort to understand why human beings are violent and has been a nonviolent activist since 1981. He is the author of ‘Why Violence?’ http://tinyurl.com/whyviolence His email address is flametree@riseup.net and his website is here. http://robertjburrowes.wordpress.com

Robert J. Burrowes
P.O. Box 68
Daylesford, Victoria 3460
Australia

Email: flametree@riseup.net

Websites:
Nonviolence Charter
Flame Tree Project to Save Life on Earth
‘Why Violence?’
Feelings First
Nonviolent Campaign Strategy
Nonviolent Defense/Liberation Strategy
Anita: Songs of Nonviolence
Robert Burrowes
Global Nonviolence Network

Overcoming the Myth of Authority

By Gary Z McGee

Source: The Mind Unleashed

“For thousands hacking at the branches there is one striking the root.” ~Henry David Thoreau

If, as Albert Einstein said, “unthinking respect for authority is the greatest enemy of truth,” then it stands to reason that we should think critically toward, rather than blindly believe in, authority. No matter who or what that authority might be.

Whether it’s an eccentric physicist with wild hair or an authoritarian president demanding respect without giving it. Whether it’s a flat-earther challenging the very foundations of physics, or an overreaching cop high on false power. Belief in authority is a huge psychological hang-up for our species. It’s an evolutionary impediment of monumental proportions.

Even as we daily self-overcome, so too should we daily overcome the myth of authority. It’s a myth because it’s foremost a story. It’s a story we’ve all fallen for –hook, line, and sinker. It’s a story that most of us were culturally conditioned to believe in. It’s a story that most of us take as a given, but certainly should not. For, ultimately, “it’s just the way things are” is a cowardly copout.

Rather than cowardice, rather than willful ignorance, complacency, and intellectual laziness, we should challenge the myth of authority –across the board. We should be ruthless with our skepticism, like a scientist regarding his own hypothesis, like peer-reviewed interrogators keeping the science of others honest.

Because the art of life, especially an examined life that’s well-lived, is scientific, logical, and reasonable. It strikes at the heart of the orthodoxy, whatever that may be. It undermines the Powers That Be, whoever they may be. And that’s likely to upset more than a few blind worshippers, myopic rule-followers, and willfully ignorant law-abiding citizens. So be it. Upset their precious apple-cart anyway. Especially if that apple-cart is outdated, violent, and based upon parochial reasoning and fear. As Oscar Wilde stated, “Disobedience was man’s original virtue.”

Overcoming authoritarianism:

“As soon as the generals and the politicos can predict the motions of your mind, lose it. Leave it as a sign to mark the false trail, the way you didn’t go. Be like the fox who makes more tracks than necessary, some in the wrong direction. Practice resurrection.” ~Wendell Berry

The problem with belief in authority is that it leads to the idea that we need to give a group of people permission to control us. And, as Lord Byron taught us, power given to an authority tends to become corrupt.

The problem with power is not the intent behind it. The problem with power is that it tends to corrupt the one wielding it regardless of their intent. So, since we all know that power tends to corrupt whether one has good or bad intentions, and since we know that we will all seek power anyway, it behooves us to be mercilessly circumspect both with our own power and against the power of others.

It stands to reason that we should not ignorantly give power to an authority by blindly believing it. We should instead challenge authority first, and trust it second, if at all. The best way to use our power is to use it against authority by ruthlessly questioning it. It’s a social leveling mechanism par excellence. As a wise, young sixth grader once said, “Question authority, including the authority that told you to question authority.”

Otherwise, people will fight and murder and commit genocide and ecocide for the so-called authority that they “believe” in. But they might not have fought so violently and thoughtlessly had they simply taken the power dynamic into deep consideration, nonviolently challenged that perceived dynamic, and then moved on smartly with their lives.

The best way to maintain a healthy skepticism, and not devolve into an ignorant, sycophantic, violent mess, is to take things into consideration and question them rather than blindly believe in them.

Overcoming tribalism:

“To be modern is to let imagination and invention do a lot of the work once done by tradition and ritual.” ~Adam Gopnik

By becoming worldly patriots instead of patriotic nationalists, we turn the tables on xenophobia, apathy, and blind nationalism, and we become more compassionate and empathetic towards other cultures. When we celebrate diversity instead of trying to cram the square peg of cultural affiliation into the round hole of colonialism, we turn the tables on the monkey-mind’s one-dimensional moral tribalism and we usher in Joshua Greene’s multi-dimensional concept of metamorality.

By reinforcing global citizenry rather than nationalism, we turn the tables on both our lizard brains and the Powers That Be. Like Joshua Greene says in Moral Tribes, “We need a kind of thinking that enables groups with conflicting moralities to live together and prosper. In other words, we need a metamorality. We need a moral system that resolves disagreements among groups with different moral ideals, just as ordinary first-order morality resolves disagreements among individuals with different selfish interests.”

Going Meta with morality launches us into a big-picture perspective. We’re shot out of the box of outdated tribal thinking and into a realm of higher consciousness, where our inherent tribalism gets countered by an updated logic and reasoning. We gain the holistic vision of “over eyes” (like the astronaut Overview Effect), where societal delusions and cultural abstractions dissolve into interconnectedness and interdependence.

Overcoming magical thinking:

“Every fact of science was once damned. Every invention was considered impossible. Every discovery was a nervous shock to some orthodoxy. Every artistic innovation was denounced as fraud and folly. The entire web of culture and ‘progress,’ everything on earth that is man-made and not given to us by nature, is the concrete manifestation of some man’s refusal to bow to Authority. We would own no more, know no more, and be no more than the first apelike hominids if it were not for the rebellious, the recalcitrant, and the intransigent.” ~Robert Anton Wilson

Overcoming magical thinking is vital for the healthy and progressive evolution of our species. Healthy progress depends upon courageous individuals capable of challenging authority. Especially authorities that are based in magical thinking.

If we don’t have the courage to challenge an authority that preaches magical thinking, then we are doomed to become a victim to their magical thinking. It’s for this reason, above all, that authority should be challenged.

Refusing to bow to an authority is not without its consequences. But upsetting an authority should not be avoided at the expense of progress. Progress should be embraced at the risk of upsetting an authority.

Otherwise, there would be no progress. We would remain stuck in parochial, magical thinking. We would become a stagnant –or worse, devolving– species. To avoid unhealthy stagnation and entropic devolution, we need courageous individuals who refuse to bow to authority and instead choose to ruthlessly question and nonviolently challenge that authority.

Without those who are willing to disobey, we are lost. Without them, we are left with cowardly conformists, xenophobic nationalists, complacent pacifists, dogmatic believers relying upon blind faith, and tyrannical powermongers using their power to control others. In short: we are left with magical thinking over logic and reasoning.

So, I implore you, if you would be courageous, reasonable, healthy, progressive human beings: challenge Authority. Strategically disobey. Nonviolently revolt. Lovingly crush out. Tenderly recondition the cultural conditioning of others lest they collapse in upon their own cognitive dissonance. Dare to pull the blindfold from your brother’s eyes lest they unwittingly force the blindfold back upon you.

Above all, practice self-overcoming. Otherwise, power –either yours or someone else’s– will overcome you. Be just as circumspect with your own power as you are toward the power of others.

Authorities will come and go. As they should. Your own authority will wax and wane. As it should. The balance of power within the human condition is vital for the healthy and progressive evolution of our species. And nothing balances out power better than the courage to challenge authority. The biblical courage of David pales in comparison to the individual who bravely challenges the modern-day Goliath of entrenched authority.

Identity Theft and the Body’s Disappearance

By Robert Bohm

Source: The Hampton Institute

“What sphinx of cement and aluminum bashed open their skulls and ate up their brains and imagination?”

– Allen Ginsberg from his poem “ Howl

Identity theft, at least the most familiar type, is possible because today the individual exists not merely as flesh and blood, but as flesh and blood spliced with bank account numbers, user names, passwords, credit card chips, etc. These added parts aren’t secondary to the individual’s overall identity, they’re central to it. Sometimes they’re all there is of it, as in many banking and purchasing transactions. In such instances, the data we’ve supplied to the relevant institutions doesn’t merely represent us, it is us. Our bodies alone can’t complete transactions without the account numbers, user names, passwords, credit card numbers, and ID cards which have become our identity’s essence. Without them, in many ways, we don’t exist.

In a worst case scenario, if someone gets hold of this private data, they can become us by possessing the data that is us. Following this, who or what we are is no longer a question. We don’t exist, except in the form of a stolen dataset now under someone else’s control.

In such a case, an unknown proxy has eliminated us and become who we once were.

Although problematic, the above form of identity theft is relatively minor. A worse form is one we all know about, yet chronically underestimate because we think of ourselves as too canny to be conned. Nonetheless, this other form of identity theft frames and limits everything we do. In the process, it fleeces us of the fullness of our identities and subjects our lives to a type of remote control. This remote control consists of the combined influence on us, from childhood onward, of society’s major institutions and dominant activities, which seed us with a variety of parameters for how to acceptably navigate society and and its particular challenges.

This process is usually called “socialization.” However, it’s better seen as a sorting procedure in which society sifts us through a citizenship sieve in order to eliminate supposed defects, thereby guaranteeing that, despite each of us possessing unique characteristics, we share an underlying uniformity. Ultimately, this process is a kind of identity eugenics which strives to purify the population by eliminating or weakening troublesome qualities – e.g., an overly questioning attitude, chronic boundary-testing, a confrontational stance toward authority, a fierce protectiveness toward whatever space the body inhabits, etc. Such traits are frowned upon because they’re seen by the status quo as a likely threat to society’s stability.

Such indoctrination is much subtler yet, in many ways, more pervasive than outright propaganda. Its theater of operations is everywhere, taking place on many fronts. Public and private education, advertising, mass culture, government institutions, the prevailing ideas of how to correct socioeconomic wrongs (this is a “good” form of protest, this a “bad” one), the methods by which various slangs are robbed of their transgressive nature through absorption into the mainstream, the social production of substitute behaviors for nonconformity and rebellion – each of these phenomena and others play a role in generating the so-called “acceptable citizen,” a trimmed down (i.e., possesses reduced potential) version of her or his original personality.

Make no doubt about it, this trimming of the personality is a form of identity theft. It is, in fact, the ultimate form. Take as an example the African slave in the U.S.: abducted from her or his homeland, forbidden from learning to read or write, denied legal standing in the courts, given no say over whether offspring would be sold to another owner or remain with them. The slave was robbed of her/his most essential identity, their status as a human being.

In his book, The Souls of Black Folk , W.E.B. Du Bois described this theft in terms of how slavery reduces the slave to a person with “no true self-consciousness” – that is, with no stable knowledge of self, no clear sense of who she or he is in terms of culture, preceding generations, rituals for bringing to fruition one’s potential to create her or his own fate. As Du Bois correctly argued, this left the slave, and afterwards the freed Black, with a “longing to attain self-conscious manhood,” to know who she or he was, to see oneself through one’s own eyes and not through the eyes of one’s denigrators – e.g., white supremacists, confederate diehards, “good” people who nonetheless regarded Blacks as “lesser,” etc. Du Bois understood that from such people’s perspectives, Blacks possessed only one identity: the identity of being owned, of possessing no value other than what its owner could extract from them. Without an owner to extract this value, the slave was either identity-less or possessed an identity so slimmed and emaciated as to be a nothing.

The point here isn’t that today socialization enslaves the population in the same way as U.S. slavery once enslaved Blacks, but rather that identity theft is, psychologically and culturally speaking, a key aspect of disempowering people and has been for centuries. Today, because of mass culture and new technologies, the methods of accomplishing it are far more sophisticated than during other eras.

How disempowerment/identity theft occurs in contemporary society is inseparable from capitalism’s current state of development. We long ago passed the moment (after the introduction of assembly line production in the early 20th century) when modern advertising started its trek toward becoming one of the most powerful socialization forces in the U.S. As such, it convinces consumers not only to purchase individual products but, even more importantly, sells us on the idea that buying in general and all the time, no matter what we purchase, is proof of one’s value as a person.

To accomplish this end, modern advertising was molded by its creators into a type of PSYOP designed for destabilizing individuals’ adherence to old saws like “a penny saved is a penny earned” and “without frugality none can be rich, and with it very few would be poor.” Once this happened, the United States’ days of puritan buying restraint were over. However, modern advertising was never solely about undermining personal fiscal restraint. It was also about manipulating feelings of personal failure – e.g., dissatisfaction with lifestyle and income, a sense of being trapped, fear of being physically unappealing, etc. – and turning them not into motives for self-scrutiny or social critiques, but into a spur for commodity obsession. This wasn’t simply about owning the product or products, but an obsessive hope that buying one or more commodities would trigger relief from momentary or long-term anxiety and frustration related to one’s life-woes: job, marriage, lack of money, illness, etc.

Helen Woodward, a leading advertising copywriter of the early decades of the 20th century, described how this was done in her book, Through Many Windows , published in 1926. One example she used focused on women as consumers:

The restless desire for a change in fashions is a healthy outlet. It is normal to want something different, something new, even if many women spend too much time and too much money that way. Change is the most beneficent medicine in the world to most people. And to those who cannot change their whole lives or occupations, even a new line in a dress is often a relief. The woman who is tired of her husband or her home or a job feels some lifting of the weight of life from seeing a straight line change into a bouffant, or a gray pass into a beige. Most people do not have the courage or understanding to make deeper changes.

Woodward’s statement reveals not only the advertising industry’s PSYOP characteristic of manipulating people’s frustrations in order to lure them into making purchases, but also the industry’s view of the people to whom it speaks through its ads. As indicated by Woodward’s words, this view is one of condescension, of viewing most consumers as unable to bring about real socioeconomic change because they lack the abilities – “the courage or understanding” – necessary to do so. Consequently, their main purpose in life, it is implied, is to exist as a consumer mass constantly gorging on capitalism’s products in order to keep the system running smoothly. In doing this, Woodward writes, buyers find in the act of making purchases “a healthy outlet” for troubled emotions spawned in other parts of their lives.

Such advertising philosophies in the early 20th century opened a door for the industry, one that would never again be closed. Through that door (or window), one could glimpse the future: a world with an ever greater supply of commodities to sell and an advertising industry ready to make sure people bought them. To guarantee this, advertisers set about creating additional techniques for reshaping public consciousness into one persuaded that owning as many of those commodities as possible was an existential exercise of defining who an individual was.

In his book The Consumer Society , philosopher Jean Baudrillard deals with precisely this process. He writes that such a society is driven by:

the contradiction between a virtually unlimited productivity and the need to dispose of the product. It becomes vital for the system at this stage to control not only the mechanism of production, but also consumer demand.

“To control … consumer demand.” This is the key phrase here. Capitalist forces not only wanted to own and control the means of production in factories, it also wanted to control consumers in such a way that they had no choice but to buy, then buy more. In other words, capitalism was in quest of a strategy engineered to make us synch our minds to a capitalism operating in overdrive (“virtually unlimited” production).

The way this occurs, Baudrillard argues, is by capitalism transforming (through advertising) the process of buying an individual product from merely being a response to a “this looks good” or “that would be useful around the house” attitude to something more in line with what psychologists call “ego integration.” It refers to that part of human development in which an individual’s various personality characteristics (viewpoints, goals, physical desires, etc.) are organized into a balanced whole. At that point, what advertising basically did for capitalism was develop a reconfigured ego integration process in which the personality is reorganized to view its stability as dependent on its life as a consumer.

Advertisers pulled this off because the commodity, in an age of commodity profusion, isn’t simply a commodity but is also an indicator or sign referring to a particular set of values or behavior, i.e. a particular type of person. It is this which is purchased: the meaning, or constellation of meanings, which the commodity indicates.

In this way, the commodity, once bought, becomes a signal to others that “I, the owner, am this type of person.” Buy an Old Hickory J143 baseball bat and those in the know grasp that you’re headed for the pros. Sling on some Pandora bling and all the guys’ eyes are on you as you hip-swing into the Groove Lounge. Even the NY Times is hip to what’s up. If you want to be a true Antifa activist, the newspaper informed its readers on Nov. 29, 2017, this is the attire you must wear:

Black work or military boots, pants, balaclavas or ski masks, gloves and jackets, North Face brand or otherwise. Gas masks, goggles and shields may be added as accessories, but the basics have stayed the same since the look’s inception.

After you dress up, it’s not even necessary to attend a protest and fight fascists to be full-blown Antifa. You’re a walking billboard (or signification) proclaiming your values everywhere. Dress the part and you are the part.

Let’s return to Baudrillard, though. In The System of Objects , another of his books, he writes about how the issue of signification, and the method by which individuals purchase particular commodities in order to refine their identity for public consumption, becomes the universal mass experience:

To become an object of consumption, an object must first become a sign. That is to say: it must become external, in a sense, to a relationship that it now merely signifies … Only in this context can it be ‘personalized’, can it become part of a series, and so on; only thus can it be consumed, never in its materiality, but in its difference.

This “difference” is what the product signifies. That is, the product isn’t just a product anymore. It isn’t only its function. It has transitioned into an indicator of a unique personality trait, or of being a member of a certain lifestyle grouping or social class, or of subscribing to a particular political persuasion, Republican, anarchist, whatever. In this way, choosing the commodities to purchase is essential to one’s self-construction, one’s effort to make sure the world knows exactly who they are.

The individual produced by this citizen-forming process is a reduced one, the weight of her/his full personality pared down by cutting off the unnecessary weight of potentials and inclinations perceived as “not a good fit” for a citizen at this stage of capitalism. Such a citizen, however, isn’t an automaton. She or he makes choices, indulges her or his unique appetites, even periodically rebels against bureaucratic inefficiency or a social inequity perceived to be particularly stupid or unfair. Yet after a few days or few months of this activity, this momentary rebel fades back into the woodwork, satisfied by their sincere but token challenge to the mainstream. The woodwork into which they fade is, of course, their home or another favorite location (a lover’s apartment, a bar, a ski resort cabin, a pool hall, etc.).

From this point on, or at least for the foreseeable future, such a person isn’t inclined to look at the world with a sharp political eye, except possibly within the confines of their private life. In this way, they turn whatever criticism of the mainstream they may have into a petty gripe endowed with no intention of joining with others in order to fight for any specific change(s) regarding that political, socioeconomic or cultural phenomenon against which the complaint has been lodged. Instead, all the complainer wants is congratulations from her or his listener(s) about how passionate, on-target, and right the complaint was.

This is the sieve process, identity eugenics, in action. Far more subtle and elastic than previous methods of social control, it narrows what we believe to be our options and successfully maneuvers us into a world where advertising shapes us more than schools do. In this mode, it teaches us that life’s choices aren’t so much about justice or morality, but more about what choosing between commodities is like: which is more useful to me in my private life, which one better defines me as a person, which one makes me look cooler, chicer, brainier, hunkier, more activist to those I know.

It is in this context that a young, new, “acceptable” citizen enters society as a walking irony. Raised to be a cog in a machine in a time of capitalistic excess, the individual arrives on the scene as a player of no consequence in a game in which she or he has been deluded that they’re the game’s star. But far from being a star, this person, weakened beyond repair by the surrender of too much potential, is so without ability that she or he has no impact whatsoever on the game. Consequently, this individual is, for all practical purposes, an absence. The ultimate invisible person, a nothing in the midst of players who don’t take note of this absence at all. And why should they? The full-of-potential individual who eventually morphed into this absence is long gone, remembered by no one, except as a fading image of what once was.

This process of reducing a potentially creative person into a virtual non-presence is a form of ideological anorexia. Once afflicted, an individual refuses nourishment until they’re nothing but skin and bones. However, the “weight” they’ve lost doesn’t consist of actual pounds. Instead, it involves a loss of the psychological heftiness and mental bulk necessary to be a full human being.

One can’t lose more weight than that.

Human life as we once knew it is gone, replaced by the ritual of endless purchasing. This is existence in what used to be called “the belly of the beast.” Our role in life has become to nourish capitalism by being at its disposal, by giving of ourselves. Such giving frequently entails self-mutilation: the debt, credit card and otherwise, that bludgeons to death the dreams of many individuals and families.

This quasi-religious self-sacrifice replicates in another form: the Dark Ages practice employed by fanatical monks and other flagellants who lashed themselves with whips made from copper wires, thereby ripping their flesh and bleeding until they descended into a state of religious hysteria. The more we give of ourselves in this way, the thinner and more weightless we become. Meanwhile, the god whom Allen Ginsberg called Moloch grows more obese day after day, its belly is filled with:

Robot apartments! invisible suburbs! skeleton treasuries! blind capitals! demonic industries! spectral nations! invincible madhouses! granite cocks! monstrous bombs!…

Dreams! adorations! illuminations! religions! the whole boatload of sensitive bullshit!

What capitalism wants from us, of course, isn’t merely self-sacrifice, it’s surrender. Hunger for life is viewed negatively by the status quo because it nourishes the self, making it stronger and more alert and, therefore, better prepared to assert itself. The fact that such an empowered self is more there (possesses more of a presence) than its undersized counterpart makes the healthier self unacceptable to the powers that be. This is because there-ness is no longer an option in our national life. Only non-there-ness is. If you’re not a political anorexic, you’re on the wrong side.

Wherever we look, we see it. Invisibility, or at least as much of it as possible, is the individual’s goal. It’s the new real. Fashion reveals this as well as anything. It does so by disseminating an ideal of beauty that fetishizes the body’s anorexic wilting away. Not the body’s presence but its fade to disappearance is the source of its allure. The ultimate fashion model hovers fragilely on the brink of absence in order not to distract from the only thing which counts in capitalism: the commodity to be sold – e.g., the boutique bomber jacket, the shirt, the pantsuit, the earrings, the shawl, the stilettos, the iPhone, the Ferrari, and, possibly most of all, the political passivity intrinsic to spending your life acquiring things in order to prove to others and ourselves that we’ve discovered in these things something more useful than Socrates’ goal of knowing thyself or Emma Goldman’s warning , “The most unpardonable sin in society is independence of thought.”

What is true on the fashion runway is also true in politics. Just as the best model is one thin enough to fade into non-presence, so our democracy, supposedly ruled “by and for the people,” has thinned down so much that “the people” can’t even be seen (except as stage props), let alone get their hands on democracy except in token ways. No matter how often we the people are praised rhetorically by politicians, we aren’t allowed as a group to get in the way of the capitalist system’s freedom to do whatever it wants in order to sustain commodity worship and guarantee capital’s right to permanent rule. If the military-industrial complex needs another war in order to pump out more profits, then so be it. We have no say in the matter. The identity theft built into society’s structure makes sure of this. It’s stripped us of our “weight” – our creativity, our willingness to take political risks, our capacity to choose action over posturing. After this forced weight loss, what’s left of us is a mess. Too philosophically and psychologically anemic to successfully challenge our leaders’ decisions, we, for all practical purposes, disappear.

As a reward for our passivity, we’re permitted a certain range of freedom – as long as “a certain range” is defined as “varieties of buying” and doesn’t include behavior that might result in the population’s attainment of greater political power.

So, it continues, the only good citizen is the absent citizen. Which is to say, a citizen who has dieted him or herself into a state of political anorexia – i.e., that level of mental weightlessness necessary for guaranteeing a person’s permanent self-exclusion from the machinery of power.

***

Our flesh no longer exists in the way it once did. A new evolutionary stage has arrived.

In this new stage, the flesh isn’t merely what it seems to be: flesh, pure and simple. Instead, it’s a hybrid. It’s what exists after the mind oversees its passage through the sieve of mass culture.

After this passage, what the flesh is now are the poses it adopts from studying movies, rappers, punk rockers, fashionistas of all kinds, reality TV stars, football hunks, whomever. It’s also what it wears, skinny jeans or loose-fitting chinos, short skirt or spandex, Hawaiian shirt or muscle tank top, pierced bellybutton, dope hiking boots, burgundy eyeliner. Here we come, marching, strolling, demon-eyed, innocent as Johnny Appleseed. Everybody’s snapping pics with their phones, selfies and shots of others (friends, strangers, the maimed, the hilarious, the so-called idiotic). The flesh’s pictures are everywhere. In movie ads, cosmetic ads, suppository ads, Viagra ads. This is the wave of the already-here but still-coming future. The actual flesh’s replacement by televised, printed, digitalized and Photoshopped images of it produces the ultimate self-bifurcation.

Increasingly cut off from any unmediated life of its own, the flesh now exists mostly as a natural resource for those (including ourselves) who need it for a project; to photograph it, dress it up, pose it in a certain way, put it on a diet, commodify/objectify it in any style ranging from traditional commodification to the latest avant-garde objectification.

All these stylings/makeovers, although advertised as a form of liberation for the flesh (a “freeing” of your flesh so you can be what you want to be), are in fact not that. Instead, they are part of the process of distancing ourselves from the flesh by always doing something to it rather than simply being it.

When we are it, we feel what the flesh feels, the pain, the joy, the satisfaction, the terror, the disgust, the hints of hope, a sense of irreparable loss, whatever.

When we objectify it, it is a mannequin, emotionless, a thing that uses up a certain amount of space. As such we can do what we want with it: decorate it, pull it apart, vent our frustrations on it, starve it, practice surgical cuts on it, put it to whatever use we like. It isn’t a person. It is separate from our personhood and we own it.

In fact we own all the world’s flesh.

We live, after all, in the American Empire, and the Empire owns everything. As the Empire’s citizens, we own everything it owns. Except for one thing: ourselves.

***

The flesh is both here and not here. Increasingly, it is more an object that we do things to – e.g., bulk it up, change its hair color, mass-kill it from a hotel window on the 32nd floor, view in a porno flick – than a presence in its own right (i.e., self-contained, a force to be reckoned with). In this sense, it is a growing absence, each day losing more of its self-determination and becoming more a thing lost than something that exists fully, on its own, in the here and now. Given this, the proper attitude to have toward the flesh is one of nostalgia.

Of course, the flesh hasn’t really disappeared. What has disappeared is what it once was, a meat-and-bones reality, a site of pleasure and injury. Now, however, it’s not so valuable in itself as it is in its in its role as a starting-off point for endless makeovers.

These makeover options are arrayed before the consumer everywhere: online, in big box stores, in niche markets and so on. Today, it is in these places, not at birth, that the flesh starts its trek toward maturation. It does this by offering itself up as a sacrifice to be used as they see fit by the fashion industry, the gym industry, the addiction-cure industry, the diet industry, the pharmaceutical industry, the education industry, etc. Each body in the nation reaches its fullest potential only when it becomes a testing site to be used by these industries as they explore more and better ways to establish themselves as indispensable to capitalism’s endless reproduction.

In the end, the flesh, the target of all this competition for its attention, has less of a life on its own than it does as the object of advertisers’ opinions about what can be done to improve it or to reconstruct it. Only to the extent that the flesh can transcend or reconstitute itself can it be said to be truly alive.

This last fact – about aliveness – represents the culmination of a process. This process pertains to the visualization and digitalization of everything and the consequent disappearance of everything behind a wall of signification.

A televised or computerized image, discussion, commentary, conjecture, etc., becomes the thing it meditates on, depicts or interprets. This happens by virtue of the fact that the thing itself (the real flesh behind the televised or computerized image, discussion, commentary, conjecture, etc.) has disappeared into the discussion or into the image of it presented on the computer or TV screen.

In the same way, an anorexic model (her/his flesh and blood presence) disappears into the fashions she or he displays for the public.

In each instance the thing (the flesh) now no longer exists except in other people’s meditations on it; it has become those other people’s meditations. The ultimate anorexic, it (the thing) has lost so much weight it’s no longer physically there except as an idea in someone else’s mind or in a series of binary codings inside computers.

This is the final victory of absence over there-ness, of the anorexic ideal over the idea of being fully human (i.e., “bulging with existence,” “fat with life”). The self has been successfully starved to the point of such a radical thinness that it can no longer stand up to a blade of grass, let alone make itself felt by the powers that be.

The Four Pillars of Disaster Shamanism

By Gary Z McGee

Source: The Mind Unleashed

“We need shamans, and if society doesn’t provide them, the universe will.” ~Joe Lewels

In the article The Archetypal Path to Getting Your Shit Together, I wrote about the power of archetypes. A Disaster Shaman is an example of using an archetype to make the world a better place; to become the change we want to see in the world.

The Disaster Shaman recipe combines aspects of The Shadow with aspects of The Hero and then mixes in a little Trickster tomfoolery. The combination of these archetypes creates a particular flavor of nontraditional shamanism that spearheads healthy Cosmic Law through the heart of unhealthy lawfulness.

A Disaster Shaman is a force of nature first, a person second; sowing radical revolution in order to reap progressive evolution. Healthy balance is primary. Even if it comes at the expense of comfort and security. Disaster Shamanism (otherwise known as Apocalyptic Shamanism) is the interdependent shamanic response to apocalyptic times.

The primary tasks of a Disaster Shaman are as follows:

–Heal disaster situations through shamanic cosmology and ecopsychology.

–Listen to Universal Laws to discern the difference between healthy and unhealthy.

–Live moderately so that others may moderately live.

–Diagnose and heal Nature Deprivation.

–Transform weaponry into livingry.

–Bring “water” to the Wasteland.

The sub archetypes of the Disaster Shaman archetype are The New Hero, The Fifth Horseman of the Apocalypse, The Sacred Clown, and The New Oracle. The four pillars of Disaster Shamanism subsume these sub archetypes. Let’s break them down…

Hero-expiation (New Hero or Cosmic Hero):

“Our sole responsibility is to produce something smarter than we are; any problems beyond that are not ours to solve.” ~Ray Kurzweil

The average person is not heroic (courageous). Likewise, the average hero is not prestigious (provident). A New Hero, as opposed to a typical hero, is a hero with prestige. A New Hero has gone Meta with the concept of heroism itself.

The New Hero sub archetype is the first pillar of Disaster Shamanism. New heroism is gained through hero-expiation. Hero-expiation is the voluntary, wise, honorable, moral, and compassionate distribution of power so that power doesn’t get to the point that it corrupts.

It is through the wise distribution of power that a typical hero becomes a New-hero (cosmic hero, next-level hero) with skill, power, honor, and prestige, as opposed to just a typical hero with only skill and power.

Hero expiation is all about getting power over power. Whether one’s own power or the overreaching power of others. It’s about turning the tables on entrenched power by “counting coup” on it.

A Disaster Shaman as New Hero is a social leveling mechanism par excellence. They count coup on power through strategic humiliation (shaming) so that power never has the chance to become absolute.

When a New Hero comes to town, fear-filled blue pills are swapped with wisdom-filled red pills. The Disaster Shaman has come to set the record straight. To recondition the culture that has been conditioned into believing in competition over cooperation and narrow-minded one-upmanship over open-minded compassion. They teach that competition has always been secondary to cooperation; otherwise we wouldn’t have survived as a species (Darwin).

Eco-consciousness (Fifth Horseman of the Apocalypse):

“Extreme positions are not succeeded by moderate ones, but by contrary extreme positions.” ~Friedrich Nietzsche

Eco-consciousness is about uncontrolled order overcoming controlled chaos. Whether the culture (tribe) has become too obese, too greedy, too violent or some other unhealthy excess, the Disaster Shaman arrives to set the record straight; to plant a seed of overt moderation within the covert immoderation.

The Fifth Horseman is the one that cleans up after the original Four Horsemen (the Four Horsemen is a metaphor for anyone caught up (aware or not) in any kind of unsustainable, unhealthy, violent, immoral, or mass-destructive social system). The Fifth Horseman of the Apocalypse is an archetype representing rebirth and renewal in the face of conquest, war, famine, and death.

The Fifth Horseman is ruthless with her healing powers. Her name is Providence, Phoenix-like, she rises up from the ashes of war & decay to spread self-actualized love, open-mindedness, and progressive sustainability by digging up the decay and unsustainable residue of past and present civilizations and then using it all as compost in cultivating and growing a healthier more balanced future.

In that capacity she has devoted herself to planting gardens of eco-centric heroism in the humus of war, hate, close-mindedness and greed, and anything else left behind by the original four horsemen. She is dedicated to, as Buckminster Fuller said, transforming weaponry into livingry.

She subsumes the original four horsemen by teaching them that the new definition of right & wrong must be derived from the universal dictation of healthy & unhealthy rather than the human opinion of good & evil.

The Fifth Horseman is the Goddess of Recompense. She is the Verdant Force. She is the soft hammer of evolution. She has come to blur the false boundaries that have been erected between nature and the human soul.

She is Gaia. She is Lady Justice. She is the return of the Sacred Feminine. She is all of us, men and women, realizing that we are nature first and humans second, that we are soul first and ego second. She is weighing the worth of the human world with the Scales of Justice. With or without us, she will not fail to bring water to the wasteland.

High humor (Sacred Clown):

“What is a tragedy but a misunderstood comedy.” ~Shakespeare

Most of us are familiar with the prototypical clowns: red-nosed clowns, court jesters, and Tarot fools. But sacred clowns take clowning to a whole other level.

Almost all types of sacred clowns combine trickster spirit with shamanic wisdom to create a kind of sacred tomfoolery that keeps the zeitgeist in check. Their methods are unconventional and typically antithetical to the status quo, but extremely effective. They indirectly re-enforce societal customs by directly enforcing their own powerful sense of humor into the social dynamic.

The main function of a sacred clown is to deflate the ego of power by reminding those in power of their own fallibility, while also reminding those who are not in power that power has the potential to become corrupt if it’s not balanced with other forces, namely with humor. But sacred clowns don’t out-rightly derive things. They’re not comedians, per se. Though they can be. They are more like personified trickster gods, poking holes in things that people take too seriously.

Through acts of satire and showy displays of blasphemy, sacred clowns create a cultural dissonance born from their Crazy Wisdom, from which serious anxiety is free to collapse on itself into sincere laughter.

The high humor of Sacred Clowns leads to a higher courage and the audacity to speak truth to power. And they do so with silver-tongued proficiency. There exists no perceived construct of power that’s above their enlightened rebellion. No idol too golden. No high horse too high. No pedestal too revered. No “wizard” too disguised. No God too godly. No title too contrived.

Nothing is immune to the exactness of a Sacred Clown’s rebellion. It’s all merely procrastinating compost. It’s all just well-arranged armor waiting to rust. It’s all an illusion within a delusion. And the Sacred Clown has the enlightened sense of humor to reveal that absolute fact.

Lest we write our lives off to unhealthy stagnation and devolving inertia, we must become something that has the power to perpetually overcome itself. The sacred clown has this power. Paraphrasing William Blake, “If the fool (Sacred Clown) would persist in his folly, he would become wise (New Oracle).”

Self-overcoming (New Oracle):

“To attain knowledge, add things every day. To attain wisdom, remove things every day.” ~Lao Tzu

A Disaster Shaman is a New Oracle who has come to inform the old oracles that they have failed. The self-centered “culture” has been declared a wasteland, and the unhealthy surroundings dubbed unworthy for healthy humans attempting to evolve into a more robust species.

The New Oracle teaches this, above all: Pain should not be avoided at the expense of love. Love should be embraced at the risk of pain.

As such, a master with high humor is needed to resolve the disaster of the self; to usher in an eco-centric, as opposed to an ego-centric, perspective. This master lies dormant inside us all. It can only be found by having the out-of-mind experience of no-mind, in the courageous throes of self-overcoming. There, in the stillness, the master is meditating. The master is connected to the source of all things, his thousand-petalled lotus spinning like a galaxy above his head.

He is radiating inside of you, bursting with wisdom and nth-degree-questions. She pirouettes like Shakti. He foxtrots like Shiva. He/she is the all-dancing, all-laughing oracle of the primordial self, interconnected with all things. And it can only be found there in the silence, between inhale and exhale, between being and non-being, between mind and no-mind, between finitude and infinity.

After disaster, but just before mastery, the Disaster Shaman as New Oracle persistently self-actualizes toward enlightenment.

There, above thought, is the source of human creativity: the place where artists, poets, musicians, and even scientists have discovered the secrets of the universe. Like Leonard Cohen said, “You lose your grip, and then you slip into the masterpiece.”

The mind of Everything holds the heart of Nothingness; the void of Nothingness holds the core of Everything. The masterpiece is the working, self-overcoming, canvas of the New Oracle’s life. The New Oracle is forever in the process of seizing the moment in order to seize the day in order to seize the life.

In the end, the Disaster Shaman is a healthy response to our unhealthy times. Wielding the courage of the New Hero, brandishing the mettle of the Fifth Horseman, harnessing the humor of the Sacred Clown, and channeling the wisdom of the New Oracle, the Disaster Shaman is determined to tonalize an otherwise atonal world.

“One Long Discomfort”: The Legacy and Future of David Lindsay’s ‘A Voyage to Arcturus’

By Ben Schwartz

Source: We Are the Mutants

Ballantine “Adult Fantasy” edition, 1973, with cover art by Bob Pepper

David Lindsay’s masterpiece A Voyage to Arcturus was first published in London in 1920 by Methuen & Co. It came dressed in a simple red cloth cover; no dust jacket, just the title and author’s name debossed into the front. This first printing sold less than 600 copies, and so Arcturus didn’t come to the US until Macmillan brought it out in 1964. In 1968, Ballantine picked it up after the massive success of the publisher’s Lord of the Rings paperbacks, and, for the first time ever, the cover featured bespoke art, painted by Bob Pepper. The printing predated Ballantine’s influential Adult Fantasy series, edited by Lin Carter, but was eventually given honorary membership, with later printings carrying the unicorn stamp and benefiting from the cachet the series possessed.

With the late-1960s Lord of the Rings phenomenon leading the charge, speculative fiction, and Arcturus with it, rode into the public consciousness on about as high a tide as it has ever had. Lindsay’s biographer Bernard Sellin notes that Ballantine’s edition “[had]… overtaken all the accumulated efforts of forty years” in terms of circulating Lindsay’s first novel. But he’s quick to point out that Lindsay’s audience is still limited, and that “The average, sensual reader is in serious danger of being disappointed in Lindsay.” Sellin wrote this in 1981 and, with a weird choice of words, envisions a “‘superior race’ of readers, anxious to go beyond the plot” of Arcturus and grasp what it’s really about. Today, in 2018, Lindsay’s potential audience, superior or otherwise, struggles against a vanishing text.

In the UK, Gollancz brought out an Arcturus reissue in the ’40s (the “novel… is regarded by some of those who have read it as a work of genius,” the cover read), which was subsequently routed into their “Rare Works of Imaginative Fiction” reissues in the early ’60s. Today, the label keeps it alive in its “Fantasy Masterworks” series as an affordable paperback. A high quality limited edition from Savoy Books was the high point of its publication history, but that small batch is fifteen years gone now.

In the states, the novel languishes in Print on Demand Hell. Most readily available copies are ill-starred editions from nebulous outfits bearing names like CreateSpace and Wilder Publications, featuring non sequitur cover images that look like refugees from a Windows ME screensaver folder: a field of wheat, a macro of autumn leaves, an anonymous, slightly-out-of-focus Roman ruin. Even outside of PoD territory there are some seriously janky efforts, leprous with typos: the first printing of Arcturus from Bison Press misspelled the word “Commemorative” on its own cover, and newer printings still contain fistfuls of errors.

And this is a book that counts Clive Barker, Alan Moore, Michael Moorcock, and Jeff Vandermeer among its admirers. C.S. Lewis called it the “real father” of his Space Trilogy. Pathological anti-genre lit critic Harold Bloom’s sole piece of published fiction—ever—is a pseudo-sequel to Arcturus called A Voyage to Lucifer. Colin Wilson, who became a literary sensation with publication of his The Outsider in 1956, put it in his curriculum while teaching and wrote multiple essays about Lindsay. These and other enthusiasts have tended the flame over the years, keeping the book visible to the small cadre of readers that are likely to respond to it. But will Arcturus ever grow beyond that niche audience?

It may be helpful to explain what readers find when they pick up the novel. On a superficial level, A Voyage to Arcturus is a spacefaring adventure of a strong, competent hero, same as you’d find in any number of time-yellowed pulp paperbacks. After a few strange chapters spent on earth, our hero, Maskull, and his two companions, Nightspore and Krag, journey to Tormance, a planet orbiting Arcturus, which in the book is a binary star with two suns, Branchspell and Alppain. Maskull wakes alone in a fantastical desert on Tormance, and quickly becomes embroiled in this new world. There are rocket ships, tentacle arms, dreamlike landscapes—Tormance is prodigious when it comes to landscapes: like Ifdawn Marest, a place of crags and mountains that are constantly sinking and shooting up in fatal, vertiginous thousand-foot shifts; or Matterplay, a valley so replete with life energy that new beings literally pop into existence, fully formed; or the Sinking Sea, whose water varies in density from place to place and which Maskull navigates by riding a giant, semi-living treelike creature. The evocative names of places and people have a distinctly Amazing Stories vibe: Disscourn, Panawe, Corpang, the Lusion Plain.

Maskull sets out ostensibly looking for Nightspore and Krag. But as he proceeds, it becomes clear that his purpose on Tormance is tied to that of a being called Surtur, who draws Maskull northward with a slow, insistent drumbeat that only he can hear. Every chapter sees Maskull enter a new region of Tormance, each with its own particular landscape and specific philosophical culture—a sort of Gulliver’s Travels recast as a troubling, darkly symbolic dream. Ifdawn Marest lives violently, crudely, simply—its residents engage in contests of mind control to dominate, torture, and kill one another. The land of Sant houses vain ascetics who have renounced all the physical pleasures of the world. In Matterplay, Maskull encounters the last of the phaen, an ancient race composed not of men or women but a third, primordial gender. Names of other supreme beings are revealed: some mention Muspel, but many talk of Crystalman, possibly another god, or maybe just another name for Surtur—the Tormancians’ accounts vary. But when people die on Tormance, their faces twist into a nauseating smile known as Crystalman’s grin. The precise cosmology always remains just out of focus, however, and this refusal to resolve comes to drive Maskull forward more than the thought of finding his companions. And through this driving impetus, Maskull finds each place, each philosophy, exposed as limited, false, incomplete. This falseness usually results in an explosion of ugly violence, and Maskull, often as not, is perpetrating it.

And so the book proceeds, like some dark, cosmic picaresque, until Maskull reaches Surtur’s Ocean, the northernmost ocean of Tormance. He reunites with Krag, who seems to be expecting him. Krag takes the physically failing Maskull on a raft out to sea, on a journey to Muspel, which Maskull learns is the name of the “true world,” the world outside the corruption of illusory things. As they sail along, Maskull, exhausted and spent, dies, which somehow releases Nightspore back into being. Then Krag lets Nightspore off at a lone edifice in the sea. As he ascends through it, Nightspore stops at a succession of windows that show him the nature of reality: there is Muspel, Surtur’s world, the impartial, pure, true world that most are prevented from seeing by the illusory world of Crystalman, who is not an aspect of Surtur but an embodiment of deceit and distraction. Violence, art, love, talk, work, play—all of these are tools Crystalman uses to ensnare the spark of Muspel contained in each living thing, preventing that life from returning to the world it came from. All the inhabitants of Tormance and their multifarious philosophies were blinded to this truth by Crystalman—and that’s why, when they died, their faces contorted into Crystalman’s Grin, the signature of his triumph over their souls.

Arcturus ends with the resurrected/transmogrified/newborn Nightspore descending the tower and meeting up with Krag again, who reveals that he is Surtur, and that his name on earth is Pain. Nightspore steps back onto the raft and the two sail away into the darkness, presumably to continue their struggle against Crystalman, on earth or elsewhere. It’s a powerful, striking, triumphless ending—a metaphysical cliffhanger that opens up long avenues of thought.

Anybody reading with their internal aerial up and receiving would have noticed something going on with Arcturus before the final chapters, but they are only the biggest among many clues that make it clear the novel is more than a weightless adventure yarn. Maskull is an off-putting protagonist. He’s animated less by personality and more by some psychic decree outside of his control (authorial or otherwise). He’s got the wrong proportions for a standard hero: Lindsay describes him as “a kind of giant, but of broader and more robust physique than most giants,” with a full beard, short bristling hair, and features that are “thick and heavy, coarsely modeled, like those of a wooden carving”—and yet with eyes sparkling with “intelligence and audacity.” He’s impulsive, driven, and violent—and key to the dark energy that propels Arcturus away from genre pulp into deeper, thornier territory.

Much early speculative fiction created vistas of longing; they showed better worlds, nobler peoples, purer ways of living. The Lord of the Rings set the standard in this regard but it was hardly alone, and not the first. The Worm Ouroboros, Lud-in-the-Mist, Time and the Gods are others—all committed to beauty and magic and bravery as antidotes to our own world. They didn’t deny their correlation to accepted reality, but they actively opposed aspects of that reality by showing us better versions. Arcturus, rather than look outward over the hills of faerie, turns inward, drills down until it exposes its fundamental vision of existence, and that vision is a searing one. Its aspect is fire, and whereas most speculative fiction is aspirational, Arcturus is agonized; reality is, like the unearthly wound Maskull receives from Krag, “one long discomfort,” a galaxy of damnation:

Millions of grotesque, vulgar, ridiculous, sweetened individuals – once Spirit – were calling out from their degradation and agony for salvation from Muspel…

Arcturus the planet isn’t meant to be “real” like Minas Tirith or Lud-in-the-Mist or Witchland are meant to be real. Instead of creating another world, Lindsay showed us our own; refracted through the alien metaphors of Tormance, yes, but nevertheless recognizable. As anthropologist Loren Eiseley notes in his introduction to the Ballantine edition, Arcturus is really “a long earth journey.” There’s a dystopia in Lindsay’s novel, though the dystopia is not political or societal, but metaphysical. It’s not a nightmare city, but a nightmare world; not a corrupt government, but a corrupt soul. Maskull’s vicious, driving nature allows him to open that final door for readers.

Naturally, this dark, anguished, philosophical heart impacted Arcturus’ initial sales. In 1920, science fiction seemed impossibly far from literary “respectability.” There was a strong undercurrent of literary speculative fiction at the time, but it wasn’t universally popular and certainly not accepted by the establishment. Arcturus came blazing fully-formed into the world, subverting tropes that had barely been established. And you can imagine potential readers either avoiding Arcturus because of those tropes, or dropping it because it didn’t thoroughly conform to nascent genre conventions. Arcturus did itself no commercial favors by tapping SF in the name of art. It made itself a black sheep among black sheep.

Sellin ends his ’81 overview of Linday’s life and work as all essays on Arcturus and Lindsay end: with hope for a wider readership in the future. But I predict Arcturus will continue to be preserved by a small but vocal readership—no more. I think it has already assumed the strange, somewhat sour mantle of an “influential” classic, one whose most visible legacy will always be the way it presaged so much that came after. Once you read Arcturus, you’re always finding chunks of it here and there, like burning fragments of an exploded spaceship smoldering in a field. Its Mariana Trench pessimism turns up in Harlan Ellison and, with a paranoiac twist, in Philip K. Dick. Its deep exploration of reality through violence and sexuality bring to mind A Clockwork Orange, Dhalgren; and Maskull’s surrender into a metaphysical system vaster than himself hits on core conceits in much of Pynchon. And most obviously, science fiction as metaphor for our own world, our own souls, was a shocking and (to some) ugly experiment in Arcturus—but today it’s as common as grass.

I think the novel’s admirers want recognition for Arcturus because Lindsay’s life is always painted as one of frustration, where recognition for his accomplishments was continually withheld. And that’s true. But he also created a masterwork, and it seems weird to quibble with immortality, no matter how it comes. Even today, Lindsay’s first novel stands out in any literary landscape, casting a long shadow: an architecture phased in from a parallel dimension both alien and familiar.

Disarming the Weapons of Mass Distraction

By Madeleine Bunting

Source: Rise Up Times

“Are you paying attention?” The phrase still resonates with a particular sharpness in my mind. It takes me straight back to my boarding school, aged thirteen, when my eyes would drift out the window to the woods beyond the classroom. The voice was that of the math teacher, the very dedicated but dull Miss Ploughman, whose furrowed grimace I can still picture.

We’re taught early that attention is a currency—we “pay” attention—and much of the discipline of the classroom is aimed at marshaling the attention of children, with very mixed results. We all have a history here, of how we did or did not learn to pay attention and all the praise or blame that came with that. It used to be that such patterns of childhood experience faded into irrelevance. As we reached adulthood, how we paid attention, and to what, was a personal matter and akin to breathing—as if it were automatic.

Today, though, as we grapple with a pervasive new digital culture, attention has become an issue of pressing social concern. Technology provides us with new tools to grab people’s attention. These innovations are dismantling traditional boundaries of private and public, home and office, work and leisure. Emails and tweets can reach us almost anywhere, anytime. There are no cracks left in which the mind can idle, rest, and recuperate. A taxi ad offers free wifi so that you can remain “productive” on a cab journey.

Even those spare moments of time in our day—waiting for a bus, standing in a queue at the supermarket—can now be “harvested,” says the writer Tim Wu in his book The Attention Merchants. In this quest to pursue “those slivers of our unharvested awareness,” digital technology has provided consumer capitalism with its most powerful tools yet. And our attention fuels it. As Matthew Crawford notes in The World Beyond Your Head, “when some people treat the minds of other people as a resource, this is not ‘creating wealth,’ it is transferring it.”

There’s a whiff of panic around the subject: the story that our attention spans are now shorter than a goldfish’s attracted millions of readers on the web; it’s still frequently cited, despite its questionable veracity. Rates of diagnosis attention deficit hyperactivity disorder in children have soared, creating an $11 billion global market for pharmaceutical companies. Every glance of our eyes is now tracked for commercial gain as ever more ingenious ways are devised to capture our attention, if only momentarily. Our eyeballs are now described as capitalism’s most valuable real estate. Both our attention and its deficits are turned into lucrative markets.

There is also a domestic economy of attention; within every family, some get it and some give it. We’re all born needing the attention of others—our parents’, especially—and from the outset, our social skills are honed to attract the attention we need for our care. Attention is woven into all forms of human encounter from the most brief and transitory to the most intimate. It also becomes deeply political: who pays attention to whom?

Social psychologists have researched how the powerful tend to tune out the less powerful. One study with college students showed that even in five minutes of friendly chat, wealthier students showed fewer signs of engagement when in conversation with their less wealthy counterparts: less eye contact, fewer nods, and more checking the time, doodling, and fidgeting. Discrimination of race and gender, too, plays out through attention. Anyone who’s spent any time in an organization will be aware of how attention is at the heart of office politics. A suggestion is ignored in a meeting, but is then seized upon as a brilliant solution when repeated by another person.

What is political is also ethical. Matthew Crawford argues that this is the essential characteristic of urban living: a basic recognition of others.

And then there’s an even more fundamental dimension to the politics of attention. At a primary level, all interactions in public space require a very minimal form of attention, an awareness of the presence and movement of others. Without it, we would bump into each other, frequently.

I had a vivid demonstration of this point on a recent commute: I live in East London and regularly use the narrow canal paths for cycling. It was the canal rush hour—lots of walkers with dogs, families with children, joggers as well as cyclists heading home. We were all sharing the towpath with the usual mixture of give and take, slowing to allow passing, swerving around and between each other. Only this time, a woman was walking down the center of the path with her eyes glued to her phone, impervious to all around her. This went well beyond a moment of distraction. Everyone had to duck and weave to avoid her. She’d abandoned the unspoken contract that avoiding collision is a mutual obligation.

This scene is now a daily occurrence for many of us, in shopping centers, station concourses, or on busy streets. Attention is the essential lubricant of urban life, and without it, we’re denying our co-existence in that moment and place. The novelist and philosopher, Iris Murdoch, writes that the most basic requirement for being good is that a person “must know certain things about his surroundings, most obviously the existence of other people and their claims.”

Attention is what draws us out of ourselves to experience and engage in the world. The word is often accompanied by a verb—attention needs to be grabbed, captured, mobilized, attracted, or galvanized. Reflected in such language is an acknowledgement of how attention is the essential precursor to action. The founding father of psychology William James provided what is still one of the best working definitions:

It is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. Focalization, concentration, of consciousness are of its essence. It implies withdrawal from some things in order to deal effectively with others.

Attention is a limited resource and has to be allocated: to pay attention to one thing requires us to withdraw it from others. There are two well-known dimensions to attention, explains Willem Kuyken, a professor of psychology at Oxford. The first is “alerting”— an automatic form of attention, hardwired into our brains, that warns us of threats to our survival. Think of when you’re driving a car in a busy city: you’re aware of the movement of other cars, pedestrians, cyclists, and road signs, while advertising tries to grab any spare morsel of your attention. Notice how quickly you can swerve or brake when you spot a car suddenly emerging from a side street. There’s no time for a complicated cognitive process of decision making. This attention is beyond voluntary control.

The second form of attention is known as “executive”—the process by which our brain selects what to foreground and focus on, so that there can be other information in the background—such as music when you’re cooking—but one can still accomplish a complex task. Crucially, our capacity for executive attention is limited. Contrary to what some people claim, none of us can multitask complex activities effectively. The next time you write an email while talking on the phone, notice how many typing mistakes you make or how much you remember from the call. Executive attention can be trained, and needs to be for any complex activity. This was the point James made when he wrote: “there is no such thing as voluntary attention sustained for more than a few seconds at a time… what is called sustained voluntary attention is a repetition of successive efforts which bring back the topic to the mind.”

Attention is a complex interaction between memory and perception, in which we continually select what to notice, thus finding the material which correlates in some way with past experience. In this way, patterns develop in the mind. We are always making meaning from the overwhelming raw data. As James put it, “my experience is what I agree to attend to. Only those items which I notice shape my mind—without selective interest, experience is an utter chaos.”

And we are constantly engaged in organizing that chaos, as we interpret our experience. This is clear in the famous Gorilla Experiment in which viewers were told to watch a video of two teams of students passing a ball between them. They had to count the number of passes made by the team in white shirts and ignore those of the team in black shirts. The experiment is deceptively complex because it involves three forms of attention: first, scanning the whole group; second, ignoring the black T-shirt team to keep focus on the white T-shirt team (a form of inhibiting attention); and third, remembering to count. In the middle of the experiment, someone in a gorilla suit ambles through the group. Afterward, half the viewers when asked hadn’t spotted the gorilla and couldn’t even believe it had been there. We can be blind not only to the obvious, but to our blindness.

There is another point in this experiment which is less often emphasized. Ignoring something—such as the black T-shirt team in this experiment—requires a form of attention. It costs us attention to ignore something. Many of us live and work in environments that require us to ignore a huge amount of information—that flashing advert, a bouncing icon or pop-up.

In another famous psychology experiment, Walter Mischel’s Marshmallow Test, four-year-olds had a choice of eating a marshmallow immediately or two in fifteen minutes. While filmed, each child was put in a room alone in front of the plate with a marshmallow. They squirmed and fidgeted, poked the marshmallow and stared at the ceiling. A third of the children couldn’t resist the marshmallow and gobbled it up, a third nibbled cautiously, but the last third figured out how to distract themselves. They looked under the table, sang… did anything but look at the sweet. It’s a demonstration of the capacity to reallocate attention. In a follow-up study some years later, those who’d been able to wait for the second marshmallow had better life outcomes, such as academic achievement and health. One New Zealand study of 1,000 children found that this form of self-regulation was a more reliable predictor of future success and wellbeing than even a good IQ or comfortable economic status.

What, then, are the implications of how digital technologies are transforming our patterns of attention? In the current political anxiety about social mobility and inequality, more weight needs to be put on this most crucial and basic skill: sustaining attention.

*

I learned to concentrate as a child. Being a bookworm helped. I’d be completely absorbed in my reading as the noise of my busy family swirled around me. It was good training for working in newsrooms; when I started as a journalist, they were very noisy places with the clatter of keyboards, telephones ringing and fascinating conversations on every side. What has proved much harder to block out is email and text messages.

The digital tech companies know a lot about this widespread habit; many of them have built a business model around it. They’ve drawn on the work of the psychologist B.F. Skinner who identified back in the Thirties how, in animal behavior, an action can be encouraged with a positive consequence and discouraged by a negative one. In one experiment, he gave a pigeon a food pellet whenever it pecked at a button and the result, as predicted, was that the pigeon kept pecking. Subsequent research established that the most effective way to keep the pigeon pecking was “variable-ratio reinforcement.” Give the pigeon a food pellet sometimes, and you have it well and truly hooked.

We’re just like the pigeon pecking at the button when we check our email or phone. It’s a humiliating thought. Variable reinforcement ensures that the customer will keep coming back. It’s the principle behind one of the most lucrative US industries: slot machines, which generate more profit than baseball, films, and theme parks combined. Gambling was once tightly restricted for its addictive potential, but most of us now have the attentional equivalent of a slot machine in our pocket, beside our plate at mealtimes, and by our pillow at night. Even during a meal out, a play at the theater, a film, or a tennis match. Almost nothing is now experienced uninterrupted.

Anxiety about the exponential rise of our gadget addiction and how it is fragmenting our attention is sometimes dismissed as a Luddite reaction to a technological revolution. But that misses the point. The problem is not the technology per se, but the commercial imperatives that drive the new technologies and, unrestrained, colonize our attention by fundamentally changing our experience of time and space, saturating both in information.

In much public space, wherever your eye lands—from the back of the toilet door, to the handrail on the escalator, or the hotel key card—an ad is trying to grab your attention, and does so by triggering the oldest instincts of the human mind: fear, sex, and food. Public places become dominated by people trying to sell you something. In his tirade against this commercialization, Crawford cites advertisements on the backs of school report cards and on debit machines where you swipe your card. Before you enter your PIN, that gap of a few seconds is now used to show adverts. He describes silence and ad-free experience as “luxury goods” that only the wealthy can afford. Crawford has invented the concept of the “attentional commons,” free public spaces that allow us to choose where to place our attention. He draws the analogy with environmental goods that belong to all of us, such as clean air or clean water.

Some legal theorists are beginning to conceive of our own attention as a human right. One former Google employee warned that “there are a thousand people on the other side of the screen whose job it is to break down the self-regulation you have.” They use the insights into human behavior derived from social psychology—the need for approval, the need to reciprocate others’ gestures, the fear of missing out. Your attention ceases to be your own, pulled and pushed by algorithms. Attention is referred to as the real currency of the future.

*

In 2013, I embarked on a risky experiment in attention: I left my job. In the previous two years, it had crept up on me. I could no longer read beyond a few paragraphs. My eyes would glaze over and, even more disastrously for someone who had spent their career writing, I seemed unable to string together my thoughts, let alone write anything longer than a few sentences. When I try to explain the impact, I can only offer a metaphor: it felt like my imagination and use of language were vacuum packed, like a slab of meat coated in plastic. I had lost the ability to turn ideas around, see them from different perspectives. I could no longer draw connections between disparate ideas.

At the time, I was working in media strategy. It was a culture of back-to-back meetings from 8:30 AM to 6 PM, and there were plenty of advantages to be gained from continuing late into the evening if you had the stamina. Commitment was measured by emails with a pertinent weblink. Meetings were sometimes as brief as thirty minutes and frequently ran through lunch. Meanwhile, everyone was sneaking time to battle with the constant emails, eyes flickering to their phone screens in every conversation. The result was a kind of crazy fog, a mishmash of inconclusive discussions.

At first, it was exhilarating, like being on those crazy rides in a theme park. By the end, the effect was disastrous. I was almost continuously ill, battling migraines and unidentifiable viruses. When I finally made the drastic decision to leave, my income collapsed to a fraction of its previous level and my family’s lifestyle had to change accordingly. I had no idea what I was going to do; I had lost all faith in my ability to write. I told friends I would have to return the advance I’d received to write a book. I had to try to get back to the skills of reflection and focus that had once been ingrained in me.

The first step was to teach myself to read again. I sometimes went to a café, leaving my phone and computer behind. I had to slow down the racing incoherence of my mind so that it could settle on the text and its gradual development of an argument or narrative thread. The turning point in my recovery was a five weeks’ research trip to the Scottish Outer Hebrides. On the journey north of Glasgow, my mobile phone lost its Internet connection. I had cut myself loose with only the occasional text or call to family back home. Somewhere on the long Atlantic beaches of these wild and dramatic islands, I rediscovered my ability to write.

I attribute that in part to a stunning exhibition I came across in the small harbor town of Lochboisdale, on the island of South Uist. Vija Celmins is an acclaimed Latvian-American artist whose work is famous for its astonishing patience. She can take a year or more to make a woodcut that portrays in minute detail the surface of the sea. A postcard of her work now sits above my desk, a reminder of the power of slow thinking.

Just as we’ve had a slow eating movement, we need a slow thinking campaign. Its manifesto could be the German poet Rainer Maria Rilke’s beautiful “Letters to a Young Poet”:

To let every impression and the germ of every feeling come to completion inside, in the dark, in the unsayable, the unconscious, in what is unattainable to one’s own intellect, and to wait with deep humility and patience for the hour when a new clarity is delivered.

Many great thinkers attest that they have their best insights in moments of relaxation, the proverbial brainwave in the bath. We actually need what we most fear: boredom.

When I left my job (and I was lucky that I could), friends and colleagues were bewildered. Why give up a good job? But I felt that here was an experiment worth trying. Crawford frames it well as “intellectual biodiversity.” At a time of crisis, we need people thinking in different ways. If we all jump to the tune of Facebook or Instagram and allow ourselves to be primed by Twitter, the danger is that we lose the “trained powers of concentration” that allow us, in Crawford’s words, “to recognize that independence of thought and feeling is a fragile thing, and requires certain conditions.”

I also took to heart the insights of the historian Timothy Snyder, who concluded from his studies of twentieth-century European totalitarianism that the way to fend off tyranny is to read books, make an effort to separate yourself from the Internet, and “be kind to our language… Think up your own way of speaking.” Dropping out and going offline enabled me to get back to reading, voraciously, and to writing; beyond that, it’s too early to announce the results of my experiment with attention. As Rilke said, “These things cannot be measured by time, a year has no meaning, and ten years are nothing.”

*

A recent column in The New Yorker cheekily suggests that all the fuss about the impact of digital technologies on our attention is nothing more than writers’ worrying about their own working habits. Is all this anxiety about our fragmenting minds a moral panic akin to those that swept Victorian Britain about sexual behavior? Patterns of attention are changing, but perhaps it doesn’t much matter?

My teenage children read much less than I did. One son used to play chess online with a friend, text on his phone, and do his homework all at the same time. I was horrified, but he got a place at Oxford. At his interview, he met a third-year history undergraduate who told him he hadn’t yet read any books in his time at university. But my kids are considerably more knowledgeable about a vast range of subjects than I was at their age. There’s a small voice suggesting that the forms of attention I was brought up with could be a thing of the past; the sustained concentration required to read a whole book will become an obscure niche hobby.

And yet, I’m haunted by a reflection: the magnificent illuminations of the eighth-century Book of Kells has intricate patterning that no one has ever been able to copy, such is the fineness of the tight spirals. Lines are a millimeter apart. They indicate a steadiness of hand and mind—a capability most of us have long since lost. Could we be trading in capacities for focus in exchange for a breadth of reference? Some might argue that’s not a bad trade. But we would lose depth: artist Paul Klee wrote that he would spend a day in silent contemplation of something before he painted it. Paul Cézanne was similarly known for his trance like attention on his subject. Madame Cézanne recollected how her husband would gaze at the landscape, and told her, “The landscape thinks itself in me, and I am its consciousness.” The philosopher Maurice Merleau-Ponty describes a contemplative attention in which one steps outside of oneself and immerses oneself in the object of attention.

It’s not just artists who require such depth of attention. Nearly two decades ago, a doctor teaching medical students at Yale was frustrated at their inability to distinguish between types of skin lesions. Their gaze seemed restless and careless. He took his students to an art gallery and told them to look at a picture for fifteen minutes. The program is now used in dozens of US medical schools.

Some argue that losing the capacity for deep attention presages catastrophe. It is the building block of “intimacy, wisdom, and cultural progress,” argues Maggie Jackson in her book Distracted, in which she warns that “as our attentional skills are squandered, we are plunging into a culture of mistrust, skimming, and a dehumanizing merging between man and machine.” Significantly, her research began with a curiosity about why so many Americans were deeply dissatisfied with life. She argues that losing the capacity for deep attention makes it harder to make sense of experience and to find meaning—from which comes wonder and fulfillment. She fears a new “dark age” in which we forget what makes us truly happy.

Strikingly, the epicenter of this wave of anxiety over our attention is the US. All the authors I’ve cited are American. It’s been argued that this debate represents an existential crisis for America because it exposes the flawed nature of its greatest ideal, individual freedom. The commonly accepted notion is that to be free is to make choices, and no one can challenge that expression of autonomy. But if our choices are actually engineered by thousands of very clever, well-paid digital developers, are we free? The former Google employee Tristan Harris confessed in an article in 2016 that technology “gives people the illusion of free choice while architecting the menu so that [tech giants] win, no matter what you choose.”

Despite my children’s multitasking, I maintain that vital human capacities—depth of insight, emotional connection, and creativity—are at risk. I’m intrigued as to what the resistance might look like. There are stirrings of protest with the recent establishment of initiatives such as the Time Well Spent movement, founded by tech industry insiders who have become alarmed at the efforts invested in keeping people hooked. But collective action is elusive; the emphasis is repeatedly on the individual to develop the necessary self-regulation, but if that is precisely what is being eroded, we could be caught in a self-reinforcing loop.

One of the most interesting responses to our distraction epidemic is mindfulness. Its popularity is evidence that people are trying to find a way to protect and nourish their minds. Jon Kabat-Zinn, who pioneered the development of secular mindfulness, draws an analogy with jogging: just as keeping your body fit is now well understood, people will come to realize the importance of looking after their minds.

I’ve meditated regularly for twenty years, but curious as to how this is becoming mainstream, I went to an event in the heart of high-tech Shoreditch in London. In a hipster workspaces with funky architecture, excellent coffee, and an impressive range of beards, a soft-spoken retired Oxford professor of psychology, Mark Williams, was talking about how multitasking has a switching cost in focus and concentration. Our unique human ability to remember the past and to think ahead brings a cost; we lose the present. To counter this, he advocated a daily practice of mindfulness: bringing attention back to the body—the physical sensations of the breath, the hands, the feet. Williams explained how fear and anxiety inhibit creativity. In time, the practice of mindfulness enables you to acknowledge fear calmly and even to investigate it with curiosity. You learn to place your attention in the moment, noticing details such as the sunlight or the taste of the coffee.

On a recent retreat, I was beside a river early one morning and a rower passed. I watched the boat slip by and enjoyed the beauty in a radically new way. The moment was sufficient; there was nothing I wanted to add or take away—no thought of how I wanted to do this every day, or how I wanted to learn to row, or how I wished I was in the boat. Nothing but the pleasure of witnessing it. The busy-ness of the mind had stilled. Mindfulness can be a remarkable bid to reclaim our attention and to claim real freedom, the freedom from our habitual reactivity that makes us easy prey for manipulation.

But I worry that the integrity of mindfulness is fragile, vulnerable both to commercialization by employers who see it as a form of mental performance enhancement and to consumer commodification, rather than contributing to the formation of ethical character. Mindfulness as a meditation practice originates in Buddhism, and without that tradition’s ethics, there is a high risk of it being hijacked and misrepresented.

Back in the Sixties, the countercultural psychologist Timothy Leary rebelled against the conformity of the new mass media age and called for, in Crawford’s words, an “attentional revolution.” Leary urged people to take control of the media they consumed as a crucial act of self-determination; pay attention to where you place your attention, he declared. The social critic Herbert Marcuse believed Leary was fighting the struggle for the ultimate form of freedom, which Marcuse defined as the ability “to live without anxiety.” These were radical prophets whose words have an uncanny resonance today. Distraction has become a commercial and political strategy, and it amounts to a form of emotional violence that cripples people, leaving them unable to gather their thoughts and overwhelmed by a sense of inadequacy. It’s a powerful form of oppression dressed up in the language of individual choice.

The stakes could hardly be higher, as William James knew a century ago: “The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will.” And what are we humans without these three?

MySpace Tom beat Facebook in the long run

Wouldn’t you rather be a rich nobody than whatever Mark Zuckerberg is?

By Jeremy Gordon

Source: The Outline

My MySpace profile was abandoned when, at the ripe age of 18, I decided it was just a little too juvenile — the glittering GIFs affixed to every page, the garish customized designs, the pressure on maintaining your top 8. By 2006, Facebook offered a cleaner social experience; by 2009, Twitter offered a more casual one. MySpace was a complete relic by this point, even though only a few years had passed since its launch.

Back in 2005, though, long before MySpace burned out, its founder, Tom Anderson — whose grinning face greeted every new user as their first “friend” — sold the site for $580 million to Rupert Murdoch’s News Corporation. While his site was becoming a punchline during the rise of Facebook, Twitter, Instagram, and the other social media networks we now use everyday, Anderson disappeared entirely from the tech scene. Now, he travels the world, documenting his visits to exotic locations.

Contrast that with what’s currently happening to Facebook’s Mark Zuckerberg, who’s on day two of being grilled by a Senate committee for Facebook’s role in haphazardly collecting all of our personal data, and possibly swinging the 2016 presidential election toward Donald Trump. What was supposed to be a basic networking tool has now become one of the chief mediators of how people interact with each other and the world around them, and how information is absorbed and disseminated on the internet. It’s now apparent that Facebook and Zuckerberg didn’t really consider any of this when aggressively pursuing growth, and now we’re all screwed as we try to untangle the consequences.

MySpace Tom? His most recent Instagram post from seven days ago is a giveaway for a stay at an Iceland hotel. He doesn’t have to issue any terse statements about his company’s commitment to fostering a healthy society; he doesn’t have to sit on a booster seat for seven hours and take dipshit questions from a procession of Senate ghouls. He isn’t worth as much money as Zuckerberg, of course, but unless you’re an oil baron, $580 million is enough to tide you over for the length of your lifetime, and your children’s lifetime, and your children’s children’s lifetime, and so on. (Even after taxes!) And yes, yes, being that rich is good for nobody, but without getting into an argument about the perils of capitalism, we can agree that personally speaking, Anderson is having a much better go of things.

It puts MySpace’s failure to evolve in a new light, as perhaps the healthy thing is for a platform to die and for everyone to move on. Its aesthetic and form, back when everyone had emo bangs and listened to Hawthorne Heights, couldn’t change without altering the meaning of the site altogether, and by that time, everyone was gone. Had Facebook not gotten too good at inserting itself between human users, there’s no way it would’ve run into their current problems at such a wide scope. The suspicious CEO is not the one who cashes out; it’s the one who sticks around and creates a behemoth.

Zuckerberg could have sold off his share and avoided becoming literally one of the most disliked people in the present moment. I never thought we’d declare MySpace the winner over Facebook, but then again, I never thought a lot of things about the moment we’re in.

 

Challenges for Resolving Complex Conflicts

By Robert J. Burrowes

While conflict theories and resolution processes advanced dramatically during the second half of the 20th century, particularly thanks to the important work of several key scholars such as Professor Johan Galtung – see ‘Conflict Transformation by Peaceful Means (the Transcend Method)’ – significant gaps remain in the conflict literature on how to deal with particular conflict configurations. Notably, these include the following four.

First, existing conflict theory does not adequately explain, emphasize and teach how to respond in those circumstances in which parties cannot be brought to the table to deeply consider a conflict and the measures necessary to resolve it. This particularly applies in cases where one or more parties is violently defending (often using a combination of direct and structural violence) substantial interrelated (material and non-material) interests. The conflict between China and Tibet over the Chinese-occupied Tibetan plateau, the many conflicts between western corporations and indigenous peoples over exploitation of the natural environment, and the conflict between the global elite and ‘ordinary’ people over resource allocation in the global economy are obvious examples of a vast number of conflicts in this category. As one of the rare conflict theorists who addresses this question, Galtung notes that structural violence ‘is not only evil, it is obstinate and must be fought’, and his preferred strategy is nonviolent revolution. See The True Worlds: A Transnational Perspective p. 140. But how?

Second, existing conflict theory does not explain how to respond in those circumstances in which one or more parties to the conflict are insane. The conflict between Israel and Palestine over Israeli-occupied Palestine classically illustrates this problem, particularly notable in the insanity of Israeli Prime Minister Binjamin Netanyahu, Defense Minister Avigdor Lieberman and Justice Minister Ayelet Shaked. But it is also readily illustrated by the insanity of the current political/military leadership in the USA and the insanity of the political, military and Buddhist leaders in Myanmar engaged in a genocidal assault on the Rohingya. For a brief discussion of the meaning and cause of this insanity see ‘The Global Elite is Insane Revisited’.

As an aside, there is little point deluding ourselves that insanity is not a problem or even ‘diplomatically’ not mentioning the insanity (if this is indeed the case) of certain parties in particular conflicts. The truth enables us to fully understand a conflict so that we can develop and implement a strategy to deal with all aspects of that truth. Any conflict strategy that fails to accurately identify and address all key aspects of the conflict, including the insanity of any of the parties, will virtually certainly fail.

Third, and more fundamentally, existing conflict theory does not take adequate account of the critical role that several unconscious emotions play in driving conflict in virtually all contexts, often preventing its resolution. This particularly applies in the case of (but is not limited to) suppressed terror, self-hatred and anger which are often unconsciously projected as fear of, hatred for and anger at an opponent or even an innocent third-party (essentially because this individual/group feels ‘safe’ to the person who is projecting). See ‘The Psychology of Projection in Conflict’.

While any significant ongoing conflict would illustrate this point adequately, the incredibly complex and interrelated conflicts being conducted in the Middle East, the prevalent Islamophobia in some western countries, and the conflicts over governance and exploitation of resources in the Democratic Republic of Congo are superlative examples. Ignoring suppressed (and projected) emotions can stymie conflict resolution in any context, interpersonally and geopolitically, and it does so frequently.

Fourth, existing conflict theory pays little attention to the extinction-causing conflict being ongoingly generated by human over-consumption in the finite planetary biosphere (and currently resulting in 200 species extinctions daily) which is sometimes inadequately identified as a conflict caused by capitalism’s drive for unending economic growth in a finite environment.

So what can we do?

Well, to begin, in all four categories of cases mentioned above, I would use Gandhian nonviolent strategy to compel violent opponents to participate in a conflict transformation process such as Galtung’s. Why nonviolent and why Gandhian? Nonviolent because our intention is to process the conflict to achieve a higher level of need satisfaction for all parties and violence against any or all participants is inconsistent with that intention. But Gandhian nonviolence because only Gandhi’s version of nonviolence has this conflict intention built into it. See ‘Conception of Nonviolence’.

‘But isn’t this nonviolent strategy simply coercion by another name?’ you might ask. Well, according to the Norwegian philosopher, Professor Arne Naess, it is not. In his view, if a change of will follows the scrutiny of norms in the context of new information while one is ‘in a state of full mental and bodily powers’, this is an act of personal freedom under optimal conditions. Naess highlights this point with the following example: Suppose that one person carries another against their will into the streets where there is a riot and, as a result of what they see, the carried person changes some of their attitudes and opinions. Was the change coerced? According to Naess, while the person was coerced into seeing something that caused the change, the change itself was not coerced. The distinction is important, Naess argues, because satyagraha (Gandhian nonviolent struggle) is incompatible with changes of attitudes or opinions that are coerced. See Gandhi and Group Conflict: An Exploration of Satyagraha pp. 91-92.

To elaborate this point: Unlike other conceptions of nonviolence, Gandhi’s nonviolence is based on certain premises, including the importance of the truth, the sanctity and unity of all life, and the unity of means and end, so his strategy is always conducted within the framework of his desired political, social, economic and ecological vision for society as a whole and not limited to the purpose of any immediate campaign. It is for this reason that Gandhi’s approach to strategy is so important. He is always taking into account the ultimate end of all nonviolent struggle – a just, peaceful and ecologically sustainable society of self-realized human beings – not just the outcome of this campaign. He wants each campaign to contribute to the ultimate aim, not undermine vital elements of the long-term and overarching struggle to create a world without violence.

Consequently, given his conception of nonviolence, Gandhi’s intention is to reach a conflict outcome that recognizes the sanctity and unity of all life which, obviously, includes the lives (but also the physical and emotional well-being) of his opponents. His nonviolent strategy is designed to compel participation in a conflict process but not to impose his preferred outcome unilaterally. See Nonviolent Campaign Strategy and Nonviolent Defense/Liberation Strategy.

This can apply in the geopolitical context or in relation to ordinary individuals ‘merely’ participating in the violence of overconsumption. Using nonviolent strategy to campaign on the climate catastrophe or other environmental issues can include mobilizing individuals and communities to emulate Gandhi’s asceticism in a modest way by participating in the fifteen-year strategy outlined in The Flame Tree Project to Save Life on Earth which he inspired.

But even if we can use nonviolent strategy effectively to get the conflicting parties together, the reality is that suppressed and projected emotions – particularly fear, self-hatred and anger as mentioned above – or even outright insanity on the part of one or more parties may still make efforts to effectively transform the conflict impossible. So for conflict resolution to occur, we need individuals who are willing and able to participate with at least minimal goodwill in designing a superior conflict outcome beneficial to everyone concerned.

Hence, I would do one more thing in connection with this process. Prior to, and then also in parallel with, the ‘formal’ conflict process, I would provide opportunities for all individuals engaged in the process (or otherwise critical to it because of their ‘background’ role, perhaps as a leader not personally present at the formal conflict process) to explore in a private setting with a skilled ‘nisteler’ (who is outside the conflict process), the unconscious emotions that are driving their particular approach to the conflict. See ‘Nisteling: The Art of Deep Listening’. The purpose of this nisteling is to allow each participant in the conflict process to bring a higher level of self-awareness to it. See ‘Human Intelligence or Human Awareness?’

I am not going to pretend that this would necessarily be possible, quick, easy or even work in every context. Insane individuals are obviously the last to know they have a psychological problem and the least likely to participate in a process designed to uncover and remove the roots of their insanity. However, those who are trapped in a dysfunctional psychological state short of insanity may be willing to avail themselves of the opportunity. In time, the value of this aspect of the conflict resolution process should become apparent, particularly because delusions and projections are exposed by the person themself (as an outcome of the expertise of the person nisteling).

Obviously, I am emphasizing the psychological aspects of the conflict process because my own considerable experience as a nonviolent activist together with my research convinces me that understanding violence requires an understanding of the psychology that drives it. If you are interested, you can read about the psychology of violence, including the 23 psychological characteristics in the emotional profile of archetype perpetrators of violence, in the documents Why Violence? and Fearless Psychology and Fearful Psychology: Principles and Practice.

Ideally, I would like to see the concept of nistelers operating prior to, and then parallel with, focused attention on the conflict itself normalized as an inherent part of the conflict resolution process. Clearly, we need teams of people equipped to perform this service, a challenge in itself in the short-term.

If, however, conflicting parties cannot be convinced to participate in this process with reasonable goodwill, we can always revert to using nonviolent strategy to compel them to do so. And, if all attempts to conduct a reasonable conflict process fail (particularly in a circumstance in which insanity is the cause of this failure), to impose a nonviolent solution which nevertheless takes account of the insane’s party’s legitimate needs. (Yes, on just that one detail, I diverge from Gandhi.)

Having stated that, however, I acknowledge that only a rare individual has the capacity to think, plan and act strategically in tackling a violent conflict nonviolently, so considerable education in nonviolent strategy will be necessary and is a priority.

Given what is at stake, however – a superior strategy for tackling and resolving violent geopolitical conflicts including those (such as the threat of nuclear war, the climate catastrophe and decimation of the biosphere) that threaten human extinction – any resources devoted to improving our capacity to deliver this outcome would be well spent.

Provided, of course, that reducing (and ultimately eliminating) violence and resolving conflict is your aim.

In addition to the above, I would do something else more generally (that is, outside the conflict process).

Given that dysfunctional parenting is ultimately responsible for the behaviour of those individuals who generate and perpetuate violent conflicts, I would encourage all parents to consider making ‘My Promise to Children’ so that we start to produce a higher proportion of functional individuals who know how to powerfully resolve conflicts in their lives without resort to violence. If any parent feels unable to make this promise, then they have the option of tackling this problem at its source by ‘Putting Feelings First’.

If we do not dramatically and quickly improve our individual and collective capacity to resolve conflicts nonviolently, including when we are dealing with individuals who are insane, then one day relatively soon we will share the fate of those 200 species of life we drove to extinction today.

 

 

Biodata: Robert J. Burrowes has a lifetime commitment to understanding and ending human violence. He has done extensive research since 1966 in an effort to understand why human beings are violent and has been a nonviolent activist since 1981. He is the author of Why Violence? His email address is flametree@riseup.net and his website is here.

Robert J. Burrowes
P.O. Box 68
Daylesford, Victoria 3460
Australia

Email: flametree@riseup.net

Websites:
Nonviolence Charter
Flame Tree Project to Save Life on Earth
‘Why Violence?’
Feelings First
Nonviolent Campaign Strategy
Nonviolent Defense/Liberation Strategy
Anita: Songs of Nonviolence
Robert Burrowes
Global Nonviolence Network