Saturday Matinee: The Occupation of the American Mind

From OccupationMovie.org:

Israel’s ongoing military occupation of Palestinian territory and its repeated invasions of the Gaza strip have triggered a fierce backlash against Israeli policies virtually everywhere in the world — except the United States. The Occupation of the American Mind takes an eye-opening look at this critical exception, zeroing in on pro-Israel public relations efforts within the U.S. Narrated by Roger Waters and featuring leading observers of the Israeli–Palestinian conflict, the film explores how the Israeli government, the U.S. government, and the pro-Israel lobby have joined forces, often with very different motives, to shape American media coverage of the conflict in Israel’s favor. The Occupation of the American Mind provides a sweeping analysis of Israel’s decades-long battle for the hearts, minds, and tax dollars of the American people — a battle that has only intensified over the past few years in the face of widening international condemnation of Israel’s increasingly right-wing policies. (Running time: 85 mins)

If you are a member of a major library system you can watch the full film at Kanopy.

Still Waiting for Evidence of a Russian Hack

More than two years after the allegation of Russian hacking of the 2016 U.S. presidential election was first made, conclusive proof is still lacking and may never be produced, says Ray McGovern.

By Ray McGovern

Source: Consortium News

If you are wondering why so little is heard these days of accusations that Russia hacked into the U.S. election in 2016, it could be because those charges could not withstand close scrutiny. It could also be because special counsel Robert Mueller appears to have never bothered to investigate what was once the central alleged crime in Russia-gate as no one associated with WikiLeaks has ever been questioned by his team.

Veteran Intelligence Professionals for Sanity — including two “alumni” who were former National Security Agency technical directors — have long since concluded that Julian Assange did not acquire what he called the “emails related to Hillary Clinton” via a “hack” by the Russians or anyone else. They found, rather, that he got them from someone with physical access to Democratic National Committee computers who copied the material onto an external storage device — probably a thumb drive. In December 2016 VIPS explained this in some detail in an open Memorandum to President Barack Obama.

On January 18, 2017 President Obama admitted that the “conclusions” of U.S. intelligence regarding how the alleged Russian hacking got to WikiLeaks were “inconclusive.” Even the vapid FBI/CIA/NSA “Intelligence Community Assessment of Russian Activities and Intentions in Recent U.S. Elections” of January 6, 2017, which tried to blame Russian President Vladimir Putin for election interference, contained no direct evidence of Russian involvement.  That did not prevent the “handpicked” authors of that poor excuse for intelligence analysis from expressing “high confidence” that Russian intelligence “relayed material it acquired from the Democratic National Committee … to WikiLeaks.”  Handpicked analysts, of course, say what they are handpicked to say.

Never mind. The FBI/CIA/NSA “assessment” became bible truth for partisans like Rep. Adam Schiff (D-CA), ranking member of the House Intelligence Committee, who was among the first off the blocks to blame Russia for interfering to help Trump.  It simply could not have been that Hillary Clinton was quite capable of snatching defeat out of victory all by herself.  No, it had to have been the Russians.

Five days into the Trump presidency, I had a chance to challenge Schiff personally on the gaping disconnect between the Russians and WikiLeaks. Schiff still “can’t share the evidence” with me … or with anyone else, because it does not exist.

WikiLeaks

It was on June 12, 2016, just six weeks before the Democratic National Convention, that Assange announced the pending publication of “emails related to Hillary Clinton,” throwing the Clinton campaign into panic mode, since the emails would document strong bias in favor of Clinton and successful attempts to sabotage the campaign of Bernie Sanders.  When the emails were published on July 22, just three days before the convention began, the campaign decided to create what I call a Magnificent Diversion, drawing attention away from the substance of the emails by blaming Russia for their release.

Clinton’s PR chief Jennifer Palmieri later admitted that she golf-carted around to various media outlets at the convention with instructions “to get the press to focus on something even we found difficult to process: the prospect that Russia had not only hacked and stolen emails from the DNC, but that it had done so to help Donald Trump and hurt Hillary Clinton.”  The diversion worked like a charm.  Mainstream media kept shouting “The Russians did it,” and gave little, if any, play to the DNC skullduggery revealed in the emails themselves. And like Brer’ Fox, Bernie didn’t say nothin’.

Meanwhile, highly sophisticated technical experts, were hard at work fabricating “forensic facts” to “prove” the Russians did it.  Here’s how it played out:

June 12, 2016: Assange announces that WikiLeaks is about to publish “emails related to Hillary Clinton.”

June 14, 2016: DNC contractor CrowdStrike, (with a dubious professional record and multiple conflicts of interest) announces that malware has been found on the DNC server and claims there is evidence it was injected by Russians.

June 15, 2016: “Guccifer 2.0” affirms the DNC statement; claims responsibility for the “hack;” claims to be a WikiLeaks source; and posts a document that the forensics show was synthetically tainted with “Russian fingerprints.”

The June 12, 14, & 15 timing was hardly coincidence. Rather, it was the start of a pre-emptive move to associate Russia with anything WikiLeaks might have been about to publish and to “show” that it came from a Russian hack.

Enter Independent Investigators

A year ago independent cyber-investigators completed the kind of forensic work that, for reasons best known to then-FBI Director James Comey, neither he nor the “handpicked analysts” who wrote the Jan. 6, 2017 assessment bothered to do.  The independent investigators found verifiable evidence from metadata found in the record of an alleged Russian hack of July 5, 2016 showing that the “hack” that day of the DNC by Guccifer 2.0 was not a hack, by Russia or anyone else.

Rather it originated with a copy (onto an external storage device – a thumb drive, for example) by an insider — the same process used by the DNC insider/leaker before June 12, 2016 for an altogether different purpose. (Once the metadata was found and the “fluid dynamics” principle of physics applied, this was not difficult to disprove the validity of the claim that Russia was responsible.)

One of these independent investigators publishing under the name of The Forensicator on May 31 published new evidence that the Guccifer 2.0 persona uploaded a document from the West Coast of the United States, and not from Russia.

In our July 24, 2017 Memorandum to President Donald Trump we stated, “We do not know who or what the murky Guccifer 2.0 is. You may wish to ask the FBI.”

Our July 24 Memorandum continued: “Mr. President, the disclosure described below may be related. Even if it is not, it is something we think you should be made aware of in this general connection. On March 7, 2017, WikiLeaks began to publish a trove of original CIA documents that WikiLeaks labeled ‘Vault 7.’ WikiLeaks said it got the trove from a current or former CIA contractor and described it as comparable in scale and significance to the information Edward Snowden gave to reporters in 2013.

“No one has challenged the authenticity of the original documents of Vault 7, which disclosed a vast array of cyber warfare tools developed, probably with help from NSA, by CIA’s Engineering Development Group. That Group was part of the sprawling CIA Directorate of Digital Innovation – a growth industry established by John Brennan in 2015. [ (VIPS warned President Obama of some of the dangers of that basic CIA reorganization at the time.]

Marbled

“Scarcely imaginable digital tools – that can take control of your car and make it race over 100 mph, for example, or can enable remote spying through a TV – were described and duly reported in the New York Times and other media throughout March. But the Vault 7, part 3 release on March 31 that exposed the “Marble Framework” program apparently was judged too delicate to qualify as ‘news fit to print’ and was kept out of the Times at the time, and has never been mentioned since.

“The Washington Post’s Ellen Nakashima, it seems, ‘did not get the memo’ in time. Her March 31 article bore the catching (and accurate) headline: ‘WikiLeaks’ latest release of CIA cyber-tools could blow the cover on agency hacking operations.’

“The WikiLeaks release indicated that Marble was designed for flexible and easy-to-use ‘obfuscation,’ and that Marble source code includes a “de-obfuscator” to reverse CIA text obfuscation.

“More important, the CIA reportedly used Marble during 2016. In her Washington Post report, Nakashima left that out, but did include another significant point made by WikiLeaks; namely, that the obfuscation tool could be used to conduct a ‘forensic attribution double game’ or false-flag operation because it included test samples in Chinese, Russian, Korean, Arabic and Farsi.”

A few weeks later William Binney, a former NSA technical director, and I commented on Vault 7 Marble, and were able to get a shortened op-ed version published in The Baltimore Sun.

The CIA’s reaction to the WikiLeaks disclosure of the Marble Framework tool was neuralgic. Then Director Mike Pompeo lashed out two weeks later, calling Assange and his associates “demons,” and insisting; “It’s time to call out WikiLeaks for what it really is, a non-state hostile intelligence service, often abetted by state actors like Russia.”

Our July 24 Memorandum continued:  “Mr. President, we do not know if CIA’s Marble Framework, or tools like it, played some kind of role in the campaign to blame Russia for hacking the DNC. Nor do we know how candid the denizens of CIA’s Digital Innovation Directorate have been with you and with Director Pompeo. These are areas that might profit from early White House review.  [ President Trump then directed Pompeo to invite Binney, one of the authors of the July 24, 2017 VIPS Memorandum to the President, to discuss all this.  Binney and Pompeo spent an hour together at CIA Headquarters on October 24, 2017, during which Binney briefed Pompeo with his customary straightforwardness. ]

“We also do not know if you have discussed cyber issues in any detail with President Putin. In his interview with NBC’s Megyn Kelly he seemed quite willing – perhaps even eager – to address issues related to the kind of cyber tools revealed in the Vault 7 disclosures, if only to indicate he has been briefed on them. Putin pointed out that today’s technology enables hacking to be ‘masked and camouflaged to an extent that no one can understand the origin’ [of the hack] … And, vice versa, it is possible to set up any entity or any individual that everyone will think that they are the exact source of that attack.

“‘Hackers may be anywhere,’ he said. ‘There may be hackers, by the way, in the United States who very craftily and professionally passed the buck to Russia. Can’t you imagine such a scenario? … I can.’

New attention has been drawn to these issues after I discussed them in a widely published 16-minute interview last Friday.

In view of the highly politicized environment surrounding these issues, I believe I must append here the same notice that VIPS felt compelled to add to our key Memorandum of July 24, 2017:

“Full Disclosure: Over recent decades the ethos of our intelligence profession has eroded in the public mind to the point that agenda-free analysis is deemed well nigh impossible. Thus, we add this disclaimer, which applies to everything we in VIPS say and do: We have no political agenda; our sole purpose is to spread truth around and, when necessary, hold to account our former intelligence colleagues.

“We speak and write without fear or favor. Consequently, any resemblance between what we say and what presidents, politicians and pundits say is purely coincidental.” The fact we find it is necessary to include that reminder speaks volumes about these highly politicized times.

Identity Theft and the Body’s Disappearance

By Robert Bohm

Source: The Hampton Institute

“What sphinx of cement and aluminum bashed open their skulls and ate up their brains and imagination?”

– Allen Ginsberg from his poem “ Howl

Identity theft, at least the most familiar type, is possible because today the individual exists not merely as flesh and blood, but as flesh and blood spliced with bank account numbers, user names, passwords, credit card chips, etc. These added parts aren’t secondary to the individual’s overall identity, they’re central to it. Sometimes they’re all there is of it, as in many banking and purchasing transactions. In such instances, the data we’ve supplied to the relevant institutions doesn’t merely represent us, it is us. Our bodies alone can’t complete transactions without the account numbers, user names, passwords, credit card numbers, and ID cards which have become our identity’s essence. Without them, in many ways, we don’t exist.

In a worst case scenario, if someone gets hold of this private data, they can become us by possessing the data that is us. Following this, who or what we are is no longer a question. We don’t exist, except in the form of a stolen dataset now under someone else’s control.

In such a case, an unknown proxy has eliminated us and become who we once were.

Although problematic, the above form of identity theft is relatively minor. A worse form is one we all know about, yet chronically underestimate because we think of ourselves as too canny to be conned. Nonetheless, this other form of identity theft frames and limits everything we do. In the process, it fleeces us of the fullness of our identities and subjects our lives to a type of remote control. This remote control consists of the combined influence on us, from childhood onward, of society’s major institutions and dominant activities, which seed us with a variety of parameters for how to acceptably navigate society and and its particular challenges.

This process is usually called “socialization.” However, it’s better seen as a sorting procedure in which society sifts us through a citizenship sieve in order to eliminate supposed defects, thereby guaranteeing that, despite each of us possessing unique characteristics, we share an underlying uniformity. Ultimately, this process is a kind of identity eugenics which strives to purify the population by eliminating or weakening troublesome qualities – e.g., an overly questioning attitude, chronic boundary-testing, a confrontational stance toward authority, a fierce protectiveness toward whatever space the body inhabits, etc. Such traits are frowned upon because they’re seen by the status quo as a likely threat to society’s stability.

Such indoctrination is much subtler yet, in many ways, more pervasive than outright propaganda. Its theater of operations is everywhere, taking place on many fronts. Public and private education, advertising, mass culture, government institutions, the prevailing ideas of how to correct socioeconomic wrongs (this is a “good” form of protest, this a “bad” one), the methods by which various slangs are robbed of their transgressive nature through absorption into the mainstream, the social production of substitute behaviors for nonconformity and rebellion – each of these phenomena and others play a role in generating the so-called “acceptable citizen,” a trimmed down (i.e., possesses reduced potential) version of her or his original personality.

Make no doubt about it, this trimming of the personality is a form of identity theft. It is, in fact, the ultimate form. Take as an example the African slave in the U.S.: abducted from her or his homeland, forbidden from learning to read or write, denied legal standing in the courts, given no say over whether offspring would be sold to another owner or remain with them. The slave was robbed of her/his most essential identity, their status as a human being.

In his book, The Souls of Black Folk , W.E.B. Du Bois described this theft in terms of how slavery reduces the slave to a person with “no true self-consciousness” – that is, with no stable knowledge of self, no clear sense of who she or he is in terms of culture, preceding generations, rituals for bringing to fruition one’s potential to create her or his own fate. As Du Bois correctly argued, this left the slave, and afterwards the freed Black, with a “longing to attain self-conscious manhood,” to know who she or he was, to see oneself through one’s own eyes and not through the eyes of one’s denigrators – e.g., white supremacists, confederate diehards, “good” people who nonetheless regarded Blacks as “lesser,” etc. Du Bois understood that from such people’s perspectives, Blacks possessed only one identity: the identity of being owned, of possessing no value other than what its owner could extract from them. Without an owner to extract this value, the slave was either identity-less or possessed an identity so slimmed and emaciated as to be a nothing.

The point here isn’t that today socialization enslaves the population in the same way as U.S. slavery once enslaved Blacks, but rather that identity theft is, psychologically and culturally speaking, a key aspect of disempowering people and has been for centuries. Today, because of mass culture and new technologies, the methods of accomplishing it are far more sophisticated than during other eras.

How disempowerment/identity theft occurs in contemporary society is inseparable from capitalism’s current state of development. We long ago passed the moment (after the introduction of assembly line production in the early 20th century) when modern advertising started its trek toward becoming one of the most powerful socialization forces in the U.S. As such, it convinces consumers not only to purchase individual products but, even more importantly, sells us on the idea that buying in general and all the time, no matter what we purchase, is proof of one’s value as a person.

To accomplish this end, modern advertising was molded by its creators into a type of PSYOP designed for destabilizing individuals’ adherence to old saws like “a penny saved is a penny earned” and “without frugality none can be rich, and with it very few would be poor.” Once this happened, the United States’ days of puritan buying restraint were over. However, modern advertising was never solely about undermining personal fiscal restraint. It was also about manipulating feelings of personal failure – e.g., dissatisfaction with lifestyle and income, a sense of being trapped, fear of being physically unappealing, etc. – and turning them not into motives for self-scrutiny or social critiques, but into a spur for commodity obsession. This wasn’t simply about owning the product or products, but an obsessive hope that buying one or more commodities would trigger relief from momentary or long-term anxiety and frustration related to one’s life-woes: job, marriage, lack of money, illness, etc.

Helen Woodward, a leading advertising copywriter of the early decades of the 20th century, described how this was done in her book, Through Many Windows , published in 1926. One example she used focused on women as consumers:

The restless desire for a change in fashions is a healthy outlet. It is normal to want something different, something new, even if many women spend too much time and too much money that way. Change is the most beneficent medicine in the world to most people. And to those who cannot change their whole lives or occupations, even a new line in a dress is often a relief. The woman who is tired of her husband or her home or a job feels some lifting of the weight of life from seeing a straight line change into a bouffant, or a gray pass into a beige. Most people do not have the courage or understanding to make deeper changes.

Woodward’s statement reveals not only the advertising industry’s PSYOP characteristic of manipulating people’s frustrations in order to lure them into making purchases, but also the industry’s view of the people to whom it speaks through its ads. As indicated by Woodward’s words, this view is one of condescension, of viewing most consumers as unable to bring about real socioeconomic change because they lack the abilities – “the courage or understanding” – necessary to do so. Consequently, their main purpose in life, it is implied, is to exist as a consumer mass constantly gorging on capitalism’s products in order to keep the system running smoothly. In doing this, Woodward writes, buyers find in the act of making purchases “a healthy outlet” for troubled emotions spawned in other parts of their lives.

Such advertising philosophies in the early 20th century opened a door for the industry, one that would never again be closed. Through that door (or window), one could glimpse the future: a world with an ever greater supply of commodities to sell and an advertising industry ready to make sure people bought them. To guarantee this, advertisers set about creating additional techniques for reshaping public consciousness into one persuaded that owning as many of those commodities as possible was an existential exercise of defining who an individual was.

In his book The Consumer Society , philosopher Jean Baudrillard deals with precisely this process. He writes that such a society is driven by:

the contradiction between a virtually unlimited productivity and the need to dispose of the product. It becomes vital for the system at this stage to control not only the mechanism of production, but also consumer demand.

“To control … consumer demand.” This is the key phrase here. Capitalist forces not only wanted to own and control the means of production in factories, it also wanted to control consumers in such a way that they had no choice but to buy, then buy more. In other words, capitalism was in quest of a strategy engineered to make us synch our minds to a capitalism operating in overdrive (“virtually unlimited” production).

The way this occurs, Baudrillard argues, is by capitalism transforming (through advertising) the process of buying an individual product from merely being a response to a “this looks good” or “that would be useful around the house” attitude to something more in line with what psychologists call “ego integration.” It refers to that part of human development in which an individual’s various personality characteristics (viewpoints, goals, physical desires, etc.) are organized into a balanced whole. At that point, what advertising basically did for capitalism was develop a reconfigured ego integration process in which the personality is reorganized to view its stability as dependent on its life as a consumer.

Advertisers pulled this off because the commodity, in an age of commodity profusion, isn’t simply a commodity but is also an indicator or sign referring to a particular set of values or behavior, i.e. a particular type of person. It is this which is purchased: the meaning, or constellation of meanings, which the commodity indicates.

In this way, the commodity, once bought, becomes a signal to others that “I, the owner, am this type of person.” Buy an Old Hickory J143 baseball bat and those in the know grasp that you’re headed for the pros. Sling on some Pandora bling and all the guys’ eyes are on you as you hip-swing into the Groove Lounge. Even the NY Times is hip to what’s up. If you want to be a true Antifa activist, the newspaper informed its readers on Nov. 29, 2017, this is the attire you must wear:

Black work or military boots, pants, balaclavas or ski masks, gloves and jackets, North Face brand or otherwise. Gas masks, goggles and shields may be added as accessories, but the basics have stayed the same since the look’s inception.

After you dress up, it’s not even necessary to attend a protest and fight fascists to be full-blown Antifa. You’re a walking billboard (or signification) proclaiming your values everywhere. Dress the part and you are the part.

Let’s return to Baudrillard, though. In The System of Objects , another of his books, he writes about how the issue of signification, and the method by which individuals purchase particular commodities in order to refine their identity for public consumption, becomes the universal mass experience:

To become an object of consumption, an object must first become a sign. That is to say: it must become external, in a sense, to a relationship that it now merely signifies … Only in this context can it be ‘personalized’, can it become part of a series, and so on; only thus can it be consumed, never in its materiality, but in its difference.

This “difference” is what the product signifies. That is, the product isn’t just a product anymore. It isn’t only its function. It has transitioned into an indicator of a unique personality trait, or of being a member of a certain lifestyle grouping or social class, or of subscribing to a particular political persuasion, Republican, anarchist, whatever. In this way, choosing the commodities to purchase is essential to one’s self-construction, one’s effort to make sure the world knows exactly who they are.

The individual produced by this citizen-forming process is a reduced one, the weight of her/his full personality pared down by cutting off the unnecessary weight of potentials and inclinations perceived as “not a good fit” for a citizen at this stage of capitalism. Such a citizen, however, isn’t an automaton. She or he makes choices, indulges her or his unique appetites, even periodically rebels against bureaucratic inefficiency or a social inequity perceived to be particularly stupid or unfair. Yet after a few days or few months of this activity, this momentary rebel fades back into the woodwork, satisfied by their sincere but token challenge to the mainstream. The woodwork into which they fade is, of course, their home or another favorite location (a lover’s apartment, a bar, a ski resort cabin, a pool hall, etc.).

From this point on, or at least for the foreseeable future, such a person isn’t inclined to look at the world with a sharp political eye, except possibly within the confines of their private life. In this way, they turn whatever criticism of the mainstream they may have into a petty gripe endowed with no intention of joining with others in order to fight for any specific change(s) regarding that political, socioeconomic or cultural phenomenon against which the complaint has been lodged. Instead, all the complainer wants is congratulations from her or his listener(s) about how passionate, on-target, and right the complaint was.

This is the sieve process, identity eugenics, in action. Far more subtle and elastic than previous methods of social control, it narrows what we believe to be our options and successfully maneuvers us into a world where advertising shapes us more than schools do. In this mode, it teaches us that life’s choices aren’t so much about justice or morality, but more about what choosing between commodities is like: which is more useful to me in my private life, which one better defines me as a person, which one makes me look cooler, chicer, brainier, hunkier, more activist to those I know.

It is in this context that a young, new, “acceptable” citizen enters society as a walking irony. Raised to be a cog in a machine in a time of capitalistic excess, the individual arrives on the scene as a player of no consequence in a game in which she or he has been deluded that they’re the game’s star. But far from being a star, this person, weakened beyond repair by the surrender of too much potential, is so without ability that she or he has no impact whatsoever on the game. Consequently, this individual is, for all practical purposes, an absence. The ultimate invisible person, a nothing in the midst of players who don’t take note of this absence at all. And why should they? The full-of-potential individual who eventually morphed into this absence is long gone, remembered by no one, except as a fading image of what once was.

This process of reducing a potentially creative person into a virtual non-presence is a form of ideological anorexia. Once afflicted, an individual refuses nourishment until they’re nothing but skin and bones. However, the “weight” they’ve lost doesn’t consist of actual pounds. Instead, it involves a loss of the psychological heftiness and mental bulk necessary to be a full human being.

One can’t lose more weight than that.

Human life as we once knew it is gone, replaced by the ritual of endless purchasing. This is existence in what used to be called “the belly of the beast.” Our role in life has become to nourish capitalism by being at its disposal, by giving of ourselves. Such giving frequently entails self-mutilation: the debt, credit card and otherwise, that bludgeons to death the dreams of many individuals and families.

This quasi-religious self-sacrifice replicates in another form: the Dark Ages practice employed by fanatical monks and other flagellants who lashed themselves with whips made from copper wires, thereby ripping their flesh and bleeding until they descended into a state of religious hysteria. The more we give of ourselves in this way, the thinner and more weightless we become. Meanwhile, the god whom Allen Ginsberg called Moloch grows more obese day after day, its belly is filled with:

Robot apartments! invisible suburbs! skeleton treasuries! blind capitals! demonic industries! spectral nations! invincible madhouses! granite cocks! monstrous bombs!…

Dreams! adorations! illuminations! religions! the whole boatload of sensitive bullshit!

What capitalism wants from us, of course, isn’t merely self-sacrifice, it’s surrender. Hunger for life is viewed negatively by the status quo because it nourishes the self, making it stronger and more alert and, therefore, better prepared to assert itself. The fact that such an empowered self is more there (possesses more of a presence) than its undersized counterpart makes the healthier self unacceptable to the powers that be. This is because there-ness is no longer an option in our national life. Only non-there-ness is. If you’re not a political anorexic, you’re on the wrong side.

Wherever we look, we see it. Invisibility, or at least as much of it as possible, is the individual’s goal. It’s the new real. Fashion reveals this as well as anything. It does so by disseminating an ideal of beauty that fetishizes the body’s anorexic wilting away. Not the body’s presence but its fade to disappearance is the source of its allure. The ultimate fashion model hovers fragilely on the brink of absence in order not to distract from the only thing which counts in capitalism: the commodity to be sold – e.g., the boutique bomber jacket, the shirt, the pantsuit, the earrings, the shawl, the stilettos, the iPhone, the Ferrari, and, possibly most of all, the political passivity intrinsic to spending your life acquiring things in order to prove to others and ourselves that we’ve discovered in these things something more useful than Socrates’ goal of knowing thyself or Emma Goldman’s warning , “The most unpardonable sin in society is independence of thought.”

What is true on the fashion runway is also true in politics. Just as the best model is one thin enough to fade into non-presence, so our democracy, supposedly ruled “by and for the people,” has thinned down so much that “the people” can’t even be seen (except as stage props), let alone get their hands on democracy except in token ways. No matter how often we the people are praised rhetorically by politicians, we aren’t allowed as a group to get in the way of the capitalist system’s freedom to do whatever it wants in order to sustain commodity worship and guarantee capital’s right to permanent rule. If the military-industrial complex needs another war in order to pump out more profits, then so be it. We have no say in the matter. The identity theft built into society’s structure makes sure of this. It’s stripped us of our “weight” – our creativity, our willingness to take political risks, our capacity to choose action over posturing. After this forced weight loss, what’s left of us is a mess. Too philosophically and psychologically anemic to successfully challenge our leaders’ decisions, we, for all practical purposes, disappear.

As a reward for our passivity, we’re permitted a certain range of freedom – as long as “a certain range” is defined as “varieties of buying” and doesn’t include behavior that might result in the population’s attainment of greater political power.

So, it continues, the only good citizen is the absent citizen. Which is to say, a citizen who has dieted him or herself into a state of political anorexia – i.e., that level of mental weightlessness necessary for guaranteeing a person’s permanent self-exclusion from the machinery of power.

***

Our flesh no longer exists in the way it once did. A new evolutionary stage has arrived.

In this new stage, the flesh isn’t merely what it seems to be: flesh, pure and simple. Instead, it’s a hybrid. It’s what exists after the mind oversees its passage through the sieve of mass culture.

After this passage, what the flesh is now are the poses it adopts from studying movies, rappers, punk rockers, fashionistas of all kinds, reality TV stars, football hunks, whomever. It’s also what it wears, skinny jeans or loose-fitting chinos, short skirt or spandex, Hawaiian shirt or muscle tank top, pierced bellybutton, dope hiking boots, burgundy eyeliner. Here we come, marching, strolling, demon-eyed, innocent as Johnny Appleseed. Everybody’s snapping pics with their phones, selfies and shots of others (friends, strangers, the maimed, the hilarious, the so-called idiotic). The flesh’s pictures are everywhere. In movie ads, cosmetic ads, suppository ads, Viagra ads. This is the wave of the already-here but still-coming future. The actual flesh’s replacement by televised, printed, digitalized and Photoshopped images of it produces the ultimate self-bifurcation.

Increasingly cut off from any unmediated life of its own, the flesh now exists mostly as a natural resource for those (including ourselves) who need it for a project; to photograph it, dress it up, pose it in a certain way, put it on a diet, commodify/objectify it in any style ranging from traditional commodification to the latest avant-garde objectification.

All these stylings/makeovers, although advertised as a form of liberation for the flesh (a “freeing” of your flesh so you can be what you want to be), are in fact not that. Instead, they are part of the process of distancing ourselves from the flesh by always doing something to it rather than simply being it.

When we are it, we feel what the flesh feels, the pain, the joy, the satisfaction, the terror, the disgust, the hints of hope, a sense of irreparable loss, whatever.

When we objectify it, it is a mannequin, emotionless, a thing that uses up a certain amount of space. As such we can do what we want with it: decorate it, pull it apart, vent our frustrations on it, starve it, practice surgical cuts on it, put it to whatever use we like. It isn’t a person. It is separate from our personhood and we own it.

In fact we own all the world’s flesh.

We live, after all, in the American Empire, and the Empire owns everything. As the Empire’s citizens, we own everything it owns. Except for one thing: ourselves.

***

The flesh is both here and not here. Increasingly, it is more an object that we do things to – e.g., bulk it up, change its hair color, mass-kill it from a hotel window on the 32nd floor, view in a porno flick – than a presence in its own right (i.e., self-contained, a force to be reckoned with). In this sense, it is a growing absence, each day losing more of its self-determination and becoming more a thing lost than something that exists fully, on its own, in the here and now. Given this, the proper attitude to have toward the flesh is one of nostalgia.

Of course, the flesh hasn’t really disappeared. What has disappeared is what it once was, a meat-and-bones reality, a site of pleasure and injury. Now, however, it’s not so valuable in itself as it is in its in its role as a starting-off point for endless makeovers.

These makeover options are arrayed before the consumer everywhere: online, in big box stores, in niche markets and so on. Today, it is in these places, not at birth, that the flesh starts its trek toward maturation. It does this by offering itself up as a sacrifice to be used as they see fit by the fashion industry, the gym industry, the addiction-cure industry, the diet industry, the pharmaceutical industry, the education industry, etc. Each body in the nation reaches its fullest potential only when it becomes a testing site to be used by these industries as they explore more and better ways to establish themselves as indispensable to capitalism’s endless reproduction.

In the end, the flesh, the target of all this competition for its attention, has less of a life on its own than it does as the object of advertisers’ opinions about what can be done to improve it or to reconstruct it. Only to the extent that the flesh can transcend or reconstitute itself can it be said to be truly alive.

This last fact – about aliveness – represents the culmination of a process. This process pertains to the visualization and digitalization of everything and the consequent disappearance of everything behind a wall of signification.

A televised or computerized image, discussion, commentary, conjecture, etc., becomes the thing it meditates on, depicts or interprets. This happens by virtue of the fact that the thing itself (the real flesh behind the televised or computerized image, discussion, commentary, conjecture, etc.) has disappeared into the discussion or into the image of it presented on the computer or TV screen.

In the same way, an anorexic model (her/his flesh and blood presence) disappears into the fashions she or he displays for the public.

In each instance the thing (the flesh) now no longer exists except in other people’s meditations on it; it has become those other people’s meditations. The ultimate anorexic, it (the thing) has lost so much weight it’s no longer physically there except as an idea in someone else’s mind or in a series of binary codings inside computers.

This is the final victory of absence over there-ness, of the anorexic ideal over the idea of being fully human (i.e., “bulging with existence,” “fat with life”). The self has been successfully starved to the point of such a radical thinness that it can no longer stand up to a blade of grass, let alone make itself felt by the powers that be.

Convenient Tales About Riches Within Reach

By Sam Pizzigati

Source: OpEdNews.com

The world at large knew virtually nada about Sylvia Bloom for 96 years. Then she died in 2016. Now, just a little too late, Sylvia Bloom is getting her belated — yet richly deserved — 15 minutes of worldwide fame.

The New York Times has just published a heart-warming story of the caring, upright life Sylvia Bloom lived, and the remarkable — and hidden — fortune she quietly accumulated over the course of her 67-year career as a Manhattan legal secretary.

That fortune totaled, in the end, over $9 million. The bulk of that wealth, the Times account reveals, is going — per Bloom’s wishes — to help students from poor families advance their educations.

None of Bloom’s surviving relatives or law firm colleagues or fellow volunteers at the Henry Street Settlement, the social services agency set to get $6.24 million from her bequest, had any idea that their unassuming loved one and friend had saved anything remotely close to multiple millions.

Counting Pennies

Bloom lived frugally all her life in Brooklyn and commuted, by subway, to her job. The “high life” never interested her in the least. She led a simple existence. She counted her pennies. In the end, she put them all to good use.

Stories like Bloom’s have been popping up regularly over recent years. Leonard Gigowski, a Wisconsin shopkeeper, died three years ago at age 90, and left behind a “secret $13 million fortune” that’s currently funding scholarships. Grace Groner passed away in 2010 at age 100. She spent most of her life in a one-bedroom Illinois home, shopped at thrift stores, and left $9 million for her alma mater.

Convenient Tales

Our popular culture can’t seem to get enough of these life-affirming tales of modest multi-millionaire seniors. These stories make us feel good. They also, unfortunately, reinforce a message that our society’s richest — and their cheerleaders — find enormously convenient.

You don’t have to be money-hungry, commit vile acts or have remarkable talents to become wealthy, the tellers of all these stories of hidden millions suggest. You just have to be frugal; almost anybody, in other words, can become rich.

And if you don’t happen to become rich, the media coverage of these stories not so subtly hints, just look in the mirror for the reason why. You, too, could have resisted temptation and counted your pennies.

You, too, could have built a huge personal fortune. Shame on you. You chose not to.

The Millionaire Next Door

A couple of decades ago, two academic researchers — Thomas Stanley and William Danko — made themselves not insignificant personal fortunes by wrapping up that same theme in reams of statistics. Their 1996 book, The Millionaire Next Door, has so far sold over 4 million copies.

That thrifty fellow down the block with a six-year-old Ford, The Millionaire Next Door related, could well be worth millions. And those millions, the book stressed, all begin with frugality.

Conservative pundits have always loved this basic frugality-pays thesis. Stanley and Danko, the argument goes, have served up the ultimate secret to getting rich. “Hardly any” of the self-made rich the pair profiled in The Millionaire Next Door, as one commentator noted a few years ago, “had expensive tastes.” Instead, these millionaires avoided “new homes and expensive clothes” and “often invested 15 to 20 percent of their net income.”

Any of us could follow that lead, this analyst would add, so long as we understand “that building wealth takes discipline, sacrifice, and hard work.”

Reaping Rewards

But if “discipline, sacrifice, and hard work” build wealth, why do so many millions of disciplined, sacrificing, and hard-working Americans today have so little of it? Why is the “millionaire next door” — especially for our millennial generation — becoming a vanishing species?

Sylvia Bloom’s life offers some clues. Yes, Bloom lived frugally, sacrificed, and worked hard. But she also matured in a society — mid-20th century America — that endeavored to help disciplined, sacrificing, and hard-working people.

That help came in many different forms. Sylvia Bloom attended Hunter College, part of a system of free public higher education in New York City. She and her husband, a firefighter and later teacher, lived in a rent-controlled apartment. She commuted, for just a few dimes per day, on the world’s most extensive public transit system.

Sylvia Bloom’s young adult counterparts today? They confront a totally different reality. The sky-high costs of attending college have turned 21st-century young adults into life-long debtors. To find an affordable place to live, they squeeze into tiny apartments close to their jobs or plop themselves in distant exurbs, fighting traffic jams all the way to work — if not paying big bucks daily for scarce transit options.

Austerity Trumps Frugality

These millennials aren’t living the frugal life. They’re living the austere life — and not by choice. Our elected leaders have thrust this austerity upon them, with decades of public policies that have rewarded the rich with tax cuts at every turn and whittled away public services at every opportunity.

If Sylvia Bloom had been born a millennial, she’d be pinching pennies today to pay off her college debts. She’d be looking forward to years of hard work and sacrifice, with no hope of ever saving up enough to become a significant invester.

In her actual life, Sylvia Bloom had the good fortune to live her early adult years in a society much more caring than ours. She cared back — and chose to devote her own financial good fortune to helping others to the same support that so helped her.

Sylvia Bloom’s life does indeed offer up inspiration. Let’s not let our rich turn that life into a rationalization for their riches.

How false flag operations are carried out today

By Philip M. Giraldi

Source: Intrepid Report

False Flag is a concept that goes back centuries. It was considered to be a legitimate ploy by the Greeks and Romans, where a military force would pretend to be friendly to get close to an enemy before dropping the pretense and raising its banners to reveal its own affiliation just before launching an attack. In the sea battles of the eighteenth century among Spain, France and Britain hoisting an enemy flag instead of one’s own to confuse the opponent was considered to be a legitimate ruse de guerre, but it was only “honorable” if one reverted to one’s own flag before engaging in combat.

Today’s false flag operations are generally carried out by intelligence agencies and non-government actors including terrorist groups, but they are only considered successful if the true attribution of an action remains secret. There is nothing honorable about them as their intention is to blame an innocent party for something that it did not do. There has been a lot of such activity lately and it was interesting to learn by way of a leak that the Central Intelligence Agency (CIA) has developed a capability to mimic the Internet fingerprints of other foreign intelligence services. That means that when the media is trumpeting news reports that the Russians or Chinese hacked into U.S. government websites or the sites of major corporations, it could actually have been the CIA carrying out the intrusion and making it look like it originated in Moscow or Beijing. Given that capability, there has been considerable speculation in the alternative media that it was actually the CIA that interfered in the 2016 national elections in the United States.

False flags can be involved in other sorts of activity as well. The past year’s two major alleged chemical attacks carried out against Syrian civilians that resulted in President Donald Trump and associates launching 160 cruise missiles are pretty clearly false flag operations carried out by the rebels and terrorist groups that controlled the affected areas at the time. The most recent reported attack on April 7 might not have occurred at all, according to doctors and other witnesses who were actually in Douma. Because the rebels succeeded in convincing much of the world that the Syrian government had carried out the attacks, one might consider their false flag efforts to have been extremely successful.

The remedy against false flag operations such as the recent one in Syria is, of course, to avoid taking the bait and instead waiting until a thorough and objective inspection of the evidence has taken place. The United States, Britain and France did not do that, preferring instead to respond to hysterical press reports by “doing something.” If the U.N. investigation of the alleged attack turns up nothing, a distinct possibility, it is unlikely that they will apologize for having committed a war crime.

The other major false flag that has recently surfaced is the poisoning of Sergei Skripal and his daughter Yulia in Salisbury, England, on March 4. Russia had no credible motive to carry out the attack and had, in fact, good reasons not to do so. The allegations made by British Prime Minister Theresa May about the claimed nerve agent being “very likely” Russian in origin have been debunked, in part through examination by the U.K.’s own chemical weapons lab. May, under attack even within her own party, needed a good story and a powerful enemy to solidify her own hold on power so false flagging something to Russia probably appeared to be just the ticket as Moscow would hardly be able to deny the “facts” being invented in London. Unfortunately, May proved wrong and the debate ignited over her actions, which included the expulsion of twenty-three Russian diplomats, has done her severe damage. Few now believe that Russia actually carried out the poisoning and there is a growing body of opinion suggesting that it was actually a false flag executed by the British government or even by the CIA.

The lesson that should be learned from Syria and Skripal is that if “an incident” looks like it has no obvious motive behind it, there is a high probability that it is a false flag. A bit of caution in assigning blame is appropriate given that the alternative would be a precipitate and likely disproportionate response that could easily escalate into a shooting war.

Social Media Behemoths Sweep Alternative News into the Memory Hole

By Kurt Nimmo

Source: Another Day in the Empire

The squabbling between self-identified progressives and conservatives continues as social media transforms itself into a news, information, and opinion gatekeeper.

All information that contradicts the establishment narrative will either be downgraded into obscurity or excluded outright on social media.

Take for instance ThinkProgress, the Soros-financed news website, a project of the Center for American Progress Action Fund welded to the infrastructure of the Democrat party. On May 2, it complained that a bias study at Facebook will be run by conservatives, that is to say establishment Republicans, notably former Arizona Congress critter Jon Kyl.

ThinkProgress believes there is no such thing as bias aimed at conservatives—it’s the liberals who are routinely downgraded at Facebook while so-called conservatives are free to post what progressives characterize as an evil and poisonous ideology.

According to Libby Watson at Splinter News, conservatives are involved in “grift,” flimflamming poor Mark Zuckerberg with untrue claims of bias against the likes of Breitbart News.

It’s all part of a never ending and hugely counterproductive “culture war” that has raged between the ostensible right and left going on thirty years now. Ms. Watson manages to squeeze identity politics into her screed.

“The conservative movement has done a remarkable job over the last half century to bellow and bully its way into having its most ridiculous and reality-divorced concerns taken seriously,” she writes. “It lies about and distorts everything: about tax cuts, about Benghazi and her emails, about immigration, about healthcare, about Diamond and Silk. The further Facebook descends down the path of letting that screaming white face of faux outrage dictate how they run their platform, the harder it’s going to be for them to get away from them.”

The progressive news website Common Dreams complains it has weathered “significant drops in traffic since Google and Facebook began changing algorithms and talking openly about their new attempts to control the kind of news content users see. According to internal data and Google Analytics, traffic to Common Dreams from Google searches fell by 34 percent after the powerful search giant unveiled its new search protocol in April 2017.”

Meanwhile, on the other side of the yawning divide, Brent Bozell, founder of the Media Research Center, rallied around 60 conservatives and fired off an open letter to the social media giants demanding transparency, clarity on the definition of hate speech, equality for conservatives, and respect for the First Amendment.

“Social media censorship and online restriction of conservatives and their organizations have reached a crisis level,” the open letter states. “Facebook CEO Mark Zuckerberg’s hearings on Capitol Hill only served to draw attention to how widespread this problem has become. Conservative leaders now have banded together to call for equal treatment on tech and social media.”

Both liberals and conservatives are missing the point.

Facebook and Google will continue and enlarge the effort to gatekeep information that does not jive with the establishment narrative, be it from the right or left.

The internet and web upended the establishment’s carefully constructed propaganda machine—the CIA’s “Mighty Wurlitzer” under its Operation Mockingbird beginning in the early 1950s—deeply embedded within corporate media.

Beginning with Friendster, MySpace, and like projects in the early 2000s and eventually morphing into the corporate behemoths Facebook, YouTube, and Twitter, social media platforms have extended the reach of alternative media, much to the displeasure of the establishment. Its preferred propaganda conduits have withered and this has seriously hampered its ability to control the narrative.

Both the right and left need to nurture their own social media platforms and drive traffic there.

Of course, this will not be as effective as plugging into the massive matrix of social connectivity provided by the corporate tech giants, but the alternative is to be marginalized and eventually swept into the memory hole as the context of “extremism” narrows and constricts expression, excluding all but the most token disagreement with the establishment narrative.

However, I’m not sure we’re up to it.

The elite has done a remarkable job of using the time tested divide and conquer concept, endlessly pitting the so-called right against the amorphously defined left and vice versa. Liberals and conservatives continue to fight over frivolous ideological points as the funny money asset-driven economy prepares to implode and the mission of infinity war expands to the point where it endangers life on planet Earth.

US Collapse – the Spectacle of Our Time

By Finnian Cunningham

Source: Axis of Logic

May you live in interesting times, goes the Chinese proverb. Few can doubt that we are indeed living in such an interesting time. Big changes are afoot in the world, it seems.

None more so than the collapsing of the American Empire.

The US is going through an historic “correction” in the same way that the Soviet Union did some 30 years ago when the latter was confronted with the reality of its unsustainable political and economic system. (That’s not meant to imply, however, that socialism is unviable, because arguably the Soviet Union had fatally strayed from its genuine socialist project into something more akin to unwieldy state capitalism.)

In any case, all empires come to an end eventually. History is littered with the debris of countless empires. Why should the American Empire be any different? It’s not. Only arrogant “American exceptionalism” deludes itself from the reality.

The notable thing is just how in denial the political class and the US news media are about the unfolding American crisis.

This is partly where the whole “Russiagate” narrative comes into play. Blaming Russia for allegedly destabilizing US politics and society is a cover for denial over the internal rot facing the US.

Some may scoff at the very idea of an “American Empire”. That’s something Europeans did, not us, goes the apologist for US power. The quick retort to that view is to point out that the US has over 1,000 military bases in more than 100 countries around the world. If that is not a manifestation of empire then what is?

For seven decades since the Second World War, “Pax Americana” was the grandiose name given to US imperial design for the global order. The period was far from peaceful as the vainglorious name suggests. Dozens of wars, proxy conflicts and violent subversions were carried by the US on every continent in order to maintain its empire. The so-called “global policeman” was more often a “global thug”.

That US empire is now teetering at the cusp of an emerging multipolar world order led by China, Russia and other rising powers.

When US leaders complain about China and Russia “reshaping the global order” to reflect their interests what the American leaders are tacitly admitting is the coming end of Washington’s presumed hegemony.

Rather than accepting the fate of demise, the US is aggressively resisting by denigrating China and Russia’s power as somehow illegitimate. It’s the classic denial reaction of a sore loser.

So, what are the telltale signs that the US is indeed undergoing a seminal “correction” — or collapse?

The heyday of American capitalism is well passed. The once awesome productive system is a skeleton of its former self. The rise of massive social poverty alongside obscene wealth among a tiny elite is a sure sign that the once mighty American economy is chronically moribund. The country’s soaring $20 trillion national debt is another symptom of chronic atrophy.

Recent self-congratulatory whooping by President Trump of “economic recovery” is like the joy felt from looking at a mirage. The roaring stock market is an elite phenomenon which can just as easily slump over night.
What the champagne bubbles can’t disguise is the structural failing of US capitalism to reverse exploding inequality and endemic poverty across America. The national prowess of US capitalism has been superseded by global capitalism where American corporations among others scour the planet for cheap labor and tax havens. There is no going back to a supposed golden age, no matter how much Trump crows about “America First”.

The other side of the coin from historic US economic demise is the concomitant rise in its militarism as a way to compensate for its overall loss of power.

It is no coincidence that since the end of the Cold War following the dissolution of the Soviet Union, US military interventions around the world have erupted with increased frequency and duration. The US is in a veritable permanent state of war actively deploying its forces simultaneously in several countries, particularly in the oil-rich Middle East.

Washington of course gives itself a fig leaf cover by calling its surge in militarism a “war on terror” or “defending allies”. But, increasingly, US war conduct is seen for what it plainly is — violation of international law and the sovereignty of nations for the pursuit of American imperial interests.

In short, the US is patently lashing out as a rogue regime. There’s no disguising that fiendish fact.

In addition to waging wars, bombing countries, sponsoring terrorist proxies and assassinating enemies at will with drones, Washington is increasingly threatening others with military aggression. In recent months, North Korea and Iran have been openly threatened based on spurious claims. Russia and China have also been explicitly warned of American aggression in several strategic documents published by the Trump administration.

The grounds for American belligerence are baseless. As noted, the real motive is to do with compensating for its own inherent political, economic and social crises. That then amounts to American leaders inciting conflicts and wars, which is in itself a grave violation of international law — a crime against peace, according to Nuremberg principles.

The American Empire is failing and flailing. This is the spectacle of our time. The Western mainstream news media are either blind, ignorant or complicit in denying the historic collapse. Such media are indulging reckless fantasies of the US political class to distract from the potential internal implosion. Casting around for scapegoats to “explain” the deep inherent problems, the political class are using Russia and alleged Russian “interference” as a pretext.

World history has reached a foreboding cross-roads due to the collapsing of the American Empire. Can we navigate a safe path forward avoiding catastrophic war that often accompanies the demise of empires?

A lot, it seems, depends on ordinary American people becoming politically organized to challenge their dysfunctional system run by and for the elites. If the American people cannot hold their elites to account and break their corrupt rule, overhauling it with something more equitable and democratic, then the world is in peril of being plunged into total war. We can only but wish our American brothers and sisters solidarity and success.

Disarming the Weapons of Mass Distraction

By Madeleine Bunting

Source: Rise Up Times

“Are you paying attention?” The phrase still resonates with a particular sharpness in my mind. It takes me straight back to my boarding school, aged thirteen, when my eyes would drift out the window to the woods beyond the classroom. The voice was that of the math teacher, the very dedicated but dull Miss Ploughman, whose furrowed grimace I can still picture.

We’re taught early that attention is a currency—we “pay” attention—and much of the discipline of the classroom is aimed at marshaling the attention of children, with very mixed results. We all have a history here, of how we did or did not learn to pay attention and all the praise or blame that came with that. It used to be that such patterns of childhood experience faded into irrelevance. As we reached adulthood, how we paid attention, and to what, was a personal matter and akin to breathing—as if it were automatic.

Today, though, as we grapple with a pervasive new digital culture, attention has become an issue of pressing social concern. Technology provides us with new tools to grab people’s attention. These innovations are dismantling traditional boundaries of private and public, home and office, work and leisure. Emails and tweets can reach us almost anywhere, anytime. There are no cracks left in which the mind can idle, rest, and recuperate. A taxi ad offers free wifi so that you can remain “productive” on a cab journey.

Even those spare moments of time in our day—waiting for a bus, standing in a queue at the supermarket—can now be “harvested,” says the writer Tim Wu in his book The Attention Merchants. In this quest to pursue “those slivers of our unharvested awareness,” digital technology has provided consumer capitalism with its most powerful tools yet. And our attention fuels it. As Matthew Crawford notes in The World Beyond Your Head, “when some people treat the minds of other people as a resource, this is not ‘creating wealth,’ it is transferring it.”

There’s a whiff of panic around the subject: the story that our attention spans are now shorter than a goldfish’s attracted millions of readers on the web; it’s still frequently cited, despite its questionable veracity. Rates of diagnosis attention deficit hyperactivity disorder in children have soared, creating an $11 billion global market for pharmaceutical companies. Every glance of our eyes is now tracked for commercial gain as ever more ingenious ways are devised to capture our attention, if only momentarily. Our eyeballs are now described as capitalism’s most valuable real estate. Both our attention and its deficits are turned into lucrative markets.

There is also a domestic economy of attention; within every family, some get it and some give it. We’re all born needing the attention of others—our parents’, especially—and from the outset, our social skills are honed to attract the attention we need for our care. Attention is woven into all forms of human encounter from the most brief and transitory to the most intimate. It also becomes deeply political: who pays attention to whom?

Social psychologists have researched how the powerful tend to tune out the less powerful. One study with college students showed that even in five minutes of friendly chat, wealthier students showed fewer signs of engagement when in conversation with their less wealthy counterparts: less eye contact, fewer nods, and more checking the time, doodling, and fidgeting. Discrimination of race and gender, too, plays out through attention. Anyone who’s spent any time in an organization will be aware of how attention is at the heart of office politics. A suggestion is ignored in a meeting, but is then seized upon as a brilliant solution when repeated by another person.

What is political is also ethical. Matthew Crawford argues that this is the essential characteristic of urban living: a basic recognition of others.

And then there’s an even more fundamental dimension to the politics of attention. At a primary level, all interactions in public space require a very minimal form of attention, an awareness of the presence and movement of others. Without it, we would bump into each other, frequently.

I had a vivid demonstration of this point on a recent commute: I live in East London and regularly use the narrow canal paths for cycling. It was the canal rush hour—lots of walkers with dogs, families with children, joggers as well as cyclists heading home. We were all sharing the towpath with the usual mixture of give and take, slowing to allow passing, swerving around and between each other. Only this time, a woman was walking down the center of the path with her eyes glued to her phone, impervious to all around her. This went well beyond a moment of distraction. Everyone had to duck and weave to avoid her. She’d abandoned the unspoken contract that avoiding collision is a mutual obligation.

This scene is now a daily occurrence for many of us, in shopping centers, station concourses, or on busy streets. Attention is the essential lubricant of urban life, and without it, we’re denying our co-existence in that moment and place. The novelist and philosopher, Iris Murdoch, writes that the most basic requirement for being good is that a person “must know certain things about his surroundings, most obviously the existence of other people and their claims.”

Attention is what draws us out of ourselves to experience and engage in the world. The word is often accompanied by a verb—attention needs to be grabbed, captured, mobilized, attracted, or galvanized. Reflected in such language is an acknowledgement of how attention is the essential precursor to action. The founding father of psychology William James provided what is still one of the best working definitions:

It is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. Focalization, concentration, of consciousness are of its essence. It implies withdrawal from some things in order to deal effectively with others.

Attention is a limited resource and has to be allocated: to pay attention to one thing requires us to withdraw it from others. There are two well-known dimensions to attention, explains Willem Kuyken, a professor of psychology at Oxford. The first is “alerting”— an automatic form of attention, hardwired into our brains, that warns us of threats to our survival. Think of when you’re driving a car in a busy city: you’re aware of the movement of other cars, pedestrians, cyclists, and road signs, while advertising tries to grab any spare morsel of your attention. Notice how quickly you can swerve or brake when you spot a car suddenly emerging from a side street. There’s no time for a complicated cognitive process of decision making. This attention is beyond voluntary control.

The second form of attention is known as “executive”—the process by which our brain selects what to foreground and focus on, so that there can be other information in the background—such as music when you’re cooking—but one can still accomplish a complex task. Crucially, our capacity for executive attention is limited. Contrary to what some people claim, none of us can multitask complex activities effectively. The next time you write an email while talking on the phone, notice how many typing mistakes you make or how much you remember from the call. Executive attention can be trained, and needs to be for any complex activity. This was the point James made when he wrote: “there is no such thing as voluntary attention sustained for more than a few seconds at a time… what is called sustained voluntary attention is a repetition of successive efforts which bring back the topic to the mind.”

Attention is a complex interaction between memory and perception, in which we continually select what to notice, thus finding the material which correlates in some way with past experience. In this way, patterns develop in the mind. We are always making meaning from the overwhelming raw data. As James put it, “my experience is what I agree to attend to. Only those items which I notice shape my mind—without selective interest, experience is an utter chaos.”

And we are constantly engaged in organizing that chaos, as we interpret our experience. This is clear in the famous Gorilla Experiment in which viewers were told to watch a video of two teams of students passing a ball between them. They had to count the number of passes made by the team in white shirts and ignore those of the team in black shirts. The experiment is deceptively complex because it involves three forms of attention: first, scanning the whole group; second, ignoring the black T-shirt team to keep focus on the white T-shirt team (a form of inhibiting attention); and third, remembering to count. In the middle of the experiment, someone in a gorilla suit ambles through the group. Afterward, half the viewers when asked hadn’t spotted the gorilla and couldn’t even believe it had been there. We can be blind not only to the obvious, but to our blindness.

There is another point in this experiment which is less often emphasized. Ignoring something—such as the black T-shirt team in this experiment—requires a form of attention. It costs us attention to ignore something. Many of us live and work in environments that require us to ignore a huge amount of information—that flashing advert, a bouncing icon or pop-up.

In another famous psychology experiment, Walter Mischel’s Marshmallow Test, four-year-olds had a choice of eating a marshmallow immediately or two in fifteen minutes. While filmed, each child was put in a room alone in front of the plate with a marshmallow. They squirmed and fidgeted, poked the marshmallow and stared at the ceiling. A third of the children couldn’t resist the marshmallow and gobbled it up, a third nibbled cautiously, but the last third figured out how to distract themselves. They looked under the table, sang… did anything but look at the sweet. It’s a demonstration of the capacity to reallocate attention. In a follow-up study some years later, those who’d been able to wait for the second marshmallow had better life outcomes, such as academic achievement and health. One New Zealand study of 1,000 children found that this form of self-regulation was a more reliable predictor of future success and wellbeing than even a good IQ or comfortable economic status.

What, then, are the implications of how digital technologies are transforming our patterns of attention? In the current political anxiety about social mobility and inequality, more weight needs to be put on this most crucial and basic skill: sustaining attention.

*

I learned to concentrate as a child. Being a bookworm helped. I’d be completely absorbed in my reading as the noise of my busy family swirled around me. It was good training for working in newsrooms; when I started as a journalist, they were very noisy places with the clatter of keyboards, telephones ringing and fascinating conversations on every side. What has proved much harder to block out is email and text messages.

The digital tech companies know a lot about this widespread habit; many of them have built a business model around it. They’ve drawn on the work of the psychologist B.F. Skinner who identified back in the Thirties how, in animal behavior, an action can be encouraged with a positive consequence and discouraged by a negative one. In one experiment, he gave a pigeon a food pellet whenever it pecked at a button and the result, as predicted, was that the pigeon kept pecking. Subsequent research established that the most effective way to keep the pigeon pecking was “variable-ratio reinforcement.” Give the pigeon a food pellet sometimes, and you have it well and truly hooked.

We’re just like the pigeon pecking at the button when we check our email or phone. It’s a humiliating thought. Variable reinforcement ensures that the customer will keep coming back. It’s the principle behind one of the most lucrative US industries: slot machines, which generate more profit than baseball, films, and theme parks combined. Gambling was once tightly restricted for its addictive potential, but most of us now have the attentional equivalent of a slot machine in our pocket, beside our plate at mealtimes, and by our pillow at night. Even during a meal out, a play at the theater, a film, or a tennis match. Almost nothing is now experienced uninterrupted.

Anxiety about the exponential rise of our gadget addiction and how it is fragmenting our attention is sometimes dismissed as a Luddite reaction to a technological revolution. But that misses the point. The problem is not the technology per se, but the commercial imperatives that drive the new technologies and, unrestrained, colonize our attention by fundamentally changing our experience of time and space, saturating both in information.

In much public space, wherever your eye lands—from the back of the toilet door, to the handrail on the escalator, or the hotel key card—an ad is trying to grab your attention, and does so by triggering the oldest instincts of the human mind: fear, sex, and food. Public places become dominated by people trying to sell you something. In his tirade against this commercialization, Crawford cites advertisements on the backs of school report cards and on debit machines where you swipe your card. Before you enter your PIN, that gap of a few seconds is now used to show adverts. He describes silence and ad-free experience as “luxury goods” that only the wealthy can afford. Crawford has invented the concept of the “attentional commons,” free public spaces that allow us to choose where to place our attention. He draws the analogy with environmental goods that belong to all of us, such as clean air or clean water.

Some legal theorists are beginning to conceive of our own attention as a human right. One former Google employee warned that “there are a thousand people on the other side of the screen whose job it is to break down the self-regulation you have.” They use the insights into human behavior derived from social psychology—the need for approval, the need to reciprocate others’ gestures, the fear of missing out. Your attention ceases to be your own, pulled and pushed by algorithms. Attention is referred to as the real currency of the future.

*

In 2013, I embarked on a risky experiment in attention: I left my job. In the previous two years, it had crept up on me. I could no longer read beyond a few paragraphs. My eyes would glaze over and, even more disastrously for someone who had spent their career writing, I seemed unable to string together my thoughts, let alone write anything longer than a few sentences. When I try to explain the impact, I can only offer a metaphor: it felt like my imagination and use of language were vacuum packed, like a slab of meat coated in plastic. I had lost the ability to turn ideas around, see them from different perspectives. I could no longer draw connections between disparate ideas.

At the time, I was working in media strategy. It was a culture of back-to-back meetings from 8:30 AM to 6 PM, and there were plenty of advantages to be gained from continuing late into the evening if you had the stamina. Commitment was measured by emails with a pertinent weblink. Meetings were sometimes as brief as thirty minutes and frequently ran through lunch. Meanwhile, everyone was sneaking time to battle with the constant emails, eyes flickering to their phone screens in every conversation. The result was a kind of crazy fog, a mishmash of inconclusive discussions.

At first, it was exhilarating, like being on those crazy rides in a theme park. By the end, the effect was disastrous. I was almost continuously ill, battling migraines and unidentifiable viruses. When I finally made the drastic decision to leave, my income collapsed to a fraction of its previous level and my family’s lifestyle had to change accordingly. I had no idea what I was going to do; I had lost all faith in my ability to write. I told friends I would have to return the advance I’d received to write a book. I had to try to get back to the skills of reflection and focus that had once been ingrained in me.

The first step was to teach myself to read again. I sometimes went to a café, leaving my phone and computer behind. I had to slow down the racing incoherence of my mind so that it could settle on the text and its gradual development of an argument or narrative thread. The turning point in my recovery was a five weeks’ research trip to the Scottish Outer Hebrides. On the journey north of Glasgow, my mobile phone lost its Internet connection. I had cut myself loose with only the occasional text or call to family back home. Somewhere on the long Atlantic beaches of these wild and dramatic islands, I rediscovered my ability to write.

I attribute that in part to a stunning exhibition I came across in the small harbor town of Lochboisdale, on the island of South Uist. Vija Celmins is an acclaimed Latvian-American artist whose work is famous for its astonishing patience. She can take a year or more to make a woodcut that portrays in minute detail the surface of the sea. A postcard of her work now sits above my desk, a reminder of the power of slow thinking.

Just as we’ve had a slow eating movement, we need a slow thinking campaign. Its manifesto could be the German poet Rainer Maria Rilke’s beautiful “Letters to a Young Poet”:

To let every impression and the germ of every feeling come to completion inside, in the dark, in the unsayable, the unconscious, in what is unattainable to one’s own intellect, and to wait with deep humility and patience for the hour when a new clarity is delivered.

Many great thinkers attest that they have their best insights in moments of relaxation, the proverbial brainwave in the bath. We actually need what we most fear: boredom.

When I left my job (and I was lucky that I could), friends and colleagues were bewildered. Why give up a good job? But I felt that here was an experiment worth trying. Crawford frames it well as “intellectual biodiversity.” At a time of crisis, we need people thinking in different ways. If we all jump to the tune of Facebook or Instagram and allow ourselves to be primed by Twitter, the danger is that we lose the “trained powers of concentration” that allow us, in Crawford’s words, “to recognize that independence of thought and feeling is a fragile thing, and requires certain conditions.”

I also took to heart the insights of the historian Timothy Snyder, who concluded from his studies of twentieth-century European totalitarianism that the way to fend off tyranny is to read books, make an effort to separate yourself from the Internet, and “be kind to our language… Think up your own way of speaking.” Dropping out and going offline enabled me to get back to reading, voraciously, and to writing; beyond that, it’s too early to announce the results of my experiment with attention. As Rilke said, “These things cannot be measured by time, a year has no meaning, and ten years are nothing.”

*

A recent column in The New Yorker cheekily suggests that all the fuss about the impact of digital technologies on our attention is nothing more than writers’ worrying about their own working habits. Is all this anxiety about our fragmenting minds a moral panic akin to those that swept Victorian Britain about sexual behavior? Patterns of attention are changing, but perhaps it doesn’t much matter?

My teenage children read much less than I did. One son used to play chess online with a friend, text on his phone, and do his homework all at the same time. I was horrified, but he got a place at Oxford. At his interview, he met a third-year history undergraduate who told him he hadn’t yet read any books in his time at university. But my kids are considerably more knowledgeable about a vast range of subjects than I was at their age. There’s a small voice suggesting that the forms of attention I was brought up with could be a thing of the past; the sustained concentration required to read a whole book will become an obscure niche hobby.

And yet, I’m haunted by a reflection: the magnificent illuminations of the eighth-century Book of Kells has intricate patterning that no one has ever been able to copy, such is the fineness of the tight spirals. Lines are a millimeter apart. They indicate a steadiness of hand and mind—a capability most of us have long since lost. Could we be trading in capacities for focus in exchange for a breadth of reference? Some might argue that’s not a bad trade. But we would lose depth: artist Paul Klee wrote that he would spend a day in silent contemplation of something before he painted it. Paul Cézanne was similarly known for his trance like attention on his subject. Madame Cézanne recollected how her husband would gaze at the landscape, and told her, “The landscape thinks itself in me, and I am its consciousness.” The philosopher Maurice Merleau-Ponty describes a contemplative attention in which one steps outside of oneself and immerses oneself in the object of attention.

It’s not just artists who require such depth of attention. Nearly two decades ago, a doctor teaching medical students at Yale was frustrated at their inability to distinguish between types of skin lesions. Their gaze seemed restless and careless. He took his students to an art gallery and told them to look at a picture for fifteen minutes. The program is now used in dozens of US medical schools.

Some argue that losing the capacity for deep attention presages catastrophe. It is the building block of “intimacy, wisdom, and cultural progress,” argues Maggie Jackson in her book Distracted, in which she warns that “as our attentional skills are squandered, we are plunging into a culture of mistrust, skimming, and a dehumanizing merging between man and machine.” Significantly, her research began with a curiosity about why so many Americans were deeply dissatisfied with life. She argues that losing the capacity for deep attention makes it harder to make sense of experience and to find meaning—from which comes wonder and fulfillment. She fears a new “dark age” in which we forget what makes us truly happy.

Strikingly, the epicenter of this wave of anxiety over our attention is the US. All the authors I’ve cited are American. It’s been argued that this debate represents an existential crisis for America because it exposes the flawed nature of its greatest ideal, individual freedom. The commonly accepted notion is that to be free is to make choices, and no one can challenge that expression of autonomy. But if our choices are actually engineered by thousands of very clever, well-paid digital developers, are we free? The former Google employee Tristan Harris confessed in an article in 2016 that technology “gives people the illusion of free choice while architecting the menu so that [tech giants] win, no matter what you choose.”

Despite my children’s multitasking, I maintain that vital human capacities—depth of insight, emotional connection, and creativity—are at risk. I’m intrigued as to what the resistance might look like. There are stirrings of protest with the recent establishment of initiatives such as the Time Well Spent movement, founded by tech industry insiders who have become alarmed at the efforts invested in keeping people hooked. But collective action is elusive; the emphasis is repeatedly on the individual to develop the necessary self-regulation, but if that is precisely what is being eroded, we could be caught in a self-reinforcing loop.

One of the most interesting responses to our distraction epidemic is mindfulness. Its popularity is evidence that people are trying to find a way to protect and nourish their minds. Jon Kabat-Zinn, who pioneered the development of secular mindfulness, draws an analogy with jogging: just as keeping your body fit is now well understood, people will come to realize the importance of looking after their minds.

I’ve meditated regularly for twenty years, but curious as to how this is becoming mainstream, I went to an event in the heart of high-tech Shoreditch in London. In a hipster workspaces with funky architecture, excellent coffee, and an impressive range of beards, a soft-spoken retired Oxford professor of psychology, Mark Williams, was talking about how multitasking has a switching cost in focus and concentration. Our unique human ability to remember the past and to think ahead brings a cost; we lose the present. To counter this, he advocated a daily practice of mindfulness: bringing attention back to the body—the physical sensations of the breath, the hands, the feet. Williams explained how fear and anxiety inhibit creativity. In time, the practice of mindfulness enables you to acknowledge fear calmly and even to investigate it with curiosity. You learn to place your attention in the moment, noticing details such as the sunlight or the taste of the coffee.

On a recent retreat, I was beside a river early one morning and a rower passed. I watched the boat slip by and enjoyed the beauty in a radically new way. The moment was sufficient; there was nothing I wanted to add or take away—no thought of how I wanted to do this every day, or how I wanted to learn to row, or how I wished I was in the boat. Nothing but the pleasure of witnessing it. The busy-ness of the mind had stilled. Mindfulness can be a remarkable bid to reclaim our attention and to claim real freedom, the freedom from our habitual reactivity that makes us easy prey for manipulation.

But I worry that the integrity of mindfulness is fragile, vulnerable both to commercialization by employers who see it as a form of mental performance enhancement and to consumer commodification, rather than contributing to the formation of ethical character. Mindfulness as a meditation practice originates in Buddhism, and without that tradition’s ethics, there is a high risk of it being hijacked and misrepresented.

Back in the Sixties, the countercultural psychologist Timothy Leary rebelled against the conformity of the new mass media age and called for, in Crawford’s words, an “attentional revolution.” Leary urged people to take control of the media they consumed as a crucial act of self-determination; pay attention to where you place your attention, he declared. The social critic Herbert Marcuse believed Leary was fighting the struggle for the ultimate form of freedom, which Marcuse defined as the ability “to live without anxiety.” These were radical prophets whose words have an uncanny resonance today. Distraction has become a commercial and political strategy, and it amounts to a form of emotional violence that cripples people, leaving them unable to gather their thoughts and overwhelmed by a sense of inadequacy. It’s a powerful form of oppression dressed up in the language of individual choice.

The stakes could hardly be higher, as William James knew a century ago: “The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will.” And what are we humans without these three?