Social Media Behemoths Sweep Alternative News into the Memory Hole

By Kurt Nimmo

Source: Another Day in the Empire

The squabbling between self-identified progressives and conservatives continues as social media transforms itself into a news, information, and opinion gatekeeper.

All information that contradicts the establishment narrative will either be downgraded into obscurity or excluded outright on social media.

Take for instance ThinkProgress, the Soros-financed news website, a project of the Center for American Progress Action Fund welded to the infrastructure of the Democrat party. On May 2, it complained that a bias study at Facebook will be run by conservatives, that is to say establishment Republicans, notably former Arizona Congress critter Jon Kyl.

ThinkProgress believes there is no such thing as bias aimed at conservatives—it’s the liberals who are routinely downgraded at Facebook while so-called conservatives are free to post what progressives characterize as an evil and poisonous ideology.

According to Libby Watson at Splinter News, conservatives are involved in “grift,” flimflamming poor Mark Zuckerberg with untrue claims of bias against the likes of Breitbart News.

It’s all part of a never ending and hugely counterproductive “culture war” that has raged between the ostensible right and left going on thirty years now. Ms. Watson manages to squeeze identity politics into her screed.

“The conservative movement has done a remarkable job over the last half century to bellow and bully its way into having its most ridiculous and reality-divorced concerns taken seriously,” she writes. “It lies about and distorts everything: about tax cuts, about Benghazi and her emails, about immigration, about healthcare, about Diamond and Silk. The further Facebook descends down the path of letting that screaming white face of faux outrage dictate how they run their platform, the harder it’s going to be for them to get away from them.”

The progressive news website Common Dreams complains it has weathered “significant drops in traffic since Google and Facebook began changing algorithms and talking openly about their new attempts to control the kind of news content users see. According to internal data and Google Analytics, traffic to Common Dreams from Google searches fell by 34 percent after the powerful search giant unveiled its new search protocol in April 2017.”

Meanwhile, on the other side of the yawning divide, Brent Bozell, founder of the Media Research Center, rallied around 60 conservatives and fired off an open letter to the social media giants demanding transparency, clarity on the definition of hate speech, equality for conservatives, and respect for the First Amendment.

“Social media censorship and online restriction of conservatives and their organizations have reached a crisis level,” the open letter states. “Facebook CEO Mark Zuckerberg’s hearings on Capitol Hill only served to draw attention to how widespread this problem has become. Conservative leaders now have banded together to call for equal treatment on tech and social media.”

Both liberals and conservatives are missing the point.

Facebook and Google will continue and enlarge the effort to gatekeep information that does not jive with the establishment narrative, be it from the right or left.

The internet and web upended the establishment’s carefully constructed propaganda machine—the CIA’s “Mighty Wurlitzer” under its Operation Mockingbird beginning in the early 1950s—deeply embedded within corporate media.

Beginning with Friendster, MySpace, and like projects in the early 2000s and eventually morphing into the corporate behemoths Facebook, YouTube, and Twitter, social media platforms have extended the reach of alternative media, much to the displeasure of the establishment. Its preferred propaganda conduits have withered and this has seriously hampered its ability to control the narrative.

Both the right and left need to nurture their own social media platforms and drive traffic there.

Of course, this will not be as effective as plugging into the massive matrix of social connectivity provided by the corporate tech giants, but the alternative is to be marginalized and eventually swept into the memory hole as the context of “extremism” narrows and constricts expression, excluding all but the most token disagreement with the establishment narrative.

However, I’m not sure we’re up to it.

The elite has done a remarkable job of using the time tested divide and conquer concept, endlessly pitting the so-called right against the amorphously defined left and vice versa. Liberals and conservatives continue to fight over frivolous ideological points as the funny money asset-driven economy prepares to implode and the mission of infinity war expands to the point where it endangers life on planet Earth.

Saturday Matinee: Horns and Halos

Review by Underground Film Journal

Horns and Halos, which opened the 9th New York Underground Film Festival, is a documentary by married filmmakers Suki Hawley and Michael Galinsky about the intrigue surrounding the publication of the controversial book Fortunate Son, a biography of George W. Bush. The book was originally published by St. Martin’s Press in 1999, subsequently pulled off the bookstore shelves by them after controversy arose over a passage accusing Bush of being a convicted drug user and then re-published by a little artsy boutique outfit in New York City called Soft Skull Press.

What makes Horns and Halos a successful documentary is that the filmmakers did an excellent job of remaining amazingly unbiased towards the subject matter. While watching the movie, I got the impression that Bush supporters would dismiss everyone involved in the book’s publication as a complete wacko and reject any criticisms made against the president in the film; and that anti-Bush activists would find the issues brought up by the book to be damning evidence against him. Personally, I think the truth lies, as the saying goes, somewhere in-between.

The author of Fortunate Son is Arkansas author J.H. Hatfield who, despite appearing slightly off-kilter, seems like an intensely earnest man who just wanted to be taken as a serious author. Previous to his infamous work, Hatfield was the writer of unauthorized biographies of Ewan McGregor and Patrick Stewart, as well as guides to TV shows like Star Trek, Lost in Space and The X-Files.

The main focus of the film, however, is Sander Hicks, the garrulous and determined CEO of Soft Skull Press and re-publisher of “Fortunate Son.” It makes sense that Horns and Halos would spotlight Sander over Hatfield, though, since the movie seems like a very low-budget affair and Hawley, Galinsky and Hicks are all NYC residents while Hatfield lived all the way in Arkansas.

But what’s interesting about all four participants — the filmmakers and their subjects — is that they all seem to be people who have stumbled onto a subject that’s bigger than themselves. There’s a lot of information presented in and lurking around the fringes of Horns and Halos that I really think would have been better served by someone with a bigger budget, for example Michael Moore who can afford a team of researchers; travel freely around the country and also possibly have the balls to charge the White House and demand interviews with Bush and Karl Rove, whose name figures prominently in Hatfield’s research but whom the filmmakers don’t go into much detail on.

But that doesn’t mean that this film shouldn’t be seen and that Galinsky and Hawley’s approach isn’t entirely successful. The real winner of the movie, even though he isn’t in the film as much as I would have liked, is Hatfield. I think the film, at the very least, redeems his character, which got so maligned in the public forum that it eventually led him to commit suicide in 2001.

It is true that Hatfield was an ex-felon. He served five years in prison after being convicted of conspiracy to murder in 1988. But he also may have been a victim of a greedy publisher who forced him to include in his book the unsubstantiated rumor that Bush was convicted of cocaine possession in 1972.

The drug charge story is a complicated one and rather than me recount it here, a good overview of it is included in a new preface by Sander Hicks in Fortunate Son, which is also available to read on Soft Skull Press’s website. While the preface is interesting, it does make one or two slips, especially in not footnoting key passages, e.g. the statement, “[Bush] blurted out at a press conference that he hadn’t done drugs since 1974.” Little details like that can bug me and prevent me from agreeing with a story 100%. (Alas, since the writing of this review, Hicks’ preface is no longer available, but there is a new forward by Mark Crispin Miller.)

The same goes for all of Horns and Halos. I do think having a little bit more of Hatfield in the flick would have made things a lot more clearer, especially considering the scope of the subject. After the NYUFF screening of the film, Hicks and Galinsky did a brief Q&A session together (Hawley was absent as she had just given birth to a daughter) and I thought it really sad that Hatfield couldn’t be there to see the finished film and accept the applause from the audience that would have greeted him. I think he would have been the hit of the festival.

Most library system members can watch the full film on Kanopy.

US Collapse – the Spectacle of Our Time

By Finnian Cunningham

Source: Axis of Logic

May you live in interesting times, goes the Chinese proverb. Few can doubt that we are indeed living in such an interesting time. Big changes are afoot in the world, it seems.

None more so than the collapsing of the American Empire.

The US is going through an historic “correction” in the same way that the Soviet Union did some 30 years ago when the latter was confronted with the reality of its unsustainable political and economic system. (That’s not meant to imply, however, that socialism is unviable, because arguably the Soviet Union had fatally strayed from its genuine socialist project into something more akin to unwieldy state capitalism.)

In any case, all empires come to an end eventually. History is littered with the debris of countless empires. Why should the American Empire be any different? It’s not. Only arrogant “American exceptionalism” deludes itself from the reality.

The notable thing is just how in denial the political class and the US news media are about the unfolding American crisis.

This is partly where the whole “Russiagate” narrative comes into play. Blaming Russia for allegedly destabilizing US politics and society is a cover for denial over the internal rot facing the US.

Some may scoff at the very idea of an “American Empire”. That’s something Europeans did, not us, goes the apologist for US power. The quick retort to that view is to point out that the US has over 1,000 military bases in more than 100 countries around the world. If that is not a manifestation of empire then what is?

For seven decades since the Second World War, “Pax Americana” was the grandiose name given to US imperial design for the global order. The period was far from peaceful as the vainglorious name suggests. Dozens of wars, proxy conflicts and violent subversions were carried by the US on every continent in order to maintain its empire. The so-called “global policeman” was more often a “global thug”.

That US empire is now teetering at the cusp of an emerging multipolar world order led by China, Russia and other rising powers.

When US leaders complain about China and Russia “reshaping the global order” to reflect their interests what the American leaders are tacitly admitting is the coming end of Washington’s presumed hegemony.

Rather than accepting the fate of demise, the US is aggressively resisting by denigrating China and Russia’s power as somehow illegitimate. It’s the classic denial reaction of a sore loser.

So, what are the telltale signs that the US is indeed undergoing a seminal “correction” — or collapse?

The heyday of American capitalism is well passed. The once awesome productive system is a skeleton of its former self. The rise of massive social poverty alongside obscene wealth among a tiny elite is a sure sign that the once mighty American economy is chronically moribund. The country’s soaring $20 trillion national debt is another symptom of chronic atrophy.

Recent self-congratulatory whooping by President Trump of “economic recovery” is like the joy felt from looking at a mirage. The roaring stock market is an elite phenomenon which can just as easily slump over night.
What the champagne bubbles can’t disguise is the structural failing of US capitalism to reverse exploding inequality and endemic poverty across America. The national prowess of US capitalism has been superseded by global capitalism where American corporations among others scour the planet for cheap labor and tax havens. There is no going back to a supposed golden age, no matter how much Trump crows about “America First”.

The other side of the coin from historic US economic demise is the concomitant rise in its militarism as a way to compensate for its overall loss of power.

It is no coincidence that since the end of the Cold War following the dissolution of the Soviet Union, US military interventions around the world have erupted with increased frequency and duration. The US is in a veritable permanent state of war actively deploying its forces simultaneously in several countries, particularly in the oil-rich Middle East.

Washington of course gives itself a fig leaf cover by calling its surge in militarism a “war on terror” or “defending allies”. But, increasingly, US war conduct is seen for what it plainly is — violation of international law and the sovereignty of nations for the pursuit of American imperial interests.

In short, the US is patently lashing out as a rogue regime. There’s no disguising that fiendish fact.

In addition to waging wars, bombing countries, sponsoring terrorist proxies and assassinating enemies at will with drones, Washington is increasingly threatening others with military aggression. In recent months, North Korea and Iran have been openly threatened based on spurious claims. Russia and China have also been explicitly warned of American aggression in several strategic documents published by the Trump administration.

The grounds for American belligerence are baseless. As noted, the real motive is to do with compensating for its own inherent political, economic and social crises. That then amounts to American leaders inciting conflicts and wars, which is in itself a grave violation of international law — a crime against peace, according to Nuremberg principles.

The American Empire is failing and flailing. This is the spectacle of our time. The Western mainstream news media are either blind, ignorant or complicit in denying the historic collapse. Such media are indulging reckless fantasies of the US political class to distract from the potential internal implosion. Casting around for scapegoats to “explain” the deep inherent problems, the political class are using Russia and alleged Russian “interference” as a pretext.

World history has reached a foreboding cross-roads due to the collapsing of the American Empire. Can we navigate a safe path forward avoiding catastrophic war that often accompanies the demise of empires?

A lot, it seems, depends on ordinary American people becoming politically organized to challenge their dysfunctional system run by and for the elites. If the American people cannot hold their elites to account and break their corrupt rule, overhauling it with something more equitable and democratic, then the world is in peril of being plunged into total war. We can only but wish our American brothers and sisters solidarity and success.

Fake News Is Fake Amerika

The ‘Values,’ ‘Vision,’ and ‘Democracy’ of an Inauthentic Opposition

Average Americans, whose economic survival is threatened, have no political party to represent them, including deceptive Democrats who claim to be their champions and blame others when their deception fails, says Paul Street.

By Paul Street

Source: Consortium News

Never underestimate the capacity of the United States’ Inauthentic Opposition Party, the corporate Democrats, for self-congratulatory delusion and the externalization of blame.

Look, for example, at the Democratic National Committee’s (DNC) recently filed 66-page lawsuit against Russia, WikiLeaks, and the 2016 Donald Trump campaign. The document accuses Russia of “mount[ing] a brazen attack on the American democracy,” “destabilize[ing] the U.S. political environment” on Trump’s (and Russia’s) behalf, and “interfering with our democracy….”

“The [RussiaGate] conspiracy,” the DNC Complaint says, “undermined and distorted the DNC’s ability to communicate the [Democratic] party’s values and vision to the American electorate” and “sowed discord within the Democratic Party at a time when party unity was essential…”

Yes, Russia, like numerous other nations living under the global shadow of the American Superpower, may well have tried to have some surreptitious say in 2016 U.S. presidential election. (Why wouldn’t the Kremlin have done that, given the very real and grave threats Washington and its Western NATO allies have posed for many years to post-Soviet-era Russian security and peace in Eastern Europe?)

Still, charging Russia with interfering with US-“American democracy” is like me accusing the Washington Capital’s star left winger Alex Ovechkin of interfering with my potential career as a National Hockey League player (I’m middle aged and can’t skate backwards). The U.S. doesn’t have a functioning democracy to undermine, as numerous careful studies (see this,this,this,this,this,this,this,this, and this) have shown.

We have, rather, a corporate and financial oligarchy, an open plutocracy. U.S.-Americans get to vote, yes, but the nation’s “unelected dictatorship of money” reigns nonetheless in the United States, where, as leading liberal political scientists Benjamin Page (Northwestern) and Marin Gilens (Princeton) find, “government policy…reflects the wishes of those with money, not the wishes of the millions of ordinary citizens who turn out every two years to choose among the preapproved, money-vetted candidates for federal office.”

Our Own Oligarchs

Russia and WikiLeaks “destabilized the U.S. political environment”? Gee, how about the 20 top oligarchic U.S. mega-donors who invested more than $500 million combined in disclosed campaign contributions (we can only guess at how much “dark,” that is undisclosed, money they gave) to candidates and political organizations in the 2016 election cycle? The 20 largest organizational donors also gave a total of more than $500 million. The foremost plutocratic election investors included hard right-wing billionaires like casino owner Sheldon Adelson ($83 million disclosed to Republicans and right-wing groups), hedge-fund manager Paul Singer ($26 million to Republicans and the right), hedge fund manager Robert Mercer ($26 million) and packaging mogul Richard Uihlein ($24 million).

How about the multi-billionaire Trump’s own real estate fortune, which combined with the remarkable free attention the corporate media oligopoly granted him to help catapult the orange-tinted fake-populist beast past his more traditional Republican primary opponents? And what about the savagely unequal distribution of wealth and income in Barack Obama’s America, so extreme in the wake of the Great Recession that Hillary’s primary campaign rival Bernie Sanders could credibly report that the top tenth of the upper U.S.1% possessed nearly as much wealth as the nation’s bottom 90%? Such extreme disparity helped doom establishment, Wall Street- and Goldman Sachs-embroiled candidates like Jeb Bush, Marco Rubio, and Mrs. Clinton in 2016. Russia and WikiLeaks did not create that deep, politically- and neoliberal-policy-generated socioeconomic imbalance.

Double Vision

And just what were the Democratic Party “values and vision” that Russia, Trump, and WikiLeaks supposedly prevented the DNC and the Clinton team from articulating in 2016? As the distinguished political scientist and money-politics expert Thomas Ferguson and his colleagues Paul Jorgensen and Jie Chen noted in an important study released three months ago, the Clinton campaign “emphasized candidate and personal issues and avoided policy discussions to a degree without precedent in any previous election for which measurements exist….it deliberately deemphasized issues in favor of concentrating on what the campaign regarded as [Donald] Trump’s obvious personal weaknesses as a candidate.” Strangely enough, the Twitter-addicted reality television star Trump had a lot more to say about policy than the former First Lady, U.S. Senator, and Secretary of State Hillary Clinton, a wonkish Yale Law graduate.

The Democrats “values and vision” in 2016 amounted pretty much to the accurate but hardly inspiring or mass-mobilizing notion that Donald Trump was an awful person who was unqualified for the White House. Clinton ran almost completely on candidate character and quality. This was a blunder of historic proportions, given Clinton’s own highly problematic character brand. Any campaign needs a reasonably strong policy platform to stand on in case of candidate difficulties.

By Ferguson, Jorgenson, and Chen’s account, Hillary’s peculiar policy silence was about U.S. oligarchs’ campaign money. Thanks to candidate Trump’s bizarre nature and his declared isolationism and nationalism, Clinton achieved remarkable campaign finance success with normally Republican-affiliated capitalist sectors less disposed to abide the standard, progressive-sounding policy rhetoric of Democratic Party candidates than their more liberal counterparts.

One ironic but “fateful consequence” of her curious connection to conservative business interests was her “strategic silence about most important matters of public policy. … Misgivings of major contributors who worried that the Clinton campaign message lacked real attractions for ordinary Americans were rebuffed. The campaign,” Ferguson, Jorgenson, and Chen wrote, “sought to capitalize on the angst within business by vigorously courting the doubtful and undecideds there, not in the electorate.”

Other Clinton mistakes included failing to purchase television ads in Michigan, failing to set foot in Wisconsin after the Democratic National Convention, and getting caught telling wealthy New York City campaign donors that Trump’s white supporters were “a basket of” racist, sexist, nativist, and homophobic “deplorables.” This last misstep was a Freudian slip of the neoliberal variety. It reflected and advanced the corporate Democrats’ longstanding alienation of and from the nation’s rural and industrial and ex-industrial “heartland.”

Fake Progressives

As left historian Nancy Fraser noted after Trump was elected, the Democrats, since at least the Bill Clinton administration, had joined outwardly progressive forces like feminism, antiracism, multiculturalism, and LGBTQ rights to “financial capitalism.” This imparted liberal “charisma” and “gloss” to “policies that …devastated…what were once middle-class lives” by wiping out manufacturing, weakening unions, slashing wages, and increasing the “precarity of work.”

To make matters worse, Fraser rightly added, the “progressive neoliberal” blue-and digital-zone Democrats “compounded” the “injury of deindustrialization” with “the insult of progressive moralism,” which rips red-and analog-zone whites as culturally retrograde (recall candidate Obama’s problematic 2008 reflection on how rural and small-town whites “cling to religion and guns”) and yet privileged by the simple color of their skin.

Such insults from elite, uber-professional class neo-liberals like Obama (Harvard Law) and the Clintons (Yale Law) would sting less in the nation’s “flyover zones” if the those uttering them had not spent their sixteen years in the White House governing blatantly in accord with the wishes of Wall Street, Silicon Valley, and the leading multinational corporations. Like Bill Clinton’s two terms, the Obama years were richly consistent with Sheldon Wolin’s early 2008 description of the Democrats as an “inauthentic opposition” whose dutiful embrace of “centrist precepts” meant they would do nothing to “substantially revers[e] the drift rightwards” or “significantly alter the direction of society.”

The fake-“progressive” Obama presidency opened with the expansion of Washington’s epic bailout of the very parasitic financial elites who recklessly sparked the Great Recession (this with no remotely concomitant expansion of federal assistance to the majority middle- and working-class victims), the abandonment of campaign pledges to restore workers’ right to organize (through the immediately forgotten Employee Free Choice Act), and the kicking of Single Payer health care advocates to the curb as Obama worked with the big drug and insurance syndicates to craft a corporatist, profit-friendly health insurance reform. Obama’s second term ended with him doggedly (if unsuccessfully) championing the arch-authoritarian global-corporatist Trans Pacific Partnership.

This Goldman Sachs and Citigroup-directed policy record was no small part of what demobilized the Democrats’ mass electoral base in ways that “destabilized the U.S. political environment” to the benefit of the reactionary populist Trump, whose Mercer family-backed proto-fascistic strategist and Svengali Steve Bannon was smartly attuned to the Democrats’ elitist class problem.

There was a major 2016 presidential candidate who ran with genuinely progressive “values and vision” – Bernie Sanders. The most remarkable finding in Ferguson, Jorgenson, and Chen’s study is that the self-declared “democratic socialist” Sanders came tantalizingly close to winning the Democratic presidential nomination with no support from Big Business. The small-donor Sanders campaign was “without precedent in American politics not just since the New Deal, but across virtually the whole of American history … a major presidential candidate waging a strong, highly competitive campaign whose support from big business was essentially zero.”

Sanders was foiled by the big-money candidate Clinton’s advance control of the Democratic National Committee and convention delegates. Under a formal funding arrangement it worked up with the Democratic National Committee (DNC) in late September of 2015, the depressing “lying neoliberal warmonger” Hillary’s campaign was granted advance control of all the DNC’s “strategic decisions.” The Democratic Party’s presidential caucuses and primaries were rigged against Sanders in ugly ways that provoked a different lawsuit last year – a class-action suit against the DNC on behalf of Sanders’ supporters. The complaint was dismissed by a federal judge who ruled on the side of DNC lawyers by agreeing that the DNC was within its rights to violate their party’s charter and bylaws by selecting its candidate in advance of the primaries.

How was that for the noble “values and vision” that “American democracy” inspires atop the not-so leftmost of the nation’s two major and electorally viable political parties?

Under Cover of Russia-gate

That’s what “sowed discord within the Democratic Party at a time when party unity was essential…” Russia didn’t do it. Neither did WikiLeaks or the Trump campaign. The Clinton campaign and the Democratic Party establishment – themselves funded by major U.S. oligarchs like San Francisco hedge-fund billionaire Tom Steyer– did that on their own.

Could Sanders – the most popular politician in the U.S. (something rarely reported in a “mainstream” corporate media that could barely cover his giant campaign rallies even as it obsessed over Trump’s every bizarre Tweet) – have defeated the orange-tinted beast in the general election? Perhaps, though much of the oligarchic funding Hillary got would have gone to Trump if “socialist” Bernie had been the Democratic nominee. It is unlikely that Sanders could have accomplished much as president in a nation long controlled by the capitalist oligarchy in numerous ways that go far beyond campaign finance alone.

Meanwhile, under the cover of RussiaGate, the still-dismal and dollar-drenched corporate-imperial Democrats seem content to continue tilting to the center-right, purging Sanders-style progressives from the party’s leadership and citing the party’s special election victories (Doug Jones and Conor Lamb) against deeply flawed and Trump-backed Republicans in two bright-red voting districts (the state of Alabama and a fading Pennsylvania canton) as proof that tepid neoliberal centrism is still (even after Hillary’s stunning defeat) the way to go.

Along the way, the Inauthentic Opposition’s candidate roster for the upcoming Congressional mid-term election is loaded with an extraordinary number of contenders with U.S. military and intelligence backgrounds, consistent with Congressional Democrats repeated votes to give massive military and surveillance-state funds and power to a president they consider (accurately enough) unbalanced and dangerous.

The trick, the neoliberal “CIA Democrats” think, is to run conservative, Wall Street-backed imperial and National Security State veterans who pretend (see Eric Draitser’s recent piece on “How Clintonites Are Manufacturing Faux Progressive Congressional Campaigns”) to be aligned with majority-progressive left-of-center policy sentiments and values. It’s still very much their party.

Whatever happens during the next biennial electoral extravaganza, “the crucial fact” remains, in Wolin’s words nine years ago, “that for the poor, minorities, the working class and anti-corporatists there is no opposition party working on their behalf” in the United States – the self-declared homeland and headquarters of global democracy.

 

Paul Street is an independent radical-democratic policy researcher, journalist, historian, author and speaker based in Iowa City, Iowa, and Chicago, Illinois.  He is the author of seven books. His latest is They Rule: The 1% v. Democracy (Paradigm, 2014)

Disarming the Weapons of Mass Distraction

By Madeleine Bunting

Source: Rise Up Times

“Are you paying attention?” The phrase still resonates with a particular sharpness in my mind. It takes me straight back to my boarding school, aged thirteen, when my eyes would drift out the window to the woods beyond the classroom. The voice was that of the math teacher, the very dedicated but dull Miss Ploughman, whose furrowed grimace I can still picture.

We’re taught early that attention is a currency—we “pay” attention—and much of the discipline of the classroom is aimed at marshaling the attention of children, with very mixed results. We all have a history here, of how we did or did not learn to pay attention and all the praise or blame that came with that. It used to be that such patterns of childhood experience faded into irrelevance. As we reached adulthood, how we paid attention, and to what, was a personal matter and akin to breathing—as if it were automatic.

Today, though, as we grapple with a pervasive new digital culture, attention has become an issue of pressing social concern. Technology provides us with new tools to grab people’s attention. These innovations are dismantling traditional boundaries of private and public, home and office, work and leisure. Emails and tweets can reach us almost anywhere, anytime. There are no cracks left in which the mind can idle, rest, and recuperate. A taxi ad offers free wifi so that you can remain “productive” on a cab journey.

Even those spare moments of time in our day—waiting for a bus, standing in a queue at the supermarket—can now be “harvested,” says the writer Tim Wu in his book The Attention Merchants. In this quest to pursue “those slivers of our unharvested awareness,” digital technology has provided consumer capitalism with its most powerful tools yet. And our attention fuels it. As Matthew Crawford notes in The World Beyond Your Head, “when some people treat the minds of other people as a resource, this is not ‘creating wealth,’ it is transferring it.”

There’s a whiff of panic around the subject: the story that our attention spans are now shorter than a goldfish’s attracted millions of readers on the web; it’s still frequently cited, despite its questionable veracity. Rates of diagnosis attention deficit hyperactivity disorder in children have soared, creating an $11 billion global market for pharmaceutical companies. Every glance of our eyes is now tracked for commercial gain as ever more ingenious ways are devised to capture our attention, if only momentarily. Our eyeballs are now described as capitalism’s most valuable real estate. Both our attention and its deficits are turned into lucrative markets.

There is also a domestic economy of attention; within every family, some get it and some give it. We’re all born needing the attention of others—our parents’, especially—and from the outset, our social skills are honed to attract the attention we need for our care. Attention is woven into all forms of human encounter from the most brief and transitory to the most intimate. It also becomes deeply political: who pays attention to whom?

Social psychologists have researched how the powerful tend to tune out the less powerful. One study with college students showed that even in five minutes of friendly chat, wealthier students showed fewer signs of engagement when in conversation with their less wealthy counterparts: less eye contact, fewer nods, and more checking the time, doodling, and fidgeting. Discrimination of race and gender, too, plays out through attention. Anyone who’s spent any time in an organization will be aware of how attention is at the heart of office politics. A suggestion is ignored in a meeting, but is then seized upon as a brilliant solution when repeated by another person.

What is political is also ethical. Matthew Crawford argues that this is the essential characteristic of urban living: a basic recognition of others.

And then there’s an even more fundamental dimension to the politics of attention. At a primary level, all interactions in public space require a very minimal form of attention, an awareness of the presence and movement of others. Without it, we would bump into each other, frequently.

I had a vivid demonstration of this point on a recent commute: I live in East London and regularly use the narrow canal paths for cycling. It was the canal rush hour—lots of walkers with dogs, families with children, joggers as well as cyclists heading home. We were all sharing the towpath with the usual mixture of give and take, slowing to allow passing, swerving around and between each other. Only this time, a woman was walking down the center of the path with her eyes glued to her phone, impervious to all around her. This went well beyond a moment of distraction. Everyone had to duck and weave to avoid her. She’d abandoned the unspoken contract that avoiding collision is a mutual obligation.

This scene is now a daily occurrence for many of us, in shopping centers, station concourses, or on busy streets. Attention is the essential lubricant of urban life, and without it, we’re denying our co-existence in that moment and place. The novelist and philosopher, Iris Murdoch, writes that the most basic requirement for being good is that a person “must know certain things about his surroundings, most obviously the existence of other people and their claims.”

Attention is what draws us out of ourselves to experience and engage in the world. The word is often accompanied by a verb—attention needs to be grabbed, captured, mobilized, attracted, or galvanized. Reflected in such language is an acknowledgement of how attention is the essential precursor to action. The founding father of psychology William James provided what is still one of the best working definitions:

It is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. Focalization, concentration, of consciousness are of its essence. It implies withdrawal from some things in order to deal effectively with others.

Attention is a limited resource and has to be allocated: to pay attention to one thing requires us to withdraw it from others. There are two well-known dimensions to attention, explains Willem Kuyken, a professor of psychology at Oxford. The first is “alerting”— an automatic form of attention, hardwired into our brains, that warns us of threats to our survival. Think of when you’re driving a car in a busy city: you’re aware of the movement of other cars, pedestrians, cyclists, and road signs, while advertising tries to grab any spare morsel of your attention. Notice how quickly you can swerve or brake when you spot a car suddenly emerging from a side street. There’s no time for a complicated cognitive process of decision making. This attention is beyond voluntary control.

The second form of attention is known as “executive”—the process by which our brain selects what to foreground and focus on, so that there can be other information in the background—such as music when you’re cooking—but one can still accomplish a complex task. Crucially, our capacity for executive attention is limited. Contrary to what some people claim, none of us can multitask complex activities effectively. The next time you write an email while talking on the phone, notice how many typing mistakes you make or how much you remember from the call. Executive attention can be trained, and needs to be for any complex activity. This was the point James made when he wrote: “there is no such thing as voluntary attention sustained for more than a few seconds at a time… what is called sustained voluntary attention is a repetition of successive efforts which bring back the topic to the mind.”

Attention is a complex interaction between memory and perception, in which we continually select what to notice, thus finding the material which correlates in some way with past experience. In this way, patterns develop in the mind. We are always making meaning from the overwhelming raw data. As James put it, “my experience is what I agree to attend to. Only those items which I notice shape my mind—without selective interest, experience is an utter chaos.”

And we are constantly engaged in organizing that chaos, as we interpret our experience. This is clear in the famous Gorilla Experiment in which viewers were told to watch a video of two teams of students passing a ball between them. They had to count the number of passes made by the team in white shirts and ignore those of the team in black shirts. The experiment is deceptively complex because it involves three forms of attention: first, scanning the whole group; second, ignoring the black T-shirt team to keep focus on the white T-shirt team (a form of inhibiting attention); and third, remembering to count. In the middle of the experiment, someone in a gorilla suit ambles through the group. Afterward, half the viewers when asked hadn’t spotted the gorilla and couldn’t even believe it had been there. We can be blind not only to the obvious, but to our blindness.

There is another point in this experiment which is less often emphasized. Ignoring something—such as the black T-shirt team in this experiment—requires a form of attention. It costs us attention to ignore something. Many of us live and work in environments that require us to ignore a huge amount of information—that flashing advert, a bouncing icon or pop-up.

In another famous psychology experiment, Walter Mischel’s Marshmallow Test, four-year-olds had a choice of eating a marshmallow immediately or two in fifteen minutes. While filmed, each child was put in a room alone in front of the plate with a marshmallow. They squirmed and fidgeted, poked the marshmallow and stared at the ceiling. A third of the children couldn’t resist the marshmallow and gobbled it up, a third nibbled cautiously, but the last third figured out how to distract themselves. They looked under the table, sang… did anything but look at the sweet. It’s a demonstration of the capacity to reallocate attention. In a follow-up study some years later, those who’d been able to wait for the second marshmallow had better life outcomes, such as academic achievement and health. One New Zealand study of 1,000 children found that this form of self-regulation was a more reliable predictor of future success and wellbeing than even a good IQ or comfortable economic status.

What, then, are the implications of how digital technologies are transforming our patterns of attention? In the current political anxiety about social mobility and inequality, more weight needs to be put on this most crucial and basic skill: sustaining attention.

*

I learned to concentrate as a child. Being a bookworm helped. I’d be completely absorbed in my reading as the noise of my busy family swirled around me. It was good training for working in newsrooms; when I started as a journalist, they were very noisy places with the clatter of keyboards, telephones ringing and fascinating conversations on every side. What has proved much harder to block out is email and text messages.

The digital tech companies know a lot about this widespread habit; many of them have built a business model around it. They’ve drawn on the work of the psychologist B.F. Skinner who identified back in the Thirties how, in animal behavior, an action can be encouraged with a positive consequence and discouraged by a negative one. In one experiment, he gave a pigeon a food pellet whenever it pecked at a button and the result, as predicted, was that the pigeon kept pecking. Subsequent research established that the most effective way to keep the pigeon pecking was “variable-ratio reinforcement.” Give the pigeon a food pellet sometimes, and you have it well and truly hooked.

We’re just like the pigeon pecking at the button when we check our email or phone. It’s a humiliating thought. Variable reinforcement ensures that the customer will keep coming back. It’s the principle behind one of the most lucrative US industries: slot machines, which generate more profit than baseball, films, and theme parks combined. Gambling was once tightly restricted for its addictive potential, but most of us now have the attentional equivalent of a slot machine in our pocket, beside our plate at mealtimes, and by our pillow at night. Even during a meal out, a play at the theater, a film, or a tennis match. Almost nothing is now experienced uninterrupted.

Anxiety about the exponential rise of our gadget addiction and how it is fragmenting our attention is sometimes dismissed as a Luddite reaction to a technological revolution. But that misses the point. The problem is not the technology per se, but the commercial imperatives that drive the new technologies and, unrestrained, colonize our attention by fundamentally changing our experience of time and space, saturating both in information.

In much public space, wherever your eye lands—from the back of the toilet door, to the handrail on the escalator, or the hotel key card—an ad is trying to grab your attention, and does so by triggering the oldest instincts of the human mind: fear, sex, and food. Public places become dominated by people trying to sell you something. In his tirade against this commercialization, Crawford cites advertisements on the backs of school report cards and on debit machines where you swipe your card. Before you enter your PIN, that gap of a few seconds is now used to show adverts. He describes silence and ad-free experience as “luxury goods” that only the wealthy can afford. Crawford has invented the concept of the “attentional commons,” free public spaces that allow us to choose where to place our attention. He draws the analogy with environmental goods that belong to all of us, such as clean air or clean water.

Some legal theorists are beginning to conceive of our own attention as a human right. One former Google employee warned that “there are a thousand people on the other side of the screen whose job it is to break down the self-regulation you have.” They use the insights into human behavior derived from social psychology—the need for approval, the need to reciprocate others’ gestures, the fear of missing out. Your attention ceases to be your own, pulled and pushed by algorithms. Attention is referred to as the real currency of the future.

*

In 2013, I embarked on a risky experiment in attention: I left my job. In the previous two years, it had crept up on me. I could no longer read beyond a few paragraphs. My eyes would glaze over and, even more disastrously for someone who had spent their career writing, I seemed unable to string together my thoughts, let alone write anything longer than a few sentences. When I try to explain the impact, I can only offer a metaphor: it felt like my imagination and use of language were vacuum packed, like a slab of meat coated in plastic. I had lost the ability to turn ideas around, see them from different perspectives. I could no longer draw connections between disparate ideas.

At the time, I was working in media strategy. It was a culture of back-to-back meetings from 8:30 AM to 6 PM, and there were plenty of advantages to be gained from continuing late into the evening if you had the stamina. Commitment was measured by emails with a pertinent weblink. Meetings were sometimes as brief as thirty minutes and frequently ran through lunch. Meanwhile, everyone was sneaking time to battle with the constant emails, eyes flickering to their phone screens in every conversation. The result was a kind of crazy fog, a mishmash of inconclusive discussions.

At first, it was exhilarating, like being on those crazy rides in a theme park. By the end, the effect was disastrous. I was almost continuously ill, battling migraines and unidentifiable viruses. When I finally made the drastic decision to leave, my income collapsed to a fraction of its previous level and my family’s lifestyle had to change accordingly. I had no idea what I was going to do; I had lost all faith in my ability to write. I told friends I would have to return the advance I’d received to write a book. I had to try to get back to the skills of reflection and focus that had once been ingrained in me.

The first step was to teach myself to read again. I sometimes went to a café, leaving my phone and computer behind. I had to slow down the racing incoherence of my mind so that it could settle on the text and its gradual development of an argument or narrative thread. The turning point in my recovery was a five weeks’ research trip to the Scottish Outer Hebrides. On the journey north of Glasgow, my mobile phone lost its Internet connection. I had cut myself loose with only the occasional text or call to family back home. Somewhere on the long Atlantic beaches of these wild and dramatic islands, I rediscovered my ability to write.

I attribute that in part to a stunning exhibition I came across in the small harbor town of Lochboisdale, on the island of South Uist. Vija Celmins is an acclaimed Latvian-American artist whose work is famous for its astonishing patience. She can take a year or more to make a woodcut that portrays in minute detail the surface of the sea. A postcard of her work now sits above my desk, a reminder of the power of slow thinking.

Just as we’ve had a slow eating movement, we need a slow thinking campaign. Its manifesto could be the German poet Rainer Maria Rilke’s beautiful “Letters to a Young Poet”:

To let every impression and the germ of every feeling come to completion inside, in the dark, in the unsayable, the unconscious, in what is unattainable to one’s own intellect, and to wait with deep humility and patience for the hour when a new clarity is delivered.

Many great thinkers attest that they have their best insights in moments of relaxation, the proverbial brainwave in the bath. We actually need what we most fear: boredom.

When I left my job (and I was lucky that I could), friends and colleagues were bewildered. Why give up a good job? But I felt that here was an experiment worth trying. Crawford frames it well as “intellectual biodiversity.” At a time of crisis, we need people thinking in different ways. If we all jump to the tune of Facebook or Instagram and allow ourselves to be primed by Twitter, the danger is that we lose the “trained powers of concentration” that allow us, in Crawford’s words, “to recognize that independence of thought and feeling is a fragile thing, and requires certain conditions.”

I also took to heart the insights of the historian Timothy Snyder, who concluded from his studies of twentieth-century European totalitarianism that the way to fend off tyranny is to read books, make an effort to separate yourself from the Internet, and “be kind to our language… Think up your own way of speaking.” Dropping out and going offline enabled me to get back to reading, voraciously, and to writing; beyond that, it’s too early to announce the results of my experiment with attention. As Rilke said, “These things cannot be measured by time, a year has no meaning, and ten years are nothing.”

*

A recent column in The New Yorker cheekily suggests that all the fuss about the impact of digital technologies on our attention is nothing more than writers’ worrying about their own working habits. Is all this anxiety about our fragmenting minds a moral panic akin to those that swept Victorian Britain about sexual behavior? Patterns of attention are changing, but perhaps it doesn’t much matter?

My teenage children read much less than I did. One son used to play chess online with a friend, text on his phone, and do his homework all at the same time. I was horrified, but he got a place at Oxford. At his interview, he met a third-year history undergraduate who told him he hadn’t yet read any books in his time at university. But my kids are considerably more knowledgeable about a vast range of subjects than I was at their age. There’s a small voice suggesting that the forms of attention I was brought up with could be a thing of the past; the sustained concentration required to read a whole book will become an obscure niche hobby.

And yet, I’m haunted by a reflection: the magnificent illuminations of the eighth-century Book of Kells has intricate patterning that no one has ever been able to copy, such is the fineness of the tight spirals. Lines are a millimeter apart. They indicate a steadiness of hand and mind—a capability most of us have long since lost. Could we be trading in capacities for focus in exchange for a breadth of reference? Some might argue that’s not a bad trade. But we would lose depth: artist Paul Klee wrote that he would spend a day in silent contemplation of something before he painted it. Paul Cézanne was similarly known for his trance like attention on his subject. Madame Cézanne recollected how her husband would gaze at the landscape, and told her, “The landscape thinks itself in me, and I am its consciousness.” The philosopher Maurice Merleau-Ponty describes a contemplative attention in which one steps outside of oneself and immerses oneself in the object of attention.

It’s not just artists who require such depth of attention. Nearly two decades ago, a doctor teaching medical students at Yale was frustrated at their inability to distinguish between types of skin lesions. Their gaze seemed restless and careless. He took his students to an art gallery and told them to look at a picture for fifteen minutes. The program is now used in dozens of US medical schools.

Some argue that losing the capacity for deep attention presages catastrophe. It is the building block of “intimacy, wisdom, and cultural progress,” argues Maggie Jackson in her book Distracted, in which she warns that “as our attentional skills are squandered, we are plunging into a culture of mistrust, skimming, and a dehumanizing merging between man and machine.” Significantly, her research began with a curiosity about why so many Americans were deeply dissatisfied with life. She argues that losing the capacity for deep attention makes it harder to make sense of experience and to find meaning—from which comes wonder and fulfillment. She fears a new “dark age” in which we forget what makes us truly happy.

Strikingly, the epicenter of this wave of anxiety over our attention is the US. All the authors I’ve cited are American. It’s been argued that this debate represents an existential crisis for America because it exposes the flawed nature of its greatest ideal, individual freedom. The commonly accepted notion is that to be free is to make choices, and no one can challenge that expression of autonomy. But if our choices are actually engineered by thousands of very clever, well-paid digital developers, are we free? The former Google employee Tristan Harris confessed in an article in 2016 that technology “gives people the illusion of free choice while architecting the menu so that [tech giants] win, no matter what you choose.”

Despite my children’s multitasking, I maintain that vital human capacities—depth of insight, emotional connection, and creativity—are at risk. I’m intrigued as to what the resistance might look like. There are stirrings of protest with the recent establishment of initiatives such as the Time Well Spent movement, founded by tech industry insiders who have become alarmed at the efforts invested in keeping people hooked. But collective action is elusive; the emphasis is repeatedly on the individual to develop the necessary self-regulation, but if that is precisely what is being eroded, we could be caught in a self-reinforcing loop.

One of the most interesting responses to our distraction epidemic is mindfulness. Its popularity is evidence that people are trying to find a way to protect and nourish their minds. Jon Kabat-Zinn, who pioneered the development of secular mindfulness, draws an analogy with jogging: just as keeping your body fit is now well understood, people will come to realize the importance of looking after their minds.

I’ve meditated regularly for twenty years, but curious as to how this is becoming mainstream, I went to an event in the heart of high-tech Shoreditch in London. In a hipster workspaces with funky architecture, excellent coffee, and an impressive range of beards, a soft-spoken retired Oxford professor of psychology, Mark Williams, was talking about how multitasking has a switching cost in focus and concentration. Our unique human ability to remember the past and to think ahead brings a cost; we lose the present. To counter this, he advocated a daily practice of mindfulness: bringing attention back to the body—the physical sensations of the breath, the hands, the feet. Williams explained how fear and anxiety inhibit creativity. In time, the practice of mindfulness enables you to acknowledge fear calmly and even to investigate it with curiosity. You learn to place your attention in the moment, noticing details such as the sunlight or the taste of the coffee.

On a recent retreat, I was beside a river early one morning and a rower passed. I watched the boat slip by and enjoyed the beauty in a radically new way. The moment was sufficient; there was nothing I wanted to add or take away—no thought of how I wanted to do this every day, or how I wanted to learn to row, or how I wished I was in the boat. Nothing but the pleasure of witnessing it. The busy-ness of the mind had stilled. Mindfulness can be a remarkable bid to reclaim our attention and to claim real freedom, the freedom from our habitual reactivity that makes us easy prey for manipulation.

But I worry that the integrity of mindfulness is fragile, vulnerable both to commercialization by employers who see it as a form of mental performance enhancement and to consumer commodification, rather than contributing to the formation of ethical character. Mindfulness as a meditation practice originates in Buddhism, and without that tradition’s ethics, there is a high risk of it being hijacked and misrepresented.

Back in the Sixties, the countercultural psychologist Timothy Leary rebelled against the conformity of the new mass media age and called for, in Crawford’s words, an “attentional revolution.” Leary urged people to take control of the media they consumed as a crucial act of self-determination; pay attention to where you place your attention, he declared. The social critic Herbert Marcuse believed Leary was fighting the struggle for the ultimate form of freedom, which Marcuse defined as the ability “to live without anxiety.” These were radical prophets whose words have an uncanny resonance today. Distraction has become a commercial and political strategy, and it amounts to a form of emotional violence that cripples people, leaving them unable to gather their thoughts and overwhelmed by a sense of inadequacy. It’s a powerful form of oppression dressed up in the language of individual choice.

The stakes could hardly be higher, as William James knew a century ago: “The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will.” And what are we humans without these three?

Why America’s Major News-Media Must Change Their Thinking

By Eric Zuesse

Source: Strategic Culture Foundation

America’s ‘news’-media possess the mentality that characterizes a dictatorship, not a democracy. This will be documented in the linked-to empirical data which will be subsequently discussed. But, first, here is what will be documented by those data, and which will make sense of these data:

In a democracy, the public perceive their country to be improving, in accord with that nation’s values and priorities. Consequently, they trust their government, and especially they approve of the job-performance of their nation’s leader. In a dictatorship, they don’t. In a dictatorship, the government doesn’t really represent them, at all. It represents the rulers, typically a national oligarchy, an aristocracy of the richest 0.1% or even of only the richest 0.01%. No matter how much the government ‘represents’ the public in law (or “on paper”), it’s not representing them in reality; and, so, the public don’t trust their government, and the public’s job-rating of their national leader, the head-of-state, is poor, perhaps even more disapproval than approval. So, whereas in a democracy, the public widely approve of both the government and the head-of-state; in a dictatorship, they don’t.

In a dictatorship, the ‘news’-media hide reality from the public, in order to serve the government — not the public. But the quality of government that the regime delivers to its public cannot be hidden as the lies continually pile up, and as the promises remain unfulfilled, and as the public find that despite all of the rosy promises, things are no better than before, or are even becoming worse. Trust in such a government falls, no matter how much the government lies and its media hide the fact that it has been lying. Though a ‘democratic’ election might not retain in power the same leaders, it retains in power the same regime (be it the richest 0.1%, or the richest 0.01%, or The Party, or whatever the dictatorship happens to be). That’s because it’s a dictatorship: it represents the same elite of power-holding insiders, no matter what. It does not represent the public. That elite — whatever it is — is referred to as the “Deep State,” and the same Deep State can control more than one country, in which case there is an empire, which nominally is headed by the head-of-state of its leading country (this used to be called an “Emperor”), but which actually consists of an alliance between the aristocracies within all these countries; and, sometimes, the nominal leading country is actually being led, in its foreign policies, by wealthier aristocrats in the supposedly vassal nations. But no empire can be a democracy, because the residents in no country want to be governed by any foreign power: the public, in every land, want their nation to be free — they want democracy, no dictatorship at all, especially no dictatorship from abroad.

In order for the elite to change, a revolution is required, even if it’s only to a different elite, instead of to a democracy. So, if there is no revolution, then certainly it’s the same dictatorship as before. The elite has changed (and this happens at least as often as generations change), but the dictatorship has not. And in order to change from a dictatorship to a democracy, a revolution also is required, but it will have to be a revolution that totally removes from power the elite (and all their agents) who had been ruling. If this elite had been the nation’s billionaires and its centi-millionaires who had also been billionaire-class donors to political campaigns (such as has been proven to be the case in the United States), then those people, who until the revolution had been behind the scenes producing the bad government, need to be dispossessed of their assets, because their assets were being used as their weapons against the public, and those weapons need (if there is to be a democracy) to be transferred to the public as represented by the new and authentically democratic government. If instead the elite had been a party, then all of those individuals need to be banned from every sort of political activity in the future. But, in either case, there will need to be a new constitution, and a consequent new body of laws, because the old order (the dictatorship) no longer reigns — it’s no longer in force after a revolution. That’s what “revolution” means. It doesn’t necessarily mean “democratic,” but sometimes it does produce a democracy where there wasn’t one before. The idea that every revolution is democratic is ridiculous, though it’s often assumed in ‘news’-reports. In fact, coups (which the U.S. Government specializes in like no other) often are a revolution that replaces a democracy by a dictatorship (such as the U.S. Government did to Ukraine in 2014, for example, and most famously before that, did to Iran in 1953). (Any country that perpetrates a coup anywhere is a dictatorship over the residents there, just the same as is the case when any invasion and occupation of a country are perpetrated upon a country. The imposed stooges are stooges, just the same. No country that imposes coups and/or invasions/occupations upon any government that has not posed an existential threat against the residents of that perpetrating country, supports democracy; to the exact contrary, that country unjustifiably imposes dictatorships; it spreads its own dictatorship, which is of the imperialistic type, and any government that spreads its dictatorship is evil and needs to be replaced — revolution is certainly justified there.)

This is how to identify which countries are democracies, and which ones are not: In a democracy, the public are served by the government, and thus are experiencing improvement in their lives and consequently approve of the job-performance of their head-of-state, and they trust the government. But in a dictatorship, none of these things is true.

In 2014, a Japanese international marketing-research firm polled citizens in each of ten countries asking whether they approve or disapprove of the job-performance of their nation’s head-of-state, and Harvard then provided an English-translated version online for a few years, then eliminated that translation from its website; but, fortunately, the translation had been web-archived and so is permanent here (with no information however regarding methodology or sampling); and it shows the following percentages who approved of the job-performance of their President or other head-of-state in each of the given countries, at that time:

China (Xi)          90%

Russia (Putin)      87%

India (Modi)        86%

South Africa (Zuma) 70%

Germany (Merkel)    67%

Brazil (Roussef)    63%

U.S. (Obama)        62%

Japan (Abe)         60%

UK (Cameron)        55%

France (Hollande)   48%

In January 2018, the global PR firm Edelman came out with the latest in their annual series of scientifically polled surveys in more than two dozen countries throughout the world, tapping into, actually, some of the major criteria within each nation indicating whether or not the given nation is more toward the dictatorship model, or more toward the democracy model. The 2018 Edelman Trust Barometer survey showed that “Trust in Government” (scored and ranked on page 39) was 44% in Russia, and is only 33% in the United States. Trust in Government is the highest in China: 84%. The U.S. and Russia are the nuclear super-powers; and the U.S. and China are the two economic super-powers; so, these are the world’s three leading powers; and, on that single measure of whether or not a country is democratic, China is the global leader (#1 of 28), Russia is in the middle (#13 of 28), and U.S. ranks at the bottom of the three, and near the bottom of the entire lot (#21 of 28). (#28 of 28 is South Africa, which, thus — clearly in retrospect — had a failed revolution when it transitioned out of its apartheid dictatorship. That’s just a fact, which cannot reasonably be denied, given this extreme finding. Though the nation’s leader, Zuma, was, according to the 2014 Japanese study, widely approved by South Africans, his Government was overwhelmingly distrusted. This distrust indicates that the public don’t believe that the head-of-state actually represents the Government. If the head-of-state doesn’t represent the Government, the country cannot possibly be a democracy: the leader might represent the people, but the Government doesn’t.)

When the government is trusted but the head-of-state is not, or vice-versa, there cannot be a functioning democracy. In other words: if either the head-of-state, or the Government, is widely distrusted, there’s a dictatorship at that time, and the only real question regarding it, is: What type of dictatorship is this?

These figures — the numbers reported here — contradict the ordinary propaganda; and, so, Edelman’s trust-barometer on each nation’s ‘news’-media (which are scored and ranked on page 40) might also be considered, because the natural question now is whether unreliable news-media might have caused this counter-intuitive (in Western countries) rank-order. However, a major reason why this media-trust-question is actually of only dubious relevance to whether or not the given nation is a democracy, is that to assume that it is, presumes that trust in the government can be that easily manipulated — it actually can’t. Media and PR can’t do that; they can’t achieve it. Here is a widespread misconception: Trust in government results not from the media but from a government’s having fulfilled its promises, and from the public’s experiencing and seeing all around themselves that they clearly have been fulfilled; and lying ‘news’-media can’t cover-up that reality, which is constantly and directly being experienced by the public.

However, even if trust in the ‘news’-media isn’t really such a thing as might be commonly hypothesized regarding trust in the government, here are those Edelman findings regarding the media, for whatever they’re worth regarding the question of democracy-versus-dictatorship: Trust in Media is the highest, #1, in China, 71%; and is 42% in #15 U.S.; and is 35% in #20 Russia. (A July 2017 Marist poll however found that only 30% of Americans trust the media. That’s a stunning 12% lower than the Edelman survey found.) In other words: Chinese people experience that what they encounter in their news-media becomes borne-out in retrospect as having been true, but only half of that percentage of Russians experience this; and U.S. scores nearer to Russia than to China on this matter. (Interestingly, Turkey, which scores #7 on trust-in-government, scores #28 on trust-in-media. Evidently, Turks find that their government delivers well on its promises, but that their ‘news’-media often deceive them. A contrast this extreme within the Edelman findings is unique. Turkey is a special case, regarding this.)

I have elsewhere reported regarding other key findings in that 2018 Edelman study.

According to all of these empirical findings, the United States is clearly not more of a democracy than it is a dictatorship. This particular finding from these studies has already been overwhelmingly (and even more so) confirmed in the world’s only in-depth empirical scientific study of whether or not a given country is or is not a “democracy”: This study (the classic Gilens and Page study) found, incontrovertibly, that the U.S. is a dictatorship — specifically an aristocracy, otherwise commonly called an “oligarchy,” and that it’s specifically a dictatorship by the richest, against the public.

Consequently, whenever the U.S. Government argues that it intends to “spread democracy” (such as it claims in regards to Syria, and to Ukraine), it is most-flagrantly lying — and any ‘news’-medium that reports such a claim without documenting (such as by linking to this article) its clear and already-proven falsehood (which is more fully documented here than has yet been done anywhere, since the Gilens and Page study is here being further proven by these international data), is no real ‘news’-medium at all, but is, instead, a propaganda-vehicle for the U.S. Government, a propaganda-arm of a dictatorship — a nation that has been overwhelmingly proven to be a dictatorship, not a democracy.

MySpace Tom beat Facebook in the long run

Wouldn’t you rather be a rich nobody than whatever Mark Zuckerberg is?

By Jeremy Gordon

Source: The Outline

My MySpace profile was abandoned when, at the ripe age of 18, I decided it was just a little too juvenile — the glittering GIFs affixed to every page, the garish customized designs, the pressure on maintaining your top 8. By 2006, Facebook offered a cleaner social experience; by 2009, Twitter offered a more casual one. MySpace was a complete relic by this point, even though only a few years had passed since its launch.

Back in 2005, though, long before MySpace burned out, its founder, Tom Anderson — whose grinning face greeted every new user as their first “friend” — sold the site for $580 million to Rupert Murdoch’s News Corporation. While his site was becoming a punchline during the rise of Facebook, Twitter, Instagram, and the other social media networks we now use everyday, Anderson disappeared entirely from the tech scene. Now, he travels the world, documenting his visits to exotic locations.

Contrast that with what’s currently happening to Facebook’s Mark Zuckerberg, who’s on day two of being grilled by a Senate committee for Facebook’s role in haphazardly collecting all of our personal data, and possibly swinging the 2016 presidential election toward Donald Trump. What was supposed to be a basic networking tool has now become one of the chief mediators of how people interact with each other and the world around them, and how information is absorbed and disseminated on the internet. It’s now apparent that Facebook and Zuckerberg didn’t really consider any of this when aggressively pursuing growth, and now we’re all screwed as we try to untangle the consequences.

MySpace Tom? His most recent Instagram post from seven days ago is a giveaway for a stay at an Iceland hotel. He doesn’t have to issue any terse statements about his company’s commitment to fostering a healthy society; he doesn’t have to sit on a booster seat for seven hours and take dipshit questions from a procession of Senate ghouls. He isn’t worth as much money as Zuckerberg, of course, but unless you’re an oil baron, $580 million is enough to tide you over for the length of your lifetime, and your children’s lifetime, and your children’s children’s lifetime, and so on. (Even after taxes!) And yes, yes, being that rich is good for nobody, but without getting into an argument about the perils of capitalism, we can agree that personally speaking, Anderson is having a much better go of things.

It puts MySpace’s failure to evolve in a new light, as perhaps the healthy thing is for a platform to die and for everyone to move on. Its aesthetic and form, back when everyone had emo bangs and listened to Hawthorne Heights, couldn’t change without altering the meaning of the site altogether, and by that time, everyone was gone. Had Facebook not gotten too good at inserting itself between human users, there’s no way it would’ve run into their current problems at such a wide scope. The suspicious CEO is not the one who cashes out; it’s the one who sticks around and creates a behemoth.

Zuckerberg could have sold off his share and avoided becoming literally one of the most disliked people in the present moment. I never thought we’d declare MySpace the winner over Facebook, but then again, I never thought a lot of things about the moment we’re in.