Foxes Guard Facebook Henhouse

hqdefault

By F. William Engdahl

Source: New Eastern Outlook

The latest mantra of CIA-linked media since the “Pizzagate” leaks of data alleging that Hillary Clinton Campaign Manager John Podesta and other highly influential political persons in Washington were connected to an unusual pizza place near the White House run by a 41-year old James Achilles Alefantis called Comet Ping Pong, is the need to crack down (i.e. censorship) on what is being called “Fake News.” The latest step in this internet censorship drive is a decision by the murky social media organization called Facebook to hire special organizations to determine if Facebook messages are pushing Fake News or not. Now it comes out that the “fact check” private organizations used by Facebook are tied to the CIA and CIA-related NGO’s including George Soros’ Open Society Foundations.

In the last weeks of the US Presidential campaign, Wikileaks released a huge number of emails linked to Clinton Campaign Manager, John Podesta. The contents of thousands of emails revealed detailed exchanges between Podesta and the oddly-influential Comet Ping Pong pizza place owner, Alefantis, as well as the Clinton campaign, which held fundraisers at Comet Ping Pong.

The Pizzagate scandal exploded in the final weeks of the US campaign as teams of private researchers documented and posted Facebook, Instagram and other data suggesting that Alefantis and Comet Ping Pong were at the heart of a pedophilia ring that implicated some of the most prominent politicians in Washington and beyond.

The New York Times and Washington Post moved swiftly to assert that the Pizzagate revelations were Fake News, quoting “anonymous sources” who supposedly said the CIA “believed” Russia was behind hackers who exposed emails and documents from the Democratic National Committee (DNC) and Hillary Clinton’s campaign chair John Podesta. Former NSA senior intelligence expert William Binney claimed the Podesta and Clinton campaign data were leaked, not hacked. The NSA, he pointed out, would immediately identify a hack, especially a foreign hack, and they have remained silent.

The uncovering and release to Wikileaks of the Podesta emails were immediately blamed on Russian intelligence by the CIA, and now by the US President, with not a shred of proof, and despite the fact that NSA. Wikipedia, whose content is often manipulated by US intelligence agencies, rapidly posted a page with the curious title, “Pizzagate (Conspiracy Theory).”

To make certain the neutral interested reader gets the message, the first line reads, “Pizzagate is a debunked conspiracy theory which emerged during the 2016 United States presidential election cycle, alleging that John Podesta’s emails, which were leaked by WikiLeaks, contain coded messages referring to human trafficking, and connecting a number of pizzerias in Washington, D.C. and members of the Democratic Party to a child-sex ring.”

‘Fake News’ Mantra Begins

My purpose in mentioning Pizzagate details is not to demonstrate the authenticity of the Pizzagate allegations. That others are doing with far more resources. Rather, it is to point out the time synchronicity of the explosive Pizzagate email releases by Julian Assange’s Wikileaks web blog, with the launch of a massive mainstream media and political campaign against what is now being called “Fake News.”

The cited New York Times article that Wikipedia cites as “debunking” the Pizzagate allegations states, “None of it was true. While Mr. Alefantis has some prominent Democratic friends in Washington and was a supporter of Mrs. Clinton, he has never met her, does not sell or abuse children, and is not being investigated by law enforcement for any of these claims. He and his 40 employees had unwittingly become real people caught in the middle of a storm of fake news.” The article contains not one concrete proof that the allegations are false, merely quoting Alefantis as the poor victim of malicious Fake News.

That New York Times story was accompanied by a series of articles such as “How Fake News Goes Viral: A Case Study.” Another headline reads, “Obama, With Angela Merkel in Berlin, Assails Spread of Fake News.” Then on November 19, strong Clinton supporter, Facebook billionaire Mark Zuckerberg is quoted in a prominent article titled, “Facebook Considering Ways to Combat Fake News, Mark Zuckerberg Says.”

Facebook uses CIA Censors

Zuckerberg, CEO and founder of the world-leading social media site, Facebook.com, the world’s 5th wealthiest man at an estimated $50 billion, has now established a network of “Third Party Fact Checkers” whose job is to red flag any Facebook message of the estimated one billion people using the site, with a prominent warning that reads, “Disputed by Third-Party Fact Checkers.”

Facebook has announced that it is taking its censorship ques from something called The International Fact-Checking Network (IFCN). This IFCN, a new creation, has drafted a code of five principles for news websites to accept, and Facebook will work with “third-party fact checking organizations” that are signatories to that code of principles.

If we search under the name International Fact-Checking Network, we find ourselves at the homepage of something called the Poynter Institute for Media Studies in St. Petersburg, Florida.

OK. If we look a bit deeper we find that the Poynter Institute’s International Fact-Checking Network in turn, as its website states, gets money from the Bill & Melinda Gates Foundation, Google, the National Endowment for Democracy, the Omidyar Network, the Open Society Foundations of George Soros.

Oh my, oh my! Bill & Melinda Gates Foundation who partners with Soros in numerous nasty projects such as convincing African countries to accept Genetically Modified or GMO seeds? Google, whose origins date back to funding by the CIA and NSA as what intelligence researcher Nafeez Ahmed describes as a “plethora of private sector start-ups co-opted by US intelligence to retain ‘information superiority‘ “?

The Omidyar Foundation is the foundation of eBay founder and multi billionaire, Pierre Omidyar, which finances among other projects the online digital publication, The Intercept, launched in 2014 by Glenn Greenwald, Laura Poitras and Jeremy Scahill.

And the National Endowment for Democracy (NED), the US Government-financed “private” NGO behind every Color Revolution CIA regime change from the Ukraine Color Revolutions to the Arab Spring? The NED was a CIA project created in the 1980’s during the Reagan Administration as part of privatizing US intelligence dirty operations, to do, as Allen Weinstein, who drafted the Congressional legislation to establish the NED, noted in a candid 1991 Washington Post interview, “A lot of what we do today was done covertly 25 years ago by the CIA.”

And if we dig even deeper we find, lo and behold, the name George Soros, convicted hedge fund insider trader, tax-exempt philanthropist and giga-billionaire who seems to fund not only Hillary Clinton and virtually every CIA and US State Department Color Revolution from Russia to China to Iran through his network of Open Society Foundations including the 1990’s Jeffrey Sachs Shock Therapy plunder of Russia and most of former Communist East Europe.

Another one of the media working with Zuckerberg’s Facebook censorship of Fake News is the Washington Post, today owned by Amazon billionaire founder Jeff Bezos. Bezos is a major media business partner of….The US Central Intelligence Agency, a fact he omitted to inform about after taking over ownership of the most important newspaper in Washington.

Bezos’ Washington Post recently published a bizarre list of 200 websites it claimed generated Fake News. It refused to identify who gave them the list. Veteran Washington investigative reporter, Wayne Madsen, exposed the source of the McCarthy-style taboo list of so-called Fake News. It was a “website called PropOrNot.com that has links to the CIA and George Soros.”

It’s not merely the Pizzagate revelations that have triggered such a massive attack on independent Internet websites. It seems that back in January 2014 at the Davos World Economic Forum control of information on the Internet was a top item of discussion. At the time, Madsen noted, “With the impending demise of World Wide Web ‘net neutrality,’ which has afforded equal access for website operators to the Internet, the one percent of billionaire investors are busy positioning themselves to take over total control of news reporting on the Internet.”

It’s not even the foxes who are guarding the Internet Henhouse. It’s the werewolves of CIA and US Government censorship. Whether the explosive Pizzagate Podesta revelations merely triggered a dramatic acceleration in the timetable for the CIA’s planned “Fake News” operation as the successor to their 1980’s “Conspiracy Theory” linguistic discrediting operation, it’s clear this is no unbiased, objective, transparent public service to protect the Internet public from harmful content.

And, besides, who are they to tell me or you what you are allowed to read, digest and form your independent ideas about? This is a 21st Century reincarnation of the Spanish Inquisition, one by the real fake newsmakers–Washington Post, AP, ABCNews, Snopes.com, FactCheck.org, the CIA and friends. I would say it’s an alarming development of cyber warfare, not by Russia, but by those CIA-run networks that are fomenting Fake News to demonize any and everyone who opposes Washington intelligence propaganda.

 

F. William Engdahl is strategic risk consultant and lecturer, he holds a degree in politics from Princeton University and is a best-selling author on oil and geopolitics, exclusively for the online magazine “New Eastern Outlook.” 

 

The world wide cage

zuckerberg_VR_people-625x350

Technology promised to set us free. Instead it has trained us to withdraw from the world into distraction and dependency

By Nicholas Carr

Source: Aeon

It was a scene out of an Ambien nightmare: a jackal with the face of Mark Zuckerberg stood over a freshly killed zebra, gnawing at the animal’s innards. But I was not asleep. The vision arrived midday, triggered by the Facebook founder’s announcement – in spring 2011 – that ‘The only meat I’m eating is from animals I’ve killed myself.’ Zuckerberg had begun his new ‘personal challenge’, he told Fortune magazine, by boiling a lobster alive. Then he dispatched a chicken. Continuing up the food chain, he offed a pig and slit a goat’s throat. On a hunting expedition, he reportedly put a bullet in a bison. He was ‘learning a lot’, he said, ‘about sustainable living’.

I managed to delete the image of the jackal-man from my memory. What I couldn’t shake was a sense that in the young entrepreneur’s latest pastime lay a metaphor awaiting explication. If only I could bring it into focus, piece its parts together, I might gain what I had long sought: a deeper understanding of the strange times in which we live.

What did the predacious Zuckerberg represent? What meaning might the lobster’s reddened claw hold? And what of that bison, surely the most symbolically resonant of American fauna? I was on to something. At the least, I figured, I’d be able to squeeze a decent blog post out of the story.

The post never got written, but many others did. I’d taken up blogging early in 2005, just as it seemed everyone was talking about ‘the blogosphere’. I’d discovered, after a little digging on the domain registrar GoDaddy, that ‘roughtype.com’ was still available (an uncharacteristic oversight by pornographers), so I called my blog Rough Type. The name seemed to fit the provisional, serve-it-raw quality of online writing at the time.

Blogging has since been subsumed into journalism – it’s lost its personality – but back then it did feel like something new in the world, a literary frontier. The collectivist claptrap about ‘conversational media’ and ‘hive minds’ that came to surround the blogosphere missed the point. Blogs were crankily personal productions. They were diaries written in public, running commentaries on whatever the writer happened to be reading or watching or thinking about at the moment. As Andrew Sullivan, one of the form’s pioneers, put it: ‘You just say what the hell you want.’ The style suited the jitteriness of the web, that needy, oceanic churning. A blog was critical impressionism, or impressionistic criticism, and it had the immediacy of an argument in a bar. You hit the Publish button, and your post was out there on the world wide web, for everyone to see.

Or to ignore. Rough Type’s early readership was trifling, which, in retrospect, was a blessing. I started blogging without knowing what the hell I wanted to say. I was a mumbler in a loud bazaar. Then, in the summer of 2005, Web 2.0 arrived. The commercial internet, comatose since the dot-com crash of 2000, was up on its feet, wide-eyed and hungry. Sites such as MySpace, Flickr, LinkedIn and the recently launched Facebook were pulling money back into Silicon Valley. Nerds were getting rich again. But the fledgling social networks, together with the rapidly inflating blogosphere and the endlessly discussed Wikipedia, seemed to herald something bigger than another gold rush. They were, if you could trust the hype, the vanguard of a democratic revolution in media and communication – a revolution that would change society forever. A new age was dawning, with a sunrise worthy of the Hudson River School.

Rough Type had its subject.

The greatest of the United States’ homegrown religions – greater than Jehovah’s Witnesses, greater than the Church of Jesus Christ of Latter-Day Saints, greater even than Scientology – is the religion of technology. John Adolphus Etzler, a Pittsburgher, sounded the trumpet in his testament The Paradise Within the Reach of All Men (1833). By fulfilling its ‘mechanical purposes’, he wrote, the US would turn itself into a new Eden, a ‘state of superabundance’ where ‘there will be a continual feast, parties of pleasures, novelties, delights and instructive occupations’, not to mention ‘vegetables of infinite variety and appearance’.

Similar predictions proliferated throughout the 19th and 20th centuries, and in their visions of ‘technological majesty’, as the critic and historian Perry Miller wrote, we find the true American sublime. We might blow kisses to agrarians such as Jefferson and tree-huggers such as Thoreau, but we put our faith in Edison and Ford, Gates and Zuckerberg. It is the technologists who shall lead us.

Cyberspace, with its disembodied voices and ethereal avatars, seemed mystical from the start, its unearthly vastness a receptacle for the spiritual yearnings and tropes of the US. ‘What better way,’ wrote the philosopher Michael Heim in ‘The Erotic Ontology of Cyberspace’ (1991), ‘to emulate God’s knowledge than to generate a virtual world constituted by bits of information?’ In 1999, the year Google moved from a Menlo Park garage to a Palo Alto office, the Yale computer scientist David Gelernter wrote a manifesto predicting ‘the second coming of the computer’, replete with gauzy images of ‘cyberbodies drift[ing] in the computational cosmos’ and ‘beautifully laid-out collections of information, like immaculate giant gardens’.

The millenarian rhetoric swelled with the arrival of Web 2.0. ‘Behold,’ proclaimed Wired in an August 2005 cover story: we are entering a ‘new world’, powered not by God’s grace but by the web’s ‘electricity of participation’. It would be a paradise of our own making, ‘manufactured by users’. History’s databases would be erased, humankind rebooted. ‘You and I are alive at this moment.’

The revelation continues to this day, the technological paradise forever glittering on the horizon. Even money men have taken sidelines in starry-eyed futurism. In 2014, the venture capitalist Marc Andreessen sent out a rhapsodic series of tweets – he called it a ‘tweetstorm’ – announcing that computers and robots were about to liberate us all from ‘physical need constraints’. Echoing Etzler (and Karl Marx), he declared that ‘for the first time in history’ humankind would be able to express its full and true nature: ‘we will be whoever we want to be.’ And: ‘The main fields of human endeavour will be culture, arts, sciences, creativity, philosophy, experimentation, exploration, adventure.’ The only thing he left out was the vegetables.

Such prophesies might be dismissed as the prattle of overindulged rich guys, but for one thing: they’ve shaped public opinion. By spreading a utopian view of technology, a view that defines progress as essentially technological, they’ve encouraged people to switch off their critical faculties and give Silicon Valley entrepreneurs and financiers free rein in remaking culture to fit their commercial interests. If, after all, the technologists are creating a world of superabundance, a world without work or want, their interests must be indistinguishable from society’s. To stand in their way, or even to question their motives and tactics, would be self-defeating. It would serve only to delay the wonderful inevitable.

The Silicon Valley line has been given an academic imprimatur by theorists from universities and think tanks. Intellectuals spanning the political spectrum, from Randian right to Marxian left, have portrayed the computer network as a technology of emancipation. The virtual world, they argue, provides an escape from repressive social, corporate and governmental constraints; it frees people to exercise their volition and creativity unfettered, whether as entrepreneurs seeking riches in the marketplace or as volunteers engaged in ‘social production’ outside the marketplace. As the Harvard law professor Yochai Benkler wrote in his influential book The Wealth of Networks (2006):

This new freedom holds great practical promise: as a dimension of individual freedom; as a platform for better democratic participation; as a medium to foster a more critical and self-reflective culture; and, in an increasingly information-dependent global economy, as a mechanism to achieve improvements in human development everywhere.

Calling it a revolution, he said, is no exaggeration.

Benkler and his cohort had good intentions, but their assumptions were bad. They put too much stock in the early history of the web, when the system’s commercial and social structures were inchoate, its users a skewed sample of the population. They failed to appreciate how the network would funnel the energies of the people into a centrally administered, tightly monitored information system organised to enrich a small group of businesses and their owners.

The network would indeed generate a lot of wealth, but it would be wealth of the Adam Smith sort – and it would be concentrated in a few hands, not widely spread. The culture that emerged on the network, and that now extends deep into our lives and psyches, is characterised by frenetic production and consumption – smartphones have made media machines of us all – but little real empowerment and even less reflectiveness. It’s a culture of distraction and dependency. That’s not to deny the benefits of having easy access to an efficient, universal system of information exchange. It is to deny the mythology that shrouds the system. And it is to deny the assumption that the system, in order to provide its benefits, had to take its present form.

Late in his life, the economist John Kenneth Galbraith coined the term ‘innocent fraud’. He used it to describe a lie or a half-truth that, because it suits the needs or views of those in power, is presented as fact. After much repetition, the fiction becomes common wisdom. ‘It is innocent because most who employ it are without conscious guilt,’ Galbraith wrote in 1999. ‘It is fraud because it is quietly in the service of special interest.’ The idea of the computer network as an engine of liberation is an innocent fraud.

I love a good gizmo. When, as a teenager, I sat down at a computer for the first time – a bulging, monochromatic terminal connected to a two-ton mainframe processor – I was wonderstruck. As soon as affordable PCs came along, I surrounded myself with beige boxes, floppy disks and what used to be called ‘peripherals’. A computer, I found, was a tool of many uses but also a puzzle of many mysteries. The more time you spent figuring out how it worked, learning its language and logic, probing its limits, the more possibilities it opened. Like the best of tools, it invited and rewarded curiosity. And it was fun, head crashes and fatal errors notwithstanding.

In the early 1990s, I launched a browser for the first time and watched the gates of the web open. I was enthralled – so much territory, so few rules. But it didn’t take long for the carpetbaggers to arrive. The territory began to be subdivided, strip-malled and, as the monetary value of its data banks grew, strip-mined. My excitement remained, but it was tempered by wariness. I sensed that foreign agents were slipping into my computer through its connection to the web. What had been a tool under my own control was morphing into a medium under the control of others. The computer screen was becoming, as all mass media tend to become, an environment, a surrounding, an enclosure, at worst a cage. It seemed clear that those who controlled the omnipresent screen would, if given their way, control culture as well.

‘Computing is not about computers any more,’ wrote Nicholas Negroponte of the Massachusetts Institute of Technology in his bestseller Being Digital (1995). ‘It is about living.’ By the turn of the century, Silicon Valley was selling more than gadgets and software: it was selling an ideology. The creed was set in the tradition of US techno-utopianism, but with a digital twist. The Valley-ites were fierce materialists – what couldn’t be measured had no meaning – yet they loathed materiality. In their view, the problems of the world, from inefficiency and inequality to morbidity and mortality, emanated from the world’s physicality, from its embodiment in torpid, inflexible, decaying stuff. The panacea was virtuality – the reinvention and redemption of society in computer code. They would build us a new Eden not from atoms but from bits. All that is solid would melt into their network. We were expected to be grateful and, for the most part, we were.

Our craving for regeneration through virtuality is the latest expression of what Susan Sontag in On Photography (1977) described as ‘the American impatience with reality, the taste for activities whose instrumentality is a machine’. What we’ve always found hard to abide is that the world follows a script we didn’t write. We look to technology not only to manipulate nature but to possess it, to package it as a product that can be consumed by pressing a light switch or a gas pedal or a shutter button. We yearn to reprogram existence, and with the computer we have the best means yet. We would like to see this project as heroic, as a rebellion against the tyranny of an alien power. But it’s not that at all. It’s a project born of anxiety. Behind it lies a dread that the messy, atomic world will rebel against us. What Silicon Valley sells and we buy is not transcendence but withdrawal. The screen provides a refuge, a mediated world that is more predictable, more tractable, and above all safer than the recalcitrant world of things. We flock to the virtual because the real demands too much of us.

‘You and I are alive at this moment.’ That Wired story – under headline ‘We Are the Web’ – nagged at me as the excitement over the rebirth of the internet intensified through the fall of 2005. The article was an irritant but also an inspiration. During the first weekend of October, I sat at my Power Mac G5 and hacked out a response. On Monday morning, I posted the result on Rough Type – a short essay under the portentous title ‘The Amorality of Web 2.0’. To my surprise (and, I admit, delight), bloggers swarmed around the piece like phagocytes. Within days, it had been viewed by thousands and had sprouted a tail of comments.

So began my argument with – what should I call it? There are so many choices: the digital age, the information age, the internet age, the computer age, the connected age, the Google age, the emoji age, the cloud age, the smartphone age, the data age, the Facebook age, the robot age, the posthuman age. The more names we pin on it, the more vaporous it seems. If nothing else, it is an age geared to the talents of the brand manager. I’ll just call it Now.

It was through my argument with Now, an argument that has now careered through more than a thousand blog posts, that I arrived at my own revelation, if only a modest, terrestrial one. What I want from technology is not a new world. What I want from technology are tools for exploring and enjoying the world that is – the world that comes to us thick with ‘things counter, original, spare, strange’, as Gerard Manley Hopkins once described it. We might all live in Silicon Valley now, but we can still act and think as exiles. We can still aspire to be what Seamus Heaney, in his poem ‘Exposure’, called inner émigrés.

A dead bison. A billionaire with a gun. I guess the symbolism was pretty obvious all along.

Stripping the veneer off America’s propaganda menagerie

qefgtrhy

By Wayne Madsen

Source: Intrepid Report

National Security Agency whistleblower Edward Snowden and WikiLeaks founder Julian Assange have doubled down recently on the games being played in cyberspace by America’s cyberwarriors. Snowden suggests that many of NSA’s most damaging malware programs are now in the hands of America’s opponents, thanks to enterprising foreign counterintelligence hackers known as the Shadow Brokers. Snowden believes that the malware, including destructive programs such as Stuxnet, are being auctioned off, via Bitcoin payments, by the Shadow Brokers. Snowden stated that the malware was obtained through hacking from a murky NSA operation called the “Equation Group.”

Assange, fearful that a new Ecuadorian president will hand him over to a Clinton administration in 2017, claims to have more hacked bombshells to drop on Team Clinton, courtesy of weak security in Democratic National Committee and Clinton campaign computer systems.

We have entered a new phase of cyberwarfare, one in which America’s (and Israel’s) most damaging computer hacking and disruption programs are available to anyone willing to pay in Bitcoins on the cyberblack market. The Democratic Party’s leaked emails, coupled with the leaked State Department cables, has Hillary Clinton in an outrage. These disclosures, along with the Snowden disclosures that illustrate how America spies on friend and foe, have stripped the veneer off of America’s propaganda menagerie. Two of the three culprits Mrs. Clinton would like to see in prison for the rest of their lives are, for the time being, outside of Gulag America. Snowden is enjoying political asylum in Russia and Assange has asylum at the Ecuadorian embassy in London. The third, Chelsea Manning, is serving a 35-year prison term at Fort Leavenworth in Kansas and allegedly recently attempted suicide.

Paul Ceglia, who claims to have been the co-founder of Facebook, says he is on the run from the CIA after he filed suit against Facebook and its owner Mark Zuckerberg. Although Zuckerberg admits to having a past business relationship with Ceglia, the US Justice Department criminally charged Ceglia for trying to defraud Facebook after the former associate of Zuckerberg brought a civil suit in federal court in Buffalo against the company. Interestingly, Facebook has donated more money to Hillary Clinton than any other presidential candidate. But what is really at issue in the bizarre case is that Ceglia claims that Facebook’s seed money came from the CIA’s venture capital firm IN-Q-TEL, a charge to which WMR can attest after compiling a massive list of CIA front companies and proprietaries in the soon-to-be-published book: “The Almost Classified Guide to CIA Front Companies, Proprietaries and Contractors.”

The CIA and its partners at Facebook, Google, Yahoo, Microsoft, and other social media firms have striven to control the new media in the same manner that the CIA controlled the “old media” through operations like MOCKINGBIRD. During the Cold War era, the CIA claimed that all the world’s ills were due to Communist front organizations that influenced the media. The truth is that the so-called “fronts” often provided actual accounts of the misdeeds of the CIA and other Western intelligence agencies. However, with U.S. newspapers, magazines, and broadcast networks carrying the water for the CIA, it was Langley’s interpretation of the news that made Western headlines. The “Communist” reports were relegated to the nether regions of “Soviet disinformation” campaigns and “active measures.” The CIA laughably put out a periodical report on such “disinformation” tactics. In reality, what was called “disinformation” was actually bona fide news.

Today, when the CIA wants to debase a news article, it uses such operations as Snopes.com and Wikipedia to engage in CIA disinformation tactics. Uncomfortable truthful news items are quickly dispatched with the term “conspiracy theory.” There is little doubt that Facebook, Wikipedia, and Snopes are part of a “new MOCKINGBIRD” designed for the digital age. Like them or not, Snowden, Assange, Manning, Ceglia, and others have pulled the veil off of the new MOCKINGBIRD.

A formerly secret February 1987 CIA report on Soviet disinformation tactics illustrates that what was described then as “propaganda” was, in fact, the truth.

  • The CIA called baseless charges in a Soviet book that Jonestown, Guyana was a CIA behavioral control operation. It was.
  • The Soviets accused Ronald Reagan’s Strategic Defense Initiative (SDI) of having the goal of weaponizing outer space. Not only was that the goal then, but it remains the goal of present incarnations of SDI.
  • The Soviets and the Afghan president, Najibullah of the Afghan Communist Party, said that they reached out to 50,000 Afghan mujaheddin in Afghanistan and Pakistan, who agreed to lay down their arms and join a coalition government, with eight opposition parties joining the Communists. The CIA and Western media called the news bogus. It was true with television footage of Afghan refugees returning to their homeland from India. The Soviets wanted an internationally-guaranteed neutral Afghanistan before withdrawing their troops. The CIA wanted a radical Islamic Afghanistan from which to launch attacks on the southern Soviet Union. That decision came back to bite the United States on September 11, 2001.
  • The CIA accused the Sandinista government of Nicaragua and the Soviets of being behind the Christian “Evangelical Committee for Development Aid” as a Communist front group. If so, it would have been the first time Communists and Christian evangelicals broke bread together. The CIA’s charge was fatuously false.
  • The Soviets accused the U.S. of using Africans as test subjects for a new AIDS vaccine. This charge has been proven with Africans being used as “guinea pigs” for various new vaccines in programs funded by the Bill and Melinda Gates Foundation, the Clinton Global Initiative, Pfizer Corporation, and other entities linked to CIA biological and genetic warfare operations.
  • Articles in two Bolivian newspapers that stated that the U.S. Information Service in La Paz was trying to recruit Bolivian journalists to write pro-Pentagon articles were deemed by the CIA to be bogus. The CIA charge was false and it included smearing the Federation of Bolivian Press Workers as a Communist front. That is the usual practice by the CIA when it’s caught red-handed.
  • The Soviet news agency Novosti was accused of running a false article, titled “The Relationship Between Journalists and the CIA: Hundreds of Them in International Press.” The article was spot on.
  • The CIA charged as Soviet disinformation charges that the CIA killed nine nonaligned leaders, including Indian Prime Ministers Indira and Rajiv Gandhi. In fact, the CIA has killed many more than nine nonaligned leaders.

In the digital world of YouTube, Facebook, Google, and other social and news media sites, the CIA continues its game of disinformation while accusing others of conducting the same game plan. Some three decades after the Cold War, the CIA’s charges of Soviet disinformation can now be seen as disinformation in their own right.

 

Wayne Madsen is a Washington, DC-based investigative journalist and nationally-distributed columnist. He is the editor and publisher of the Wayne Madsen Report (subscription required).

Having Their Cake and Eating Ours Too

bill-gates

By Chris Lehmann

Source: The Baffler

What are billionaires for? It’s time we sussed out a plausible answer to this question, as their numbers ratchet upward across the globe, impervious to the economic setbacks suffered by mere mortals, and their “good works” ooze across the fair land. The most recent count from Forbes reports a record 1,826 of these ten-figure, market-cornering Croesuses, with familiar North American brands holding down the top three spots: Bill Gates, Carlos Slim, and Warren Buffett. Esteemed newcomers to the list include Uber kingpin Travis Kalanick, boasting $5.3 billion in net worth; gay-baiting, evangelical artery-hardeners Dan and Bubba Cathy, of Chick-fil-A fame ($3.2 billion); and Russ Weiner, impresario of the antifreeze-by-another-name energy drink Rockstar ($2.1 billion). For the first time, too, Mark Zuckerberg has cracked the elite Top 20 of global wealth; in fact, fellow Californians, most following Zuckerberg’s savvy footsteps into digital rentiership, account for 23 of the planet’s new billionaires and 131 of the total number—more than supplied by any nation apart from China and the Golden State’s host country, a quaint former republic known as the United States.

What becomes of the not-inconsiderable surplus that your average mogul kicks up in his rush to market conquest? In most cases, he (and in the vast majority of cases, it is still a “he”) parks his boodle in inflation-boosted goods like art and real estate, which neatly double as venerable monuments to his own vanity or taste.

But what happens when the super-rich turn their clever minds toward challenges beyond getting up on the right side of their well-feathered beds? Specifically, what are the likely dividends of their decisions to “give back to the community,” as the charitable mantra of the moment has it? Once upon a time, the Old World ideal of noblesse oblige might have directed their natural stirrings of conscience toward the principles of mutuality and reciprocity. But this is precisely where the new millennial model of capital-hoarding falls apart. The notion that the most materially fortunate among us actually owe the rest of us anything from their storehouses of pelf is now as unlikely as a communard plot twist in an Ayn Rand novel.

Look around at the charitable causes favored among today’s info-elite, and you’ll see the public good packaged as one continual study in billionaire self-portraiture. The Bill and Melinda Gates Foundation, endowed by a celebrated prep-school graduate and Harvard dropout, devotes the bulk of its endowment and nearly all of its intellectual firepower to laying waste to the nation’s teachers’ unions. The Eli and Edythe Broad Foundation is but the Gates operation on steroids, unleashing a shakedown syndicate of overcapitalized and chronically underperforming charter schools in the beleaguered urban centers where the democratic ideal of the common school once flourished. The Clinton Global Initiative, when it’s not furnishing vaguely agreeable alibis for Bill Clinton’s louche traveling companions, is consumed by neoliberal delusions of revolutionary moral self-improvement via the most unlikely of means—the proliferation of the very same sort of dubious financial instruments that touched off the 2008 economic meltdown. In this best of all possible investors’ worlds, swashbuckling info-moralists will teach international sex workers about the folly of their life choices by setting them up with a laptop and an extended tutorial on the genius of microloans.

This recent spike in elite self-infatuation, in other words, bespeaks a distressing new impulse among the fabulously well-to-do. While past campaigns of top-down charity focused on inculcating habits of bourgeois self-control among the lesser-born, today’s philanthro-capitalist seigneurs are seeking to replicate the conditions of their own success amid the singularly unpromising social world of the propertyless, unskilled, less educated denizens of the Global South. It’s less a matter of philanthro-capitalism than one of philanthro-imperialism. Where once the gospel of industrial success held sway among the donor class, we are witnessing the gospel of the just-in-time app, the crowdsourced startup, and the crisply leveraged microloan. This means, among other things, that the objects of mogul charity are regarded less and less as moral agents in their own right and more and more as obliging bit players in a passion play exclusively devoted to dramatizing the all-powerful, disruptive genius of our info-elite. They aren’t “giving back” so much as peering into the lower depths of the global social order and demanding, in the ever-righteous voice of privilege, “Who’s the fairest of them all?”

Noblesse Sans Oblige

There was plenty to deride in the Old World model of noblesse oblige; it dates back to the bad old days of feudal monarchy, when legacy-royal layabouts not only abjured productive labor entirely, but felt justified in the notion that they owned the souls of the peasants tethered to their sprawling estates. It’s no accident, therefore, that the idea of the rich being in receipt of any reciprocal obligation to the main body of the social order failed to make it onto the American scene. The sturdy mythology of the American self-made man didn’t really permit an arriviste material adventurer to look back to his roots at all, save to assure those within earshot that he’d definitively risen above them by the sheer force of an indomitable will-to-succeed.

But the relevant defining trait is the oblige part: the notion that the wealthy not only could elect to “give back” when it might suit their fancy, but that they had to positively let certain social goods alone—and assertively fund others—by virtue of their privileged station. Traditions such as the English commons stemmed from the idea that certain public institutions were inviolate, so far as the enfeoffing prerogatives of the landowning class went. The state church is another, altogether more problematic, legacy of this ancien régime; in addition to owning feudal souls outright, the higher orders of old had to evince some institutional concern for their ultimate destiny. There was exploitation and corruption galore woven into this social contract, of course, but for the more incendiary figures who dared to take its spiritual precepts seriously, there were also strong speculative grounds for envisioning another sort of world entirely, one in which the radical notion of spiritual equality took hold. As the Puritan Leveller John Lilburne—a noble by birth—put it in 1646, in the midst of the English Civil War:

All and every particular and individual man and woman, that ever breathed in the world . . . are by nature all equal and alike in their power, dignity, authority, and majesty, none of them having (by nature) any authority, dominion, or magisterial power, one over or above another.

Of course, the Levellers clearly were not on the winning side of British history, but this militant Puritan spirit migrated to the American colonies to supply the seedbed of our own communitarian ideal, expounded most famously in John Winthrop’s social-gospel oration “A Model of Christian Charity” aboard the Arbella in 1630. Throughout his sermon, Winthrop repeatedly exhorted his immigrant parishioners to practice extreme liberality in charity. “He that gives to the poor, lends to the Lord,” Winthrop declared in an appeal to philanthropic mutuality far less widely quoted than his fabled simile of the colonial settlement of New England as a city on a hill. “And he will repay him even in this life an hundredfold to him or his.” Citing a litany of biblical precedent, Winthrop went on to remind his mostly well-to-do Puritan flock that “the Scripture gives no caution to restrain any from being over liberal this way.” Indeed, he drove home the point much more forcefully as he highlighted the all-too-urgent imperative for these colonial adventurers to hand over the entirety of their substance for fellow settlers in material distress. “The care of the public must oversway all private respects,” Winthrop thundered—and then, sounding every bit the proto-socialist that his countryman Lilburne was: “It is a true rule that particular estates cannot subsist in the ruin of the public.”

The Accumulator As Paragon

The story of how Winthrop’s model of Christian charity degenerated into the neoliberal shibboleths of the Gates and Zuckerberg age is largely the saga of American monopoly capitalism, and far too epic to dally with here. But there is a key transitional figure in this shift: the enormously wealthy, self-made, and terminally self-serious steel-titan-cum-social reformer Andrew Carnegie. Born in rural Scotland in 1835 to an erratically employed artisan weaver, Carnegie grew up on the Chartist slogans that, amid the more secular social unrest of the industrial revolution, came to supplant the Levellers’ democratic visions of a world turned upside down. When he rose from an apprenticeship in a Pittsburgh telegraph office to true mogul status in the railroad, iron, and steel industries, Carnegie continued to cleave to the pleasing reverie that he was a worker’s kind of robber baron. Thanks to his own class background, he intoned, he had unique insight into the plight of the workmen seeking to hew their livings out of the harsh conditions of a new industrial capitalist social order. “Labor is all that the working man has to sell,” Carnegie pronounced just ahead of a series of wage cuts at his Pittsburgh works in 1883. “And he cannot be expected to take kindly to reductions of wages. . . . I think the wages paid at the seaboard of the United States are about as low as men can be expected to take.”

It was vital to Carnegie’s moral vanity to keep maintaining this self-image as the benevolent industrial noble, and he did so well past the point where his actually existing business interests dictated (as he saw it) the systematic beggaring of his workers. When the managers of Carnegie-owned firms would sell their workers short, lock them out, or bust their unions, Carnegie would typically blame the workers for not obtaining better contracts at rival iron, steel, and railroad concerns. While he might sympathize with their generally weak bargaining position, Carnegie well understood that he couldn’t have his competitors undercutting his own bottom line with cheaper labor costs—and with cheaper goods to market to Carnegie’s customers.

Carnegie’s patrician moral sentiments were genuine; throughout his career, he erected an elaborate philosophical defense of philanthropy as the only proper path for the disposition of riches, and famously spent his last years furiously trying to disperse as much of his fortune as possible to pay for charitable foundations, libraries, church organs, and the like. As he saw it, the mogul receives a sacred charge from the larger historical forces that conspire in the creation of his wealth: the rich man must act as a “trustee” for the needier members of the community.

Because the millionaire had proved his mettle as an accumulator of material rewards in the battle for business dominion, it followed that he had also been selected to be the most beneficent, and judicious, dispenser of charitable support for the lower orders as well. In Carnegie’s irenic vision of ever-advancing moral progress, all social forces were tending toward “an ideal state, in which the surplus wealth of the few will become, in the best sense, the property of the many, because administered for the common good,” as he preached in his famous 1889 essay “The Gospel of Wealth.” “And this wealth, passing through the hands of the few, can be made a much more potent force for the elevation of our race than if it had been distributed in small sums to the people themselves.” The accomplished mogul was, in Carnegie’s fanciful telling, nothing less than a dispassionate expert in the optimal disbursal of resources downward: “The man of wealth,” he wrote, became “the mere agent and trustee for his poorer brethren, bringing to their service his superior wisdom, experience, and ability to administer, doing for them better than they would or could do for themselves.”

Such blissfully un-self-aware flourishes of elite condescension—and the intolerable contradictions that called them into being—point at the tensions lurking just beneath Carnegie’s placid, controlling social muse. For as his own career as a market-cornering industrialist made painfully clear, precisely none of Carnegie’s fortune stemmed from serving out a benevolent trusteeship in the interests of the poor and working masses. Indeed, something far more perverse and unsightly impelled the business model for Carnegie’s commercial and charitable pursuits, as his biographer David Nasaw notes: Carnegie used the alibi of his own enlightened, philanthropic genius as the primary justification for denying collective bargaining rights to his workers.

Since he was clearly foreordained to serve the best interests of these workers better than they could, it was ultimately to everyone’s benefit to transform Carnegie’s business holdings into the most profitable enterprises on the planet—all the better to sluice more of the mogul’s ruthlessly extracted wealth back into the hands of a grateful hoi polloi, once it was rationalized and sanctified by the great man’s “superior wisdom, experience, and ability to administer.” In the sanctum of his New York study, where he spent the bulk of his days once his wealth disencumbered him of direct managerial duties at his Pittsburgh holdings, Carnegie found thrilling confirmation of his enlightened moral standing in the writings of social Darwinist Herbert Spencer. Yes, the wholesale of workers, widows, and orphans might seem “harsh,” Spencer preached to his ardent business readership. But when viewed from the proper vantage—the end point toward which all of humanity’s evolutionary struggles were ineluctably trending—this remorseless process of deskilling, displacement, and death was actually a sacred mandate, not to be tampered with: “When regarded not separately, but in connection with the interests of universal humanity, these harsh fatalities are seen to be of the highest beneficence.”

And so, indeed, it came to pass, albeit a bit too vividly for Carnegie’s own moral preference. At the center of the Carnegie firms’ labor-bleeding business model was a landmark tragedy in American labor relations: the 1892 strike at Carnegie’s Homestead works. Carnegie’s lieutenant, Henry Clay Frick, locked out the facility’s workforce after the Amalgamated Association of Iron and Steel Workers pressed management to suspend threatened wage cuts and pare back punishing twelve-hour shifts for steel workers. Frick clumsily tried to ferry in Pinkerton forces on the Monongahela River to take control of the plant; Homestead workers, backed by their families and local business owners, fought to repel the Pinkerton thugs. Gunfire was exchanged on both sides, killing two Pinkertons and nine workers. Eventually, Frick got the state militia to disperse the crowds of workers and their supporters; with his field of action cleared, the plant’s manager proceeded to starve out the strikers, breaking the strike five months after it began. The Amalgamated Union collapsed into oblivion the following year. No union would ever again darken the door of a Carnegie-owned business, no matter what sort of lip service he continued to pay to the dignity of the workingman in public.

Homestead was a bitter rebuke to Carnegie’s self-image as the workers’ expert missionizing advocate—but tellingly, it didn’t do any lasting damage to the larger edifice of his charitable pretension. Partly, this was a function of Carnegie’s genuine generosity. More fundamentally, though, the steel mogul’s outsized moral self-regard endured in its prim, unmolested state thanks to the larger American public consensus on the proper Olympian status of men of wealth, especially when gauged against the demoralizing spectacle of industrial conflict.

Strings, Attached

The desperate intellectual acrobatics of the self-made Carnegie were never viewed as pathological, for the simple reason that they mirrored the logic by which American business interests at large pursued public favor. In this scheme of things, the lords of commerce were always to be the unquestioned possessors of a magisterial historical prerogative, and the base, petty interests of a self-organized labor movement were always the retrograde obstacle to true progress. What else could it mean, after all, for the owners of capital to always and forever be acting “in connection with the interests of universal humanity”? Following the broad contours of Carnegie’s founding efforts in this sphere, a long succession of American business leaders would proceed to claim for themselves the mantle of enlightened market despotism, from GM CEO Charlie “Engine” Wilson’s breezy midcentury conflation of his corporation’s grand good fortune with that of its host nation to the confident prognostications of today’s tech lords that we are about to efface global poverty in the swipe of a few well-designed apps.

So how does the philanthropic debauching of the public sphere unfold today, now that Carnegie’s bifurcated model of exploitation for charity’s sake has receded into the dimly remembered newsreel footage of the industrial age? Well, for one thing, it’s become a lot less genteel. Trusteeship isn’t the model any longer; it’s annexation.

Take one especially revealing case involving our own age’s pet mogul crusade of school reform. Just five years ago, Mark Zuckerberg made a splashy, Oprah-choreographed gift of $100 million to the chronically low-performing Newark public school district—an announcement also timed to coincide with the national release of the union-baiting school reform documentary Waiting for “Superman.” The idea was to enlist the Facebook wizard’s fellow philanthro-capitalists in a matching donor drive, so that the city’s schools, already staked to a $1 billion state-administered budget, would also pick up $200 million of private-sector foundation dosh, to be spent on charter schools and other totems of managerial faux-excellence. With this dramatic infusion of money from our lead innovation industries, it would be largely a formality to “turn Newark into a symbol of educational excellence for the whole nation,” as Zuckerberg told a cheerleading Oprah.

And sure enough, all the usual deep-pocketed benefactors turned out in force to meet the Zuckerberg challenge: Eli Broad, the Gates Foundation, the Walton Foundation, and even Zuckerberg’s chief operation officer, Sheryl “Lean In” Sandberg, all kicked into the kitty. At the public forums rolling out the initiative—organized for a cool $1.3 million by Tusk Strategies, a consultancy concern affiliated with erstwhile New York mayor Michael Bloomberg’s own school-privatizing fiefdom—Newark parents more concerned with securing basic protections for their kids in local schools, such as freedom from gang violence and drug trafficking, exhorted the newly parachuted reform class to focus on the mundane prerequisites of infrastructure support and student safety. But try as they might, they found their voices continually drowned out by a rising chorus of vacuous reform-speak. “It’s destiny that we become the first city in America that makes its whole district a system of excellence,” then-mayor Cory Booker burbled at one such gathering. “We want to go from islands of excellence to a hemisphere of hope.”

But for all these stirring reprises of the Spencerian catechism on “the interests of universal humanity,” the actual state of schooling in Newark was not measurably improving. The leaders of the reform effort (which was, of course, entitled “Startup:Education”) couldn’t answer the most basic questions about how the rapidly deployed battery of excellence-incubating Newark charter schools would coexist beside the shambolic wrecks of the city’s merely public schools, where a majority of Newark kids would still be enrolled—or even how parents of charter kids would get their kids to and from school, since these wise, reforming souls neglected to allot due funding for bus transportation. Not surprisingly, the new plan’s leaders were also cagey about explaining how all the individual school budgets, charter and public alike, were to be brought into line.

So in short order, the magic Zuckerberg seed money, together with the additional $100 million in matching grants, had all vanished. More than $20 million of that went to pay PR and consultancy outfits like Tusk Strategies, according to New Yorker writer Dale Russakoff, who notes that “the going rate for individual consultants in Newark was a thousand dollars a day.” Another $30 million went to pad teachers’ salaries with back pay to buy off workers’ good will—and far more important, to gain the necessary leverage to dismiss or reassign union-protected teachers who didn’t project as the privatizing Superman type. The most enduring legacy of Startup:Education appears to be a wholly unintended political one: disenchanted Newark citizens rallied behind the mayoral candidacy of Ras Baraka, former principal of Newark’s Central High School and son of the late radical poet Amiri Baraka, who was elected last year on a platform of returning Newark educational policy to the control of the community.

With all due allowances for the dramatically disparate character of the underlying social order, and the shift from an Industrial Age economy to a service-driven information one, it’s nonetheless striking to note just how little about the purblind conduct of overclass charity has changed since Carnegie’s time. Just as Carnegie’s own sentimental and imaginary identification with the workers in his employ supplied him with the indispensable rhetorical cover for beggaring said workers of their livelihoods and rights to self-determination in the workplace, so did the leaders of Startup:Education evince just enough peremptory interest in the actual living conditions of Newark school families to net optimal Oprah coverage. And once the Klieg lights dimmed, the real business plan kicked into gear: a sustained feeding frenzy for the neoliberal symbolic analysts professionally devoted to stage-managing the appearance of far-seeing school reform. These high-priced hirelings were of course less brutal and bloodthirsty than the Pinkertons Frick had unleashed on the Homestead workers, but their realpolitik charge was, at bottom, equally stark: to discredit teachers’ unions and community activists while delivering control of a vital social good into the hands of a remote investing and owning class. If the parents and kids grew restive in their appointed role as stage props for the pleasing display of patrician largess, why, they could just hire Uber drivers to dispatch themselves to the new model charter schools, or maybe scare off local gang members by assembling an artillery of firearms generated via their 3-D printers.

In truth, no magic-bullet privatization plan could begin to address the core conditions that sent the Newark schools spiraling into systemic decay: rampant white flight after the 1967 riots, which in turn drained the city of the property-tax revenues needed to sustain a quality educational system, combined with corruption within the city’s political establishment and (yes) among the leadership of its teachers’ unions. To make local education districts respond meaningfully to the needs of the communities they serve, reformers would have to begin at the very opposite end of the class divide from where Startup:Education set up shop—by giving power to the members of said communities, not their self-appointed neoliberal overseers. In other words, common schools should rightly be understood as a commons, not as playthings for bored digital barons or as little success engines, managed like startups in the pejorative sense, left to stall out indefinitely in beta-testing mode until all the money’s gone.

Andrew Carnegie, at least, had the depth of character to recognize when his vision of his world-conquering destiny had gone badly off the rails. In the last years of his life, his infatuation with the stolid charms of mere libraries and church organs seemed to fade, so he adopted a quixotic quest to recalibrate human character entirely. Starting with an ardent—and quite worthy—campaign to stem the worst excesses of American imperialism in the wake of the Spanish-American War, Carnegie then turned to the seemingly insoluble challenge of stamping out altogether the human propensity to make war. When this latter crusade ran afoul of the colossal carnage unleashed in the Great War, he became an uncharacteristically depressed, isolated, and retiring figure, barely reemerging in public life before his death in 1919.

In today’s America, however, no one learns from our mogul class’s leadership mistakes and moral disasters—we just proceed to copy them faster. So when New York’s neoliberal governor Andrew Cuomo tore a page from the Zuckerberg playbook and launched a system of lavish tax breaks for tech firms affiliated with colleges and universities—surely these educational outposts would be model incubators of just-in-time prosperity—nemesis once again beckoned. Indeed, when Cuomo’s economic savants unleashed tech money to do its own bidding in the notional public sphere, the end results proved to be no different than they had been in the Zuckerberg-funded mogul playground of Newark charter schools. Cuomo’s ballyhooed, billion-dollar, five-year plan for way-new digital job creation—called, you guessed it, “Startup New York”—yielded just seventy-six jobs in 2014, according to a report from the state’s Committee on Economic Development. This isn’t a multiplier effect so much as a subtraction one; it’s hard to see how Cuomo could have netted a less impressive return on investment if he had simply left a billion dollars lying out on the street.

Just as Newark vouchsafed us a vision of educational excellence without the messy parents, neighborhood social ills, and union-backed teachers who louse the works up, so has Cuomo choreographed a seamless model of tax breaks operating in a near-complete economic vacuum. Say what you will about the abuses of Old World wealth; a little noblesse oblige might go a long way in these absurdly predatory times.