The world wide cage

zuckerberg_VR_people-625x350

Technology promised to set us free. Instead it has trained us to withdraw from the world into distraction and dependency

By Nicholas Carr

Source: Aeon

It was a scene out of an Ambien nightmare: a jackal with the face of Mark Zuckerberg stood over a freshly killed zebra, gnawing at the animal’s innards. But I was not asleep. The vision arrived midday, triggered by the Facebook founder’s announcement – in spring 2011 – that ‘The only meat I’m eating is from animals I’ve killed myself.’ Zuckerberg had begun his new ‘personal challenge’, he told Fortune magazine, by boiling a lobster alive. Then he dispatched a chicken. Continuing up the food chain, he offed a pig and slit a goat’s throat. On a hunting expedition, he reportedly put a bullet in a bison. He was ‘learning a lot’, he said, ‘about sustainable living’.

I managed to delete the image of the jackal-man from my memory. What I couldn’t shake was a sense that in the young entrepreneur’s latest pastime lay a metaphor awaiting explication. If only I could bring it into focus, piece its parts together, I might gain what I had long sought: a deeper understanding of the strange times in which we live.

What did the predacious Zuckerberg represent? What meaning might the lobster’s reddened claw hold? And what of that bison, surely the most symbolically resonant of American fauna? I was on to something. At the least, I figured, I’d be able to squeeze a decent blog post out of the story.

The post never got written, but many others did. I’d taken up blogging early in 2005, just as it seemed everyone was talking about ‘the blogosphere’. I’d discovered, after a little digging on the domain registrar GoDaddy, that ‘roughtype.com’ was still available (an uncharacteristic oversight by pornographers), so I called my blog Rough Type. The name seemed to fit the provisional, serve-it-raw quality of online writing at the time.

Blogging has since been subsumed into journalism – it’s lost its personality – but back then it did feel like something new in the world, a literary frontier. The collectivist claptrap about ‘conversational media’ and ‘hive minds’ that came to surround the blogosphere missed the point. Blogs were crankily personal productions. They were diaries written in public, running commentaries on whatever the writer happened to be reading or watching or thinking about at the moment. As Andrew Sullivan, one of the form’s pioneers, put it: ‘You just say what the hell you want.’ The style suited the jitteriness of the web, that needy, oceanic churning. A blog was critical impressionism, or impressionistic criticism, and it had the immediacy of an argument in a bar. You hit the Publish button, and your post was out there on the world wide web, for everyone to see.

Or to ignore. Rough Type’s early readership was trifling, which, in retrospect, was a blessing. I started blogging without knowing what the hell I wanted to say. I was a mumbler in a loud bazaar. Then, in the summer of 2005, Web 2.0 arrived. The commercial internet, comatose since the dot-com crash of 2000, was up on its feet, wide-eyed and hungry. Sites such as MySpace, Flickr, LinkedIn and the recently launched Facebook were pulling money back into Silicon Valley. Nerds were getting rich again. But the fledgling social networks, together with the rapidly inflating blogosphere and the endlessly discussed Wikipedia, seemed to herald something bigger than another gold rush. They were, if you could trust the hype, the vanguard of a democratic revolution in media and communication – a revolution that would change society forever. A new age was dawning, with a sunrise worthy of the Hudson River School.

Rough Type had its subject.

The greatest of the United States’ homegrown religions – greater than Jehovah’s Witnesses, greater than the Church of Jesus Christ of Latter-Day Saints, greater even than Scientology – is the religion of technology. John Adolphus Etzler, a Pittsburgher, sounded the trumpet in his testament The Paradise Within the Reach of All Men (1833). By fulfilling its ‘mechanical purposes’, he wrote, the US would turn itself into a new Eden, a ‘state of superabundance’ where ‘there will be a continual feast, parties of pleasures, novelties, delights and instructive occupations’, not to mention ‘vegetables of infinite variety and appearance’.

Similar predictions proliferated throughout the 19th and 20th centuries, and in their visions of ‘technological majesty’, as the critic and historian Perry Miller wrote, we find the true American sublime. We might blow kisses to agrarians such as Jefferson and tree-huggers such as Thoreau, but we put our faith in Edison and Ford, Gates and Zuckerberg. It is the technologists who shall lead us.

Cyberspace, with its disembodied voices and ethereal avatars, seemed mystical from the start, its unearthly vastness a receptacle for the spiritual yearnings and tropes of the US. ‘What better way,’ wrote the philosopher Michael Heim in ‘The Erotic Ontology of Cyberspace’ (1991), ‘to emulate God’s knowledge than to generate a virtual world constituted by bits of information?’ In 1999, the year Google moved from a Menlo Park garage to a Palo Alto office, the Yale computer scientist David Gelernter wrote a manifesto predicting ‘the second coming of the computer’, replete with gauzy images of ‘cyberbodies drift[ing] in the computational cosmos’ and ‘beautifully laid-out collections of information, like immaculate giant gardens’.

The millenarian rhetoric swelled with the arrival of Web 2.0. ‘Behold,’ proclaimed Wired in an August 2005 cover story: we are entering a ‘new world’, powered not by God’s grace but by the web’s ‘electricity of participation’. It would be a paradise of our own making, ‘manufactured by users’. History’s databases would be erased, humankind rebooted. ‘You and I are alive at this moment.’

The revelation continues to this day, the technological paradise forever glittering on the horizon. Even money men have taken sidelines in starry-eyed futurism. In 2014, the venture capitalist Marc Andreessen sent out a rhapsodic series of tweets – he called it a ‘tweetstorm’ – announcing that computers and robots were about to liberate us all from ‘physical need constraints’. Echoing Etzler (and Karl Marx), he declared that ‘for the first time in history’ humankind would be able to express its full and true nature: ‘we will be whoever we want to be.’ And: ‘The main fields of human endeavour will be culture, arts, sciences, creativity, philosophy, experimentation, exploration, adventure.’ The only thing he left out was the vegetables.

Such prophesies might be dismissed as the prattle of overindulged rich guys, but for one thing: they’ve shaped public opinion. By spreading a utopian view of technology, a view that defines progress as essentially technological, they’ve encouraged people to switch off their critical faculties and give Silicon Valley entrepreneurs and financiers free rein in remaking culture to fit their commercial interests. If, after all, the technologists are creating a world of superabundance, a world without work or want, their interests must be indistinguishable from society’s. To stand in their way, or even to question their motives and tactics, would be self-defeating. It would serve only to delay the wonderful inevitable.

The Silicon Valley line has been given an academic imprimatur by theorists from universities and think tanks. Intellectuals spanning the political spectrum, from Randian right to Marxian left, have portrayed the computer network as a technology of emancipation. The virtual world, they argue, provides an escape from repressive social, corporate and governmental constraints; it frees people to exercise their volition and creativity unfettered, whether as entrepreneurs seeking riches in the marketplace or as volunteers engaged in ‘social production’ outside the marketplace. As the Harvard law professor Yochai Benkler wrote in his influential book The Wealth of Networks (2006):

This new freedom holds great practical promise: as a dimension of individual freedom; as a platform for better democratic participation; as a medium to foster a more critical and self-reflective culture; and, in an increasingly information-dependent global economy, as a mechanism to achieve improvements in human development everywhere.

Calling it a revolution, he said, is no exaggeration.

Benkler and his cohort had good intentions, but their assumptions were bad. They put too much stock in the early history of the web, when the system’s commercial and social structures were inchoate, its users a skewed sample of the population. They failed to appreciate how the network would funnel the energies of the people into a centrally administered, tightly monitored information system organised to enrich a small group of businesses and their owners.

The network would indeed generate a lot of wealth, but it would be wealth of the Adam Smith sort – and it would be concentrated in a few hands, not widely spread. The culture that emerged on the network, and that now extends deep into our lives and psyches, is characterised by frenetic production and consumption – smartphones have made media machines of us all – but little real empowerment and even less reflectiveness. It’s a culture of distraction and dependency. That’s not to deny the benefits of having easy access to an efficient, universal system of information exchange. It is to deny the mythology that shrouds the system. And it is to deny the assumption that the system, in order to provide its benefits, had to take its present form.

Late in his life, the economist John Kenneth Galbraith coined the term ‘innocent fraud’. He used it to describe a lie or a half-truth that, because it suits the needs or views of those in power, is presented as fact. After much repetition, the fiction becomes common wisdom. ‘It is innocent because most who employ it are without conscious guilt,’ Galbraith wrote in 1999. ‘It is fraud because it is quietly in the service of special interest.’ The idea of the computer network as an engine of liberation is an innocent fraud.

I love a good gizmo. When, as a teenager, I sat down at a computer for the first time – a bulging, monochromatic terminal connected to a two-ton mainframe processor – I was wonderstruck. As soon as affordable PCs came along, I surrounded myself with beige boxes, floppy disks and what used to be called ‘peripherals’. A computer, I found, was a tool of many uses but also a puzzle of many mysteries. The more time you spent figuring out how it worked, learning its language and logic, probing its limits, the more possibilities it opened. Like the best of tools, it invited and rewarded curiosity. And it was fun, head crashes and fatal errors notwithstanding.

In the early 1990s, I launched a browser for the first time and watched the gates of the web open. I was enthralled – so much territory, so few rules. But it didn’t take long for the carpetbaggers to arrive. The territory began to be subdivided, strip-malled and, as the monetary value of its data banks grew, strip-mined. My excitement remained, but it was tempered by wariness. I sensed that foreign agents were slipping into my computer through its connection to the web. What had been a tool under my own control was morphing into a medium under the control of others. The computer screen was becoming, as all mass media tend to become, an environment, a surrounding, an enclosure, at worst a cage. It seemed clear that those who controlled the omnipresent screen would, if given their way, control culture as well.

‘Computing is not about computers any more,’ wrote Nicholas Negroponte of the Massachusetts Institute of Technology in his bestseller Being Digital (1995). ‘It is about living.’ By the turn of the century, Silicon Valley was selling more than gadgets and software: it was selling an ideology. The creed was set in the tradition of US techno-utopianism, but with a digital twist. The Valley-ites were fierce materialists – what couldn’t be measured had no meaning – yet they loathed materiality. In their view, the problems of the world, from inefficiency and inequality to morbidity and mortality, emanated from the world’s physicality, from its embodiment in torpid, inflexible, decaying stuff. The panacea was virtuality – the reinvention and redemption of society in computer code. They would build us a new Eden not from atoms but from bits. All that is solid would melt into their network. We were expected to be grateful and, for the most part, we were.

Our craving for regeneration through virtuality is the latest expression of what Susan Sontag in On Photography (1977) described as ‘the American impatience with reality, the taste for activities whose instrumentality is a machine’. What we’ve always found hard to abide is that the world follows a script we didn’t write. We look to technology not only to manipulate nature but to possess it, to package it as a product that can be consumed by pressing a light switch or a gas pedal or a shutter button. We yearn to reprogram existence, and with the computer we have the best means yet. We would like to see this project as heroic, as a rebellion against the tyranny of an alien power. But it’s not that at all. It’s a project born of anxiety. Behind it lies a dread that the messy, atomic world will rebel against us. What Silicon Valley sells and we buy is not transcendence but withdrawal. The screen provides a refuge, a mediated world that is more predictable, more tractable, and above all safer than the recalcitrant world of things. We flock to the virtual because the real demands too much of us.

‘You and I are alive at this moment.’ That Wired story – under headline ‘We Are the Web’ – nagged at me as the excitement over the rebirth of the internet intensified through the fall of 2005. The article was an irritant but also an inspiration. During the first weekend of October, I sat at my Power Mac G5 and hacked out a response. On Monday morning, I posted the result on Rough Type – a short essay under the portentous title ‘The Amorality of Web 2.0’. To my surprise (and, I admit, delight), bloggers swarmed around the piece like phagocytes. Within days, it had been viewed by thousands and had sprouted a tail of comments.

So began my argument with – what should I call it? There are so many choices: the digital age, the information age, the internet age, the computer age, the connected age, the Google age, the emoji age, the cloud age, the smartphone age, the data age, the Facebook age, the robot age, the posthuman age. The more names we pin on it, the more vaporous it seems. If nothing else, it is an age geared to the talents of the brand manager. I’ll just call it Now.

It was through my argument with Now, an argument that has now careered through more than a thousand blog posts, that I arrived at my own revelation, if only a modest, terrestrial one. What I want from technology is not a new world. What I want from technology are tools for exploring and enjoying the world that is – the world that comes to us thick with ‘things counter, original, spare, strange’, as Gerard Manley Hopkins once described it. We might all live in Silicon Valley now, but we can still act and think as exiles. We can still aspire to be what Seamus Heaney, in his poem ‘Exposure’, called inner émigrés.

A dead bison. A billionaire with a gun. I guess the symbolism was pretty obvious all along.

David Cronenberg’s Videodrome Was a Technology Prophecy

videodrome_one_sheet_movie_poster_l

Editor’s note: Since today marks director David Cronenberg’s 73rd birthday, it’s a good time to appreciate one of his greatest and most notorious works. Though my favorite of his remains the distinctly PKD-like eXistenZ, a close runner up is the cult classic Videodrome, which the following analysis reappraises in the context of contemporary social media-fixated culture.

By Nathan Jurgenson

Source: Omni Reboot

David Cronenberg’s vision of technology as the “new flesh” in Videodrome isn’t so shocking anymore.

Videodrome is the best movie ever made about Facebook.

What felt “vaguely futuristic” about it in 1983 is prescient today: technology and media are ever more intimate, personal, embodied, an interpenetration that David Cronenberg’s film graphically explores.
Videodrome offers a long-needed correction to how we collectively view and talk about technology. As the anti-Matrix, Videodrome understood that media is not some separate space, but something which burrows into mind and flesh. The present has a funny habit of catching up with David Cronenberg.
Still, Videodrome is deeply of its time and place. It’s set in Toronto, where Cronenberg was born and studied at the same time as University of Toronto superstar media theorist Marshall McLuhan, who coined the phrase “the medium is the message.” Beyond McLuhan’s reputation, Toronto was also known as a wired city; among other things, it was an early adopter of cable television.
In suit, Videodrome follows a Toronto cable television president, Max Renn (James Woods). He becomes involved with a radio psychiatrist named Nikki Brand (Debbie Harry, of Blondie fame), who reminds us of popular criticisms of television culture: we want to be stimulated until we’re desensitized, becoming (at best) apolitical zombies and (at worst) amoral monsters. Television signal saturates this film. The satellite dishes, screens, playback devices, and general aesthetics of analogue video are on glorious, geeked-out display. Although Videodrome’s operating metaphor is television, this film can be understood as being a fable about media in general. And what seemed possible with television in 1983 seems obvious today with social media.
Over the course of the film, Max comes to know a “media prophet” named Professor Brian O’Blivion—an obvious homage to Marshall McLuhan. O’Blivion builds a “Cathode Ray Mission,” named after the television set component which shoots electrons and creates images. The Cathode Ray Mission gives the destitute a chance to watch television in order to “patch them back into the world’s mixing board,” akin to McLuhan’s notion of media creating a “global village,” premised on the idea that media and technology, together, form the social fabric. O’Blivion goes on to monologue, “The television screen is the retina of the mind’s eye. Therefore, the television screen is part of the physical structure of the brain. Therefore, whatever appears on the television screen appears as raw experience for those who watch it. Therefore, television is reality; and reality is less than television.”
This is Videodrome’s philosophy. It’s the opposite of The Matrix’s reading of Baudrillard’s theories of simulation, and it goes completely against the common understanding of the Web as “virtual,” of the so-called “offline” as “real.” O’blivion would agree when I claim that “it is wrong to say ‘IRL’ to mean offline: Facebook is real life.”
This logic—that the Web is some other place we visit, a “cyber” space, something “virtual” and hence unreal—is what I call “digital dualism” and I think it’s dead wrong. Instead, we need a far more synthetic understanding of technology and society, media and bodies, physicality and information as perpetually enmeshed and co-determining. If The Matrix is the film of digital dualism, Videodrome is its synthetic and augmented opponent.
As P.J. Rey illustrates, fictional Web-spatiality is the favorite digital dualist plot device. Yet more than fiction books and films, what has come to dominate much of our cultural mythology around the Web is the idea that we are trading “real” communication for something simply mechanical: that real friendship, sex, thinking, and whatever else lazy op-ed writers can imagine are being replaced by merely simulated experiences. The non-coincidental byproduct of inventing the notion of a “cyber” space is the simultaneous invention of “the real,” the “IRL,” the offline space that is more human, deep, and true. Where The Matrix’s green lines of code or Neal Stephenson’s 3D Metaverse may have been the sci-fi milieu of the 1990s, the idea of a natural “offline” world is today’s preferred fiction.
Alternatively, what makes Videodrome, and Cronenberg’s oeuvre in general, so useful for understanding social media is their fundamental assumption that there is nothing “natural” about the body. Cronenberg’s trademark flavor of body-horror is highly posthuman: boundaries are pushed and queered, first through medical technologies in Shivers , Rabid , The Brood , and Scanners , then through media technology in Videodrome  and eXistenZ , then, most notoriously, in The Fly, where the human and animal merge. If The Matrix is René Descartes, Videodrome is Donna Haraway.
Cronenberg’s characters are consistent with Haraway’s theory of the cyborg: not the half-robot with the shifty laser eye, but you and me. In the film, the goal is never to remove the videodrome signal that is augmenting the body, but to reprogram it. To direct it. As Haraway famously wrote, “I’d rather be a cyborg than a goddess.” “Natural” was never a real option anyways.
Max Renn is especially good at finding the real in the so-called “virtual” because he is equally good at seeing virtuality in the “real.” From the beginning, he understands that much of everyday life is a massive media event devoid of meaning. The old flesh is tired, used up, and toxic. The world is filled with a suffering assuaged only by glowing television screens. As the film progresses, the real and unreal blur, making each seem hyperbolic: hallucinations become tangible, while the tangible drips with a surrealism that’s gritty, jumpy, dirty, erotic, and violent—closer to Spring Breakers than The Wizard of Oz. As such, Cronenberg’s universe is always a little sticky: an unease which begs the nightmares to come true, so that we at least know what’s real.
Videodrome’s depiction of techno-body synthesis is, to be sure, intense; Cronenberg has the unusual talent of making violent, disgusting, and erotic things seem even more so. The technology is veiny and lubed. It breaths and moans; after watching the film, I want to cut my phone open just to see if it will bleed. Fittingly, the film was originally titled “Network of Blood,” which is precisely how we should understand social media, as a technology not just of wires and circuits, but of bodies and politics. There’s nothing anti-human about technology: the smartphone that you rub and take to bed is a technology of flesh. Information penetrates the body in increasingly more intimate ways.
This synthesis of the physical and the digital is mirrored in the film’s soundtrack, too. In his book on Videodrome’s production, Tim Lucas calls Howard Shore’s score “bio-electronic” because it was written, programmed into a synthesizer, and played back on a computer in a recording studio while live strings played along. Early in the film, the score is mostly those strings, but as time passes the electronic synthesizers creep up in the mix, forming the bio-electronic synthesis.
The most fitting example of techno-human union in Videodrome is the famous scene of Max inserting his head into a breathing, moaning, begging video screen; somewhere between erotic and hilarious, media and humanity coalesce. There isn’t a person and then an avatar, a real world and then an Internet. They’re merged. As theorists like Katherine Hayles have long taught, technology, society, and the self have always been intertwined. Videodrome knows this, and it shows us with that headfirst dive into the screen—to say nothing of media being inserted directly into a vaginal opening in Max’s stomach, or the gun growing into his hand.

Thirty years after its release, Videodrome remains the most powerful fictional representation of technology-self synthesis. This merger wasn’t invented with the Internet, or even television. Humans and technology have always been co-implicated. We often forget this when talking about the Web, selling ourselves instead a naive picture of defined “virtual” spaces which somehow lack the components of “real” reality. This is why The Matrix and “cyberspace” have long outworn their welcome as a frame for understanding the Internet. It should be of no surprise that body horror is as useful for understanding social media as cyberpunk.

Counterculture: The Rebel Commodity

Screen-Shot-2013-09-23-at-2.48.39-PM

By James Curcio

Source: Rebel News

Let’s talk about being a rebel.

Everyone seems to want to be one. But it’s not entirely clear what it means. Does it take camo- pants? A Che T-shirt? A guitar? Is it just doing the opposite of whatever your parents did? “Be an individual, a rebel, innovate,” so many advertisements whisper. They’d have us believe that True Revolutionaries think different. They use Apple, or drink Coke. We signal our dissent to one another with the music we listen to and the cars we drive.

There’s something very peculiar going on here, something elusive and deeply contentious.

In the 1997 book, Commodify Your Dissent, Thomas Frank laid out a thesis that may appear common sense to those that have watched or lived in the commodified subcultures of the 90s, 00s, and beyond. A New York Times review comments,

… business culture and the counterculture today are essentially one and the same thing. Corporations cleverly employ the slogans and imagery of rebellion to market their products, thereby (a) seizing a language that ever connotes “new” and “different,” two key words in marketing, and (b) coaxing the young effortlessly into the capitalist order, where they will be so content with the stylishly packaged and annually updated goods signifying nonconformity they’ll never so much as consider real dissent — dissent against what Frank sees as the concentrated economic power of the “Culture Trust,” those telecommunications and entertainment giants who, he believes, “fabricate the materials with which the world thinks.” To have suffered the calculated pseudo-transgressions of Madonna or Calvin Klein, to have winced at the Nike commercial in which the Beatles’ “Revolution” serves as a jingle, is to sense Frank is on to something. (After reading Frank, in fact, you’ll have a hard time using words like “revolution” or “rebel” ever again, at least without quotation marks.)

The urge to rebel fuels the same system they ostensibly oppose. Whether it’s in arms trade, or far less ominously, manners of dress and behavior, there are dollars to be made fighting “The Man.” And maybe making money isn’t always an altogether bad thing. But it is certainly a complication, especially for those espousing neo-Marxists ideals.

As Guy Debord observed, “revolutionary theory is now the enemy of all revolutionary ideology and knows it.” Rebel movements are a counterculture, regardless of what they call themselves.

Rebellion is Cool

We’ll begin with a quintessential icon of the branded, shiny counterculture. The Matrix. We’ve probably all seen it. Even as an example it’s a cliché, and that’s part of the point. Here’s a framed sketch of the first movie, for those that haven’t: when it first ran, it was a slick take on the alienation most suburban American youth feel, packaged within the context of the epistemological skepticism Descartes wrestled with in the 17th century. Taken out of the cubicle and into the underworld, we witness the protagonist “keeping it real” by eating mush, donning co-opted fetish fashion, and fighting an army of identical men in business suits in slow motion. The movie superimposes the oligarchic and imperialist powers-that-be atop Neo’s quest of adolescent self-mastery. A successful piece of marketing — you can be sure no one collecting profits or licensing deals let their misgivings about “the Man” keep them from paying the rent.

This is not to point an accusatory finger, but rather to show the essential dependence of the counterculture upon the mainstream, because counter-cultures are not self-sustaining, and every culture produces a counter-culture in its shadow, just as every self produces an other. Any counterculture. Punk, mod, beatnik, romantic, hippy, psychedelic, straight edge, or occult. Even the early adopters of Internet culture started a group of outsiders that shared a collective vision,

The computer enthusiasts who could only dream of an open, global network in 1990 would go on to staff the dot-coms of the next decade. The closed networks that once guarded forbidden knowledge quickly fell by the wayside, and curiosity about computers could no longer be imagined a crime.

Our cyberspace today has its share of problems, but it is no dystopia — and for that, we must acknowledge the key part played by the messy collision of table-top games, computer hacking, law enforcement overreach and cyberpunk science fiction in 1990.

This article explores the strange history of Peter Jackson games, TSR, and the FBI. But it wasn’t the only one. Shadow Run, another popular cyberpunk RPGs of the 1990s, presented one of the more seemingly-improbable of cyberpunk futures, where you could play a freelancing mutant scrambling to survive in an ecosystem of headless corporations connected through cyberspace. Sound familiar? The Matrix just represented the final translation of these and similar fringe narratives into the mainstream.

Future vision has some effect on future reality, both in the identities we imagine for ourselves and the technologies we choose to explore. They almost always have unexpected consequences. Now we carry the networked planet in our palms, granting near instant communication with anyone, anywhere and anytime, and your intended subject isn’t always the only one listening.

We shouldn’t be surprised by this feedback loop. Without laying the material, mythic, and social groundwork for a new society, counterculture cannot be a bridge; it almost invariably leads back to the mainstream, though not necessarily without first making its mark and pushing some new envelope.

This even presents something of a false dichotomy — that old models of business can’t themselves be co-opted by countercultural myths. Yesterday’s counterculture is today’s mainstream. What better way to understand the so-called revolution of iPads or social media?

Our cultural symbols and signifiers are never static. Psychedelic and straight edge can share the same rack in a store if the store owner can co-brand the fashions, and people can brand themselves “green” through their purchasing power without ever leaving those boxes or worrying about the big picture. AdBuster’s Buy Nothing Day still capitalizes on the “rebel dollar.”

Rebellion is cool.  “Cool” is what customers pay a premium for, along with the comfort of a world with easy definitions and pre-packaged cultural rebellions. This process itself isn’t new. The rebel or nonconformist is probably a constitutive feature of the American imagination: original colonies were religious non-conformist, the country was founded by rebellion, the frontier, the civil war, the swinging 20s, Jazz, James Dean, John Wayne, Elvis, the list goes on. The non-conformist imagination is as paradoxically and problematically American as cowboys and indians, apple pie and racism.

The territory between aesthetic, ideals, and social movement is blurry at best. But the most well-known expression of this trend in recent history is the now somewhat idealized 1960s, a clear view of which has been obscured through a haze of pot-smoke and partisan politics. Though this revolution certainly didn’t start in the 1960s, there we have one of the clearest instances of what good bed-fellows mass advertising and manufacturing make when branded under the zeitgeist of the counterculture.

When people bought those hip clothes to make a statement, whose pockets were they lining? It’s a revolving door of product tie-ins, and it all feeds on the needs of the individual, embodied in a sub-culture. The moment that psychedelic culture gained a certain momentum, Madison Avenue chewed it up and spit it back out in 7up ads. That interpretation of what it meant to be a hippie, a revolutionary, became an influence on the next generation. The rise of Rolling Stone magazine could also be seen as an example of this — a counterculture upstart turned mainstream institution.

While advertising and counterculture get along just fine, authenticity and profit often make strange bedfellows. But they aren’t necessarily diametric opposites, either. As movements gain momentum, they present a market, and markets are essentially agnostic when it comes to ideals.

There are many examples of how troubled that relationship can be. The Grunge movement in the 90s, before it was discovered, was just a bunch of poor ass kids playing broken ass instruments in the Pacific Northwest. This was the very reason it struck disenfranchised youth — the relationship between those acts and the aging record industry in many ways seemed to reflect the relationship of adolescent Gen Xers with their Boomer parents. They retained the desire to “drop out,” as Timothy Leary had preached to the previous counterculture generation of Laguna Beach and Haight Ashbury, but without the mystical optimism of “tuning in.” Hunter S Thompson maybe presaged this transition in the quotation from Fear and Loathing In Las Vegas that’s now rendered famous to the kids of 90s thanks to Terry Gilliam’s film adaptation,

We are all wired into a survival trip now. No more of the speed that fueled that 60’s. That was the fatal flaw in Tim Leary’s trip. He crashed around America selling “consciousness expansion” without ever giving a thought to the grim meat-hook realities that were lying in wait for all the people who took him seriously… All those pathetically eager acid freaks who thought they could buy Peace and Understanding for three bucks a hit. But their loss and failure is ours too. What Leary took down with him was the central illusion of a whole life-style that he helped create… a generation of permanent cripples, failed seekers, who never understood the essential old-mystic fallacy of the Acid Culture: the desperate assumption that somebody… or at least some force – is tending the light at the end of the tunnel.

I don’t think it’s a great stretch to imagine the suddenly-famous bands of the Grunge era as a part of this same legacy. Alice In Chains or Nirvana songs about dying drugged out and alone weren’t oracular prophecy, they were journal entry. And it became part of the allure, because it too was “authentic.” The greatest irony of all was that the tragic meltdowns and burn outs that followed on fame’s heels became part of the commodity. (Not that this vulture economy is new to tabloids).

Our narratives about authentic moments of aesthetic expression or innovation often depict them like volcanic eruptions: they build up and acquire force in subterranean and occluded environments, before erupting in a momentary and spectacular public display of creativity. It is telling that this quote from On The Road has become so popular, very likely cited in the papers and journals of more rebellion-minded American teens than any other from that book, “… The only people for me are the mad ones, the ones who are mad to live, mad to talk, mad to be saved, desirous of everything at the same time, the ones who never yawn or say a commonplace thing, but burn, burn, burn like fabulous yellow roman candles exploding like spiders across the stars.”

Hendrix, Joplin, Morrison, Cobain… the 27 Club is big. And quite a few more could be added if it was “the 20-something club.” Are the public self destructions of so many young, creative minds informed by this myth, or do they create it?

Maybe a bit of both. The Spectacle, in the sense Guy Debord uses it, disseminates its sensibilities, styles — a version of the truth. The particular moves ever toward the general, as facts gradually turn to legend and, eventually, myth. Mainstream appropriation is the process in which aesthetic movements affect broader society and culture. The ideals need a pulpit to reach the people, even if invariably it is fitted with guillotines for the early adopters once that message has been heard.

YOUR FATASS DIRTY DOLLAR

A message is a commodity, or it is obscure. Capitalism survives so well, in part, because it adapts to any message. If we instead think counterculture is an ideal that exists somehow apart from plebeian needs like making money, then countercultures will forever hobble itself. It doesn’t matter that these ideologies have little in common. It is the fashion or mystique that gets sold. Anti-corporate ideology sells as well as pro-. When all an ideology really boils down to is an easy to replicate aesthetic, how could they not?

Where do we draw the line between idealism and profit? The question is how individuals utilize or leverage the potential energy represented by that currency, and what ends it is applied to. Hard nosed books on business by the old guard, such as Drucker’s Management: Tasks, Responsibilities, Practices say exactly the same thing, in a less epigrammatic, Yoda-like way: profit is not a motive, it is a means. This much, at least, doesn’t change with the changing of the (sub)cultural tides. Within our present economic paradigm, without profit, nothing happens. Game over.

Those who position themselves as extreme radicals within the counterculture framework just  disenfranchise themselves through an act of inept transference, finding anything with a dollar sign on it questionable. To this view, anyone that’s made a red cent off of their work is somehow morally bankrupt. This mentality generally ends one way: howling after the piece of meat on the end of someone else’s string, working by day for a major corporation, covering their self-loathing at night in tattoos, and body-modifications they can hide. That is, unless they lock themselves in a cave or try to start an agrarian commune. None of this posturing is in any way necessary, since business rhetoric itself has long since co-opted the countercultural message. For instance, this passage from Commodify your Dissent,

Dropping Naked Lunch and picking up Thriving on Chaos, the groundbreaking 1987 management text by Tom Peters, the most popular business writer of the past decade, one finds more philosophical similarities than one would expect from two manifestos of, respectively, dissident culture and business culture. If anything, Peters’ celebration of disorder is, by virtue of its hard statistics, bleaker and more nightmarish than Burroughs’. For this popular lecturer on such once-blithe topics as competitiveness and pop psychology there is nothing, absolutely nothing, that is certain. His world is one in which the corporate wisdom of the past is meaningless, established customs are ridiculous, and “rules” are some sort of curse, a remnant of the foolish fifties that exist to be defied, not obeyed. We live in what Peters calls “A World Turned Upside Down,” in which whirl is king and, in order to survive, businesses must eventually embrace Peters’ universal solution: “Revolution!”

“To meet the demands of the fast-changing competitive scene,” he counsels, “we must simply learn to love change as much as we have hated it in the past.” He advises businessmen to become Robespierres of routine, to demand of their underlings, “‘What have you changed lately?’ ‘How fast are you changing?’ and ‘Are you pursuing bold enough change goals?’” “Revolution,” of course, means for Peters the same thing it did to Burroughs and Ginsberg, Presley and the Stones in their heyday: breaking rules, pissing off the suits, shocking the bean-counters: “Actively and publicly hail defiance of the rules, many of which you doubtless labored mightily to construct in the first place.”

Growth on its own is never a clear indicator that the underlying ideals of a movement will remain preserved. If history has shown anything, it is that successful movements spread until core message becomes an empty, parroted aesthetic, as with most musical scenes and their transition from content to fashion; or that core is otherwise so emphasized that the meaning within is lost through literalism, as we can see in the history of the world’s major religions. One version of early Christian Gnostic history — of “love thy neighbor,” “all is one,” and scurrilous rumors of agape orgies — were replaced by the Roman Orthodoxy and the authority provided through the ultimate union of State and Religion. The hippies traded in their sandals and beat up VWs for SUVs and overpriced Birkenstocks. The relationship between ideology and act is far to complicated to enter into here, but the counter-history of Communism when viewed against the backdrop of Marxist ideals is perhaps equally insightful.

Enantiodromia, the tendency of things to turn into their opposites, is as much social observation as psychological. It oftentimes seems that succeeding too well can be the greatest curse to befall a movement. When the pendulum swings far in one direction, it often turns into its opposite without having the common decency to wait to swing back the other way.

As we’ve seen, this was part of the supposed downfall of counterculture in capitalism: “suits” decided they could deconstruct an organic process and manufacture it. They could own it from the ground up.

But this isn’t necessarily so. The branding of Cirque Du Soleil points toward a third option — arts movements will be dissected in the jargon of marketing, and they must succeed on those grounds to be taken seriously or accomplish anything.

Burning Man isn’t suddenly opening its gates to the wealthy. Yacht Communism has been a part of that movement ever since it gained some mainstream appeal, likely before. Seen as an arts and cultural movement, it has been vastly successful. Seen as an example of how to create a true egalitarian society, it would be an utter failure. But that was never the point.

Two weeks at Burning Man might be fun, even transformative, but spend two years there and you’d find out what hell is like.