Rope-a-Dope

rope-a-dope3By Rodney Swearengin

Source: Adbusters

During the second round of the 1974 epic boxing match billed as the Rumble in the Jungle, Mohammad Ali leaned extraordinarily far back upon the ropes as George Foreman relentlessly bludgeoned Ali’s body and arms. It looked much like the devastating beating Ali took at the hands of Joe Frazier in 1971. Foreman’s notoriously powerful punches were sure to do Ali in as he languished on the ropes round after round. But in the eighth — with Foreman’s stamina sapped — Ali got off the ropes, and went on the attack, winning the bout with a knockout. He called it the “rope-a-dope.”

I feel worked over — not knowing if I can keep up the pace of the caffeine infused all-night drift through a world-wide cataloging of every failure of imagination — large and small — the war, disease, simple stupidity, the latest meme designed to bring a smile all the way to your eyes — brought not only into your living room, but also the kitchen, the bedroom. It seems we&rsquo—re always peering deep into our glowing box, trying to sort out the trouble and hop to the next possible potential of some game-changing inspiration in the incessant production-line flow of recycled mediocrity. But the troubles are never through. The work is never done. That breakthrough — that genius sabot insight never comes.

But the metaphor of production-line work — already passé when McLuhan made us aware of so many similarly irrelevant tropes — is based on psychological responses and concepts conditioned by the former technology — mechanization — of the factory. There is something comforting in the nostalgic ease with which Lucille Ball or Charlie Chaplin revealed the absurdity of Fordist efficiency, the worker as a mere appendage of the machine. Although laughable even then — that was a time in which the worker still had a genuine role to play; being more than an option cheaper than automation. That time is gone.

I feel over worked. But I’ve never worked at the mill. I’ve never done a 12-hour stint keeping pace with cogs and conveyer belts. I’m not being over worked. I’m being worked over — as we all are — not by a craftwork mechanized pace that drives us to exhaustion — but by an alluring rhythm — a rhythm that can at once lull us into acquiescence while at the same time keeping us off balance — all the better mobilized for each permutation of familiar themes. We are mesmerized by the rhythm of electrostatic transmissions coded through glitches of the cybernetic network and the fragments of old media. Cycling through neoclassic postmodern motifs destructured and reformulated into predictably surprising combinations — this rhythm — this aesthetic — makes us move —and more importantly, buy. Consumers at heart, the rhythm sucks us in and incorporates us more completely than any machine ever could. Somehow thinking that we are breaking free from the autonomic conditioning of a youthful wasteland, we wait in eager anticipation for the next issue of a magazine devoted to the pure form of advertising —though in its pages there is none to be found. It makes our consumer heart skip a beat. Like Victorians who wouldn’t dare indulge in such an unsavory act — but nonetheless cannot stop talking about it — we swoon, sway and jerk with the rhythm of the spliced (dis)tasteful image juxtaposed by words of a hopeful, anxious, elliptical cant — breakdown and breakthrough.

I get the breakdown. Where’s the breakthrough? We talk and all the while we’re being worked over. And this is no massage. This is a beat down. In the expanded edition of his vintage Politics and Vision, Sheldon Wolin argued that the particular rhythm of our contemporary aesthetic has been put to expert use by the new corporate form of governance he called “inverted totalitarianism.” Perhaps Wolin really put his finger on our fatal flaw when he suggested that the “cascades of ‘critical theory’ and their postures of revolt, and the appetite for theoretical novelty, function as support rather than opposition” to capitalism, because this sort of frenetic, syncopated, decentering only “encourages its rhythms.” Like a prizefighter — agile, yet made of solid, consolidated muscle. The centralized corporate entity gets in step with our fancy footwork — bobs and weaves into every new channel of communication and community, coopts every sophistication of critique, adopts the most non-hierarchical, horizontal stance of organization and deployment — moving with the rhythm — adapting the rhythm to its own purpose — waiting for the opportunity to unload its notoriously devastating punch — coming in on the trash talker of dissent — Muhammad Ali stumbling back on the ropes, body blow after wicked body blow — pummeled — worked over completely.

I don’t want to go down on the ropes. Where’s the rope-a-dope? Where’s the rope-a- dope?!

 

Saturday Matinee: Mini Doc Double Feature

earth-eyes-9447950-400-300

“Our False Reality” and “The Lie We Live”: two recent short videos with a thematic connection. The first was produced by Aaron Dykes and Melissa Melton of TruthstreamMedia.com and explores how those who seek power and control use technology to manipulate and engineer the masses right down to the perception of reality. The second film, produced by Spencer Cathcart, offers a brief but expansive overview of systems of control with a reminder of the positive potential of communications technology.

Grooming Students for A Lifetime of Surveillance

index

The same technologists who protest against the NSA’s metadata collection programs are the ones profiting the most from the widespread surveillance of students.

By Jessy Irwin

Source: Model View Culture

Since 2011, billions of dollars of venture capital investment have poured into public education through private, for-profit technologies that promise to revolutionize education. Designed for the “21st century” classroom, these tools promise to remedy the many, many societal ills facing public education with artificial intelligence, machine learning, data mining, and other technological advancements.

They are also being used to track and record every move students make in the classroom, grooming students for a lifetime of surveillance and turning education into one of the most data-intensive industries on the face of the earth. The NSA has nothing on the monitoring tools that education technologists have developed in to “personalize” and “adapt” learning for students in public school districts across the United States.

(Mega)data Collection + Analysis

“Adaptive”, “personalized” learning platforms are one of the most heavily-funded verticals in education technology. By breaking down learning into a series of tasks, and further distilling those tasks down to a series of clicks that can be measured and analyzed, companies like Knewton (which has raised $105 million in venture capital), or the recently shuttered inBloom (which raised over $100 million from the Gates Foundation) gather immense amounts of information about students into a lengthy profile containing personal information, socioeconomic status and other data that is mined for patterns and insights to improve performance. For students, these clickstreams and data trails begin when they are 5 years old, barely able to read much less type in usernames and passwords required to access their online learning portals.

Data collection and number crunching aren’t the only technologies being explored to revolutionize education– technology billionaire and philanthropist Bill Gates funded a $1.1 million project to fit middle-school students with biometric sensors to monitor their response and engagement levels during lessons, and advocated a $5 billion program to install video cameras in every classroom to record teachers for evaluation.

The Family Educational Rights and Privacy Act, a law put in place in 1974 to protect student academic records, does nothing to protect student data when it is in the hands of education technology companies. Instead, FERPA threatens to take federal funding away from schools who are found to have breached student privacy while it fails to mandate bare minimum security standards for the storage and transmission of student data. In fact, a recent revision of FERPA increased the power that companies have to collect and mine student data.  Though lawmakers and privacy advocates are regularly outraged at the immense volume of student data freely floating through the web, the repeated failure to create legislation that protects student data from being used for profit is astounding.

One thing is clear: those who have the power to protect student privacy will not do so as long as they can continue to subsidize the cost of public education with student data.

Internet Censorship in Schools

In most educational institutions, the vast majority of IT operations are focused on monitoring, filtering and blocking web traffic instead of building secure networks that safeguard student records and sensitive behavioral data. Nowhere is this more apparent than in the widespread adoption of web filtering software tools in K-12 schools. Usage of these technologies is required for compliance with programs like E-Rate, which grant federal money to schools to fund internet access for their students.

To be eligible for funding from the E-Rate program, schools are required to comply with federal regulations that ban access to websites displaying pornography, graphic material, or any other that could otherwise be judged as immoral, improper or lewd. More often than not, this subjective criteria is determined by the opinions and belief systems of school administrators under political pressure to deny students access to content on controversial issues about topics like evolution, birth control and sex education. These decisions disproportionately affect young girls and LGBTQ students by denying them access to sites that provide important information about their rights, their developing bodies, their sexuality and their access to contraceptives. In the case of Securly, the first filtering tool designed for schools, the controls set by IT and administration for web access can extend far beyond the walls of the school and determine what content students can access while using school- issued machines from their home internet connections.

Despite the many positive contributions of the internet in the distribution and dissemination of knowledge across the planet, students are regularly denied access to valuable information that could positively impact their learning… all to safeguard a small percentage of federal budget money granted to their schools. The implications of this are particularly severe for low-income students who do not have access to the Internet at home; without the ability to freely access the web on their own terms, their digital literacy skills lag behind those of their affluent peers. Though teachers request better and broader internet access for students in their classrooms, administrator-imposed blocks and filters on school internet leave most students woefully unprepared to navigate the realities of the web. When students do find a way around the tools used to limit their access to the outside world (this happened with a group of students who were given iPads in the Los Angeles United School district last year), they’re labelled as “hackers” or miscreants, and disciplined for using Tor, a tool popular among students for anonymous web browsing and circumventing blacklists that ban websites from school networks.

Social Media Surveillance

Schools are adopting many other surveillance technologies with unprecedented reach into the private communications and lives of students and their families. In Lower Merion, PA, a suburb outside of Philadelphia, educators engaged remote administration tools on students’ laptops to regularly spy on their activities while at home. In a case that made its way into federal courts, one student was punished by administrators who mistook candy pictured through his laptop’s camera for drugs. While the full extent of the spying was never exposed, parents and students have expressed concern about educators having the ability to watch young girls undress in the privacy of their homes, unaware that they were being watched through their school-issued computers.

In 2013, the Glendale Unified School District in Glendale, CA took a move straight from the NSA surveillance handbook by seeking out a $40,000 contract with Geo Listening, a social media monitoring company that charges schools to eavesdrop on student social media chatter. While the company claims to only access posts that are public in the school districts they work with, and says it works closely with school districts to tailor their monitoring programs to prevent cyberbullying, suicide and active shooter incidents, it is very easy— too easy, in fact— to use such technologies to identify and target students who have been labeled deviant or delinquent within their communities, or who are otherwise outspoken and critical of their teachers and schools.

Schools are also demanding access to students’ social media communications in ways that severely harm their constitutionally protected rights to free speech. In Minnewaska, MN, a female student who complained about a hall monitor’s behavior in a Facebook post was questioned and given in-school suspension. Later, when a parent reported the student for “sexting” over Facebook with a classmate, she was removed from class again as a group of educators and a police officer armed with a taser demanded that the student hand over her password. They then read private communications that took place outside of school through her Facebook account. After being pulled from class multiple times, suspended from school, and barred from attending a school field trip (the same punishment was not doled out to the male student involved in the messaging), the ACLU stepped in to defend the student’s right to privacy and free speech in communications outside of school property. Though the ruling in the case upheld students’ protection under the 1st and 4th amendments, school districts around the country continue to demand access to students’ social media accounts and threaten to mark students’ academic records to make it difficult to get into a desired university or to seek other avenues for continued education.

Physical Surveillance

In addition to the online monitoring taking place in schools, there are many surveillance mechanisms in place to enforce physical security in public schools. Since the shootings that took place at Virginia Tech in 2007, and again after those that took place in Sandy Hook, CT in 2012, technology companies have launched myriad tools designed to minimize the potential loss of life in the next active shooting incident at a school. Some of these technologies include:

By preying on the absolute worst fears of administrators and parents across the country, technology companies are earning millions of dollars selling security “solutions” that do not accurately address the threat model these tools claim to dispel. School districts that purchase these systems further perpetuate the farce of security theater and infringe on students’ rights to privacy and individual freedom.

A Lifetime of Surveillance

When we develop and use educational technologies that monitor a student’s every moment in school and online, we groom that student for a lifetime of surveillance from the NSA, from data brokers, from advertisers, marketers, and even CCTV cameras. By watching every move that students make while learning, we model to students that we do not trust them– that ultimately, their every move will be under scrutiny from others. When students recognize that they are being watched, they begin to act differently– and from that very moment they begin to cede one small bit of freedom at a time.

Though the education technology revolution continually promises a silver bullet that will be a great democratizing force for all of society’s ills, it categorically disregards the patriarchal power structures and biases that both legitimate and perpetuate discrimination against minorities and marginalized groups. Despite it being well within the scope of educational technology tools to track, identify and expose biases towards groups of students, technologists avoid implementing small changes that monitor educator performance and correct for unconscious biases that negatively affect student learning. Because the surveillance taking place in schools is typically based on qualitative criteria like morality, appropriateness and good behavior, these technologies extend current practices and prejudices that perpetuate injustices against marginalized groups.

There are few to no safeguards built into the online and offline monitoring systems to protect students from the abuse of these tools. Young female students who are active on social media can be unfairly targeted, slut-shamed and disciplined for suggestive language that takes place outside of school, while their male counterparts are not held equally accountable for participating in sexually charged online conversations. Youth of color, a group that is disproportionately stereotyped as angry, aggressive, and unpredictable by educators, can easily be monitored, disciplined, and entered into the juvenile justice system for any outburst that could vaguely be misinterpreted as a threat to a homogeneous caucasian school culture. Any student grappling with issues of abuse, depression, disability, gender identity or sexuality could easily be discovered by online surveillance tools, stigmatized and outed to their teachers, parents and wider community.

Education technologists also continue to widen the digital divide between affluent and economically oppressed. Despite an industry-wide insistence that technology is not being developed to replace educators in the classroom, many poor school districts faced with massive budget cuts are implementing experimental blended learning programs reliant on “adaptive” and “personalized” software as a way to mitigate the effect of large class sizes on student learning. This means that students who attend costly private schools or live within rich school districts that can afford to employ more educators and maintain smaller class sizes receive much more personalized instruction from their teachers. Instead of receiving much-needed interaction and personalized learning directly from educators, poor students living in disadvantaged communities receive instruction from educational software that collects their data (which is likely to be sold), and have less individual instruction time from teachers than their affluent counterparts.

By developing technologies that collect, track, record, analyze every move a student makes both online and off, technologists and investors and educators are ensuring that today’s students will have less privacy than any other generation that came before them, threatening to make privacy and anonymity unattainable for future generations. Though the surveillance mechanisms at play in education technologies affect the privacy of millions of students who pass through the education system each year, this system is a profound, persistent threat to the privacy and individual liberty of LGBTQ students, low-income students, and students of color who have already been so severely failed by the status quo.

Ironically, the same technologists and investors who protest against the NSA’s metadata collection programs are the ones profiting the most from the widespread surveillance of students across the country, by building educational tools with the same function.

DATAcide: The Total Annihilation of Life as We Know It

panopticon-image

By Douglas Haddow

Source: Adbusters

“So tell me, why did you leave your last job?” he asks.

The first thing I remember about the internet was the noise. That screeching howl of static blips signifying that you were, at last, online. I first heard it in the summer of ’93. We were huddled around my friend’s brand new Macintosh, palms sweaty, one of us on lookout for his mom, the others transfixed as our Webcrawler search bore fruit. An image came chugging down, inch by inch. You could hear the modem wince as it loaded, and like a hammer banging out raw pixels from the darkness beyond the screen, a grainy, low-res jpeg came into view. It was a woman and a horse.

Since then, I’ve had a complicated relationship with the internet. We all have. The noise is gone now, and its reach has grown from a network of isolated weirdos into a silent and invisible membrane that connects everything we do and say.

“I needed a bigger challenge,” I say. This is a lie.

The brewpub we’re in has freshly painted white walls and a polished concrete floor, 20 ft ceilings and dangling lightbulbs. It could double as a minimalist porn set, or perhaps a rendition chamber. Concrete is easy to clean. The table we’re at is long and communal. Whenever someone’s smartphone vibrates we all feel it through the wood, and we’re feeling it every second minute — a look of misery slicing across my face when I realize it’s not mine.

“Tell me about your ideal process,” the guy sitting down the table from us says. My eyes strain sideways. He looks to be about thirty; we all do. Like a young Jeff Bezos, his skin is the color of fresh milk. He’s dressed like a Stasi agent trying to blend in at a disco. Textbook Zuckercore: a collared blue-green plaid shirt unbuttoned with a subdued grey-on-grey graphic tee, blue jeans and sneakers. Functional sneakers. Tech sneakers. This is a tech bar. Frequented by tech people who do tech things. The park down the street is now a tech park. That’s where the tech types gather to broadcast their whimsy and play inclusive non-sports like Quidditch, which, I’m told, is something actual people actually do. It’s a nerd paradise where the only problems that exist are the ones that you’re inspired to solve. And I want in on it, because I want to believe.

“I’m a big fan of social,” I blurt out as an aside. He replies with a calm and ministerial nod. Nobody says “social media” anymore, it’s just “social” now.

My atoms are sitting here drinking a beer, being interviewed for a position at a firm that specializes in online brand management systems. Which is a euphemism for a human centipede of marketers selling marketing to marketers for marketing. The firm is worth a billion dollars. You’ve never heard of it. It’s the type of place where they force you to play ping-pong if you come in looking depressed. Meet the new boss, same as the old boss, except this one is very concerned that you see him as a positive force in the universe.

I’m here, bringing the cold beer to my dry lips and bobbing my head in my best impersonation of someone who doesn’t feel ill when he hears the words “key metrics,” “familiarity,” “control groups” and “variant groups.” It’s the dawn of the new creative economy, and I can dig it. I’m here, but I’m also spread across the internet in a series of containers. I’m in Facebook, I’m in Instagram, I’m in Google, I’m in Twitter and a thousand other places I never knew existed. Depending how my body is disposed of, it will either become dirt or atmosphere. But the digital atoms will live forever, or at least until civilization is incinerated by whatever means we choose to off ourselves.

“What about this position interests you?” he asks.

When the TechCrunchers preach the gospel of disruption, it’s from an industrial perspective that sees life on Earth as a series of business models to be upended. Disrupt or die is the motto, but they never mention the disruptees — the travel agents, the cab drivers, the bellhops. The journalists. The meat in the box before the box is crushed by the anvil of innovation.

“People have ideas about things but it’s a bunch of things. Sign up flow for example, high level things, but sometimes I think — let’s table this for now and put together some idea maps. I feel so empowered because we’re aligned,” someone else says. I look around but can’t trace the source.

It’s hard to focus on his questions when all the conversations occurring parallel to ours combine in a cacophony of sameness, as if we’re all Tedtalking a mantra of ancient buzzwords: Engagement. Intuitive. Connection. User base. Revolutionary. It’s like coke talk gone sour, not words that are meant to say things, but stale semiotics that signify you belong. This is the the new language of business. This is where Wall Street goes to find itself.

“I traded in my suit for khakis and sunglasses,” one of them says. But he’s wearing neither. “That’s the best decision you’ve ever made bro,” his colleague replies.

These are the most boring people on the planet. And it’s their world now, we’re just supplying the data for it. The game is simple: dump venture capital into a concept, get the eyeballs, take the data and profit. But the implications of this crude scheme are profound. Beyond all the hype, something weird is happening.

I can’t eat without instagramming my food. I can’t shit without playing Candy Crush. I can’t even remember who half the people are on my Facebook feed, but I’ll still mindlessly scroll through their tedious status updates and wince at their tacky wedding photos. Out of these aimless swipes, clicks and likes, a new world is being born. A world where everything we do, no matter how inane, is tracked, recorded, sorted and analyzed. Yahoo CEO Marissa Mayer has said the whole process is “like watching the planet develop a nervous system.” And through this system, every human action has become a potential source of profit for our data lords, a signal for them to identify and exploit.

“We are about to enter a world that is half digital and half physical, and without properly noticing, we’ve become half bits and half atoms. These bits are now an integral part of our identity, and we don’t own them,” says Hannes Grassegger.

Grassegger is a German economics journalist who was raised in front of his mom’s Macintosh, and later, on a Commodore 64 he got for his sixth birthday. He recently wrote “Das Kapital bin ich” (I am Capital), a book that has been criticized by the European left for being too capitalist, and by the right for being the communist manifesto of the digital era. In it he tries to answer a deceptively simple question: if our data is the oil of the 21st century, then why aren’t we all sheikhs?

“We’ve all been sharing. But the smart ones have been collecting — and they’ve packed us into their clouds,” he says. “Privacy. Transparency. Surveillance. Security gap. I don’t want to hear about it. These are sloppy downplayings of a radical new condition: We don’t own ourselves any more. We are digital serfs.”

Like Grassegger, and like everybody else, I was lured into this radical new condition with the feel-good promises of connection, friendship and self-expression. Apps, sites and services that allowed us to share what we loved, and do what we wanted. For Grassegger, these platforms were merely fresh lots ready to be ploughed, and in turn they kept the harvest: our feelings, thoughts, experiences and emotions, encoded in letters and numbers. Now they’re putting it all to work, exploiting these assets with algorithms and sentiment analysis, and our virtual souls are toiling even while we sleep.

His solution to this dilemma is practical and pragmatic, siding with a lesser evil of establishing a personalized free data market, which would allow us to exploit our information before others do it for us, arguing that “We must carry into the new space those rights and freedoms we eked out in the physical world centuries ago. The ownership over ourselves and the freedom to employ this property for our own benefit. Only this will help us leave behind our self-imposed digital immaturity.”

“KRRAAAAASHH!”

A waitress lets a pint glass slip from her hand and shatter on the floor, but no one bothers to look over; they’re too engaged. Then I notice something eerie about the vibe in this place. There’s no sneering, no sarcasm, and no self-deprecation. Everyone is just sort of floating along in an earnest tranquility. As if each anecdote about “that cool loft I found on Airbnb” contained some deep spiritual significance beyond my grasp.

My interrogator goes for a piss and I load up Facebook in the interim, hoping to find a shard of inspiration in my feed that will provide a topical talking point. Instead I find a listicle. A curiosity gap headline. An ad. A solicitation. Another ad. Another listicle. Oh dear, someone has lost their phone. And finally, an ad in the form of a listicle. Or is it a listicle in the form of an ad?

We were told to surf the web, but in the end, the web serf’d us. Yet there’s a worse fate than digital serfdom, as Snowden’s ongoing NSA revelations suggest. This isn’t simply about the commodification of all human kinesis, it’s the psychological colonialism that makes the commodification possible.

The nature of this bad trip was hinted at in June when we learned that Facebook manipulated the emotional states of nearly 700,000 of its users. Half of those chosen for the study were fed positivity, the others, despair. “The results show emotional contagion,” the Facebook scientists told us, meaning that they had discovered that alternating between positive and negative stimulus does indeed affect our behaviour. Or perhaps rediscovered. There’s a precedent for this. We’ve been here before.

Burrhus Frederic Skinner, known simply as B.F. to his BFFs, is best known as the psychologist with the painfully large forehead who tried to convince the world that free will was an illusion. But he wasn’t always so dire. He was once a young man with hopes and dreams who wrote poems and sonnets and wanted to become a stream-of-consciousness novelist like his idol, Marcel Proust. He failed miserably and it led him to conclude that he wasn’t capable of writing anything of interest because he had nothing to say. Frustrated and bitter, he resolved that literature was irrelevant and it should be destroyed, and that psychology was the true art form of the 20th century. So he went to Harvard and developed the concept of operant conditioning by putting a rat in a cage and manipulating its behaviour by alternating positive and negative stimulus. Now we’re the rats in the cage, only we don’t know where the cage ends and where it begins.

“What’s your five year vision for social?” he asks.

There’s a right way and a wrong way to answer this question. The wrong way is to be critical and cast scepticism on the internet’s role in our lives. For instance, you could draw a parallel between Facebook’s probing of emotional contagion and the Pentagon’s ongoing research into how to quash dissent and manage social unrest. Or you could mention how the Internet of Things will inevitably consolidate corporate power over our personal liberty unless we implement strict regulations on what part of ourselves can and cannot be quantified. But if you did that, you’d upset the prevailing good vibes and come off like a sickly paranoiac in desperate need of some likes.

The right way is to turn off, buy in and cash out. Reinforce the grand narrative and talk about how social is going to bring people together, not just online, but in the real world. How it will augment our interactions and make us more open. How in five years you’ll be able to meet your true love through an algorithm that correlates your iTunes activity to your medical history and how that algorithm will be worth a billion fucking dollars. And it’s through that magical cloud of squandered human potential that Skinner emerges once again and starts poking his finger into your brain.

After establishing himself as a household name, Skinner was finally able to live out his dream of writing a novel. That novel was Walden Two, a story about a utopian commune where people live a creative and harmonious life in accordance to the principles of radical behaviourism. In contrast to 1984 and Brave New World, it was meant to be a positive portrayal of a technologically-enabled utopian ideal. In it he writes, “The majority of people don’t want to plan. They want to be free of the responsibility of planning. What they ask for is merely some assurance that they will be decently provided for. The rest is a day-to-day enjoyment of life.”

In the late 60s, Walden Two directly inspired a series of attempts to create real world versions of the fictional community it described. These were just a few of the thousands of communes that were being established across America at that time. Some thrived, but the majority fell apart within a couple short years. They failed for a number of reasons: latrines overflowed, the tofu supply ran out, the livestock starved to death and so forth. But what many of them had in common was a cascading systems failure of their foundational hypothesis — that social change could be achieved through self-transformation and the problems of power could be solved simply by ignoring them. There was always a Machiavellian in the transformational mist, though, and a refusal to acknowledge outright how power creates invisible structures that undermine the potential for cooperative action ultimately led to their implosion. It’s in this stale pub, with its complimentary WiFi and overpriced organic popcorn, that those invisible power structures continue to thrive.

“There has to be incentive. There has to be. You can’t force people to use it,” a woman in the corner mutters. She’s among a cluster of people who for some reason are all carrying the same cheap, ugly backpack. Her hand gestures become more aggressive as the conversation progresses and she looks to be caught in a moment midway between panic and ecstasy. Her expression would make the perfect emoji for the inertia of our time. It looks sort of like this: (&’Z)

“Our notions of digital utopianism are deeply rooted in a communal wing of American counter-culture from the 1960s. That group of people have had an enormous impact on how we do technology. Many of the leading figures in technology come from that wing, Steve Jobs would be one,” says Fred Turner, a communications professor at Stanford University who researches and writes about how counterculture and technology interact.

“Their ideas of what a person is and what a community should be has suffused our idealized understanding of what a virtual community can be and what a digital citizen should be. That group believed that what you had to do to save the world was to build communities of consciousness — places where you would step outside mainstream America and turn away from politics and democracy, turn away from the state, and turn instead to people like yourself and to sharing your feelings, your ideas and your information, as a way of making a new world.”

There’s a fault line that runs underneath the recycling bins of America’s abandoned hippy communes all the way to my cracked iPhone 5 screen. And if there is one man who epitomizes the breadth of this fault, it’s Stewart Brand.

In 1968, Brand published the Whole Earth Catalog, an internet before the internet that provided a directory of products for sustainable, alternative and creative lifestyles, and helped connect those who pursued them. When the Whole Earth Catalog went out of business in 1971, Brand threw a “demise party” wherein the audience got to choose who would receive the magazine’s remaining twenty grand. They chose to give it to Fred Moore, an activist moonlighting as a dishwasher, who would go on to found the Homebrew Computer Club — the birthplace of Apple and the PC. In the 80s Brand launched The Whole Earth ’Lectronic Link, one of the world’s first virtual communities. Following its success, he started the Global Business Network — a think tank to shape the future of the world. They’ve worked on “navigating social uncertainty” with corporations like Shell Oil & AT&T, among others. In 2000, GBN was bought by Monitor Group, a consultancy firm that made headlines in 2011 by earning millions of dollars from the Libyan Government to manage and enhance the global profile of Muammar Gaddafi.

Brand’s most enduring legacy will likely come from coining the phrase “information wants to be free,” which serves as the business model for the Actually Existing Internet and the Big Data dream.

Looking around the brewpub, listening to the chatter, and staring into the bright blue eyes of my would-be employer, you can almost hear the words of Google CEO Eric Schmidt echo against the minimalist decor: “We know where you are. We know where you’ve been. We can more or less guess what you’re thinking about.”

In San Francisco, my fellow disruptees have taken to the streets and kicked off a proper bricks & bottle backlash against this sort of dictator-grade hubris that has come to define the Internet of Kings. Crude graffiti reading “DIE TECHIE SCUM” is scrawled on the sidewalk next to Googlebus blockades. TECH = DEATH signs are held up at protests. Tires are slashed, windows are smashed and #techhatecrimes is a hashtag that is being passed around Silicon Valley without a hint of irony.

Just down the street from where I’m sitting, a more passive form of protest has manifested in the form of a new café that promises an escape from the incessant blips and bleeps of the internet and its accoutrements.The tables there are also long and communal, but they’re wrapped inside an aluminum metal mesh designed to interrupt and restrain wireless signals and WiFi.

We are not going to escape this crisis by putting ourselves in a cage. There is no opt-out anymore. You can draw the blinds, deadlock your door, smash your smartphone, and only carry cash, but you’ll still get caught up in their all-seeing algorithmic gaze. They’ve datafied your car, your city and even your snail mail. This is not a conspiracy, it’s the status quo, and we’ve been too busy displacing our anxiety into their tidy little containers to realize what’s going on.

“Do you have any questions for me?” he finally asks, abruptly. My beer is empty, I’m thirsty for another, and the interview hasn’t gone well. I’ve failed to put on a brave face and the only questions that I have concern how much money I’m going to make. Will it be enough to pay for my escalating rent now that the datarazzi have moved into the neighborhood? Or will I have to drive an Über in my spare time to make ends meet?

The internet is a failed utopia. And we’re all trapped inside of it. But I’m not willing to give up on it yet. It’s where I first discovered punk rock and anarchism. Where I learned about the I Ching and Albert Camus while downloading “Holiday in Cambodia” at 15kbps. It’s where I first perved out on the photos of a girl I would eventually fall in love with. It’s home to me, you and everybody we know.

No, the appropriate question to ask is: “What is the purpose of my life?”

I’ve seen the best minds of my generation sucked dry by the economics of the infinite scroll. Amidst the innovation fatigue inherent to a world with more phones than people, we’ve experienced a spectacular failure of the imagination and turned the internet, likely the only thing between us and a very dark future, into little more than a glorified counting machine.

Am I data, or am I human? The truth is somewhere in between. Next time you click I AGREE on some purposefully confusing terms and conditions form, pause for a moment to interrogate the power that lies behind the code. The dream of the internet may have proven difficult to maintain, but the solution is not to dream less, but to dream harder.