Don’t Forget Why Marijuana Legalization Is Winning

index

By Maia Szalavitz

Source: Substance.com

When I first started writing about drugs in the mid-’80s—before I got into recovery in 1988—it was almost impossible to imagine an America where four states and DC have legalized recreational marijuana use, 58% of Florida midterm voters just cast their ballots in favor of legalizing medical use (the measure needed 60% to pass), and California passed a ballot initiative to lower drug and other nonviolent crime sentences. (Nineteen other states have legalized medical marijuana.)

The magnitude of the change is hard to understand without knowing a bit of recent history—and if we are going to continue to move toward rational drug policy, knowing where we’ve been and how it has changed is critical. I offer this perspective through the lens of my own experience covering the drug war for nearly 30 years.

My first national column was called, embarrassingly enough, “Piss Patrol.” I was assigned by High Times to write about corporate urine testing policies, starting around 1987, presumably as a service to stoned readers who were considering their employment options.

Over the next few years, the media would spill so much ink and airtime demonizing crack cocaine that by 1989, 64% of people polled by CBS News said that drugs were the country’s biggest problem—and Republicans and Democrats began tripping over one another to race to pass the harshest possible drug sentencing laws.

High Times itself was targeted by the DEA with frequent demands for its list of subscribers and raids on all of its biggest advertisers of growing supplies, nearly forcing the magazine to close.

Testifying before Congress, LAPD chief Daryl Gates said that casual drug users “ought to be taken out and shot,” and the DARE drug prevention program he founded saw nothing ominous in encouraging kids to turn their parents in to the police if they used drugs. Supreme Court Justice Thurgood Marshall warned in a prescient 1989 dissent in a urine testing case that “there is no drug exception to the Constitution,” although Congress and the rest of the legal establishment apparently begged to differ.

Even today, police can confiscate cash and property they suspect to be involved in drug crimes, without convicting the owners and with virtual impunity. The surveillance revelations about the NSA’s spying on American citizens include cases where that agency has shared information with the DEA that was gathered from phones and computers without a warrant. In fact, the DEA has an official policy of basically lying to defense attorneys—and sometimes even prosecutors and judges—about the source of this data.

Yet even before the rage to pass tough drug laws took off in the 1980s, law enforcement efforts like mandatory minimum sentences were known to be ineffective. The federal government had quietly overturned one set of mandatory drug sentences in the late ‘60s—since they had clearly failed to prevent the late ‘60s.

And New York City would never have been one of the capitals of crack if the 15-to-life “Rockefeller law” mandatory sentences for selling even powder cocaine, which had been in place here since the mid-‘70s, actually suppressed drug use.

As is clear from this brief summary, for most of my adult life, the idea of a rational drug policy seemed literally to be a pipe dream (a term, by the way, from opium dens). So how did we go, in just a few years, from seeing drug users as demon enemies in a war who must be locked up to having the drug czar drop the military language and even speak at last month’s National Harm Reduction Conference in Baltimore?

Many factors are clearly playing a role. Two of the most obvious are the sheer economic burden of having become the world’s most prolific jailer and the drop in violent crime that hasn’t been paralleled by a fall in addiction rates or a reduction in the availability of drugs like marijuana, heroin and cocaine. Some of the crime decrease may, of course, be linked to the 500% rise in the number of prisoners since 1980—but research shows that violent crime fell more in states that have lowered incarceration rates.

Other influences have also been important. One has been the increasing recognition—driven especially by Michelle Alexander’s 2011 bestseller The New Jim Crow—of the racist nature of the drug war. When you know this history of the drug laws it is very hard to justify supporting them.

Another factor is the rise of the Internet. Early adopters of the net tended to be hippies and libertarians: Steve Jobs famously said that his use of LSD was one of the most important experience of his life, for example, and pro-legalization views dominated online before the mainstream media began to realize the web was the future of its business.

This gave legalizers a loud voice—one that had been previously drowned out by a media that had so bought into the drug war that networks and newsmagazines thought nothing of taking government payments to place stories with the “correct” anti-drug slant in lieu of running paid anti-drug ads.

The Internet has also allowed critics—including me—to directly attack inaccurate coverage as it appeared, exposing readers to truthful information about drugs and drug users that was previously hard to find. It is much harder to start a panic when debunkers immediately offer alternative perspectives.

Three other important forces should also be mentioned. First, the Drug Policy Alliance—helped by large donations from billionaire George Soros—spurred activism and funded ballot initiative measures that brought marijuana policy reform out of the fringes and into the mainstream.

Second, the harm reduction movement spurred by the AIDS epidemic quietly racked up successes. As it became clear that needle exchange hadn’t resulted in a massive increase in IV drug use—but had helped halt the spread of HIV—resistance to measures like naloxone to reverse overdose was pre-empted.

In contrast to the fight over needle exchange, when conservative politicians, drug treatment providers and religious leaders actively opposed expansion and claimed, without data, that it would encourage drug use, it’s actually hard now to find anyone who will argue that drug users and their families should not have access to the OD antidote for fear that preventing the deaths of users “sends the wrong message.”

Third, recovery activists have played a role. While there are still reactionary forces like Patrick Kennedy, many people who have come out about their own recovery have made clear that the criminal justice approach has failed. By putting a real face on drug users—not a stereotyped image of a criminal—recovering people have begun to help fight against, rather than support, their own oppression.

Of course, historically, fights for drug law reform have often resulted in backlash—marijuana was almost legalized, for example, under President Jimmy Carter, but instead we got Ronald Reagan’s war on drugs. But the strength and variety of the forces working against that possibility—particularly the rapid access to accurate information—give me hope that we may finally be starting to get drug policy right.

Maia Szalavitz is one of the nation’s leading neuroscience and addiction journalists, and a columnist at Substance.com. She has contributed to Timethe New York TimesScientific American Mindthe Washington Post and many other publications. She has also published five books, including Help at Any Cost: How the Troubled-Teen Industry Cons Parents and Hurts Kids (Riverhead, 2006), and is currently finishing her sixth, Unbroken Brain, which examines why seeing addiction as a developmental or learning disorder can help us better understand, prevent and treat it. Her last column for Substance.com was about why it is time to reclaim the concept of “recovery” from the abstinence-only establishment.

Saturday Matinee: JFK Documentary Archive

AMERICAN ROYALTY JFK DALLAS 03

Show notes by ConspiracyScope

The Men Who Killed Kennedy is a 9-part video documentary series about the John F. Kennedy assassination by Nigel Turner that began with two 50 minutes segments originally aired on 25 October 1988 in the United Kingdom, titled simply Part One and Part Two. The programmes were produced by Central Television for the ITV network, and were immediately followed by a studio discussion on the issues titled The Story Continues, chaired by broadcaster Peter Sissons. The United States corporation, Arts & Entertainment Company, purchased the rights to the original two segments. In 1989, the series was nominated for a Flaherty Documentary Award. The series was re-edited with additional material into three 50 minute programmes in 1991, which were again shown by ITV. A sixth episode appeared in 1995. The series typically aired in November every year, but also from time to time during the year as repeats. But in November, 2003, when three additional segments (“The Final Chapter”) were added by the History Channel, the consequences were so immense that the entire series is no longer aired, though the History Channel still sells DVD copies of the first six documentaries.

Fair Use:
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

This is the mindblowing 6-part,10 hour, video documentary series Evidence of Revision whose purpose is to present the publicly unavailable and even suppressed historical audio, video and film recordings largely unseen by the American and world public relating to the assassination of the Kennedy brothers, the little known classified “Black Ops” actually used to intentionally create the massive war in Viet Nam, the CIA “mind control” programs and their involvement in the RFK assassination and the Jonestown massacre and other important truths of our post-modern time.

Playlist:
http://www.youtube.com/playlist?p=PL6…

http://conspiracyscope.blogspot.com/

 

Cynicism, Recession, and the Resurgence of Cyberpunk

books

By Marshall Sandoval

Source: PopMatters

Human nature might be augmented and highly channeled by technology, but human nature stays the same. And that tech might actually amplify all the worst things about us too.

Cyberpunk has seen a recent resurgence in video games. Seemingly every game developer working today has a William Gibson book tucked under their arm or follows @swiftonsecurity (a satirical Twitter account that imagines a Taylor Swift consumed with cyber security). Cyberpunk video games are pervasive, including cyberpunk game jam projects on itch.io, Twine games, indie titles, and major AAA releases. All of these projects embrace cyberpunk themes and aesthetics. Observers credit the current trend to a number of cyclical and cultural factors. After talking to the indie developers behind a number of exciting cyberpunk titles at the center of this resurgence, I believe that the creators of these games are overwhelmingly inspired by the headlines in today’s newspapers.

It seems like no coincidence that these games have all appeared in a short time period following the economic recession. On the most basic level of analysis, it seems that these games may be providing a sense of escape from recent economic events. Last Life developer Sam Farmer notes, “I’m gonna go back to my film school class on Sci-Fi and Fantasy and say that it’s escapism. Horror, in general, and escapism, in particular, is often more popular in times of economic downturn, when you want to be somewhere else.”

Garrett Cooper’s Black Ice is an action game which casts the player as a hacker taking down corporate servers. Promoting the game, he’s found that cyberpunk narratives may be popular for reflecting reality as much as for providing an escape. He says, “I’ve talked to people about my game. I say, ‘All the corporations are evil.’ So they’re like, ‘Oh. So you’re talking about real life?’ I’m like, ‘No. Not exactly.’ That’s what people feel. The fantasy of being the one guy that can take something technological and turn it against the corporation.”

Games writer Austin Walker is an academic and cyberpunk superfan who sees the same throughline in these games and the literary roots of the genre. Walker says, “A key to traditional cyberpunk again and again is that there is economic inequality. We are positioning ourselves somewhere on that scale of how we feel about this stuff. Cyberpunk stories do that too. Usually they position the hero at the bottom of that; they’re usually in or near poverty.” In a time of extreme real-world inequality, cyberpunk stories locate players in a fantasy of rising up to subvert the system and taking down greedy corporations.

David Pittman’s indie project Neon Struct deals with a fictional near-future surveillance state. The game was heavily influenced by the recent leaks about actual domestic surveillance in the present day in the United States. Pittman says, “Edward Snowden’s release of NSA documents in 2013 was an essential part of the inception of Neon Struct (formerly Die Augen der Welt, or ‘The Eyes of the World’). I have strong feelings about the abuse of surveillance by the U.S. government, and I’ve known for close to a year that I wanted to make a game about it.” He’s quick to add, “Despite my own interest and leaning in the real world debate over mass surveillance, I am developing a way to introduce the story, which does not require the player character to actually leak any classified information. I don’t want to assume that the player shares my biases.” Nonetheless, it’s clear that the forthcoming project was informed by recent events.

Other examples of indie games providing commentary on and gaining inspiration from world events abound. Brigador is an isometric cyberpunk shooter with an extremely stylish trailer, and developer Jack Monahan lists a surprising influence. Monahan says, “While I’m not sure if the author would agree with the genre classification {of cyberpunk}, my brother and I both read and enjoyed (and were worried by) a book called Cities Under Siege: The New Military Urbanism by Stephen Graham. Like William Gibson said, the future caught up to all of his writing, more or less. We basically are living in a dystopic future”. Notably, Monahan made these statements before the recent military-style urban clashes in Ferguson, Missouri. The aforementioned Last Life is shaped by real world advances in medicine and philosophical debates about transhumanism. Matt Conn is seeking to expand LGBT representation in the games space with the cyberpunk RPG R.O.M. He says, “Because I did GaymerX and prior to that I did a startup that was very successful and then crashed. Seeing how all that happened, I feel like I have an interesting perspective of the tech scene and the LGBT rights scene.” These varied examples show the differing events influencing today’s cyberpunk boom.

As strongly as these games are influenced by the socio-political climate, it is reductionist to say this is the only thing bringing cyberpunk back into prominence. Again, Austin Walker says, “It’s tempting to just say, ‘Oh that’s happening again. We’re getting concerned again about things like privatization and inequality.’ I think that’s part of it. I don’t know if I’d be comfortable saying, ‘This is the one reason why’”. Many developers also noted the power of nostalgia as a reason for the influx of cyberpunk games. Alex Preston a developer behind Hyper Light Drifter says, “I think my generation is coming into its own, creatively, and we have a fondness for these themes and ideas. A lot of us grew up with books, films, and games that touched on these themes, and it bleeds through in our creative work. I think nostalgia is a powerful force.”

Likewise, Brendan Chung, creator of ‘90s-influenced hacker game Quadrilateral Cowboy has noticed the cyclical nature of cyberpunk themes. He says, “My guess is that the people who grew up fiddling with old PC tech are now at an age where they now have the skillset and financial means to make their own games. Now that we can make games, we’re making things that harken back to one of the things that got us interested in games in the first place.” Nostalgia for ‘80s and ‘90s cyberpunk is another likely force bringing these kinds of games back to the games market.

Additionally, I kept hearing indie developers suggest their own outlook about the state of the world today is extremely bleak. Conn says, “On a more philosophical note, this is a way of writing about the future we kind of want to see. Even if it’s dystopian or dark. I think that for a lot of us, it’s very scary going into the future.” A similarly grim outlook is shared by Monahan. He says, “I think the dystopic elements of cyberpunk point to a certain cynicism that things aren’t going to get any better. Human nature might be augmented and highly channeled by technology, but human nature stays the same. And that tech might actually amplify all the worst things about us too.” Monahan also sees this cynicism in the nostalgia that drives the cyberpunk resurgence. He adds, “So much great work from the ‘80s was in a similar vein. I think of Snake Plissken’s deadpan response to news that the president’s plane has gone down: ‘President of what?’. There’s a disillusionment from the classic era of cyberpunk that makes a revival now seem fairly natural, I think.” Natural or not, the revival is in full force, and it’s becoming a strong and subversive undercurrent in the indie games space.

Grooming Students for A Lifetime of Surveillance

index

The same technologists who protest against the NSA’s metadata collection programs are the ones profiting the most from the widespread surveillance of students.

By Jessy Irwin

Source: Model View Culture

Since 2011, billions of dollars of venture capital investment have poured into public education through private, for-profit technologies that promise to revolutionize education. Designed for the “21st century” classroom, these tools promise to remedy the many, many societal ills facing public education with artificial intelligence, machine learning, data mining, and other technological advancements.

They are also being used to track and record every move students make in the classroom, grooming students for a lifetime of surveillance and turning education into one of the most data-intensive industries on the face of the earth. The NSA has nothing on the monitoring tools that education technologists have developed in to “personalize” and “adapt” learning for students in public school districts across the United States.

(Mega)data Collection + Analysis

“Adaptive”, “personalized” learning platforms are one of the most heavily-funded verticals in education technology. By breaking down learning into a series of tasks, and further distilling those tasks down to a series of clicks that can be measured and analyzed, companies like Knewton (which has raised $105 million in venture capital), or the recently shuttered inBloom (which raised over $100 million from the Gates Foundation) gather immense amounts of information about students into a lengthy profile containing personal information, socioeconomic status and other data that is mined for patterns and insights to improve performance. For students, these clickstreams and data trails begin when they are 5 years old, barely able to read much less type in usernames and passwords required to access their online learning portals.

Data collection and number crunching aren’t the only technologies being explored to revolutionize education– technology billionaire and philanthropist Bill Gates funded a $1.1 million project to fit middle-school students with biometric sensors to monitor their response and engagement levels during lessons, and advocated a $5 billion program to install video cameras in every classroom to record teachers for evaluation.

The Family Educational Rights and Privacy Act, a law put in place in 1974 to protect student academic records, does nothing to protect student data when it is in the hands of education technology companies. Instead, FERPA threatens to take federal funding away from schools who are found to have breached student privacy while it fails to mandate bare minimum security standards for the storage and transmission of student data. In fact, a recent revision of FERPA increased the power that companies have to collect and mine student data.  Though lawmakers and privacy advocates are regularly outraged at the immense volume of student data freely floating through the web, the repeated failure to create legislation that protects student data from being used for profit is astounding.

One thing is clear: those who have the power to protect student privacy will not do so as long as they can continue to subsidize the cost of public education with student data.

Internet Censorship in Schools

In most educational institutions, the vast majority of IT operations are focused on monitoring, filtering and blocking web traffic instead of building secure networks that safeguard student records and sensitive behavioral data. Nowhere is this more apparent than in the widespread adoption of web filtering software tools in K-12 schools. Usage of these technologies is required for compliance with programs like E-Rate, which grant federal money to schools to fund internet access for their students.

To be eligible for funding from the E-Rate program, schools are required to comply with federal regulations that ban access to websites displaying pornography, graphic material, or any other that could otherwise be judged as immoral, improper or lewd. More often than not, this subjective criteria is determined by the opinions and belief systems of school administrators under political pressure to deny students access to content on controversial issues about topics like evolution, birth control and sex education. These decisions disproportionately affect young girls and LGBTQ students by denying them access to sites that provide important information about their rights, their developing bodies, their sexuality and their access to contraceptives. In the case of Securly, the first filtering tool designed for schools, the controls set by IT and administration for web access can extend far beyond the walls of the school and determine what content students can access while using school- issued machines from their home internet connections.

Despite the many positive contributions of the internet in the distribution and dissemination of knowledge across the planet, students are regularly denied access to valuable information that could positively impact their learning… all to safeguard a small percentage of federal budget money granted to their schools. The implications of this are particularly severe for low-income students who do not have access to the Internet at home; without the ability to freely access the web on their own terms, their digital literacy skills lag behind those of their affluent peers. Though teachers request better and broader internet access for students in their classrooms, administrator-imposed blocks and filters on school internet leave most students woefully unprepared to navigate the realities of the web. When students do find a way around the tools used to limit their access to the outside world (this happened with a group of students who were given iPads in the Los Angeles United School district last year), they’re labelled as “hackers” or miscreants, and disciplined for using Tor, a tool popular among students for anonymous web browsing and circumventing blacklists that ban websites from school networks.

Social Media Surveillance

Schools are adopting many other surveillance technologies with unprecedented reach into the private communications and lives of students and their families. In Lower Merion, PA, a suburb outside of Philadelphia, educators engaged remote administration tools on students’ laptops to regularly spy on their activities while at home. In a case that made its way into federal courts, one student was punished by administrators who mistook candy pictured through his laptop’s camera for drugs. While the full extent of the spying was never exposed, parents and students have expressed concern about educators having the ability to watch young girls undress in the privacy of their homes, unaware that they were being watched through their school-issued computers.

In 2013, the Glendale Unified School District in Glendale, CA took a move straight from the NSA surveillance handbook by seeking out a $40,000 contract with Geo Listening, a social media monitoring company that charges schools to eavesdrop on student social media chatter. While the company claims to only access posts that are public in the school districts they work with, and says it works closely with school districts to tailor their monitoring programs to prevent cyberbullying, suicide and active shooter incidents, it is very easy— too easy, in fact— to use such technologies to identify and target students who have been labeled deviant or delinquent within their communities, or who are otherwise outspoken and critical of their teachers and schools.

Schools are also demanding access to students’ social media communications in ways that severely harm their constitutionally protected rights to free speech. In Minnewaska, MN, a female student who complained about a hall monitor’s behavior in a Facebook post was questioned and given in-school suspension. Later, when a parent reported the student for “sexting” over Facebook with a classmate, she was removed from class again as a group of educators and a police officer armed with a taser demanded that the student hand over her password. They then read private communications that took place outside of school through her Facebook account. After being pulled from class multiple times, suspended from school, and barred from attending a school field trip (the same punishment was not doled out to the male student involved in the messaging), the ACLU stepped in to defend the student’s right to privacy and free speech in communications outside of school property. Though the ruling in the case upheld students’ protection under the 1st and 4th amendments, school districts around the country continue to demand access to students’ social media accounts and threaten to mark students’ academic records to make it difficult to get into a desired university or to seek other avenues for continued education.

Physical Surveillance

In addition to the online monitoring taking place in schools, there are many surveillance mechanisms in place to enforce physical security in public schools. Since the shootings that took place at Virginia Tech in 2007, and again after those that took place in Sandy Hook, CT in 2012, technology companies have launched myriad tools designed to minimize the potential loss of life in the next active shooting incident at a school. Some of these technologies include:

By preying on the absolute worst fears of administrators and parents across the country, technology companies are earning millions of dollars selling security “solutions” that do not accurately address the threat model these tools claim to dispel. School districts that purchase these systems further perpetuate the farce of security theater and infringe on students’ rights to privacy and individual freedom.

A Lifetime of Surveillance

When we develop and use educational technologies that monitor a student’s every moment in school and online, we groom that student for a lifetime of surveillance from the NSA, from data brokers, from advertisers, marketers, and even CCTV cameras. By watching every move that students make while learning, we model to students that we do not trust them– that ultimately, their every move will be under scrutiny from others. When students recognize that they are being watched, they begin to act differently– and from that very moment they begin to cede one small bit of freedom at a time.

Though the education technology revolution continually promises a silver bullet that will be a great democratizing force for all of society’s ills, it categorically disregards the patriarchal power structures and biases that both legitimate and perpetuate discrimination against minorities and marginalized groups. Despite it being well within the scope of educational technology tools to track, identify and expose biases towards groups of students, technologists avoid implementing small changes that monitor educator performance and correct for unconscious biases that negatively affect student learning. Because the surveillance taking place in schools is typically based on qualitative criteria like morality, appropriateness and good behavior, these technologies extend current practices and prejudices that perpetuate injustices against marginalized groups.

There are few to no safeguards built into the online and offline monitoring systems to protect students from the abuse of these tools. Young female students who are active on social media can be unfairly targeted, slut-shamed and disciplined for suggestive language that takes place outside of school, while their male counterparts are not held equally accountable for participating in sexually charged online conversations. Youth of color, a group that is disproportionately stereotyped as angry, aggressive, and unpredictable by educators, can easily be monitored, disciplined, and entered into the juvenile justice system for any outburst that could vaguely be misinterpreted as a threat to a homogeneous caucasian school culture. Any student grappling with issues of abuse, depression, disability, gender identity or sexuality could easily be discovered by online surveillance tools, stigmatized and outed to their teachers, parents and wider community.

Education technologists also continue to widen the digital divide between affluent and economically oppressed. Despite an industry-wide insistence that technology is not being developed to replace educators in the classroom, many poor school districts faced with massive budget cuts are implementing experimental blended learning programs reliant on “adaptive” and “personalized” software as a way to mitigate the effect of large class sizes on student learning. This means that students who attend costly private schools or live within rich school districts that can afford to employ more educators and maintain smaller class sizes receive much more personalized instruction from their teachers. Instead of receiving much-needed interaction and personalized learning directly from educators, poor students living in disadvantaged communities receive instruction from educational software that collects their data (which is likely to be sold), and have less individual instruction time from teachers than their affluent counterparts.

By developing technologies that collect, track, record, analyze every move a student makes both online and off, technologists and investors and educators are ensuring that today’s students will have less privacy than any other generation that came before them, threatening to make privacy and anonymity unattainable for future generations. Though the surveillance mechanisms at play in education technologies affect the privacy of millions of students who pass through the education system each year, this system is a profound, persistent threat to the privacy and individual liberty of LGBTQ students, low-income students, and students of color who have already been so severely failed by the status quo.

Ironically, the same technologists and investors who protest against the NSA’s metadata collection programs are the ones profiting the most from the widespread surveillance of students across the country, by building educational tools with the same function.

The Birth of the Time-Motion Human

QuantimetricSelfSensingPrototypeMannApparatus

By Dale Lately

Source: The Baffler

In a darkened room, a woman lies watched by an infra-red camera as she sleeps. It monitors her breathing, her movements, the flicker of her eyelids. Some hours later it stings her with a painful electric shock. She wakes, tumbles out of bed and into the restroom, whereupon a chip installed in her toothbrush tracks her arm movements. She’s photographed, silently, every thirty seconds. As she sets off in the morning her location is logged and data is streamed on the steps she takes. Her pulse and calorie count are recorded and sent to unseen observers. She has a dog at her side. The dog’s data is logged as well.

Such a tableau would be the envy of any futuristic dictatorship. In fact, the devices outlined above are all available on the consumer market now, for voluntary use. The impetus towards tracking our lives with smartphones, apps and stats represents a massive growth area into which companies like Jawbone, MyFitnessPal, RunKeeper, Runtastic, MapMyRun, Foodzy, GymPact, and Fitocracy are flooding. Alongside the Nike+ Fuelband, there’s the popular Fitbit Flex, a wristband that counts the steps you take by day and the number of times you stir in your sleep. There are smart cups to track what you drink and wristbands programmed to give you electric shocks for not achieving your goals. There’s even a “Fitbit for your vagina” in the form of the KGoal Smart Kegel Trainer—a Kickstarter project designed to track kegels, exercises for women’s pelvic floor muscles to improve childbirth and continence, and for helping them to achieve a better “clench strength” via Bluetooth.

With all this biofeedback now available on our phones, the act of walking, living and breathing can—at least to the “datasexuals” who embrace it—be an ongoing project with limitless potential for improvement. But might such potential also lead to a kind of “Taylorism within”? Applying scientific management to twentieth century business created a workforce optimized for maximum efficiency. Likewise, life-tracking is encouraging us to internalize this dream by optimizing ourselves. Rather than a tool for liberation, we’re using the tech, in other words, to tune our lives for maximum “productivity.”

Perhaps none of this should seem surprising for a consumer society that drives on anxiety. If bad breath had to be invented as a disease mouthwash would help to cure a century ago, now the Quantified Self movement suggests we must live in permanent beta, to aim not just at maintaining ourselves but to become “better than well.” And so, Dave Allen’s Getting Things Done and websites like Lifehacker help to turn our lives into a series of sanctioned tasks and goals, where one must carry a “Surprise Journal” to find areas for self-improvement in one’s life, and sleep comes in the form of “power” naps. There’s the Lumo Back, a gizmo that monitors the tricky process of sitting in a chair, while the Narrative wearable camera snaps your life twice a minute. Time management lessons are now available for kids, while the iPotty seems to give toddlers the message that they shouldn’t take their eyes off a screen even when satisfying the most basic of human needs.

Silicon Valley, naturally, is more than happy to export the mantra of ongoing product optimization to our bodies: life-hacking fanatics talk of “upgrades” and “body hacks,” with often obsessive results. In a Financial Times article that marked a mainstream recognition of the movement, Tim Ferriss–author of The 4-Hour Body–claimed that he could teach people how to lose weight without exercising, work on two hours’ sleep, and have a fifteen-minute orgasm, while bio-hacker Dave Asprey was adamant that he’s made himself twenty years younger and forty IQ points smarter through life-tracking and smart pills (“I’ve rewired my brain,” he said). All of this task management can become a considerable task in itself, leading to the piling up of Catch 22 ironies—like the fact that developers are now working on smartphone apps to solve the problem of people spending too much time on their smartphones.

Luckily, some are questioning the use of intimate monitoring devices in our lives. The information asymmetry provided by the emergent “Internet of Things” may create a class of uninsurable people, while ”digital Taylorism”—the tracking and tagging of workers like cattle—has been roundly criticized as it has begun to emerge at companies like Amazon. What’s disquieting about the popularization of life-tracking is the voluntary desire to become “time-motion humans,” to subject ourselves to a self-imposed surveillance state. “Track everything. Track your entire day—wherever you go,” says the website for the LumoBack. “VESSYL AUTOMATICALLY KNOWS AND TRACKS EVERYTHING YOU DRINK,” the Vessyl “smart mug” warns us in stark capitals. And once we’ve volunteered for this intimate biological scrutiny, we’re keen to publicize the results—using tools like the Withings scale, which threatens to broadcast our weight gains to our Twitter followers as “encouragement.” Self-Improvement Macht Frei.

Since the invention of the forceps we’ve been introducing machinery into our bodies to improve our lives (the aforementioned KGoal is actually based on a biofeedback device from the 1940s by Dr. Arnold Kegel), and undoubtedly many of these trackers are helping to make people healthier. But life tracking also comes from a certain ideological background, one that denigrates macro-interventions in our lives (nationalized health care) in favor of individual micro-solutionism (becoming our own gym instructors and fitness trainers).

We’re living in an entrepreneurial model of humanity, a vision of human beings as start-ups, where unfitness or obesity are viewed as “bugs” to be fixed rather than as products of an economy based on long hours and precarious work. Daily exercise has always been an individual responsibility, but sharing our biofeedback via social media encourages people to compete like businesses, vying for better health scores with the personal data that makes us special. (Flex boasts that it reflects “your stats, not any average Joe’s.”) Here we can all be Superman—“Join over 141,000 other people who want to discover their inner superhero,” urges website Superheroyou—while, back in the complex, unquantifiable real world, we often struggle to maintain control over the most basic facts of our finances and job prospects.

The Quantified Self literature is full of such fantasizing. It all treats the body as a fun challenge, a puzzle to be solved. We see this in the current trend towards adding game-like features to the process of life tracking, which leads to some quite startlingly intimate results (“Spreadsheets,” an app that promises to gamify your sex life, has the user get on the bed and talk dirty to a computer). Even antenatal workouts aren’t immune: the KGoal promises gamification in forthcoming product updates for those who fancy comparing their pelvic thrust scores to those of their peers.

The friendly rivalry that has always been a part of amateur fitness starts to look less inspiring, and more controlling, when it’s built into the architecture of smartphones and social media. It’s more like a crowd-sourced version of what philosopher Michel Foucault termed “Biopower,” the control over our bodies wielded by states and their institutions. But in this version, it’s not the institutions; we control ourselves, and each other.

As more and more aspects of our lives are seen as legitimate targets for intrusion by technology, the gaze inevitably falls on the newly born. Start-ups like Sproutling, Owlet, and Mimo are springing up to replace old-fashioned baby monitors with comprehensive, round-the-clock surveillance (temperature, pulse, breathing, position, room ambience) as well as all the attendant data crunching. These infants may be the first humans to grow up entirely in the lens of machines, with the process of rearing having been refashioned as a high-tech, high-maintenance project, requiring endless inputs from both parent and child alike. They will be the first “time-motion babies”: faster, happier, more productive, in the words of Radiohead’s Ok Computer.

Will they really be happier, versed as they will be, since birth, in the techniques of maximizing their sleep, optimizing their nutrients, and tracking the number of steps they walk? It seems doubtful, but then, it’s impossible to really tell when we talk about happiness—even Silicon Valley hasn’t worked out how to put a number on that.

 

Dale Lately writes about culture and communications and has contributed to the Guardian, 3:AM Magazine, OpenDemocracy, Litro and Pop Matters. His regular musings can be found at @dalelately and www.dalelately.blogspot.com.

Franco Berardi on the Digital Colonization of Human Experience

rupertmurdock-digital-immigrant

By Franco Berardi

Source: Adbusters

The Spanish colonization of Mesoamerica was essentially a process of symbolic and cultural submission.

The “superiority” 
of the colonizers lay on the operational effectiveness of their technical production. The colonization destroyed the cultural environment in which indigenous communities had been living for centuries: the alphabetic technology, the power of the written word overwhelmed, jeopardized and finally superseded the indigenous cultures. The conquistadors re-coded the cultural universe of nowadays Mexico and Central America.

Before the arrival of the Spanish invaders Malinche (Malinalli in Nahuatl language, Marina for the Spaniards), the daughter
 of a noble Aztec family, was given away as 
a slave to passing traders after her father died and her mother remarried. By the time Cortés arrived, she had learned the Mayan dialects spoken in the Yucatan while still understanding Nahuatl, the language of the Aztecs. As a youth she was given as tribute again, this time to the invaders.

She became the lover of Cortés and accompanied him as interpreter. She translated the words exchanged by Cortés and Moctezuma, king of the Aztec population of Tenochtitlan, and she translated the conqueror’s words when he met crowds of indigenous persons. She translated for Nahuatl-speaking people the words of Christian conquerors and of Christian priests. The Christian message melted with pre-colonisation mythologies, and the modern Mexican culture emerged. She and Cortés had a child, Martín, the first Mexican. She betrayed her own people by linking with the invaders. By the moral point of view, however, she owed nothing to her own people who had sold her into slavery, and treated her as a servant. She betrayed the conquerors, too, though they did not realize it as such.

Malinche is the ultimate symbol of the end of a world, and also the symbol of the formation of a new semiotic and symbolic space. Only when you are able to see the collapse as the end of a world, can a new world be imagined. Only when you are free from hope (which is the worst enemy of intelligence) can you start seeing a new horizon of possibility. This is the lesson that Malinche is teaching us.

DEMOCRACY

On 31 October 2011, George Papandreou announced his government’s intentions to hold a referendum for the acceptance of the terms of a Eurozone bailout deal. He wanted the Greek people to decide if the diktat of the financial class that was strangling Greek society would be accepted or rejected. Overnight, the elected Prime Minister of Greece was obliged to resign. In the very place where it was invented and named twenty-five centuries ago, democracy was finally cancelled. It will never again come to life. Financial abstraction has swallowed the destiny of billions of people. European workers’ salaries have been halved in the last ten years and unemployment and precariousness are on the rise. Meanwhile, profits skyrocket.

WAR

The Eurasian continent is heading toward a proliferation of fragmentary conflict. At the same time, the infinite war launched by Cheney and Bush has paved the way to the establishment of the Caliphate. In Japan, the Prime Minister travels the world looking for allies against China. In India, a racist mass murderer (neoliberal of course) has been elected Prime Minister. In Europe, a Euro-Russian war is in the making at the Ukrainian border. In Ferguson, Missouri, another racialized killing reveals the American police state and the poverty industrial complex — two million homeless in the US and counting. In Gaza, Israel bombards the world’s largest open air prison and blames the victims, most of them children, for dying while the world looks on. In Northern Africa, Western powers prepare for the next season of Gaddafi blowback. In Liberia, Ebola fans the flames of civil and regional war, one bleeding eyeball at a time. In Mexico, a momentary silence shrouds the bloodiest drug war humanity has ever known, with cartels ranking among the wealthiest corporations.

While capitalism will continue to thrive thanks to massive slavery and eco-catastrophe, the next 20 years will be marked by the clash between financial abstraction and biofascism. A social, cognitive breakdown is estranging the masses from the body, so the decerebrated body is taking the form of aggression. Those who have been lost in the competition react under the banners of aggressive identification. We can even see fascism revived by the vengeful spirit of the dispossessed.

BIO-FINANCIAL POWER

Nation states are over, stripped by the global machine
 of finance, computation and all-pervading behavioral Big Data algorithms. Global corporations are replacing nation states as holders of power. We now embrace the first stages of the automation of mind, language and emotions … the architecture of bio-financial power. Power, in fact, is no longer political or military. It is based more and more on the penetration of techno-linguistic automatisms into the sphere of language. Soon, life will be based on the automation of cognitive activity. Who cares if the US military machine is running on empty because of Bush’s self-defeating strategy — it’s a remnant of geopolitical thinking now dead.

THE CIRCLE

Mediocre as it is, Dave Eggers’s novel 
The Circle is a metaphor for the relation between technology, communication, emotion and power. “The Circle” is the name of the most powerful corporation 
in the world, a sort of conglomerate of Google, Facebook, Paypal and YouTube. Three men lead the company: Stockton 
is a financial shark, Bayley is a utopian and Ty Gospodinov is the project’s hidden mastermind.

The main character of the book is Mae, a young woman hired by The Circle during “the Completion,” the final phase in the implementation of TrueYou, a program intended to enforce the recording of every instant of life for pervasive, ceaseless sharing. Mae becomes the corporation’s spokesperson, the face that appears every day on the infinite channels of The Circle’s television network — the ambassador of the new credo.

The Circle is all about the utter
 capture of human attention: ceaseless communication, mandatory friendliness and creation of a new neediness — the obsessive need to express and share. One may remark that Eggers is simply re-enacting Orwell more than 60 years after the publishing of 1984. That’s true, but in the final pages of the novel, Eggers goes further than Orwell, when Ty exposes the transhuman potency of the totalitarian nightmare.

In the last scene of the novel, the inventor and founder of The Circle manages to covertly meet Mae, the newbie seducing
 the global audience. He has lost control of his own creature, the project he originally conceived, and is deprived of all power in its unstoppable self–deployment.

“I did not intend any of this to happen. And it’s moving so fast. I didn’t picture a world where Circle membership was mandatory, where all government and all life was channeled through one network … there used to be the option of opting out. But now that’s over. Completion is the end. We are closing the circle around everyone. It’s a totalitarian nightmare.”

The automaton cannot be stopped, as even the creator himself becomes overpowered by his own invention: the circle of continuous attention, the circle of perfect transparence of everybody to everybody, the circle of total power and of total impotence.

PLEASURE, AFFECTION AND EMPATHY

At the beginning of the 21st century we are in a position that is similar to the position of Malinche: the conqueror is here, peaceful or aggressive, functionally superior, unattainable, incomprehensible. The bio-info automaton is taking shape from the connection between electronic machines, digital languages and minds formatted in such a way to comply with the code. The automaton’s flow of enunciation emanates a connective world that the conjunctive codes cannot interpret, a world that is symbolically incompatible with the social civilization that was the outcome of five centuries of Humanism, Enlightenment and Socialism.

The automaton is the reification of the networked cognitive activity of millions of semio-workers around the globe. Only if they become compatible with the code, the program, can semio-workers enter in the process of networking.

This implies the de-activation of old, subconsciously engrained, modes of communication and perception (compassion, empathy, solidarity, ambiguousness and irony), paving the way to the assimilation of the conscious organism with the digital automaton.

Will the general intellect be able to disentangle itself from the automaton? Can consciousness act on neural evolution? Will pleasure, affection, empathy find a way to re-emerge? Will we translate into human language the connective language of the automated meaning-making machine buzzing and buzzing in our heads?

These are questions that only 
Malinche can answer, opening to the incomprehensible other, betraying her people and reinventing language in order to express what can not be said.

—Franco “Bifo” Berardi is an Italian Marxist theorist and activist in the autonomist tradition. He writes about the condition of media, mental breakdown
 and information technology within post-industrial capitalism. His next book, Heroes, dedicated to the suicidal wave provoked by financial nihilism, will be out in the first months of 2015.

Social Schizophrenia, Social Depression: What does TV tell us about America?

wash_dees

By Charles Hugh Smith

Source: Dangerous Minds

The difference between what we experience and what we’re told we experience creates a social schizophrenia that leads to self-destructive attitudes and behaviors.

What can popular television programs tell us about the zeitgeist (spirit of the age) of our culture and economy?

It’s an interesting question, as all mass media both responds to and shapes our interpretations and explanations of changing times. It’s also an important question, as mass media trends crystallize and express new ways of understanding our era.

Those who shape our interpretation of events also shape our responses.  This of course is the goal of propaganda: Shape the interpretation, and the response predictably follows.

As a corporate enterprise, mass media’s goal is to make money—the more the better—and that requires finding entertainment products that attract and engage large audiences.  The products that change popular culture are typically new enough to fulfill our innate attraction to novelty—but this isn’t enough. The product must express an interpretation of our time that was nascent but that had not yet found expression.

We can understand this complex process of crystallizing and giving expression to new contexts as one facet of the politics of experience.

The Politics of Experience

It is not coincidental that the phrase politics of experience was coined by a psychiatrist, R.D. Laing, for the phrase unpacks the way our internalized interpretation of experience can be shaped to create uniform beliefs about our society and economy that then lead to norms of behavior that support the political/economic status quo.

Here’s how Laing described the social ramifications in Chapter Four of his 1967 book, The Politics of Experience:

“All those people who seek to control the behavior of large numbers of other people work on the experiences of those other people. Once people can be induced to experience a situation in a similar way, they can be expected to behave in similar ways. Induce people all to want the same thing, hate the same things, feel the same threat, then their behavior is already captive – you have acquired your consumers or your cannon-fodder.”

For Laing, the politics of experience is not just about influencing social behavior – it has an individual, inner consequence as well:

“Our behavior is a function of our experience. We act according to the way we see things. If our experience is destroyed, our behavior will be destructive. If our experience is destroyed, we have lost our own selves.”

How the media shapes our interpretation affects not just our beliefs and responses, but our perceptions of self and our role in society. If the media’s interpretation no longer aligns with our experience, the conflict can generate self-destructive behaviors.

In other words, mass media interpretations can create a social schizophrenia that can lead to self-destructive attitudes and behaviors.

Social Analysis of TV

By its very nature as a mass shared experience, popular entertainment is fertile ground for social analysis.

Here’s a common example: what does a child learn about conflict resolution if he’s seen a thousand TV programs in which the “hero” is compelled to kill the “bad guy” in a showdown? What does that pattern suggest, not just about the structure of drama, but about the society that creates that drama?

Analyzing entertainment has been popular in America since the 1950s, if not earlier.  The film noir of the 1950s, for example, was widely deemed to express the angst of the Cold War era.  Others held that the rising prosperity of the 1950s enabled the populace to explore its darker demons—something the hardships and anxieties of the Depression did not encourage.

Many believe the Depression gave rise to screwball comedies and light-hearted entertainment featuring the casually wealthy precisely because these were escapist antidotes to the grinding realities of the era.

Even television shows that were denigrated as superficial in their own time (for example, Bewitched in the 1960s) can be seen as politically inert but subconsciously potent expressions of profound social changes: the “witch” in Bewitched is a powerful young female who is constantly implored by her conventional husband to conform to all the bland niceties of a suburban housewife, but she finds ways to rebel against these strictures.

Laing saw the potential conflict between what we experience and how we’re told to interpret that experience not just in social terms but in psychiatric terms: such splits open a gulf that can lead to a form of schizophrenia.

Diagnosing Our Disease with TV

What can we make of the popular TV series of the present era? What do they say, beneath the surface, about American society?

I contend that popular TV expresses three key aspects of U.S. society and economy that are at odds with the core idealized values espoused in civic classes and the media. The three idealized values are:

1.  America is a meritocracy—selections, admissions, etc., are based on the candidates’ merits

2.  Anyone can get ahead if they get an education and work hard

3.  America is the wealthiest nation on earth in terms of opportunity, fairness and capital

TV expresses three aspects that confound these idealized values:

1.  Life is a game in which the winner takes all

2.  The opportunity to “get ahead” via conventional means—getting educated, working hard, etc.—is a joke; only those who skirt conventions and the law get rich

3.  Life is a tortuous endurance course where those in charge demand the cruel and the impossible

Winner-Take-All Talent Shows

Let’s start with the genre that has been a dominant force in American TV since the 2000s: the winner-take-all talent show (reality and game shows).

The long-running Survivor series, for example, was a winner-take-all contest of physical prowess and political guile, while the many programs staging singing/dancing contests (American Idol, etc.) put entertainment skills to the competitive test.  A wide range of other entries stage competitions in cooking, entrepreneurship, losing weight, negotiating obstacle courses, and so on.

What interpretations of our experience do these highly competitive winner-take-all reality shows promote?

We could start with the fact that the stars (other than the hosts/judges) are apparently ordinary “every man/woman” Americans, i.e. people not unlike us.  Watching them, it is not too much of a stretch to imagine ourselves on stage, in the kitchen, etc., trying to impress the judges with our talents. It’s easy to identify with the contestants.

These shows enable us to vicariously experience the fantasy that we, too, could be on national TV and could win the accolades of the judges and fans and be the winner who takes it all.

This natural empathy with the temporarily famous with whom we can vicariously share the thrill of victory and the agony of defeat is clearly tapping a deep cultural desire to taste celebrity and the implied financial rewards of winning in an increasing winner-take-all society/economy.

Could the financial/political marginalization of the average citizen and the widening gulf between the typical household and those at the top of the fame/wealth pyramid have something to do with this fascination for winner-take-all competitions on the public stage?

Since there have been game shows on TV for decades, it could be argued that this proliferation of winner-take-all contests is nothing new. But this fails to account for the difference between a game show in which the correct answer is a fact and the subjective votes of judges, other players and the audience that count in winner-take-all contests.

I would argue that this recent explosion of competitions (“modern gladiator,” anyone?) is an expression of deeply held shared cultural values: we accept that ours is a highly competitive society, and that it is becoming even more so as the top “winners” skim the vast majority of the winnings (media visibility, wealth, adulation, social status. etc.), leaving a few morsels for the top 5% and nothing but crumbs for the bottom 95%.

But these TV programs also project the fantasy that our fight-to-the-figurative-death society is still a meritocracy—the best guy/gal wins, as judged by “experts” (or celebrities claiming expertise; the judges’ expertise is structured to be unassailable, just like all the other “experts” in our society).

But if we can’t win it all on merit, there is an alternative way to win: display superior political guile or greater popularity with the audience. (Interestingly, this echoes the coliseum audiences of the late Roman era, who also had some sway over who lived to fight another day on the choreographed battlefield below.)

Perhaps this helps explain our collective obsession with celebrity and the many measures of popularity available to everyone now—Instagram, Facebook likes, Twitter followers (for sale in lots of 10,000), Klout scores, and so on.

In other words, as success in the real world grows increasingly distant, vicariously competing and “winning it all” becomes very compelling. Rather than deal with the vast injustices of our system, we cling to the idealized norm that meritocracy matters, even as “winning” in real life is increasingly a game of cronyism, guile, gaming the system, misrepresenting the truth, etc.

And so we thrill to these play-acting displays of meritocracy in action, as it confirms our cultural value system that that despite the predations of Wall Street and Washington, merit still counts.

And when all else fails, we have a fallback source of identity and “winning”—our popularity. And if we don’t have enough of our own, then we can share vicariously in the popularity of TV show winners and celebrities who have reached the pinnacle.

This is the core message of an interesting and erudite half-hour talk on celebrity given by Games of Thrones actor Jack Gleeson:

“During a recent visit to the Oxford Union, Gleeson took the opportunity to dismantle the ‘religious hysteria’ of celebrity worship with an appropriately epic rant, breaking down the economic, psychological, and sociological catalysts for public reverence of celebrities and their negative impact on society as a whole.”

Gleeson draws upon a number of intellectual sources (Weber, Baudrillard, et al.) in his discussion of the contemporary culture of celebrity, and concludes that celebrity fills the void left when development of an authentic self is stunted.

The parallel with Laing’s “lost self” is striking.

In effect, when the opportunities for developing an authentic self have been reduced to popularity, public visibility and the status that flows from these forms of recognition, then the worship of celebrity and the aching desire for a moment in the spotlight become rational substitutes for a True Self.

As these wispy contingencies can never form an authentic selfhood, even those who do “win” the competition for celebrity are ultimately dissatisfied and disillusioned.

My conclusion: the popularity of competitive winner-take-all TV programs reflects the paucity of opportunities for selfhood and the substitution of celebrity worship for the difficult task of forging an independent identity in a society that marginalizes all but the top players.

If this isn’t a form of cultural schizophrenia, then what exactly is it? The claim that this is all just good clean fun is absurd, as is the claim that it’s perfectly healthy to sacrifice one’s identity in a competition for recognition that 99.9% of us are sure to lose.

It Is Important To Honor Those Who Made Others Die For Our Freedom

640Source: Clickhole

As America continues to fight battles both at home and abroad, I am reminded of those who have served our country in previous armed conflicts. Courageous individuals who answered the call and fought to protect the American way of life. Regardless of whether or not we agree with our nation’s military actions, we must always remember to honor those who made others die for our freedom.

I’m talking about the brave men and women in political office who have dedicated their lives to sending American soldiers to die in distant and brutal wars. Legislators and Defense Department officials, CIA directors and the president. No matter how ferocious the enemy, no matter how daunting the battlefield, these patriots are consistently willing to do whatever it takes to defend our nation, even if it means putting individuals other than themselves in harm’s way.

For this, we must honor them.

…these patriots are consistently willing to do whatever it takes to defend our nation…

These are the true American heroes. Senators and state representatives throughout history who battled fearlessly to send thousands of troops to fight in ground invasions. Presidents who bypassed congressional approval to authorize airstrikes in hostile regions. Even when the battle seemed distant and avoidable, these brave politicians were willing to make the ultimate sacrifice for their country: other people’s lives.

And it’s not just legislators and White House officials who deserve our reverence. We must also honor the military strategists, weapons manufacturers, and thousands of lobbyists who have committed themselves to ensuring others perish on the field of battle. These brave individuals who fought valiantly to make other people’s mothers, fathers, daughters, and sons give their lives for our nation’s safety.

No matter our political beliefs, we must look back, from time to time, and remember that the reason we can feel safe as Americans is because of the battles the men and women of our governing bodies forced people other than themselves to die fighting in.

We cannot forget that freedom has a price, and we cannot forget the people who made sure others paid that price.