Roundup of Disturbing Roundup Statistics

74481_1667968857660_1190475812_1810124_6476601_n

Source: Washington’s Blog

Roundup is found in 75% of air and water samples.  Indeed, some farmers drench crops with Roundup right before harvest.

Roundup is linked to a number of diseases.

A study from the Journal of Organic Systems includes the following 12 charts which show the correlation between Roundup (technically known as “glyphosate”) and disease:

Thyroid cancer and GMOs
Renal disease deaths and GMOs

Urinary and bladder cancer and GMO
Hypertension and GMOs

This Mom Could Go To Jail For Saving Her Son With Cannabis Oil. This Needs To Stop

index

By Joe Martino

Source: Collective Evolution

A mother of a 15-year-old could be facing jail time for using cannabis oil to help her son with the side effects of his brain injury. Her son was finally seeing relief from daily migraines, muscle spasms and uncontrollable outbursts.

“I broke the law, but I did it to save my son,” Angela Brown said. She had traveled to Colorado to obtain the cannabis oil and brought it back to Minnesota where it is illegal. She administered the Cannabis oil safely to her son and the results were amazing.

Her Son’s Accident

Angela Brown’s son Trey was a healthy kid, but a baseball accident in 2011 led to a build-up of pressure inside of his head. He was hit with a line drive to the head causing bleeding in his brain. At first, doctors were not sure if he was going to survive. They chose to induce a medical coma. When Trey finally awoke, he didn’t appear to be the same kid according to his mother.

“I cry like every day before I go to bed, like my brain is about to blow up, cause there is so much pressure.” Trey said.

He began dealing with chronic pain, depression and difficult to control outbursts. His mother searched everywhere to try and help treat the effects of his injury. They went through 18 different medications and none of them worked. Trey’s mother felt that the effects of the medications even made her son suicidal.

Then they found cannabis oil and everything was starting to turn around.

Cannabis Oil

Cannabis oil has been making news and headlines for a couple years in ways we may not expect. Cancer and Alzheimer treatments, helping to reduce seizures and replacing potentially harmful medications for many people. Cannabis, although holding a negative stigma, can be seen as a natural miracle substance in a way. It may hold the power to treat and potentially cure a lot of people’s serious diseases.

In the case of Trey, cannabis oil helped to treat and bring quality of life back in a situation where all else was tried and didn’t work.

“It stopped the pain and stopped the muscle spasms,” Trey said. “It was helping me go to school until it then got taken away and then school was really hard again.”

“It was a miracle in a bottle.”Angela Brown

But It’s Illegal In Many Places

The miracle in a bottle didn’t last for Angela and her son. When Trey’s teacher asked how he was doing better in school suddenly, Angela mentioned the oil.

“Well, I gave him an oil that we’d gotten from Colorado, it’s derived from a marijuana plant. And then you could feel the tension in the room.”

It only took a week for the sheriff’s department to confiscate the oil. Later, county officials charged Angela with child endangerment which required child protection to get involved. If she is convicted of her charges she could face up to two years in jail and $6000 in fines.

“It’s asinine, I didn’t hurt my son; I was trying to prevent him from being hurt.”

CBS News, who attained the interview with the family, took the time to reach out to the county prosecutor, law enforcement and Trey’s school district. All declined any form of interview about the case.

The final killer in this story is that in May, Minnesota became the 22nd state to approve specific forms of medicinal marijuana. But the law doesn’t go into effect until 2015. So although this substance is already recognized as something that will become legal very soon, helping her son get better is still a crime.

Why Do We Deny Things That Work?

You might ask yourself why this type of thing could even happen. Are we really that disconnected as a society? Sure one could argue we don’t have the necessary data to state whether or not cannabis oil could have negative effects over time, but the crazy thing is we do have the data that states our medications we so often prescribe have nasty side effects, even after short periods of use. So why is one illegal and the other not? The easy answer is due to social conditioning and stigma.

The deeper answer could go into the realm of business, profit and control. It is often argued that many aspects of the medical system are set up to create life long patients versus properly treating and curing patients.

Although cannabis is finally becoming legal in more places across the world, there is still resistance when it comes to the potential treatment and curing ability the natural plant can have on many serious diseases.

With clinical trials finally on the go with brain cancer patients, the next year of study for cannabis oil could be monumental to the health of our world.

Monsanto Sues Maui for Direct Democracy, Launches New PR Campaign

maui monsanto protest

By Rebekah Wilce

Source: PR Watch

Residents of Maui County, Hawai’i voted on November 4 to ban the growing of genetically modified (GMO) crops on the islands of Maui, Lanai, and Molokai until scientific studies are conducted on their safety and benefits. Monsanto and Dow Chemical’s unit Mycogen Seeds have sued the county in federal court to stop the law passed by the people.

In Vermont, the Grocery Manufacturers Association (GMA, of which Monsanto and Dow were recently listed as members) has sued the state over its law requiring GMO labels. And Monsanto has a history of suing to prevent consumer labeling regarding its products. The company sued a number of dairies in the 1990s and 2000s for labeling milk free from recombinant bovine growth hormone (rBGH), which Monsanto developed and marketed as Posilac® (sold to Eli Lilly in 2008), the only commercially approved form. Vermont itself is no stranger to such suits. The International Dairy Foods Association sued Vermont for passing a law requiring labeling of milk containing rBGH (Monsanto wrote an amicus brief in support of the plaintiff, and GMA was a plaintiff-appellant) — and it won in federal court.

On the same day that Monsanto said it would challenge the decision of Maui’s citizens to regulate their own land and environment in court, the company also launched a new national advertising campaign as part of an effort to improve the image of the widely reviled company.

The glossy ads portray families of many cultures sitting down to eat gorgeous foods, invoking images more often seen in the pages of Saveur than in the hallways of one of the world’s largest chemical companies.

In addition to print ads in several national magazines and TV ads airing on national cable networks and several local stations in coastal cities, the campaign includes a slick new website launched in September, Discover.Monsanto.com.

The website invites questions from the public. The vast majority are skeptical, if not hostile. Others sound like they were written by Monsanto staff. Predictably, some of the hardest questions, like the one posed by Tim H., “In 2013, how much money has Monsanto spent on lobbyists in DC? What laws were these lobbyists attempting to create/amend and why?” are given short shrift.

Monsanto’s pretty TV ads target moms and millenials, according to the company’s corporate brand lead, Jessica Simmons. Monsanto has even hired a new “director of millenial engagement,” Vance Crowe, 32. He represented the company at a recent South by Southwest Eco conference in Austin, where revelations that Monsanto had paid for a panel of farmers to attend and present generated some excitement, as Tom Philpott reports in Mother Jones.

Crowe told NPR‘s “The Salt” blog, “[T]he challenge with something like SXSW Eco is that it doesn’t do anybody any good if people are so passionate that they’re yelling. The challenge is how can we enter the conversation so that people don’t feel like they have to yell to be heard?” Apparently, Crowe hopes to “enter the conversation” one party at a time. He enthusiastically describes how he and a gay colleague attended sessions on “sustainable fashion” and got invited to parties where they won fans and accolades.

Coincidentally, the front page of Discover.Monsanto.com contains, under “Here’s where we work,” a picture of corn crops being tended in Maui, with the text, “Hawaii’s unique climate allows for three to four growing seasons a year, reducing the time it takes us to develop new products. Our island roots go back more than 45 years.”

The marketing text may indicate the issue at the heart of Monsanto’s lawsuit against Maui. Those multiple growing seasons mean that “about 90 percent of all corn grown in the U.S. is genetically engineered and has been developed partially at Hawaii farms,” according to the Associated Press. Monsanto and the rest of the seed crop industry reap $146.3 million a year in sales from their activities in the state, according to a 2009 USDA report. Now Monsanto would have to substantially downsize its activity in Maui County in order to follow the new law, according to its lawsuit.

Monsanto’s new PR campaign seeks to make its brand approachable to the American consumer. Yet, with 92 percent of Americans demanding that GMO foods be labelled, according to a new Consumer Reports poll, Monsanto and its new millenial hires have their work cut out for them.

Consumer Reports recently put out a study on where GMOs are hiding in your food, including in packages labeled “natural.” You can access the report here.

Rebekah Wilce is a reporter and researcher who directs CMD’s Food Rights Network project.

Don’t Forget Why Marijuana Legalization Is Winning

index

By Maia Szalavitz

Source: Substance.com

When I first started writing about drugs in the mid-’80s—before I got into recovery in 1988—it was almost impossible to imagine an America where four states and DC have legalized recreational marijuana use, 58% of Florida midterm voters just cast their ballots in favor of legalizing medical use (the measure needed 60% to pass), and California passed a ballot initiative to lower drug and other nonviolent crime sentences. (Nineteen other states have legalized medical marijuana.)

The magnitude of the change is hard to understand without knowing a bit of recent history—and if we are going to continue to move toward rational drug policy, knowing where we’ve been and how it has changed is critical. I offer this perspective through the lens of my own experience covering the drug war for nearly 30 years.

My first national column was called, embarrassingly enough, “Piss Patrol.” I was assigned by High Times to write about corporate urine testing policies, starting around 1987, presumably as a service to stoned readers who were considering their employment options.

Over the next few years, the media would spill so much ink and airtime demonizing crack cocaine that by 1989, 64% of people polled by CBS News said that drugs were the country’s biggest problem—and Republicans and Democrats began tripping over one another to race to pass the harshest possible drug sentencing laws.

High Times itself was targeted by the DEA with frequent demands for its list of subscribers and raids on all of its biggest advertisers of growing supplies, nearly forcing the magazine to close.

Testifying before Congress, LAPD chief Daryl Gates said that casual drug users “ought to be taken out and shot,” and the DARE drug prevention program he founded saw nothing ominous in encouraging kids to turn their parents in to the police if they used drugs. Supreme Court Justice Thurgood Marshall warned in a prescient 1989 dissent in a urine testing case that “there is no drug exception to the Constitution,” although Congress and the rest of the legal establishment apparently begged to differ.

Even today, police can confiscate cash and property they suspect to be involved in drug crimes, without convicting the owners and with virtual impunity. The surveillance revelations about the NSA’s spying on American citizens include cases where that agency has shared information with the DEA that was gathered from phones and computers without a warrant. In fact, the DEA has an official policy of basically lying to defense attorneys—and sometimes even prosecutors and judges—about the source of this data.

Yet even before the rage to pass tough drug laws took off in the 1980s, law enforcement efforts like mandatory minimum sentences were known to be ineffective. The federal government had quietly overturned one set of mandatory drug sentences in the late ‘60s—since they had clearly failed to prevent the late ‘60s.

And New York City would never have been one of the capitals of crack if the 15-to-life “Rockefeller law” mandatory sentences for selling even powder cocaine, which had been in place here since the mid-‘70s, actually suppressed drug use.

As is clear from this brief summary, for most of my adult life, the idea of a rational drug policy seemed literally to be a pipe dream (a term, by the way, from opium dens). So how did we go, in just a few years, from seeing drug users as demon enemies in a war who must be locked up to having the drug czar drop the military language and even speak at last month’s National Harm Reduction Conference in Baltimore?

Many factors are clearly playing a role. Two of the most obvious are the sheer economic burden of having become the world’s most prolific jailer and the drop in violent crime that hasn’t been paralleled by a fall in addiction rates or a reduction in the availability of drugs like marijuana, heroin and cocaine. Some of the crime decrease may, of course, be linked to the 500% rise in the number of prisoners since 1980—but research shows that violent crime fell more in states that have lowered incarceration rates.

Other influences have also been important. One has been the increasing recognition—driven especially by Michelle Alexander’s 2011 bestseller The New Jim Crow—of the racist nature of the drug war. When you know this history of the drug laws it is very hard to justify supporting them.

Another factor is the rise of the Internet. Early adopters of the net tended to be hippies and libertarians: Steve Jobs famously said that his use of LSD was one of the most important experience of his life, for example, and pro-legalization views dominated online before the mainstream media began to realize the web was the future of its business.

This gave legalizers a loud voice—one that had been previously drowned out by a media that had so bought into the drug war that networks and newsmagazines thought nothing of taking government payments to place stories with the “correct” anti-drug slant in lieu of running paid anti-drug ads.

The Internet has also allowed critics—including me—to directly attack inaccurate coverage as it appeared, exposing readers to truthful information about drugs and drug users that was previously hard to find. It is much harder to start a panic when debunkers immediately offer alternative perspectives.

Three other important forces should also be mentioned. First, the Drug Policy Alliance—helped by large donations from billionaire George Soros—spurred activism and funded ballot initiative measures that brought marijuana policy reform out of the fringes and into the mainstream.

Second, the harm reduction movement spurred by the AIDS epidemic quietly racked up successes. As it became clear that needle exchange hadn’t resulted in a massive increase in IV drug use—but had helped halt the spread of HIV—resistance to measures like naloxone to reverse overdose was pre-empted.

In contrast to the fight over needle exchange, when conservative politicians, drug treatment providers and religious leaders actively opposed expansion and claimed, without data, that it would encourage drug use, it’s actually hard now to find anyone who will argue that drug users and their families should not have access to the OD antidote for fear that preventing the deaths of users “sends the wrong message.”

Third, recovery activists have played a role. While there are still reactionary forces like Patrick Kennedy, many people who have come out about their own recovery have made clear that the criminal justice approach has failed. By putting a real face on drug users—not a stereotyped image of a criminal—recovering people have begun to help fight against, rather than support, their own oppression.

Of course, historically, fights for drug law reform have often resulted in backlash—marijuana was almost legalized, for example, under President Jimmy Carter, but instead we got Ronald Reagan’s war on drugs. But the strength and variety of the forces working against that possibility—particularly the rapid access to accurate information—give me hope that we may finally be starting to get drug policy right.

Maia Szalavitz is one of the nation’s leading neuroscience and addiction journalists, and a columnist at Substance.com. She has contributed to Timethe New York TimesScientific American Mindthe Washington Post and many other publications. She has also published five books, including Help at Any Cost: How the Troubled-Teen Industry Cons Parents and Hurts Kids (Riverhead, 2006), and is currently finishing her sixth, Unbroken Brain, which examines why seeing addiction as a developmental or learning disorder can help us better understand, prevent and treat it. Her last column for Substance.com was about why it is time to reclaim the concept of “recovery” from the abstinence-only establishment.

The Birth of the Time-Motion Human

QuantimetricSelfSensingPrototypeMannApparatus

By Dale Lately

Source: The Baffler

In a darkened room, a woman lies watched by an infra-red camera as she sleeps. It monitors her breathing, her movements, the flicker of her eyelids. Some hours later it stings her with a painful electric shock. She wakes, tumbles out of bed and into the restroom, whereupon a chip installed in her toothbrush tracks her arm movements. She’s photographed, silently, every thirty seconds. As she sets off in the morning her location is logged and data is streamed on the steps she takes. Her pulse and calorie count are recorded and sent to unseen observers. She has a dog at her side. The dog’s data is logged as well.

Such a tableau would be the envy of any futuristic dictatorship. In fact, the devices outlined above are all available on the consumer market now, for voluntary use. The impetus towards tracking our lives with smartphones, apps and stats represents a massive growth area into which companies like Jawbone, MyFitnessPal, RunKeeper, Runtastic, MapMyRun, Foodzy, GymPact, and Fitocracy are flooding. Alongside the Nike+ Fuelband, there’s the popular Fitbit Flex, a wristband that counts the steps you take by day and the number of times you stir in your sleep. There are smart cups to track what you drink and wristbands programmed to give you electric shocks for not achieving your goals. There’s even a “Fitbit for your vagina” in the form of the KGoal Smart Kegel Trainer—a Kickstarter project designed to track kegels, exercises for women’s pelvic floor muscles to improve childbirth and continence, and for helping them to achieve a better “clench strength” via Bluetooth.

With all this biofeedback now available on our phones, the act of walking, living and breathing can—at least to the “datasexuals” who embrace it—be an ongoing project with limitless potential for improvement. But might such potential also lead to a kind of “Taylorism within”? Applying scientific management to twentieth century business created a workforce optimized for maximum efficiency. Likewise, life-tracking is encouraging us to internalize this dream by optimizing ourselves. Rather than a tool for liberation, we’re using the tech, in other words, to tune our lives for maximum “productivity.”

Perhaps none of this should seem surprising for a consumer society that drives on anxiety. If bad breath had to be invented as a disease mouthwash would help to cure a century ago, now the Quantified Self movement suggests we must live in permanent beta, to aim not just at maintaining ourselves but to become “better than well.” And so, Dave Allen’s Getting Things Done and websites like Lifehacker help to turn our lives into a series of sanctioned tasks and goals, where one must carry a “Surprise Journal” to find areas for self-improvement in one’s life, and sleep comes in the form of “power” naps. There’s the Lumo Back, a gizmo that monitors the tricky process of sitting in a chair, while the Narrative wearable camera snaps your life twice a minute. Time management lessons are now available for kids, while the iPotty seems to give toddlers the message that they shouldn’t take their eyes off a screen even when satisfying the most basic of human needs.

Silicon Valley, naturally, is more than happy to export the mantra of ongoing product optimization to our bodies: life-hacking fanatics talk of “upgrades” and “body hacks,” with often obsessive results. In a Financial Times article that marked a mainstream recognition of the movement, Tim Ferriss–author of The 4-Hour Body–claimed that he could teach people how to lose weight without exercising, work on two hours’ sleep, and have a fifteen-minute orgasm, while bio-hacker Dave Asprey was adamant that he’s made himself twenty years younger and forty IQ points smarter through life-tracking and smart pills (“I’ve rewired my brain,” he said). All of this task management can become a considerable task in itself, leading to the piling up of Catch 22 ironies—like the fact that developers are now working on smartphone apps to solve the problem of people spending too much time on their smartphones.

Luckily, some are questioning the use of intimate monitoring devices in our lives. The information asymmetry provided by the emergent “Internet of Things” may create a class of uninsurable people, while ”digital Taylorism”—the tracking and tagging of workers like cattle—has been roundly criticized as it has begun to emerge at companies like Amazon. What’s disquieting about the popularization of life-tracking is the voluntary desire to become “time-motion humans,” to subject ourselves to a self-imposed surveillance state. “Track everything. Track your entire day—wherever you go,” says the website for the LumoBack. “VESSYL AUTOMATICALLY KNOWS AND TRACKS EVERYTHING YOU DRINK,” the Vessyl “smart mug” warns us in stark capitals. And once we’ve volunteered for this intimate biological scrutiny, we’re keen to publicize the results—using tools like the Withings scale, which threatens to broadcast our weight gains to our Twitter followers as “encouragement.” Self-Improvement Macht Frei.

Since the invention of the forceps we’ve been introducing machinery into our bodies to improve our lives (the aforementioned KGoal is actually based on a biofeedback device from the 1940s by Dr. Arnold Kegel), and undoubtedly many of these trackers are helping to make people healthier. But life tracking also comes from a certain ideological background, one that denigrates macro-interventions in our lives (nationalized health care) in favor of individual micro-solutionism (becoming our own gym instructors and fitness trainers).

We’re living in an entrepreneurial model of humanity, a vision of human beings as start-ups, where unfitness or obesity are viewed as “bugs” to be fixed rather than as products of an economy based on long hours and precarious work. Daily exercise has always been an individual responsibility, but sharing our biofeedback via social media encourages people to compete like businesses, vying for better health scores with the personal data that makes us special. (Flex boasts that it reflects “your stats, not any average Joe’s.”) Here we can all be Superman—“Join over 141,000 other people who want to discover their inner superhero,” urges website Superheroyou—while, back in the complex, unquantifiable real world, we often struggle to maintain control over the most basic facts of our finances and job prospects.

The Quantified Self literature is full of such fantasizing. It all treats the body as a fun challenge, a puzzle to be solved. We see this in the current trend towards adding game-like features to the process of life tracking, which leads to some quite startlingly intimate results (“Spreadsheets,” an app that promises to gamify your sex life, has the user get on the bed and talk dirty to a computer). Even antenatal workouts aren’t immune: the KGoal promises gamification in forthcoming product updates for those who fancy comparing their pelvic thrust scores to those of their peers.

The friendly rivalry that has always been a part of amateur fitness starts to look less inspiring, and more controlling, when it’s built into the architecture of smartphones and social media. It’s more like a crowd-sourced version of what philosopher Michel Foucault termed “Biopower,” the control over our bodies wielded by states and their institutions. But in this version, it’s not the institutions; we control ourselves, and each other.

As more and more aspects of our lives are seen as legitimate targets for intrusion by technology, the gaze inevitably falls on the newly born. Start-ups like Sproutling, Owlet, and Mimo are springing up to replace old-fashioned baby monitors with comprehensive, round-the-clock surveillance (temperature, pulse, breathing, position, room ambience) as well as all the attendant data crunching. These infants may be the first humans to grow up entirely in the lens of machines, with the process of rearing having been refashioned as a high-tech, high-maintenance project, requiring endless inputs from both parent and child alike. They will be the first “time-motion babies”: faster, happier, more productive, in the words of Radiohead’s Ok Computer.

Will they really be happier, versed as they will be, since birth, in the techniques of maximizing their sleep, optimizing their nutrients, and tracking the number of steps they walk? It seems doubtful, but then, it’s impossible to really tell when we talk about happiness—even Silicon Valley hasn’t worked out how to put a number on that.

 

Dale Lately writes about culture and communications and has contributed to the Guardian, 3:AM Magazine, OpenDemocracy, Litro and Pop Matters. His regular musings can be found at @dalelately and www.dalelately.blogspot.com.

Social Schizophrenia, Social Depression: What does TV tell us about America?

wash_dees

By Charles Hugh Smith

Source: Dangerous Minds

The difference between what we experience and what we’re told we experience creates a social schizophrenia that leads to self-destructive attitudes and behaviors.

What can popular television programs tell us about the zeitgeist (spirit of the age) of our culture and economy?

It’s an interesting question, as all mass media both responds to and shapes our interpretations and explanations of changing times. It’s also an important question, as mass media trends crystallize and express new ways of understanding our era.

Those who shape our interpretation of events also shape our responses.  This of course is the goal of propaganda: Shape the interpretation, and the response predictably follows.

As a corporate enterprise, mass media’s goal is to make money—the more the better—and that requires finding entertainment products that attract and engage large audiences.  The products that change popular culture are typically new enough to fulfill our innate attraction to novelty—but this isn’t enough. The product must express an interpretation of our time that was nascent but that had not yet found expression.

We can understand this complex process of crystallizing and giving expression to new contexts as one facet of the politics of experience.

The Politics of Experience

It is not coincidental that the phrase politics of experience was coined by a psychiatrist, R.D. Laing, for the phrase unpacks the way our internalized interpretation of experience can be shaped to create uniform beliefs about our society and economy that then lead to norms of behavior that support the political/economic status quo.

Here’s how Laing described the social ramifications in Chapter Four of his 1967 book, The Politics of Experience:

“All those people who seek to control the behavior of large numbers of other people work on the experiences of those other people. Once people can be induced to experience a situation in a similar way, they can be expected to behave in similar ways. Induce people all to want the same thing, hate the same things, feel the same threat, then their behavior is already captive – you have acquired your consumers or your cannon-fodder.”

For Laing, the politics of experience is not just about influencing social behavior – it has an individual, inner consequence as well:

“Our behavior is a function of our experience. We act according to the way we see things. If our experience is destroyed, our behavior will be destructive. If our experience is destroyed, we have lost our own selves.”

How the media shapes our interpretation affects not just our beliefs and responses, but our perceptions of self and our role in society. If the media’s interpretation no longer aligns with our experience, the conflict can generate self-destructive behaviors.

In other words, mass media interpretations can create a social schizophrenia that can lead to self-destructive attitudes and behaviors.

Social Analysis of TV

By its very nature as a mass shared experience, popular entertainment is fertile ground for social analysis.

Here’s a common example: what does a child learn about conflict resolution if he’s seen a thousand TV programs in which the “hero” is compelled to kill the “bad guy” in a showdown? What does that pattern suggest, not just about the structure of drama, but about the society that creates that drama?

Analyzing entertainment has been popular in America since the 1950s, if not earlier.  The film noir of the 1950s, for example, was widely deemed to express the angst of the Cold War era.  Others held that the rising prosperity of the 1950s enabled the populace to explore its darker demons—something the hardships and anxieties of the Depression did not encourage.

Many believe the Depression gave rise to screwball comedies and light-hearted entertainment featuring the casually wealthy precisely because these were escapist antidotes to the grinding realities of the era.

Even television shows that were denigrated as superficial in their own time (for example, Bewitched in the 1960s) can be seen as politically inert but subconsciously potent expressions of profound social changes: the “witch” in Bewitched is a powerful young female who is constantly implored by her conventional husband to conform to all the bland niceties of a suburban housewife, but she finds ways to rebel against these strictures.

Laing saw the potential conflict between what we experience and how we’re told to interpret that experience not just in social terms but in psychiatric terms: such splits open a gulf that can lead to a form of schizophrenia.

Diagnosing Our Disease with TV

What can we make of the popular TV series of the present era? What do they say, beneath the surface, about American society?

I contend that popular TV expresses three key aspects of U.S. society and economy that are at odds with the core idealized values espoused in civic classes and the media. The three idealized values are:

1.  America is a meritocracy—selections, admissions, etc., are based on the candidates’ merits

2.  Anyone can get ahead if they get an education and work hard

3.  America is the wealthiest nation on earth in terms of opportunity, fairness and capital

TV expresses three aspects that confound these idealized values:

1.  Life is a game in which the winner takes all

2.  The opportunity to “get ahead” via conventional means—getting educated, working hard, etc.—is a joke; only those who skirt conventions and the law get rich

3.  Life is a tortuous endurance course where those in charge demand the cruel and the impossible

Winner-Take-All Talent Shows

Let’s start with the genre that has been a dominant force in American TV since the 2000s: the winner-take-all talent show (reality and game shows).

The long-running Survivor series, for example, was a winner-take-all contest of physical prowess and political guile, while the many programs staging singing/dancing contests (American Idol, etc.) put entertainment skills to the competitive test.  A wide range of other entries stage competitions in cooking, entrepreneurship, losing weight, negotiating obstacle courses, and so on.

What interpretations of our experience do these highly competitive winner-take-all reality shows promote?

We could start with the fact that the stars (other than the hosts/judges) are apparently ordinary “every man/woman” Americans, i.e. people not unlike us.  Watching them, it is not too much of a stretch to imagine ourselves on stage, in the kitchen, etc., trying to impress the judges with our talents. It’s easy to identify with the contestants.

These shows enable us to vicariously experience the fantasy that we, too, could be on national TV and could win the accolades of the judges and fans and be the winner who takes it all.

This natural empathy with the temporarily famous with whom we can vicariously share the thrill of victory and the agony of defeat is clearly tapping a deep cultural desire to taste celebrity and the implied financial rewards of winning in an increasing winner-take-all society/economy.

Could the financial/political marginalization of the average citizen and the widening gulf between the typical household and those at the top of the fame/wealth pyramid have something to do with this fascination for winner-take-all competitions on the public stage?

Since there have been game shows on TV for decades, it could be argued that this proliferation of winner-take-all contests is nothing new. But this fails to account for the difference between a game show in which the correct answer is a fact and the subjective votes of judges, other players and the audience that count in winner-take-all contests.

I would argue that this recent explosion of competitions (“modern gladiator,” anyone?) is an expression of deeply held shared cultural values: we accept that ours is a highly competitive society, and that it is becoming even more so as the top “winners” skim the vast majority of the winnings (media visibility, wealth, adulation, social status. etc.), leaving a few morsels for the top 5% and nothing but crumbs for the bottom 95%.

But these TV programs also project the fantasy that our fight-to-the-figurative-death society is still a meritocracy—the best guy/gal wins, as judged by “experts” (or celebrities claiming expertise; the judges’ expertise is structured to be unassailable, just like all the other “experts” in our society).

But if we can’t win it all on merit, there is an alternative way to win: display superior political guile or greater popularity with the audience. (Interestingly, this echoes the coliseum audiences of the late Roman era, who also had some sway over who lived to fight another day on the choreographed battlefield below.)

Perhaps this helps explain our collective obsession with celebrity and the many measures of popularity available to everyone now—Instagram, Facebook likes, Twitter followers (for sale in lots of 10,000), Klout scores, and so on.

In other words, as success in the real world grows increasingly distant, vicariously competing and “winning it all” becomes very compelling. Rather than deal with the vast injustices of our system, we cling to the idealized norm that meritocracy matters, even as “winning” in real life is increasingly a game of cronyism, guile, gaming the system, misrepresenting the truth, etc.

And so we thrill to these play-acting displays of meritocracy in action, as it confirms our cultural value system that that despite the predations of Wall Street and Washington, merit still counts.

And when all else fails, we have a fallback source of identity and “winning”—our popularity. And if we don’t have enough of our own, then we can share vicariously in the popularity of TV show winners and celebrities who have reached the pinnacle.

This is the core message of an interesting and erudite half-hour talk on celebrity given by Games of Thrones actor Jack Gleeson:

“During a recent visit to the Oxford Union, Gleeson took the opportunity to dismantle the ‘religious hysteria’ of celebrity worship with an appropriately epic rant, breaking down the economic, psychological, and sociological catalysts for public reverence of celebrities and their negative impact on society as a whole.”

Gleeson draws upon a number of intellectual sources (Weber, Baudrillard, et al.) in his discussion of the contemporary culture of celebrity, and concludes that celebrity fills the void left when development of an authentic self is stunted.

The parallel with Laing’s “lost self” is striking.

In effect, when the opportunities for developing an authentic self have been reduced to popularity, public visibility and the status that flows from these forms of recognition, then the worship of celebrity and the aching desire for a moment in the spotlight become rational substitutes for a True Self.

As these wispy contingencies can never form an authentic selfhood, even those who do “win” the competition for celebrity are ultimately dissatisfied and disillusioned.

My conclusion: the popularity of competitive winner-take-all TV programs reflects the paucity of opportunities for selfhood and the substitution of celebrity worship for the difficult task of forging an independent identity in a society that marginalizes all but the top players.

If this isn’t a form of cultural schizophrenia, then what exactly is it? The claim that this is all just good clean fun is absurd, as is the claim that it’s perfectly healthy to sacrifice one’s identity in a competition for recognition that 99.9% of us are sure to lose.

The Web Revolution That’s Changing How the World Gets High

ibdt-carousel

By Mike Power

Source: Disinfo.com

It is mainly the young who are suffering the consequences of society’s inability to update our drug laws effectively for the modern age. Almost one third of young people are searching for ways of getting legally high, according to the latest survey commissioned by the Angelus Foundation, a campaign group founded in 2009 by Maryon Stewart, whose twenty-one-year-old daughter Hester, a gifted medical student and keen athlete, died after taking GBL in 2009. (Gamma-butyrolactone, a paint stripper and industrial cleaner, can be used as an intoxicant and is poplar on the club scene. It is active at 1 ml, and causes euphoria and disinhibition, but overdoses, where users fall into a coma-like state, are commonplace since it is so potent. It was legal until late 2009.)

Two-thirds of the 1,011 sixteen-to-twenty-four-year-olds surveyed by the Angelus Foundation in October 2012 admitted they were not well-informed about the risks associated with the new drugs on the market.

Festivals since Woodstock have been linked with drug use, whatever message their PR machines might seed in the press, so events there can tell us much about current trends of use and the attendant problems. Dip your head under the canvas at a festival medical tent and you arrive at the intersection of the net, new drugs and young people. Monty Flinsch, who runs Shanti Camp, a non-profit aid organization providing drug crisis intervention at American festivals, says that in recent years instead of dealing with the psychological issues caused by LSD, psilocybin and MDMA, they have seen seizures, delirium, violence and deaths. ‘Even discounting the hyperbolic news coverage of face-eating zombies, the real situation is substantially worse with legal research chemicals than it ever was before. It is now easier for an American teenager to obtain a powerful psychedelic than it is to obtain alcohol. Today’s scene is much more complex with the influx of large numbers of research chemicals ranging from the more common bath salts (MDPV, methylone) to much more obscure chemicals such as 25C-NBOMe and methoxetamine,’ he said.

The reasons the drugs are taken are manifold, but he believes their legality is a major draw, along with cultural influences. ‘Kids feel they are exposing themselves to less risk by taking drugs that are not going to get them arrested, and drug use is highly subject to countercultural trends, and whatever the cool kids are taking quickly becomes popular. In many cases the legal consequences of drug use far outweigh the medical risks. Our drug laws in the US are forcing users to experiment with increasingly dangerous compounds in order to avoid having their lives ruined by a criminal conviction.’

Flinsch says he cannot see any likely improvements in the future. ‘New research chemicals are ubiquitous and the problems associated with them are growing. From the frontlines we see the situation getting worse rather than better. The new compounds are poorly understood and have little or no history of human use, and therefore the problems we see are harder to characterize and therefore treat. It is sad that what is currently legal is substantially more dangerous than what is illegal.’

The entire debate around drugs, which was already philosophically and practically complex, has been made yet more intractable by the emergence of these new drugs and distribution systems. Our insistence on overlaying anachronistic models of drug control onto this digital world might, in future years, be seen as a fatal flaw that we did not address when we had the chance.

The popularization of research chemicals presents legislators, policymakers and police with an almost existential dilemma. They are charged with protecting the health of populations and reducing crimes, and these new drugs pose health risks, but are legal. The Chinese factories that produce them operate with none of the quality control typical in most pharmaceutical manufacturing plants, but customer uptake is enthusiastic. Each new ban brings a newer, possibly more dangerous drug to the market, and it is impossible to predict what the next moves might be.

Legal responses seem not only not to work, but to exacerbate the issue. The American Analog Act did nothing to prevent the arrival in 2009–11 of the JWH chemicals, the cathinones found in bath salts, and the other synthetic cannabinoids that had hit the UK and Europe in 2008. And where the early vendors of synthetic cannabis substitutes had sold the drugs online, the US did it bigger and better, and even more publicly and commercially.

In the US, in October 2011 the DEA responded by adding several of the new drugs to the controlled-substances schedule, making them formally and specifically illegal. The Synthetic Drug Control Act of 2011 was finally signed into law in July 2012, banning dozens of research chemicals at a stroke. Soon after the bill was passed, Time magazine quoted a Tennessee medic, Dr Sullivan Smith, who said the state had been engulfed by the new drugs. ‘The problem is these drugs are changing and I’m sure they’re going to find some that are a little bit different chemically so they don’t fall under the law,’ he said. ‘Is it adequate to name five or ten or even twenty? The answer is no, they’re changing too fast.’

Within weeks of these laws being passed, there were dozens more new drugs available in the US. One category, known as the NBOME-series of chemicals, is composed of unscheduled analogues of the banned Shulgin psychedelics 2C-I, 2C-B, 2C-D, and so on. Where Shulgin’s chemicals were generally active between 10 mg and 20 mg, these new compounds, created in legitimate medical settings for experimental purposes, are more potent by a large order of magnitude, active at around 200 µg. Each gram of these new, unresearched drugs contains around 5,000 doses, and they cost fractions of a penny per dose. The compounds existed before the most recent bans, but it was the new laws that inspired their wider use; use that will only grow as talk of their effects is amplified online. They have already claimed victims. At the Voodoo Fest in New Orleans in October 2012, twenty-one-year-old Clayton Otwell died after taking one drop of an NBOME drug. The New Orleans Times Picayune newspaper spoke to festival goers who said many dealers were selling the drug 25I-NBOME as artificial LSD or mescaline at the event. ‘This weekend, it was everywhere,’ festivalgoer Jarod Brignac, who also was with Otwell at the festival, told the paper. ‘People had bottles and bottles of it; they were walking through the crowd, trying to make a dime off people at the festival.’

There have been at least six other fatalities in the US from 25I-NBOME, Erowid reported in late 2012. There are dozens of other NBOME-drugs, and their use is growing. The Bluelight bulletin board has three threads on 25I-NBOME, running to over seventy-five pages with more than 100,000 views. Search Google for it and there are suppliers on the first page. A kilo of it can be bought for a few thousand dollars from China.

We must now allow drug users to make safer choices, and that means a gradual, tested, evaluated but concerted roll-back of all existing drug laws; particularly those concerning MDMA, marijuana, magic mushrooms and mescaline, for these are the drugs that most research chemicals seek to emulate. Only then will dangerous innovation end. Simultaneously, drug awareness classes should be compulsory at all schools with credible, evidenced and honest discussions of each drug’s effects, good and bad, including alcohol and tobacco. This will not end the debate, or addiction, or reduce drug use. But it will mean those who choose to take drugs in the future will be better informed and safer, and the costs to society lower. Governments must now seize control of the market in new and old drugs from amateurs, criminals and gangsters.

Perhaps the web’s final and most dramatic effect will be to strip drug culture of its mystique, its cachet of countercultural cool, to reveal that behind the magic and madness, there lie only molecules. At the end of it all, drugs are just carbon, hydrogen and a few other elements. They have their meaning projected onto them by users and the culture more widely. Remove the thrill of social transgression that acting illegally provides and reframe drug use in a clinical context, as a health issue, and that might change. We know in detail what the route we have taken for the last century results in: greater and more dangerous use. We now need a new approach and new data to analyse. It is not this book’s argument that any drug is entirely safe; they demonstrably are not. But to persist in the digital age with this failed and arbitrary strategy of prohibition in the face of all the evidence that it increases harm is irresponsibly dangerous.

However, although some politicians are able to admit grudgingly to youthful experimentation with drugs, it seems few are willing to experiment even moderately with new approaches in policy now they have the power to effect positive change – even at a time when the people who vote for them are demanding exactly that, and when it is more urgent than ever before.

Mike Power is a freelance investigative journalist living in London. He has worked for The Guardian, the Mail on Sunday, the BBC, and Reuters. In 2014 he received the Best Investigative Journalism Award, awarded by the Association of British Science Writers, for his piece “The drug revolution that no one can stop,” which appeared in the online journal Matter. Drugs Unlimited is his first book.

Privatized Ebola

Red Cross workers burry 14 May Italian nun Dinaros

By Margaret Kimberly

Source: Black Agenda Report

“The world of private dollars played a role in consigning thousands of people to death.”

Sierra Leone has waved the white flag in the face of Ebola Virus Disease (EVD). Its meager infrastructure has buckled under the onslaught of a disease which could have been curtailed. The announcement that infected patients will be treated at home because there is no longer the capacity to treat them in hospitals is a surrender which did not have to happen. Not only did Europe and the United States turn a blind eye to sick and dying Africans but they did so with the help of an unlikely perpetrator.

The World Health Organization is “the directing and coordinating authority for health within the United Nations system.” Its very name implies that it takes direction from and serves the needs of people all over the world but the truth is quite different. The largest contributor to the WHO budget is not a government. It is the Bill and Melinda Gates Foundation which provides more funding than either the United States or the United Kingdom. WHO actions and priorities are no longer the result of the consensus of the world’s people but top down decision making from wealthy philanthropists.

The Bill and Melinda Gates Foundation may appear to be a savior when it provides $300 million to the WHO budget, but those dollars come with strings attached. WHO director general Dr. Margaret Chan admitted as much when she said, “My budget [is] highly earmarked, so it is driven by what I call donor interests.” Instead of being on the front line when a communicable disease crisis appears, it spends its time administering what Gates and his team have determined is best.

The Ebola horror continues as it has for the last ten months in Guinea, Liberia and Sierra Leone. The cruelty of the world’s lack of concern for Africa and all Africans in the diaspora was evident by the inaction of nations and organizations that are supposed to respond in times of emergencies. While African governments and aid organizations sounded the alarm the WHO did little because its donor driven process militates against it. The world of private dollars played a role in consigning thousands of people to death.

Critics of the Gates Foundation appeared long before this current Ebola outbreak. In 2008 the WHO’s malaria chief, Dr. Arata Kochi, complained about the conflicts of interest created by the foundation. In an internal memo leaked to the New York Times he complained that the world’s top malaria researchers were “locked up in a ‘cartel’ with their own research funding being linked to those of others within the group.” In other words, the standards of independent peer reviewed research were cast aside in order to please the funder.

Private philanthropy is inherently undemocratic. It is a top down driven process in which the wealthy individual tells the recipient what they will and will not do. This is a problematic system for charities of all kinds and is disastrous where the health of world’s people is concerned. Health care should be a human right, not a charity, and the world’s governments should determine how funds to protect that right are spent. One critic put it very pointedly. “…the Gates Foundation, Bill & Melinda Gates, do not believe in the public sector, they do not believe in a democratic, publically owned, publically accountable system.”

There is little wonder why the Ebola outbreak caught the WHO so flat footed as they spent months making mealy mouthed statements but never coordinating an effective response. The Gates foundation is the WHO boss, not governments, and if they weren’t demanding action, then the desperate people affected by Ebola weren’t going to get any.

Privatization of public resources is a worldwide scourge. Education, pensions, water, and transportation are being taken out of the hands of the public and given to rich people and corporations. The Ebola crisis is symptomatic of so many others which go unaddressed or improperly addressed because no one wants to bite the hands that do the feeding.

The Bill and Melinda Gates Foundation has pledged an additional $50 million to fight the current Ebola epidemic but that too is problematic, as Director General Chan describes. “When there’s an event, we have money. Then after that, the money stops coming in, then all the staff you recruited to do the response, you have to terminate their contracts.” The WHO should not be lurching from crisis to crisis, SARS, MERS, or H1N1 influenza based on the whims of philanthropy. The principles of public health should be carried out by knowledgeable medical professionals who are not dependent upon rich people for their jobs.

The Gates are not alone in using their deep pockets to confound what should be publicly held responsibilities. Facebook founder Mark Zuckerberg announced that he was contributing $25 million to fight Ebola. His donation will go to the Centers for Disease Control Foundation. Most Americans are probably unaware that such a foundation even exists. Yet there it is, run by a mostly corporate board which will inevitably interfere with the public good. The WHO and its inability to coordinate the fight against Ebola tells us that public health is just that, public. If the CDC response to Ebola in the United States fails it may be because it falls prey to the false siren song of giving private interests control of the people’s resources and responsibilities.

 

Margaret Kimberley’s Freedom Rider column appears weekly in BAR, and is widely reprinted elsewhere. She maintains a frequently updated blog as well as at http://freedomrider.blogspot.com. Ms. Kimberley lives in New York City, and can be reached via e-Mail at Margaret.Kimberley(at)BlackAgendaReport.com.