Though television viewership has been in the decline for the past few years, latest statistics compiled in a recent piece by Jim Edwards for Business Insider indicate the trend is accelerating. The article explains how a number of factors including changing technologies, consumer habits, poor business decisions, and an economic slump have contributed to television’s descent. Factors that seems to be skimmed over is lack of quality content and changing tastes, though it does mention that viewership of professional baseball and basketball have been dropping (could it be more people are tired of watching overpaid “bread and circuses” participants?).
I’ve never paid for cable not only because there’s plenty of better alternatives, but because I dislike corporate news and commercials. From what I’ve seen on cable while traveling, the only news without blatant U.S. government/corporate bias were independent news programs on public access, RT, Press TV and a few other foreign news outlets (and those don’t seem to be available in many areas). Though I realize I’m in the minority, I’d like to believe that at least a small subset of those cutting cable cords are doing so because of increased awareness of corporate media lies.
While dwindling viewership is distressing news for many corporate interests, it’s a promising development for independent news, alternative media and those in support of cognitive diversity. Even if many people abandoning cable are following cable programing online, there’s still a greater chance to be exposed to information from sources other than U.S. government/corporations on the internet and social media (regardless of government/corporate efforts to track what people view and say online).
All the major TV providers lost a collective 113,000 subscribers in Q3 2013. That doesn’t sound like a huge deal — but it includes internet subscribers, too.
In all, about 5 million people ended their cable and broadband subs between the beginning of 2010 and the end of this year.
People are unplugging.
Time Warner Cable, for instance, lost 306,000 TV subscribers in Q3, and 24,000 broadband web subscribers, too.
And Tom Rutledge, CEO of Charter Communications, told Wall Street analysts he was “surprised” that 1.3 million of his 5.5 million customers don’t want TV — just broadband internet. “Our broadband-only growth has been greater than I thought it would be,” he said.
Cable TV ratings are sinking.
Cable TV ratings are in an historic slump. Note that the “growth” line, as charted by Citi analysts Jason B. Bazinet and Joshua P. Carlson, is persistently below zero.
Fewer people are watching TV.
Even ratings for some major TV events are in decline.
People just don’t watch the World Series like they used to. Recently, viewer decline is led by young people, according to Business Insider’s Sports Page:
It’s the same with basketball.
Maybe people prefer the NBA to the MLB? Turns out that today’s big stars don’t grab TV eyeballs the way they used to either.
For the first time ever, the number of cable TV subscribers at major providers is about to dip below 40 million.
Cable and broadband companies are increasingly unable to retain customers.
This chart (below) is the most important chart in this set: It shows the number of net subscriber additions across all types of customers — cable TV, broadband internet and landline phone.
The cable and broadband subscriber business is seasonal. The net number of people leaving or adding services changes with the seasons, because people like to move house in the fall.
It used to be that up to 500,000 new subscriptions would be added across all companies in any given quarter. But now, cable and internet companies are lucky if they get any new subscribers at all. Increasingly, the industry loses subscribers rather than gaining them, according to this data from One Touch Intelligence:
For the first time ever, less than half of subscribers at the major broadband companies now subscribe to cable TV.
What’s happening is that people are giving up on cable TV as a standalone product, and the market is shifting in favor of telco companies like AT&T and Verizon who offer TV as a package with high-speed internet access, according to media equity analysts at ISI Group. (Direct Broadcast Satellite appears to be remaining steady, in part because its customers often live in more rural areas and have fewer alternatives.)
Here is how individual TV providers are affected.
It’s not an across-the-board collapse. But this is what you would expect to see during a technological sea-change: The weaker players are crumbling. The stronger players are picking up some of the pieces … but how long can they also resist the tide?
Fewer households actually have TV.
These charts, from Citi Research, show that the total “Nielsen TV Universe” — the number of people who watch TV — is declining. Note that the number of U.S. households is still growing, but growth in the number of households with cable TV is declining.
Fewer households have TV because they are watching video on mobile devices instead.
Here’s the big picture: People are spending more of their time on mobile, and less of their time on TV:
Mobile video is booming.
Even though iPhone and Android phones still struggle to show video seamlessly, the amount of video seen on mobile devices is going through the roof. About 40% of all YouTube traffic comes from mobile.
Tablets are stealing prime time, the period we used to devote to TV.
In the media industry, iPads and other tablets are sometimes called “vampire” media — they come out at night.
Ad revenue increases are masking the macro decline of TV.
The collapse of TV is having a counter-intuitive effect on TV ad sales: prices are going up, even though the number of commercials is going down.
The reason? It’s still really, really difficult to gather a large, mass audience in any kind of media, mobile or otherwise. The Super Bowl — on TV — is the only media property than can reach more than 100 million people in a three-hour stretch. That scarcity of large audiences makes TV’s dwindling-but-still-big audience increasingly valuable.
The TV business may actually be addicted to the very thing that is killing it.
Even though cable TV has had its worst year ever, cable TV revenues are still rising because companies are charging the dwindling number of customers more in subscription fees. According to analysts Craig Moffett and Michael Nathanson, those higher prices are “part of the problem” that pushes out poor subscribers — losing the TV business even more eyeballs:
“Of course, the fact that pay-TV revenue is still rising smartly is part of the problem … We have always argued that cord-cutting is an economic phenomenon, not a technological one. … Pay-TV revenue growth reflects rapid pay-TV pricing growth and that is precisely the problem. Rapidly rising prices are squeezing lower-income consumers out of the ecosystem.”
The market does not care that the TV audience is declining.
Time Warner Cable CEO Glenn Britt said in his last-ever conference call that the cable business has been ‘in denial.’
People who are unplugging from both cable TV and broadband internet are likely going to free wifi.
So if fewer people are watching cable TV and fewer people are paying for Internet service, does that mean that we just don’t care about watching our favorite shows anymore?
Not necessarily.
Free wifi — at work, in coffee shops, and on campuses — is making it easier for consumers to get the shows, movies and videos they want without subscribing to any kind of cable or broadband service
Fifty-seven cities in the U.S., including Los Angeles, offer free wifi. Facebook and Cisco have joined to offer free wifi access to customers in any business who check in to Facebook. Facebook’s original free wifi test included just 25 stores in the Bay Area. The company has now expanded it to 1,000.
For some people, there is just no need for a cable or pipe to deliver the internet or TV to their residence specifically, as long as they are within range of a free wifi hotspot.
Though the Drug War’s disproportionately harmful effects on the poor and people of color seem to have been one of its major functions from the start, it has also been a war against cognitive liberty for everyone. On the DoseNation podcast, chemist Casey William Hardison shares an inspiring personal account of how psychedelics transformed his life for the better, and how he successfully fought a system which imprisoned him for pursuing his passion:
If the state was truly concerned for the health and safety of drug users, they would do more to give accurate information to the public and make treatment of addictions accessible (including addictions to alcohol, cigarettes, and pharmaceutical drugs). Instead, the state seems particularly concerned about drugs which can potentially lead to an expansion of consciousness. But why is cognitive liberty such a threat? Terence McKenna shares his thoughts on the revolutionary potential of the psychedelic movement in this excerpt of a speech delivered at the Esalen Institute in 1989:
The provisional model (psychedelic/open-ended partnership) way of doing things is the only style that can perhaps seize the controls of this sinking submarine and get it back to the surface so that we can figure out what should be done. If we continue as we have, then we’re doomed. And the judgement of some higher power on that will be: “They didn’t even struggle. They went to the boxcars with their suitcases and they didn’t even struggle.” This is too nightmarish to contemplate. We’re talking about the fate of a whole planet.
Why are people so polite? Why are they so patient? Why are they so forgiving of gangsterism and betrayal? It’s very difficult to understand. I believe it’s because the dominator culture is increasingly more and more sophisticated in its perfection of subliminal mechanisms of control. And I don’t mean anything grandiose and paranoid. I just mean that through press releases and soundbites and the enforced idiocy of television, the drama of a dying world has been turned into a soap opera for most people. And they don’t understand that it’s their story and that they will eat it in the final act if somewhere between here and the final act they don’t stand up on their hind legs and howl.
So this whole effort to bring the psychedelic experience back into prominence is an effort to empower individuals and to get them to see that we are bled of our authenticity by vampirish institutions that will never of their own accord leave us alone. There must be a moment when the machinery and the working of the machinery become so odious that people are willing to strive forward and throw sand on the track and force a reevaluation of the situation. And it’s not done through organizing. It’s not done through vanguard parties or cadres of intellectual elites. It’s done through just walking away from all of that. Claiming your identity, claiming your vision, your being, your intuition, and then acting from that without regret. Cleanly, without regret.
While I think the value of organizing should not be underestimated, he speaks eloquently for cognitive empowerment and inner transformation as a path towards cultural and systemic change.
With the recent release of the Hunger Games sequel, it seems fitting to feature a cult classic that may have been the inspiration for that series: “Battle Royale”. Released in 2000, it was Kinji Fukasaku’s final film, a director previously best known for his “Battles Without Honor and Humanity” series of Yakuza films. Kinji died of cancer shortly after filming the first scene of the sequel, “Battle Royale 2: Requiem”, which was completed by his son Kenta in 2003. Battle Royale takes place in a dystopian society whose government regularly forces a class of high school students to participate in a deathmatch on a small island until just one survivor is left. Each student is given a bag containing food, water, a compass, a map of the island, and a randomly selected weapon. The students are also outfitted with surveillance collars that can track their movements and detonate if they wander into “danger zones” or refuse to cooperate.
Though the film is at times physically and emotionally brutal, it works effectively as a parable for the way youth are cynically manipulated by society and the different approaches people take dealing with tyranny. When Kinji Fukasaku first read the novel his film was based on, it resonated with him because of traumatic personal experiences. As he related in a Director’s statement for Battle Royale:
I immediately identified with the 9th graders in the novel, Battle Royale. I was fifteen when World War II came to an end. By then, my class had been drafted and was working in a munitions factory.
In July 1945, we were caught up in artillery fire. Up until then, the attacks had been air raids and you had a chance of escaping from those. But with artillery, there was no way out. It was impossible to run or hide from the shells that rained down. We survived by diving for cover under our friends.
After the attacks, my class had to dispose of the corpses. It was the first time in my life I’d seen so many dead bodies. As I lifted severed arms and legs, I had a fundamental awakening … everything we’d been taught in school about how Japan was fighting the war to win world peace, was a pack of lies. Adults could not be trusted.
The emotions I experienced then–an irrational hatred for the unseen forces that drove us into those circumstances, a poisonous hostility towards adults, and a gentle sentimentality for my friends–were a starting point for everything since. This is why, when I hear reports about recent outbreaks of teenage violence and crimes, I cannot easily judge or dismiss them.
This is the point of departure for all my films. Lots of people die in my films. They die terrible deaths. But I make them this way because I don’t believe anyone would ever love or trust the films I make, any other way.
BATTLE ROYALE, my 60th film, returns irrevocably to my own adolescence. I had a great deal of fun working with the 42 teenagers making this film, even though it recalled my own teenage battleground.
In the half century since JFK’s assassination researchers have yet to determine conclusively who did it, but what can be proven definitively is that the official story is a lie. Despite overwhelming evidence that Lee Harvey Oswald could not possibly have killed JFK in the way claimed by the Warren Commission Report, many in the corporate media still cling to the belief that he was the “lone gunman”. Most who have examined the assassination in-depth suspect the involvement of a cabal of different factions, which JFK himself referenced in his famous address to the American Newspaper Publishers Association in 1961.
Some of the people involved include those most opposed to the truth getting out. Documented evidence reveals the CIA was so threatened by independent researchers looking into JFK’s murder, they coined a new ad hominem attack to discredit them: “Conspiracy Theorists”. Worse than assassinating people’s character is actual assassination, which may have been behind the suspiciously large number witnesses holding important information for investigators who died in mysterious circumstances. Others involved in the “anti-conspiracy conspiracy” might include select heads of corporate news outlets who to this day continue to push the official story and discredit all information which contradicts it. Of course many people within groups involved in the cover-up had no direct role in the planning and execution of the assassination. Some might believe mass distrust of government would be too damaging were the truth to get out, others could be doing favors for powerful allies, were paid off, and/or pressured into cooperating.
As much as the establishment would like us to ridicule and dismiss those who believe anything other than the official story, more people than ever are open to alternative theories because so many have been proven to be true over time, there’s greater awareness that governments and corporations do lie, cheat, steal and kill, and the internet has empowered more people than ever to research on their own or at least cross-check information. Though it’s likely we will never see justice for JFK, because of the work of “conspiracy theorists” we are now more aware of the scope of government/corporate criminality and connections between government, wall street, war-profiteers, and the criminal underworld. Information uncovered related to the JFK assassination provides a glimpse into “deep politics” and an alternative historical narrative. Connections to JFK can be made to Watergate, Iran-Contra, and 9/11 (which is not too surprising because they benefit the same military-industrial geopolitical factions). As new evidence continues to expand our understanding of what took place on 11/22/63, such knowledge could also help us understand government/corporate crimes of the present and future.
On the Smells Like Human Spirit podcast, host Guy Evans interviews author and activist Dr. Michael Parenti and whistleblower/ex-secret service agent Abraham Bolden on the events of 11/22/63:
I’ve never subscribed to the notion that intelligence is fixed and unchanging. Intelligence (and its opposite) can be taught and reinforced though there may be varying ranges determined by factors such as diet, habits, genetics, personality, environment, time, resources and relationships. There’s also different categories of intelligences that aren’t equally valued by society, but the type of intelligence involved in critical thinking and creative problem solving is what our society seems to need most. On the latest episode of The Bulletproof Executive podcast, host Dave Asprey and researcher/science writer Andrea Kuszewski discuss methods to improve this type of intelligence among other topics including the relationship between extreme altruism and sociopathy.
Listen to the full interview here:
Kuszewski previously expanded on 5 ways to maximize cognitive potential as a guest blogger for Scientific American. Even if one has no need or desire to boost intelligence, they can also be used for sustaining intelligence and preventing cognitive decline associated with aging. Here are the five recommendations and her conclusion:
1. Seek Novelty
It is no coincidence that geniuses like Einstein were skilled in multiple areas, or polymaths, as we like to refer to them. Geniuses are constantly seeking out novel activities, learning a new domain. It’s their personality.
There is only one trait out of the “Big Five” from the Five Factor Model of personality (Acronym: OCEAN, or Openness, Conscientiousness, Extroversion, Agreeableness, and Neuroticism) that correlates with IQ, and it is the trait of Openness to new experience. People who rate high on Openness are constantly seeking new information, new activities to engage in, new things to learn—new experiences in general [2].
When you seek novelty, several things are going on. First of all, you are creating new synaptic connections with every new activity you engage in. These connections build on each other, increasing your neural activity, creating more connections to build on other connections—learning is taking place.
An area of interest in recent research [pdf] is neural plasticity as a factor in individual differences in intelligence. Plasticity is referring to the number of connections made between neurons, how that affects subsequent connections, and how long-lasting those connections are. Basically, it means how much new information you are able to take in, and if you are able to retain it, making lasting changes to your brain. Constantly exposing yourself to new things helps puts your brain in a primed state for learning.
Novelty also triggers dopamine (I have mentioned this before in other posts), which not only kicks motivation into high gear, but it stimulates neurogenesis—the creation of new neurons—and prepares your brain for learning. All you need to do is feed the hunger.
Excellent learning condition = Novel Activity—>triggers dopamine—>creates a higher motivational state—>which fuels engagement and primes neurons—>neurogenesis can take place + increase in synaptic plasticity (increase in new neural connections, or learning).
As a follow-up of the Jaeggi study, researchers in Sweden [pdf] found that after 14 hours of training working memory over 5 weeks’ time, there was an increase of dopamine D1 binding potential in the prefrontal and parietal areas of the brain. This particular dopamine receptor, the D1 type, is associated with neural growth and development, among other things. This increase in plasticity, allowing greater binding of this receptor, is a very good thing for maximizing cognitive functioning.
Take home point: Be an “Einstein”. Always look to new activities to engage your mind—expand your cognitive horizons. Learn an instrument. Take an art class. Go to a museum. Read about a new area of science. Be a knowledge junkie.
2. Challenge Yourself
There are absolutely oodles of terrible things written and promoted on how to “train your brain” to “get smarter”. When I speak of “brain training games”, I’m referring to the memorization and fluency-type games, intended to increase your speed of processing, etc, such as Sudoku, that they tell you to do in your “idle time” (complete oxymoron, regarding increasing cognition). I’m going to shatter some of that stuff you’ve previously heard about brain training games. Here goes: They don’t work. Individual brain training games don’t make you smarter—they make you more proficient at the brain training games.
Now, they do serve a purpose, but it is short-lived. The key to getting something out of those types of cognitive activities sort of relates to the first principle of seeking novelty. Once you master one of those cognitive activities in the brain-training game, you need to move on to the next challenging activity. Figure out how to play Sudoku? Great! Now move along to the next type of challenging game. There is research that supports this logic.
A few years ago, scientist Richard Haier wanted to see if you could increase your cognitive ability by intensely training on novel mental activities for a period of several weeks. They used the video game Tetris as the novel activity, and used people who had never played the game before as subjects (I know—can you believe they exist?!). What they found, was that after training for several weeks on the game Tetris, the subjects experienced an increase in cortical thickness, as well as an increase in cortical activity, as evidenced by the increase in how much glucose was used in that area of the brain. Basically, the brain used more energy during those training times, and bulked up in thickness—which means more neural connections, or new learned expertise—after this intense training. And they became experts at Tetris. Cool, right?
Here’s the thing: After that initial explosion of cognitive growth, they noticed a decline in both cortical thickness, as well as the amount of glucose used during that task. However, they remained just as good at Tetris; their skill did not decrease. The brain scans showed less brain activity during the game-playing, instead of more, as in the previous days. Why the drop? Their brains got more efficient. Once their brain figured out how to play Tetris, and got really good at it, it got lazy. It didn’t need to work as hard in order to play the game well, so the cognitive energy and the glucose went somewhere else instead.
Efficiency is not your friend when it comes to cognitive growth. In order to keep your brain making new connections and keeping them active, you need to keep moving on to another challenging activity as soon as you reach the point of mastery in the one you are engaging in. You want to be in a constant state of slight discomfort, struggling to barely achieve whatever it is you are trying to do, as Einstein alluded to in his quote. This keeps your brain on its toes, so to speak. We’ll come back to this point later on.
3. Think Creatively
When I say thinking creatively will help you achieve neural growth, I am not talking about painting a picture, or doing something artsy, like we discussed in the first principle, Seeking Novelty. When I speak of creative thinking, I am talking about creative cognition itself, and what that means as far as the process going on in your brain.
Contrary to popular belief, creative thinking does not equal “thinking with the right side of your brain”. It involves recruitment from both halves of your brain, not just the right. Creative cognition involves divergent thinking (a wide range of topics/subjects), making remote associations between ideas, switching back and forth between conventional and unconventional thinking (cognitive flexibility), and generating original, novel ideas that are also appropriate to the activity you are doing. In order to do this well, you need both right and left hemispheres working in conjunction with each other.
Several years ago, Dr Robert Sternberg, former Dean at Tufts University, opened the PACE (Psychology of Abilities, Competencies, and Expertise) Center, in Boston. Sternberg has been on a quest to not only understand the fundamental concept of intelligence, but also to find ways in which any one person can maximize his or her intelligence through training, and especially, through teaching in schools.
Here Sternberg describes the goals of the PACE Center, which was started at Yale:
“The basic idea of the center is that abilities are not fixed but rather flexible, that they’re modifiable, and that anyone can transform their abilities into competencies, and their competencies into expertise,” Sternberg explains. “We’re especially interested in how we can help people essentially modify their abilities so that they can be better able to face the tasks and situations they’re going to confront in life.”
As part of a research study, The Rainbow Project [pdf], he created not only innovative methods of creative teaching in the classroom, but generated assessment procedures that tested the students in ways that got them to think about the problems in creative and practical ways, as well as analytical, instead of just memorizing facts.
Sternberg explains,
“In the Rainbow Project we created assessments of creative and practical as well as analytical abilities. A creative test might be: ‘Here’s a cartoon. Caption it.’ A practical problem might be a movie of a student going into a party, looking around, not knowing anyone, and obviously feeling uncomfortable. What should the student do?”
He wanted to find out if by teaching students to think creatively (and practically) about a problem, as well as for memory, he could get them to (i) Learn more about the topic, (ii) Have more fun learning, and (iii) Transfer that knowledge gained to other areas of academic performance. He wanted to see if by varying the teaching and assessment methods, he could prevent “teaching to the test” and get the students to actually learn more in general. He collected data on this, and boy, did he get great results.
In a nutshell? On average, the students in the test group (the ones taught using creative methods) received higher final grades in the college course than the control group (taught with traditional methods and assessments). But—just to make things fair— he also gave the test group the very same analytical-type exam that the regular students got (a multiple choice test), and they scored higher on that test as well. That means they were able to transfer the knowledge they gained using creative, multimodal teaching methods, and score higher on a completely different cognitive test of achievement on that same material. Sound familiar?
4. Do Things the Hard Way
I mentioned earlier that efficiency is not your friend if you are trying to increase your intelligence. Unfortunately, many things in life are centered on trying to make everything more efficient. This is so we can do more things, in a shorter amount of time, expending the least amount of physical and mental energy possible. However, this isn’t doing your brain any favors.
Take one object of modern convenience, GPS. GPS is an amazing invention. I am one of those people GPS was invented for. My sense of direction is terrible. I get lost all the time. So when GPS came along, I was thanking my lucky stars. But you know what? After using GPS for a short time, I found that my sense of direction was worse. If I failed to have it with me, I was even more lost than before. So when I moved to Boston—the city that horror movies and nightmares about getting lost are modeled after—I stopped using GPS.
I won’t lie—it was painful as hell. I had a new job which involved traveling all over the burbs of Boston, and I got lost every single day for at least 4 weeks. I got lost so much, I thought I was going to lose my job due to chronic lateness (I even got written up for it). But—in time, I started learning my way around, due to the sheer amount of practice I was getting at navigation using only my brain and a map. I began to actually get a sense of where things in Boston were, using logic and memory, not GPS. I can still remember how proud I was the day a friend was in town visiting, and I was able to effectively find his hotel downtown with only a name and a location description to go on—not even an address. It was like I had graduated from navigational awareness school.
Technology does a lot to make things in life easier, faster, more efficient, but sometimes our cognitive skills can suffer as a result of these shortcuts, and hurt us in the long run. Now, before everyone starts screaming and emailing my transhumanist friends to say that I’ve sinned by trashing tech—that’s not what I’m doing.
Look at it this way: Driving to work takes less physical energy, saves time, and it’s probably more convenient and pleasant than walking. Not a big deal. But if you drove everywhere you went, or spent your life on a Segway, even to go very short distances, you aren’t going to be expending any physical energy. Over time, your muscles will atrophy, your physical state will weaken, and you’ll probably gain weight. Your overall health will probably decline as a result.
Your brain needs exercise as well. If you stop using your problem-solving skills, your spatial skills, your logical skills, your cognitive skills—how do you expect your brain to stay in top shape—never mind improve? Think about modern conveniences that are helpful, but when relied on too much, can hurt your skill in that domain. Translation software: amazing, but my multilingual skills have declined since I started using it more. I’ve now forced myself to struggle through translations before I look up the correct format. Same goes for spell-check and autocorrect. In fact, I think autocorrect was one of the worst things ever invented for the advancement of cognition. You know the computer will catch your mistakes, so you plug along, not even thinking about how to spell any more. As a result of years of relying on autocorrect and spell-check, as a nation, are we worse spellers? (I would love someone to do a study on this.)
There are times when using technology is warranted and necessary. But there are times when it’s better to say no to shortcuts and use your brain, as long as you can afford the luxury of time and energy. Walking to work every so often or taking the stairs instead of the elevator a few times a week is recommended to stay in good physical shape. Don’t you want your brain to be fit as well? Lay off the GPS once in a while, and do your spatial and problem-solving skills a favor. Keep it handy, but try navigating naked first. Your brain will thank you.
5. Network
And that brings us to the last element to maximize your cognitive potential: Networking. What’s great about this last objective is that if you are doing the other four things, you are probably already doing this as well. If not, start. Immediately.
By networking with other people—either through social media such as Facebook or Twitter, or in face-to-face interactions—you are exposing yourself to the kinds of situations that are going to make objectives 1-4 much easier to achieve. By exposing yourself to new people, ideas, and environments, you are opening yourself up to new opportunities for cognitive growth. Being in the presence of other people who may be outside of your immediate field gives you opportunities to see problems from a new perspective, or offer insight in ways that you had never thought of before. Learning is all about exposing yourself to new things and taking in that information in ways that are meaningful and unique—networking with other people is a great way to make that happen. I’m not even going to get into the social benefits and emotional well-being that is derived from networking as a factor here, but that is just an added perk.
Steven Johnson, author who wrote the book “Where Good Ideas Come From”, discusses the importance of groups and networks for the advancement of ideas. If you are looking for ways to seek out novel situations, ideas, environments, and perspectives, then networking is the answer. It would be pretty tough to implement this “Get Smarter” regiment without making networking a primary component. Greatest thing about networking: Everyone involved benefits. Collective intelligence for the win!
…And I have a departing question for you to ponder as well: If we have all of this supporting data, showing that these teaching methods and ways of approaching learning can have such a profound positive effect on cognitive growth, why aren’t more therapy programs or school systems adopting some of these techniques? I’d love to see this as the standard in teaching, not the exception. Let’s try something novel and shake up the education system a little bit, shall we? We’d raise the collective IQ something fierce.
Intelligence isn’t just about how many levels of math courses you’ve taken, how fast you can solve an algorithm, or how many vocabulary words you know that are over 6 characters. It’s about being able to approach a new problem, recognize its important components, and solve it—then take that knowledge gained and put it towards solving the next, more complex problem. It’s about innovation and imagination, and about being able to put that to use to make the world a better place. This is the kind of intelligence that is valuable, and this is the type of intelligence we should be striving for and encouraging.
Are we naturally good or naturally evil? Cognitive scientist Paul Bloom argues in a new book that we’re both.
In “Just Babies: The Origins of Good and Evil” (Crown), the developmental psychologist and Yale professor takes on the nature of morality and vast research spanning evolutionary biology to philosophy, drawing on everyone from Sigmund Freud to Louis C.K.
His conclusion? Babies have the capacity for empathy and compassion, possess a limited understanding of justice and have the ability to judge. Yet they navigate not along colour lines but as Us versus Them, usually landing squarely in the Us camp.
A conversation with Paul Bloom:
AP: What light do you shed on the “moral sense” of babies?
Bloom: We’re born with this extraordinary moral sense. A sense of right and wrong just comes naturally to humans and shows up in the youngest babies we can study. But this morality is limited. I think tragically limited. So we are morally attuned to those around us, to our kin, to our friends, to those we interact with, and we are utterly cold-blooded toward strangers. To some extent I think babies are natural-born bigots. They are strongly attuned to break the world into Us versus Them and have no moral feelings at all toward the Them, and this shows up all through development.
So in some way, although a lot of morality is inborn, I think the great success of humans … is expanding and transcending this inborn morality. You and I believe that, you know, not only is it wrong to kill somebody, it’s wrong to kill somebody from anywhere around the world. We might also agree that we’re obliged to help people in trouble, even if they look different from us or are from a different land.
We have notions of fairness and equity and justice that, again, extend more broadly, and although we might favour our own group in some ways, consciously or unconsciously, we’re probably not racist. We probably think that racism is wrong, and that a good moral system should treat all humans more or less the same, but none of that is present in the mind of a baby.
AP: Is it a revelation that we create the environments that can transform a partially moral baby into a very moral adult?
Bloom: I think in some sense it is not. I think any good parent knows that you raise a kid into a moral kid not by, you know, imparting moral lessons and making moral pronouncements, but by shaping the environment in ways that bring out our better selves. When you want to make people good people you don’t just say, ‘Oh, try real hard.’ You try to structure their environment so as to bring out their better aspects.
AP: Is that surprising?
Bloom: I think it’s surprising the extent to which it works and the extent to which the alternative fails. So, for instance, many people believe that giving people moral stories, expressing through literature moral values, has a profound effect on people’s lives. The actual evidence says it has no effect at all. It’s just zero. In fact, there are some studies showing that if you give kids stories about being generous and kind it paradoxically makes them a little bit meaner, roughly between the ages of 4 and 10. Preaching in general with kids often backfires.
AP: Where do serial killers come from?
Bloom: Serial killers are very unusual people. … We know that there’s genetic differences in people’s empathy, in people’s compassion and how much they care about other people, in their ability to control violent rages, for example, and I’m sure a serial killer is somebody who has the genetic short end of the stick. Then you toss in certain environments. Your typical serial killer had a very unhappy childhood.
AP: What about being hard-wired at birth?
Bloom: Some people are more likely to be serial killers than other people due to accidents of genes. I am far more likely to be a murderer or a rapist or a serial killer than you are because I’m a man. There’s some evidence that people who turn out to be psychopaths, even murderous psychopaths, have the short end of the genetic stick but there’s all sorts of environmental factors. … Fifty years ago, slapping one’s wife or raping one’s wife would be viewed as comical, legitimate, certainly not a crime. Now it’s the sort of thing that only a monster would do, and so we have tremendous evidence for profound changes that have nothing to do with genes.
AP: You discuss “hodgepodge morality.” Is there such a thing in babies?
Bloom: I think we naturally have multiple moral systems, multiple responses. Some of our responses are created by disgust, some by empathy, some by a sense of justice, some by a sense of fairness, some by self-interest. We respond to kin, to our family members in different ways than we respond to strangers in all sorts of ways that don’t fall into any elegant philosophical theory. And I think this is true for babies, too.
Babies are moral beings but they aren’t moral philosophers. They don’t have some sort of coherent theory. Rather they have a series of gut reactions, a series of moral triggers that they respond to. What we find in our research is all sorts of moral capacities on the part of babies. What we don’t find is some kind of careful, contemplative theory.
AP: Is that a bad thing?
Bloom: It isn’t. It’s the way we are, one way or another, but if you set yourself the task of constructing a society where everybody lives and everybody follows the same rules and adheres to the same notions, then you do want to some degree a consistent and coherent theory.
So it may be a good theory of psychology to say that a white person naturally cares a lot more toward another white person than toward a black person, and that’s an instinctive response that could develop in certain societies, but from the standpoint of constructing a theory of what actually is good, how we should live our lives, we would say, ‘Well that’s too hodgepodge for us. It’s inconsistent. It’s actually a cruel way for the mind to work.’
AP: You write about conflicts in research on racial bias in young children.
Bloom: For kids there’s a lot of evidence that they’re very strongly biased on Us versus Them if you get them to do it on the basis of things like different colored T-shirts, for example, but race and skin colour isn’t an automatic way of dividing up the world. So you take a two- or three-year-old and typically a two- or three-year-old shows no signs of being racist in any way. When you get older, if kids are in an environment when blacks and whites interact and they’re totally mellow with each other and there’s not much conflict, they’ll see black and white but it won’t matter at all. If you’re in an environment where it matters then it will matter.
Children are extremely prone and very ready to divide the world into groups, but the groups that they focus on is determined through learning.
I recently rediscovered the following article (through Disinfo.com) which seems more applicable to the world today than when it was originally written more than ten years ago. It may help explain certain questions about our culture such as:
Why so many are obsessed with youth while topics of aging and death are often avoided.
Why the most popular narratives in mass media always have a happy resolution.
Why children are regularly taught to suppress dark emotions and are shielded from them in stories and in life.
Why some choose to ignore or deny current events.
Why more people impulsively need to keep themselves distracted at every waking moment.
Why approximately 1 in 10 Americans are now on antidepressant drugs.
These may all be symptoms of a multitude of complex problems, but they’re also examples of the unhealthy ways we cope with them. In my view, being open to and understanding dark emotions are necessary to attain true emotional resiliency to endure life’s challenges. It’s important not just for survival, but to enable us to more effectively act to rectify socio-economic problems which are connected to personal and global problems.
Grief, fear and despair are part of the human condition. Each of these emotions is useful, says psychotherapist Miriam Greenspan, if we know how to listen to them.
I was brought to the practice of mindfulness more than two decades ago by the death of my first child. Aaron died two months after he was born, never having left the hospital. Shortly after that, a friend introduced me to a teacher from whom I learned the basics of Vipassana meditation: how to breathe mindfully and meditate with “choiceless” awareness. I remember attending a dharma talk in a room full of fifty meditators. The teacher spoke about the Four Noble Truths. Life is inherently unsatisfactory, he said. The ego’s restless desires are no sooner fulfilled than they find new objects. Craving and aversion breed suffering. One of his examples was waiting in line for a movie and then not getting in.
I asked: “But what if you’re not suffering because of some trivial attachment? What if it’s about something significant, like death? What if you’re grieving because your baby was born with brain damage and died before he had a chance to live?” I wept openly, expecting that there, of all places, my tears would be accepted.
The teacher asked, “How long has your son been dead?” When I told him it had been two months, his response was swift: “Well then, that’s in the past now, isn’t it? It’s time to let go of the past and live in the present moment.”
I felt reprimanded for feeling sad about my son’s death. The teacher’s response baffled me. Live in the present? My present was suffused with a wrenching sorrow—a hole in my heart that bled daily. But the present moment, as he conceived of it, could be cleanly sliced away from and inured against this messy pain. Divested of grief, an emotionally sanitized “present moment” was served up as an antidote for my tears. However well meaning, the message was clear: Stop grieving. Get over it. Move on.
This is a familiar message. Its unintended emotional intolerance often greets those who grieve, especially if they do so openly. I call this kind of intolerance “emotion-phobia”: a pervasive fear and reflexive avoidance of difficult emotions in oneself and/or others. This is accompanied by a set of unquestioned normative beliefs about the “negativity” of painful feelings.
Emotion-phobia is endemic to our culture and perhaps to patriarchal culture in general. You’ll find it in sub-cultures as different as spiritual retreats, popular self-help books and psychiatric manuals. In fact, my teacher’s supposedly Buddhist response was very much in line with the prevailing psychiatric view of grief. According to the Diagnostic and Statistical Manual IV (the “bible” of psychiatry), the patient who is grieving a death is allotted two months for “symptoms” such as sadness, insomnia and loss of appetite before being diagnosable with a “Major Depressive Disorder.” Grief, perhaps the most inevitable of all human emotions, given the unalterable fact of mortality, is seen as an illness if it goes on too long. But how much is too long? My mother, a Holocaust survivor, grieved actively for the first decade of my life. Was this too long a grief for genocide? Time frames for our emotions are nothing if not arbitrary, but appearing in a diagnostic and statistical manual, they attain the ring of truth. The two month limit is one of many examples of institutional psychiatry’s emotion-phobia.
Emotions like grief, fear and despair are as much a part of the human condition as love, awe and joy. They are our natural and inevitable responses to existence, so long as loss, vulnerability and violence come with the territory of being human. These are the dark emotions, but by dark, I don’t mean that they are bad, unwholesome or pathological. I mean that as a culture we have kept these emotions in the dark—shameful, secret and unseen.
Emotion-phobia dissociates us from the energies of these emotions and tells us they are untrustworthy, dangerous and destructive. Like other traits our culture distrusts and devalues—vulnerability, for instance, and dependence—emotionality is associated with weakness, women and children. We tend to regard these painful emotions as signs of psychological fragility, mental disorder or spiritual defect. We suppress, intellectualize, judge or deny them. We may use our spiritual beliefs or practices to bypass their reality.
Few of us learn how to experience the dark emotions fully—in the body, with awareness—so we end up experiencing their energies in displaced, neurotic or dangerous forms. We act out impulsively. We become addicted to a variety of substances and/or activities. We become depressed, anxious or emotionally numb, and aborted dark emotions are at the root of these characteristic psychological disorders of our time. But it’s not the emotions themselves that are the problem; it’s our inability to bear them mindfully.
Every dark emotion has a value and purpose. There are no negative emotions; there are only negative attitudes towards emotions we don’t like and can’t tolerate, and the negative consequences of denying them. The emotions we call “negative” are energies that get our attention, ask for expression, transmit information and impel action. Grief tells us that we are all interconnected in the web of life, and that what connects us also breaks our hearts. Fear alerts us to protect and sustain life. Despair asks us to grieve our losses, to examine and transform the meaning of our lives, to repair our broken souls. Each of these emotions is purposeful and useful—if we know how to listen to them.
But if grief is barely tolerated in our culture, even less are fear and despair. The fact is we are all afraid and act as if we’re not. We fear the sheer vulnerability of existence; we fear its unpredictability. When we are unable to feel our fear mindfully, we turn it into anger, psychosomatic ailments or a host of “anxiety disorders”—displacements of fears we can’t feel or name.
According to experts, some 50 million people in this country suffer from phobias at some point in their lives, and millions more are diagnosed with other anxiety disorders. One reason is that we’ve lost touch with the actual experience of primal, natural fear. When fear is numbed, we learn little about what it’s for—its inherent usefulness as an alarm system that we ignore at our peril. Benumbed fear is especially dangerous when it becomes an unconscious source of vengeance, violence and other destructive acts. We see this acted out on the world stage as much as in the individual psyche.
As for despair, how many among us have not experienced periods of feeling empty, desolate, hopeless, brooding over the darkness in our world? This is the landscape of despair. Judging from my thirty years of experience as a psychotherapist, I would say that despair is common, yet we don’t speak of despair anymore. We speak of clinical depression, serotonin-deficiency, biochemical disorder and the new selective serotonin-reuptake inhibitors. We treat the “illness” with a host of new medications. In my view, “depression” is the word we use in our highly medicalized culture for a condition of chronic despair—despair that is stuck in the body and toxified by our inability to bear it mindfully. When we think of all despair as a mental disorder or a biochemical illness, we miss the spiritual metamorphosis to which it calls us.
In retrospect, a more helpful answer from my meditation teacher (and one more in line with the Buddha’s teachings) might have been, If you are grieving, do so mindfully. Pay attention to your grief. Stop and listen to it. Befriend it and let it be. The dark emotions are profound but challenging spiritual teachers, like the Zen master who whacks you until you develop patience and spiritual discipline. When grief shattered my heart after Aaron’s death, that brought with it an expansion, the beginning of my experience of a Self larger than my broken ego. Grieving mindfully—without recourse to suppression, intellectualization or religious dogmatism—made me a happier person than I’d ever been.
What I learned by listening closely to grief was a transformational process I call “the alchemy of the dark emotions.” Many years after Aaron’s death, after a second radiantly healthy child and a third who was born with a mysterious neuromotor disorder, I began to write about these alchemies—from grief to gratitude, fear to joy, and despair to faith—that I had experienced in my own life and witnessed countless times in my work as a psychotherapist.
The alchemy of the dark emotions is a process that cannot be forced, but it can be encouraged by cultivating certain basic emotional skills. The three basic skills are attending to, befriending and surrendering to emotions that make us uncomfortable. Attending to our dark emotions is not just noticing a feeling and then distancing ourselves from it. It’s about being mindful of emotions as bodily sensations and experiencing them fully. Befriending emotion is how we extend our emotional attention spans. Once again, this is a body-friendly process—getting into the body, not away from it into our thoughts. At the least, it’s a process of becoming aware of how our thoughts both trigger emotions and take us away from them. Similarly, surrender is not about letting go but about letting be. When you are open to your heart’s pain and to your body’s experience of it, emotions flow in the direction of greater healing, balance and harmony.
Attending to, befriending and surrendering to grief, we are surprised to discover a profound gratitude for life. Attending to, befriending and surrendering to fear, we find the courage to open to our vulnerability and we are released into the joy of knowing that we can live with and use our fear wisely. Attending to, befriending and surrendering to despair, we discover that we can look into the heart of darkness in ourselves and our world, and emerge with a more resilient faith in life.
Because we are all pretty much novices at this process, we need to discipline ourselves to be mindful and tolerant of the dark emotions. This is a chaotic, non-linear process, but I have broken it down to seven basic steps: 1) intention, 2) affirmation, 3) sensation, 4) contextualization, 5) the way of non-action, 6) the way of action and 7) the way of surrender.
Intention is the means by which the mind, heart and spirit are engaged and focused. Transforming the dark emotions begins when we set our intention on using our grief, fear and despair for the purpose of healing. It is helpful to ask yourself: What is my best intention with regard to the grief, fear and despair in my life? What would I want to learn or gain from this suffering?
The second step in using the dark emotions for growth is affirming their wisdom. This means changing the way we think about how we feel, and developing and cultivating a positive attitude toward challenging feelings.
Emotional intelligence is a bodily intelligence, so you have to know how to listen to your body. The step I call “sensation” includes knowing how to sense and name emotions as we experience them in the body. We need to become more familiar and friendly with the actual physical sensations of emotional energy. Meditation, T’ai chi, yoga and other physical practices that cultivate mindfulness are particularly useful. How does your body feel when you are sad, fearful or despairing? What kinds of stories does your mind spin about these emotions? What happens when you simply observe these sensations and stories, without trying to understand, analyze or change anything?
In step four, contextualization, you acquaint yourself with the stories you usually tell yourself about your emotional suffering, and then place them in a broader social, cultural, global or cosmic context. In enlarging your personal story, you connect it to a larger story of grief, fear or despair in the world. This gets us out of the isolation and narcissism of our personal history, and opens us to transforming our suffering into compassion.
Step five, the way of non-action, is the skill that psychologists call “affect tolerance.” This step extends our ability to befriend the pain of the dark emotions in the body. When you can tolerate the pain of grief, fear and despair without acting prematurely to escape it, you are practicing the way of non-action. Again, it is helpful to meditate on your emotions with the intention of really listening to them. What does your grief, fear or despair ask of you? In meditation, listen to the answers that come from your heart, rather than from your analytic mind.
The dark emotions ask us to act in some way. While the way of non-action builds our tolerance for dark emotional energy, step six is about finding an action or set of actions that puts this energy to good use. In the way of action, we act not in order to distract ourselves from emotion but in order to use its energy with the intention of transformation. The dark emotions call us to find the right action, to act with awareness and to observe the transformations that ensue, however subtle. Action can be strong medicine in times of trouble. If you are afraid, help someone who lives in fear. For example, volunteer at a battered women’s shelter. If you’re sad and lonely, work for the homeless. If you’re struggling with despair, volunteer at a hospice. Get your hands dirty with the emotion that scares you. This is one of the best ways to find hope in despair, to find connection in a shared grief and to discover the joy of working to create a less broken world. Finally, step seven, the way of surrender, is the art of conscious emotional flow. Emotional flow is something that happens automatically when you know how to attend to and befriend your emotions. When we are in flow with emotion, the energy becomes transformative, opening us to unexpected vistas.
When we look deeply into the dark emotions in our lives, we find both the universality of suffering and how much suffering is unnecessary, the result of social inequities, oppression, large scale violence and trauma. Our awareness both of the universality of suffering and of its socially created manifestations is critical to the healing journey. Knowing how our grief, fear and despair may be connected to larger emotional currents and social conditions de-pathologizes these emotions, allowing us to accept and tolerate them more fruitfully, and with more compassion for ourselves and others. We begin to see the dark emotions as messengers, information-bearers and teachers, rather than “negative” energies we must subdue, tame or deny. We tend to think of our “negative” emotions as signs that there’s something wrong with us. But the deepest significance of the feelings is simply our shared human vulnerability. When we know this deeply, we begin to heal in a way that connects rather than separates us from the world.
At Orion Magazine, authors William Cronon and Michael Pollan share a stimulating conversation about how language shapes our world. They cover questions such as “what is wild?”, “what is cultivated?”, and “what can these ideas teach us about our relationship to landscape?”. What I found most compelling was the last part of the conversation where they talk about the power and importance of storytelling:
Bill: Right. Ecology, storytelling, history—they all render connections visible. We make that which is invisible visible through story, and thereby reveal people’s relationships to other living things.
Michael: Stories establish canons of beauty, too. There is a role for art in changing cultural norms about what’s worth valuing. One hundred fifty years ago, certain people looked at a farm and saw what you might see if you look today at a nuclear power plant or some other degraded landscape. Part of the reason we tell stories is to create fresh value for certain landscapes, certain relationships.
Bill: And stories make possible acts of moral recognition that we might not otherwise experience. They help us see our own complicity in things we don’t ordinarily see as connected to ourselves.
Michael: Yes, exactly. That recognition can help remove the condescension in so much environmental writing by showing us that, look, these things we abhor are done in our name, and we are complicit in them, and we need to take account of them. It was Wendell Berry’s idea that the environmental crisis is a crisis of character. The big problem is the result of all the little problems in our everyday lives. That can be a guilt trip, but it doesn’t have to be. You can tell that story in ways that empower people.
Storytelling can also help us find hopeful solutions. For example, when I was writing Omnivore’s Dilemma and I went to Joel Salatin’s farm in Virginia, I learned how his grazing worked—intensive rotational grazing—and he explained to me what happens under the surface, how every time the ruminants come through and shear that pasture and reduce that leaf mass, a roughly equivalent amount of root mass is broken down and turned into soil. I learned that he takes vast amounts of food off this pastureland, without subtracting anything. To the contrary, the sun is feeding the grass, and the grass is feeding the ruminants; the ruminants are feeding us, and they’re also feeding the soil.
I suddenly saw a whole other way of conceiving our relationship to nature, that there are systems that exist, and could exist, that are non zero sum. There is a free lunch in nature: it’s solar energy, which means it isn’t necessarily true that for us to feed ourselves we have to diminish the world.
When you tell an audience that story, it fills them with hope and a sense of possibility, and that’s a function of storytelling. But, of course, it isn’t always so neat. There are questions of scale, and if you eat meat, there are problems with cattle. But I’m always looking for stories that refresh this narrative about nature that we’re so stuck in.
Bill: Messy stories invite us into politics. They also invite us to laugh at ourselves. And those things together—the ability to laugh, to experience hope, to be inspired toward action at the personal and political levels—these strike me as the work of engaged storytelling in a world we’re trying to change for the better.
Michael: I do have a lot of faith in the power of stories to do things. My greatest thrill as a writer is when I see people changed by the work, when people tell me that they’ve changed their behavior in some way because of something they’ve read.
One of the things I’ve fought very hard to do with my editors is to talk about alternatives when I talk about problems. For example, if I’m writing an incredibly dark story about industrial meat production and following a cow through the feedlot and slaughterhouse, I really want three paragraphs on the alternative to this system, which is to say, grass-finished beef. Those three paragraphs have more impact than anything else in the piece. And I still hear from ranchers that it was on the day that an article on that topic came out that we began to see the stirrings of a new market for grass-finished beef. “We no longer send them to the auction barn right away,” they tell me. “We’re finishing on grass now.”
Bill: That’s a good story about storytelling.
Michael: You have to pass through the dark wilderness of the feedlot before you can get there, but I think that there’s an appetite for hope that journalists don’t often satisfy.
I’ve met people, in their twenties especially, who really hate the model of the investigative article that tells them how messed up things are and doesn’t point to some alternative. True, the alternative you’re proposing can seem tacked on, and it can be incommensurate with the scale of the evils—but I think people want hope, a course of action they can take. This is something many journalists are missing right now. I think if our writing doesn’t include that dimension in some way, we lose people.
Bill: It strikes me that you’re pointing to a great tradition in the environmental movement, which is the power of good storytelling, going back to Rachel Carson.
Michael: She was incredibly effective rhetorically. Silent Spring is a very sophisticated piece of work.
Bill: It’s stunningly done.
Michael: It’s stunningly done. And it speaks to the power of fictional ideas like wilderness. Carson understood that, even if you’re writing about science, narrative is important. The trick I learned from her is never to talk about “neurotoxins”; instead, you tell the story of the molecule in the cell. Because there’s a narrative everywhere, even at the level of molecules.
Bill: Maybe that’s a good note for us to end on, don’t you think? The poet Muriel Rukeyser once said that “the world is made of stories, not of atoms.” When we lose track of the narratives that human beings need to suffuse their lives and the world with meaning, we forget what makes the world worth saving. Telling stories is how we remember.