After the Crash

Dispatches From a Long Recovery (Est. 10/2024)

After the Crash

Media’s Grim Addiction to Perseverance Porn

By Adam Johnson

Source: FAIR

You’ve seen or heard or read the personal interest story a thousand times: An enterprising seven-year-old collects cans to save for college (ABC72/8/17), a man with unmatched moxie walks 15 miles to his job (Today2/20/17), a low-wage worker buys shoes for a kid whose mother can’t afford them (Fox512/14/16), an “inspiring teen” goes right back to work after being injured in a car accident (CBS News12/16/16). All heartwarming tales of perseverance in the face of impossible odds—and all ideological agitprop meant to obscure and decontextualize the harsh reality of dog-eat-dog capitalism.

Man walks eight miles in the snow to get to work every day (ABC 273/14/17). Or was it a teen walking 10 miles in freezing weather to a job interview (New York Daily News2/26/13)? Or was it 10 miles to work every day (Times Herald Record3/17/17)? Or was it 12 (ABC News2/22/17) or 15 (Today2/20/17) or 18 (Evening Standard, 2/9/09) or 21 (Detroit Free Press1/20/15)? Who cares—their humanity is irrelevant. They’re clickbait, stand-in bootstrap archetypes meant to validate the bourgeois morality of click-happy media consumers.

These stories are typically shared for the purposes of poor-shaming, typically under the guise of inspirational life advice. “This man is proof we all just need to keep walking, no matter what life throws at us,” insisted Denver ABC7 anchor Anne Trujillo, after sharing one of those stories of a poor person forced to walk thousands of miles a year to survive.

A healthy press would take these anecdotes of “can do” spirit and ask bigger questions, like why are these people forced into such absurd hardship? Who benefits from skyrocketing college costs? Why does the public transit in this person’s city not have subsidies for the poor? Why aren’t employers forced to offer time off for catastrophic accidents? But time and again, the media mindlessly tells the bootstrap human interest story, never questioning the underlying system at work.

One particularly vulgar example was CBS News(12/16/16) referring to an “inspiring” African-American kid who had to work at his fast food job with an arm sling and a neck brace after a car accident. To compound the perseverance porn, he was, at least in part, doing so to help donate to a local homeless charity. Here we have a story highlighting how society has colossally failed its most vulnerable populations—the poor, ethnic minorities, children and the homeless—and the take-home point is, “Ah gee, look at that scrappy kid.”

Journalism is as much—if not more—about what isn’t reported as what is. Here a local reporter is faced with a cruel example of people falling through the cracks of the richest country on Earth, and their only contribution is to cherry-pick one guy who managed—just barely—to cling on to the edge.

Perseverance porn goes hand in hand with the rise of a GoFundMe economy that relies on personal narrative over collective policy, emotional appeals over baseline human rights. $930 million out of the $2 billion raised on GoFundMe since its inception in 2010 was for healthcare expenses, while an estimated 45,000 people a year die a year due to a lack of medical treatment. Meanwhile, anchors across cable news insist that single-payer healthcare is “unaffordable,” browbeating guests who support it, while populating their broadcasts with these one-off tales of people heroically scraping by.

It’s part of a broader media culture of anecdotes in lieu of the macro, moralizing “success” rather than questioning systemic problems. Perseverance porn may seem harmless, but in highlighting handpicked cases of people overcoming hardship without showing the thousands that didn’t—much less asking broader questions as to what created these conditions—the media traffics in decidedly right-wing tropes. After all, if they can do it, so can you—right?

Anxiety Dream

By Miya Tokumitsu

A 1636 Dutch print depicts a tender domestic scene: a father in his nightdress walks to and fro, soothing a wakeful baby while mom gets some well-deserved sleep. The accompanying verse is equally sweet, assuring us that God, like this kindly father, will comfort us when we become gripped with anxiety and cry out in the night.

But when we wake today, heart pounding at the recollection that we have a big presentation in six hours, many of us might find a last-minute cancellation more conducive to recovering sleep than the idea of a loving God who cradles and sings to us. Adding to our anxiety is the knowledge that the loss of every minute is setting us back. There seems hardly to be sleep enough to go around, much less to share with our loved ones. We know the stats: most Americans sleep a paltry 6.8 hours per night, less than the recommended eight hours. The litany of sleep deprivation consequences is also familiar: obesity, depression, anxiety, loss of libido, and heart disease, among others.

We also instinctively understand that we have a stake in each other’s sleep. In addition to immediate hazards, like overtired drivers taking the wheel or bleary-eyed colleagues gumming up our beautiful spreadsheets, we know that widespread depression and worn-out immune systems affect society broadly, and over the long term. And yet we often understand our sleep in terms of pure individual choice.

For that reason, wilful sleep deprivation remains a cultural ideal. This you-snooze-you-lose mindset was recently captured by internet-marketplace Fiverr’s advertisement poster, which, alarm-like, blared “SLEEP DEPRIVATION IS YOUR DRUG OF CHOICE . . . YOU MIGHT BE A DOER.” After all, what is the condition of sleep, if not an absence of motivation to chase the $5 gigs the company peddles? In this same vein, a 2012 Business Insider slideshow fawned over “19 Successful People Who Barely Sleep.” Marissa Mayer, Yahoo! CEO, got pride of place as slide number one. Slide number three was Donald Trump.

An equally individualistic pro-sleep discourse does exist, primarily in click-bait articles nestled within chum boxes, which limply scold us for watching Netflix in bed. Entering this soporific terrain, sleep-evangelist Arianna Huffington urges readers of her book, The Sleep Revolution, to sleep more, prescribing rituals to maximize its quality, including pre-bedtime soaks with Epsom salts, and counting one’s blessings.

As with our wakefulness, our slumber too is motivated and shaped by anxiety. Those who do protect their eight hours often do so because it helps them perform better at work. It’s no wonder that Huffington, a boss, approves of this motivation for sleep, writing, “It would actually be better for business if employees called in tired, got a little more sleep, and then came in a bit late, rather than call in sick a few days later or, worse, show up sick, dragging themselves through the day while infecting others.”

It may appear that as a society we have conflicting sleep ideals, but really, we’re not so much of two minds as we are fumbling around, trying to work out the role that sleep plays in a prosperous life. We want to get sleep right because we know that doing so is essential to thriving individually—indeed, Thrive is the name Huffington chose for her wellness company—but we fret over the quantity, preparatory rites, and timing of our sleep because sleep lies at the juncture between the private and the social, the biological, and the cultural.

Sleep is intensely private: where, when, with (and without) whom, and how we dress and prepare for sleep are intimate and emotional decisions. But sleep is also social: we modify our behavior and expectations on the assumption that those beyond our immediate domiciles—neighbors, colleagues both local and time zones away—are slumbering at certain hours. And although sleep is private, we do want social reassurance that we are sleeping the right way and look down upon those who choose other arrangements. Just mosey over to the comment section of any website discussing infant sleep, and you’ll find accusations of “baby torture,” and remarks like, “You may think you are fine, but no. You did hurt your baby.” Just as eating habits often come with a moral or ethical motivations that imply—or outright state—the absence of such morals and ethics of those who eat differently, sleep helps constitute our identity, something we generally like to have affirmed.

Enter the market. There are seemingly endless ways to buy yourself some sleep—books like Huffington’s, herbal teas, white noise machines, Ambien, melatonin, ear plugs tucked into earplug cases, therapy. And if you want to put sleep off—stimulants from espresso to cocaine, late night TV, alarms, gyms that open at 5 am.

Contrary to Huffington’s claim to revolutionary momentousness, it seems someone’s always been around to sell sleep optimization. Historian Sasha Handley writes in her book Sleep in Early Modern England that in the seventeenth and eighteenth centuries the panoply of goods deemed ideal for proper sleep by Brits counted breathable bed linens, thermometers to help maintain ideal room temperatures, bedclothes including nightcaps and nightcap liners, even ventilators. “No other daily activity was so heavily governed by principles of good health,” Handley writes, “nor consumed as much time, money, and labour as did sleep.” Yesterday’s silver-gilt ventilator has today become a whole range of electronic devices to track your sleep and analyze which components of your psyche and environment need correction.

We may scream at each other over the “correct” way to sleep, but the truth is that where we come down on these questions—and, indeed, whether we even have a choice at all—is largely a matter of our financial resources and anxieties. As with parenting, there are multitudinous dictums competing over how to do sleep right, but few resources to actually achieve our cultural ideals. For well-to-do families, whether to co-sleep with babies may be a considered choice. No such luck for households that cannot afford a bassinet or crib. Coffee-fuelled all-nighters are technically a choice, but usually one coerced by negative economic consequences for missing a deadline. And what can Huffington say to readers who don’t have a bathtub or even a private bedroom from which to banish their phone?

 

Luddism and Economic Ideology

ludd1

Source: the HipCrime Vocab

Smithsonian Magazine has a very good feature on the Luddites, well worth a read. There are many elements you just don’t read in many economic histories; for example, the 40-hour work week was not brought down from the mountaintop by Moses and inscribed in stone tablets, despite what you may have heard elsewhere:

At the turn of 1800, the textile industry in the United Kingdom was an economic juggernaut that employed the vast majority of workers in the North. Working from home, weavers produced stockings using frames, while cotton-spinners created yarn. “Croppers” would take large sheets of woven wool fabric and trim the rough surface off, making it smooth to the touch.

These workers had great control over when and how they worked—and plenty of leisure. “The year was chequered with holidays, wakes, and fairs; it was not one dull round of labor,” as the stocking-maker William Gardiner noted gaily at the time. Indeed, some “seldom worked more than three days a week.” Not only was the weekend a holiday, but they took Monday off too, celebrating it as a drunken “St. Monday.”

Croppers in particular were a force to be reckoned with. They were well-off—their pay was three times that of stocking-makers—and their work required them to pass heavy cropping tools across the wool, making them muscular, brawny men who were fiercely independent. In the textile world, the croppers were, as one observer noted at the time, “notoriously the least manageable of any persons employed.”

The introduction of machinery in cloth manufacture did not make these people’s lives better. In fact, it made them a lot worse:

“They [the merchant class] were obsessed with keeping their factories going, so they were introducing machines wherever they might help,” says Jenny Uglow, a historian and author of In These Times: Living in Britain Through Napoleon’s Wars, 1793-1815.

The workers were livid. Factory work was miserable, with brutal 14-hour days that left workers—as one doctor noted—“stunted, enfeebled, and depraved.” Stocking-weavers were particularly incensed at the move toward cut-ups. It produced stockings of such low quality that they were “pregnant with the seeds of its own destruction,” as one hosier put it: Pretty soon people wouldn’t buy any stockings if they were this shoddy. Poverty rose as wages plummeted.

Yes, you read that right- the introduction of “labor-saving” technology made the amount these people worked increase dramatically. It also made their work much, much more unpleasant. It transferred control to a smaller circle of wealthy people and took it away from the workers themselves. It made the rich richer, increased poverty, and tore society apart.

But more technology is always good, right?

And since history is written by the victors, “Luddite” is a term now inextricably wound up with the knee-jerk rejection of new technology. But the Luddites weren’t opposed to new technology at all! What they were fighting against was the economic conditions that took away their autonomy and turned them into mendicants in their own country:

The workers tried bargaining. They weren’t opposed to machinery, they said, if the profits from increased productivity were shared. The croppers suggested taxing cloth to make a fund for those unemployed by machines. Others argued that industrialists should introduce machinery more gradually, to allow workers more time to adapt to new trades.

The plight of the unemployed workers even attracted the attention of Charlotte Brontë, who wrote them into her novel Shirley. “The throes of a sort of moral earthquake,” she noted, “were felt heaving under the hills of the northern counties.”

[…]

At heart, the fight was not really about technology. The Luddites were happy to use machinery—indeed, weavers had used smaller frames for decades. What galled them was the new logic of industrial capitalism, where the productivity gains from new technology enriched only the machines’ owners and weren’t shared with the workers.

In fact, the Luddites actually spared the machines that were used by employers who treated workers fairly. Funny how you never hear that in most popular descriptions of the Luddite revolt:

The Luddites were often careful to spare employers who they felt dealt fairly. During one attack, Luddites broke into a house and destroyed four frames—but left two intact after determining that their owner hadn’t lowered wages for his weavers. (Some masters began posting signs on their machines, hoping to avoid destruction: “This Frame Is Making Full Fashioned Work, at the Full Price.”)

Unlike today, labor actually fought back against these attempts to destroy their way of life:

As a form of economic protest, machine-breaking wasn’t new. There were probably 35 examples of it in the previous 100 years, as the author Kirkpatrick Sale found in his seminal history Rebels Against the Future. But the Luddites, well-organized and tactical, brought a ruthless efficiency to the technique: Barely a few days went by without another attack, and they were soon breaking at least 175 machines per month. Within months they had destroyed probably 800, worth £25,000—the equivalent of $1.97 million, today.

Rather than the “natural course” of free-market economics, once again it was government intervention, including brutal state violence, that made modern capitalism possible:

Parliament was now fully awakened, and began a ferocious crackdown. In March 1812, politicians passed a law that handed out the death penalty for anyone “destroying or injuring any Stocking or Lace Frames, or other Machines or Engines used in the Framework knitted Manufactory.” Meanwhile, London flooded the Luddite counties with 14,000 soldiers.

By winter of 1812, the government was winning. Informants and sleuthing finally tracked down the identities of a few dozen Luddites. Over a span of 15 months, 24 Luddites were hanged publicly, often after hasty trials, including a 16-year-old who cried out to his mother on the gallows, “thinking that she had the power to save him.” Another two dozen were sent to prison and 51 were sentenced to be shipped off to Australia.

But wait, isn’t capitalism all about “freedom and liberty?” Freedom and liberty for some, I guess.

The problem, then as now, was not technology itself, but the economic relations that it unfolded against. What I found most interesting is that even back then, the emerging pseudoscience of economics was used to justify the harsh treatment of the workers and the bottomless greed of capitalists, in particular the “sacred text” of modern Neoclassical economics, Adam Smith’s The Wealth of Nations:

For the Luddites, “there was the concept of a ‘fair profit,’” says Adrian Randall, the author of Before the Luddites. In the past, the master would take a fair profit, but now he adds, “the industrial capitalist is someone who is seeking more and more of their share of the profit that they’re making.” Workers thought wages should be protected with minimum-wage laws. Industrialists didn’t: They’d been reading up on laissez-faire economic theory in Adam Smith’s The Wealth of Nations, published a few decades earlier.

“The writings of Dr. Adam Smith have altered the opinion, of the polished part of society,” as the author of a minimum wage proposal at the time noted. Now, the wealthy believed that attempting to regulate wages “would be as absurd as an attempt to regulate the winds.”

It seems as though nothing’s really changed. Using economic “science” to justify social inequality and private ownership goes back to the very beginnings of the Market.

When Robots Take All of Our Jobs, Remember the Luddites (Smithsonian Magazine). Smithsonian wrote about this before, see also: What the Luddites Really Fought Against

As the above history shows, there is nothing “natural” or normal about extreme busyness and brutally long working hours. It is entirely an artificial creation:

A nice post at the HBR blog…describes how being busy is now celebrated as a symbol of high status. This is not natural. Marshall Sahlins has shown that in hunter-gather societies (which were the human condition for nine-tenths of our existence) people typically worked for only around 20 hours a week. In pre-industrial societies, work was task-oriented; people did as much as necessary and then stopped. Max Weber wrote:

“Man does not “by nature” wish to earn more and more money, but simply to live as he is accustomed to live and to earn as much as is necessary for that purpose. Wherever modern capitalism has begun its work of increasing the productivity of human labour by increasing its intensity, it has encountered the immensely stubborn resistance of this leading trait of pre-capitalistic labour. (The Protestant Ethic and the Spirit of Capitalism, p24”

The backward-bending supply curve of labour was normal.

E.P. Thompson has described how pre-industrial working hours were irregular, with Mondays usually taken as holidays. He, and writers such as Sidney Pollard and Stephen Marglin, have shown how the working day as we know it was imposed by ruthless discipline, reinforced by Christian moralists. (There’s a clue in the title of Weber’s book). Marglin quotes Andrew Ure, author of The Philosophy of Manufacturers in 1835:

The main difficulty [faced by Richard Arkwright] did not, to my apprehension, lie so much in the invention of a proper mechanism for drawing out and twisting cotton into a continuous thread, as in…training human beings to renounce their desultory habits of work and to identify themselves with the unvarying regularity of the complex automation. To devise and administer a successful code of factory discipline, suited to the necessities of factory diligence, was the Herculean enterprise, the noble achievement of Arkwright…It required, in fact, a man of a Napoleon nerve and ambition to subdue the refractory tempers of workpeople accustomed to irregular paroxysms of diligence.”

Today, though, such external discipline is no longer so necessary because many of us – more so in the UK and US than elsewhere – have internalized the capitalist imperative that we work long hours, …Which just vindicates a point made by Bertrand Russell back in 1932:

“The conception of duty, speaking historically, has been a means used by the holders of power to induce others to live for the interests of their masters rather than for their own.”

Against busyness (Stumbling and Mumbling)

Honestly, the five-day workweek is outmoded and ridiculous. It’s more of a babysitting operation for adults than anything else. It’s a silly as arguing that we need over two decades of formal education in order to do our jobs.

I was reminded of this over the holidays. In the U.S. we get virtually no time off from our jobs, unlike most other countries (East Asia might be an exception). But Christmas/New Year’s is a rare exception, and we have several four-day weeks in a row (without pay for some of us, of course). Those weeks are so much more pleasant, and I would even say productive, than the rest of the year. Every year at this time I think to myself, “Why isn’t every week a four-day workweek?” Some places do have such an arrangement, but they justify it by four long, ten-hour days. I don’t know about you, but towards the end of ten hours in a row of “work” I doubt anyone’s accomplishing much of anything. Is 32 hours a week really not enough to keep society functioning in the twenty-first century?

Not only that, but many people use whatever little vacation they do have in order to take the whole time period at the end of the year off. This is typical in Europe, but rarer here. In any case, while going to work I noticed that there was hardly any traffic. The roads were empty. There were plenty of seats on the bus. The streets and sidewalks were empty. There was no waiting in the restaurants and cafes. There was plenty of room for everything. There was a laid-back feeling everywhere. It was so pleasant. I couldn’t help but think to myself, “why isn’t every week like this?” If more people could stay home and work less, it very well could be. Instead we’re trapped on a treadmill. Working less would actually pay dividends in terms of reduced traffic, less crowding, less pollution, and better health outcomes due to less stress and more time to exercise.

There’s also a simple logic problem at work here. If we say the 40-hour week is inviolable and set-in-stone for the rest of time, and we do not wish to increase the problem of unemployment, then literally no labor-saving technology will ever save labor! We might as well dispense with the creation of any labor-saving technology, since by the above logic, it cannot save labor. You could equivocate and say that it frees us from doing “lower” level work and allows us to do “higher” level work, as when ditch diggers become factory workers, or something. That may have been a valid argument a hundred years ago, but in an age when most of us are low-paid service workers or useless paper-pushers, it’s pretty hard to make that case with any seriousness anymore.

***

I often refer to economics as a religion, with its practitioners as priests. So it’s interesting to read that in other contexts. This is from Chris Dillow’s blog, where the above passage about work was taken:

The social power, i.e. the multiplied productive force”, wrote Marx, appears to people “not as their own united power but as an alien force existing outside them, of the origin and end of which they are ignorant, which they thus cannot control.”

I was reminded of this by a fine passage in The Econocracy in which the authors show that “the economy” in the sense we now know it is a relatively recent invention and that economists claim to be experts capable of understanding this alien force:

“As increasing areas of political and social life are colonized by economic language and logic, the vast majority of citizens face the struggle of making informed democratic choices in a language they have never been taught. (p19)”

This leads to the sort of alienation which Marx described. This is summed up by respondents to a You Gov survey cited by Earle, Moran and Ward-Perkins, who said; “Economics is out of my hands so there is no point discussing it.”

In one important sense such an attitude is absurd. Every time you decide what to buy, or how much to save, or what job to do or how long to work, economics is in your hands and you are making an economic decision.

This suggests to me two different conceptions of what economics is. In one conception – that of Earle, Moran and Ward-Perkins – economists claim to be a priestly elite who understand “the economy”. As Alasdair MacIntyre said, such a claim functions as a demand for power and wealth:

“Civil servants and managers alike [he might have added economists-CD] justify themselves and their claims to authority, power and money by invoking their own competence as scientific managers (After Virtue, p 86).”

There is, though, a second conception of what economists should do. Rather than exploit alienation for their own advantage, we should help people mitigate it…

Economists in an alienated society (Stumbling and Mumbling)

This makes a point I often refer to – this depiction of “The Economy” as some of “natural” force that we have no control over, subject to its own inexorable logic. We saw above how the writings of Adam Smith provided the ideological justification for the wealthy merchants to screw over the workers. It cemented the perception that the economy was just a natural force with its own internal logic that could no more be regulated than could the wind or the tides. And over the course of several hundred years, we have intentionally designed our politcal institutions such that government cannot “interfere” in the “natural workings” of the economy. Doing so would only make all of us worse off, or so goes the argument.

There is a telling passage in this column by Noah Smith:

…Even now, when economic models have become far more complex than anything in [Milton] Friedman’s time, economists still go back to Friedman’s theory as a mental touchstone — a fundamental intuition that guides the way they make their models. My first macroeconomics professor believed in it deeply and instinctively, and would even bring it up in department seminars.

Unfortunately, intuition based on incorrect theories can lead us astray. Economists have known for a while that this theory doesn’t fit the facts. When people get a windfall, they tend to spend some of it immediately. So economists have tried to patch up Friedman’s theory, using a couple of plausible fixes….

Milton Friedman’s Cherished Theory Is Laid to Rest (Bloomberg)

Yes, you read that right, economists knew for a long time that a particular theory did not accord with the observed facts, but they didn’t discard it because it was necessary for the complex mathematical models that they use to supposedly describe reality. Rather, instead of discarding it, they tried to “patch it up,” because it told them what they wanted to hear. Note how his economics professor “believed deeply” in the theory, much as how people believe in the Good Book.

Nice “science” you got there.

That methodology ought to tell you everything you need to know about economic “science.” One wonders how many other approaches economists take that such thinking applies to.

Friedman was, of course, the author of “Capitalism and Freedom,” which as we saw above, is quite an ironic title. Friedman’s skill was coming up with ideas that the rich wanted hear, and then coming up with the requisite economic “logic” to justify them, from deregulation, to privatization, to globalization, to the elimination of minimum wages and suppression of unions. His most famous idea was that the sole purpose of a firm is to make money for its shareholders, and all other responsibilities were ‘unethical.’ The resulting “libertarian” economics was promoted tirelessly, including a series on PBS, by wealthy organizations and right-wing think-tanks with bottomless funding, as it still is today (along with its even more extreme cousin, “Austrian” economics). One thing the Luddites did not have to contend with was the power of the media to shape society, one reason why such revolts would be unthinkable today (along with the panopticon police states constructed by capitalist regimes beginning with Great Britain— “freedom” indeed!).

Smith himself has written about what he calls 101-ism:

We all know basically what 101ism says. Markets are efficient. Firms are competitive. Partial-equilibrium supply and demand describes most things. Demand curves slope down and supply curves slope up. Only one curve shifts at a time. No curve is particularly inelastic or elastic; all are somewhere in the middle (straight lines with slopes of 1 and -1 on a blackboard). Etc.

Note that 101 classes don’t necessarily teach that these things are true! I would guess that most do not. Almost all 101 classes teach about elasticity, and give examples with perfectly elastic and perfectly inelastic supply and demand curves. Most teach about market failures and monopolies. Most at least mention general equilibrium.

But for some reason, people seem to come away from 101 classes thinking that the cases that are the easiest to draw on the board are – God only knows why – the benchmark cases.

101ism (Noahpinion)

But the best criticism I’ve read lately is from James Kwak who has written an entire book on the subject: Economism: Bad Economics and the Rise of Inequality. He’s written several posts on the topic, but this post is a good introduction to the concept. Basically, he argues that modern economics allows policies that benefit the rich at the expense of the rest of society to masquerade as objective “scientific” truths thanks to the misapplication of economic ideology. As we saw above ,that goes back to very beginnings of “free market” economics in the nineteenth century:

In policy debates and public relations campaigns…what you are … likely to hear is that a minimum wage must increase unemployment—because that’s what the model says. This conviction that the world must behave the way it does on the blackboard is what I call economism. This style of thinking is influential because it is clear and logical, reducing complex issues to simple, pseudo-mathematical axioms. But it is not simply an innocent mistake made by inattentive undergraduates. Economism is Economics 101 transformed into an ideology—an ideology that is particularly persuasive because it poses as a neutral means of understanding the world.

In the case of low-skilled labor, it’s clear who benefits from a low minimum wage: the restaurant and hotel industries. In their PR campaigns, however, these corporations can hardly come out and say they like their labor as cheap as possible. Instead, armed with the logic of supply and demand, they argue that raising the minimum wage will only increase unemployment and poverty. Similarly, megabanks argue that regulating derivatives will starve the real economy of capital; multinational manufacturing companies argue that new trade agreements will benefit everyone; and the wealthy argue that lower taxes will increase savings and investment, unleashing economic growth.

In each case, economism allows a private interest to pretend that its preferred policies will really benefit society as a whole.The usual result is to increase inequality or to legitimize the widening gulf between rich and poor in contemporary society.

Economics 101, Economism, and Our New Gilded Age (The Baseline Scenario)

All of the above reinforces a couple of points I often like to make:

1.) Capitalism was a creation of government from day one. There is nothing “natural” or “free” about markets.

2.) It is sustained by a particular ideology which poses as a science but is anything but.

These is no fundamental reason we need to work 40 hours a week. There is no reason we have to go into debt just to get a job. There is no benefit to the extreme wealth inequality; it’s not due to any sort of “merit.” And on and on. Economic “logic” is destroying society along with the natural world and preventing any adaptive response to these crises. But its power over the hearts and minds of society seems to be unassailable, at least until it all falls apart.

Saturday Matinee: Obsolete

Source: Truthstream Media

The Future Doesn’t Need Us… Or So We’ve Been Told. With the rise of technology and the real-time pressures of an online, global economy, humans will have to be very clever – and very careful – not to be left behind by the future. From the perspective of those in charge, human labor is losing its value, and people are becoming a liability. This documentary reveals the real motivation behind the secretive effort to reduce the population and bring resource use into strict, centralized control. Could it be that the biggest threat we face isn’t just automation and robots destroying jobs, but the larger sense that humans could become obsolete altogether? *Please watch and share!* Link to film: http://amzn.to/2f69Ocr

The United States of Work

Employers exercise vast control over our lives, even when we’re not on the job. How did our bosses gain power that the government itself doesn’t hold?

By Miya Tokumitsu

Source: New Republic

Work no longer works. “You need to acquire more skills,” we tell young job seekers whose résumés at 22 are already longer than their parents’ were at 32. “Work will give you meaning,” we encourage people to tell themselves, so that they put in 60 hours or more per week on the job, removing them from other sources of meaning, such as daydreaming or social life. “Work will give you satisfaction,” we insist, even though it requires abiding by employers’ rules, and the unwritten rules of the market, for most of our waking hours. At the very least, work is supposed to be a means to earning an income. But if it’s possible to work full time and still live in poverty, what’s the point?

Even before the global financial crisis of 2008, it had become clear that if waged work is supposed to provide a measure of well-being and social structure, it has failed on its own terms. Real household wages in the United States have remained stagnant since the 1970s, even as the costs of university degrees and other credentials rise. Young people find an employment landscape defined by unpaid internships, temporary work, and low pay. The glut of degree-holding young workers has pushed many of them into the semi- or unskilled labor force, making prospects even narrower for non–degree holders. Entry-level wages for high school graduates have in fact fallen. According to a study by the Federal Reserve Bank of New York, these lost earnings will depress this generation’s wages for their entire working lives. Meanwhile, those at the very top—many of whom derive their wealth not from work, but from returns on capital—vacuum up an ever-greater share of prosperity.

Against this bleak landscape, a growing body of scholarship aims to overturn our culture’s deepest assumptions about how work confers wealth, meaning, and care throughout society. In Private Government: How Employers Rule Our Lives (and Why We Don’t Talk About It), Elizabeth Anderson, a professor of philosophy at the University of Michigan, explores how the discipline of work has itself become a form of tyranny, documenting the expansive power that firms now wield over their employees in everything from how they dress to what they tweet. James Livingston, a historian at Rutgers, goes one step further in No More Work: Why Full Employment Is a Bad Idea. Instead of insisting on jobs for all or proposing that we hold employers to higher standards, Livingston argues, we should just scrap work altogether.

Livingston’s vision is the more radical of the two; his book is a wide-ranging polemic that frequently delivers the refrain “Fuck work.” But in original ways, both books make a powerful claim: that our lives today are ruled, above all, by work. We can try to convince ourselves that we are free, but as long as we must submit to the increasing authority of our employers and the labor market, we are not. We therefore fancy that we want to work, that work grounds our character, that markets encompass the possible. We are unable to imagine what a full life could be, much less to live one. Even more radically, both books highlight the dramatic and alarming changes that work has undergone over the past century—insisting that, in often unseen ways, the changing nature of work threatens the fundamental ideals of democracy: equality and freedom.

Anderson’s most provocative argument is that large companies, the institutions that employ most workers, amount to a de facto form of government, exerting massive and intrusive power in our daily lives. Unlike the state, these private governments are able to wield power with little oversight, because the executives and boards of directors that rule them are accountable to no one but themselves. Although they exercise their power to varying degrees and through both direct and “soft” means, employers can dictate how we dress and style our hair, when we eat, when (and if) we may use the toilet, with whom we may partner and under what arrangements. Employers may subject our bodies to drug tests; monitor our speech both on and off the job; require us to answer questionnaires about our exercise habits, off-hours alcohol consumption, and childbearing intentions; and rifle through our belongings. If the state held such sweeping powers, Anderson argues, we would probably not consider ourselves free men and women.

Employees, meanwhile, have few ways to fight back. Yes, they may leave the company, but doing so usually necessitates being unemployed or migrating to another company and working under similar rules. Workers may organize, but unions have been so decimated in recent years that their clout is greatly diminished. What’s more, employers are swift to fire anyone they suspect of speaking to their colleagues about organizing, and most workers lack the time and resources to mount a legal challenge to wrongful termination.

It wasn’t supposed to be this way. As corporations have worked methodically to amass sweeping powers over their employees, they have held aloft the beguiling principle of individual freedom, claiming that only unregulated markets can guarantee personal liberty. Instead, operating under relatively few regulations themselves, these companies have succeeded at imposing all manner of regulation on their employees. That is to say, they use the language of individual liberty to claim that corporations require freedom to treat workers as they like.

Anderson sets out to discredit such arguments by tracing them back to their historical origins. The notion that personal freedom is rooted in free markets, for instance, originated with the Levellers in seventeenth-century England, when working conditions differed substantially from today’s. The Levellers believed that a market society was essential to liberate individuals from the remnants of feudal hierarchies; their vision of utopia was a world in which men could meet and interact on terms of equality and dignity. Their ideas echoed through the writing and politics of later figures like John Locke, Adam Smith, Thomas Paine, and Abraham Lincoln, all of whom believed that open markets could provide the essential infrastructure for individuals to shape their own destiny.

An anti-statist streak runs through several of these thinkers, particularly the Levellers and Paine, who viewed markets as the bulwark against state oppression. Paine and Smith, however, would hardly qualify as hard-line contemporary libertarians. Smith believed that public education was essential to a fair market society, and Paine proposed a system of social insurance that included old-age pensions as well as survivor and disability benefits. Their hope was not for a world of win-or-die competition, but one in which open markets would allow individuals to make the fullest use of their talents, free from state monopolies and meddlesome bosses.

For Anderson, the latter point is essential; the notion of lifelong employment under a boss was anathema to these earlier visions of personal freedom. Writing in the 1770s, Smith assumes that independent actors in his market society will be self-employed, and uses butchers and bakers as his exemplars; his “pin factory,” meant to illustrate division of labor, employs only ten people. These thinkers could not envision a world in which most workers spend most of their lives performing wage labor under a single employer. In an address before the Wisconsin State Agricultural Society in 1859, Lincoln stated, “The prudent, penniless beginner in the world labors for wages awhile, saves a surplus with which to buy tools or land for himself, then labors on his own account another while, and at length hires another new beginner to help him.” In other words, even well into the nineteenth century, defenders of an unregulated market society viewed wage labor as a temporary stage on the way to becoming a proprietor.

Lincoln’s scenario does not reflect the way most people work today. Yet the “small business owner” endures as an American stock character, conjured by politicians to push through deregulatory measures that benefit large corporations. In reality, thanks to a lack of guaranteed, nationalized health care and threadbare welfare benefits, setting up a small business is simply too risky a venture for many Americans, who must rely on their employers for health insurance and income. These conditions render long-term employment more palatable than a precarious existence of freelance gigs, which further gives companies license to oppress their employees.

The modern relationship between employer and employee began with the rise of large-scale companies in the nineteenth century. Although employment contracts date back to the Middle Ages, preindustrial arrangements bore little resemblance to the documents we know today. Like modern employees, journeymen and apprentices often served their employers for years, but masters performed the same or similar work in proximity to their subordinates. As a result, Anderson points out, working conditions—the speed required of workers and the hazards to which they might be exposed—were kept in check by what the masters were willing to tolerate for themselves.

The Industrial Revolution brought radical changes, as companies grew ever larger and management structures more complex. “Employers no longer did the same kind of work as employees, if they worked at all,” Anderson observes. “Mental labor was separated from manual labor, which was radically deskilled.” Companies multiplied rapidly in size. Labor contracts now bonded workers to massive organizations in which discipline, briefs, and decrees flowed downward, but whose leaders were unreachable by ordinary workers. Today, fast food workers or bank tellers would be hard-pressed to petition their CEOs at McDonald’s or Wells Fargo in person.

Despite this, we often speak of employment contracts as agreements between equals, as if we are living in Adam Smith’s eighteenth-century dream world. In a still-influential paper from 1937 titled “The Nature of the Firm,” the economist and Nobel laureate Ronald Coase established himself as an early observer and theorist of corporate concerns. He described the employment contract not as a document that handed the employer unaccountable powers, but as one that circumscribed those powers. In signing a contract, the employee “agrees to obey the directions of an entrepreneur within certain limits,” he emphasized. But such characterizations, as Anderson notes, do not reflect reality; most workers agree to employment without any negotiation or even communication about their employer’s power or its limits. The exceptions to this rule are few and notable: top professional athletes, celebrity entertainers, superstar academics, and the (increasingly small) groups of workers who are able to bargain collectively.

Yet because employment contracts create the illusion that workers and companies have arrived at a mutually satisfying agreement, the increasingly onerous restrictions placed on modern employees are often presented as “best practices” and “industry standards,” framing all sorts of behaviors and outcomes as things that ought to be intrinsically desired by workers themselves. Who, after all, would not want to work on something in the “best” way? Beyond employment contracts, companies also rely on social pressure to foster obedience: If everyone in the office regularly stays until seven o’clock every night, who would risk departing at five, even if it’s technically allowed? Such social prods exist alongside more rigid behavioral codes that dictate everything from how visible an employee’s tattoo can be to when and how long workers can break for lunch.

Many workers, in fact, have little sense of the legal scope of their employer’s power. Most would be shocked to discover that they could be fired for being too attractive, declining to attend a political rally favored by their employer, or finding out that their daughter was raped by a friend of the boss—all real-life examples cited by Anderson. Indeed, it is only after dismissal for such reasons that many workers learn of the sweeping breadth of at-will employment, the contractual norm that allows American employers to fire workers without warning and without cause, except for reasons explicitly deemed illegal.

In reality, the employment landscape is even more dire than Anderson outlines. The rise of staffing or “temp” agencies, for example, undercuts the very idea of a direct relationship between worker and employer. In The Temp Economy: From Kelly Girls to Permatemps in Postwar America, sociologist Erin Hatton notes that millions of workers now labor under subcontracting arrangements, which give employers even greater latitude to abuse employees. For years, Walmart—America’s largest retailer—used a subcontracting firm to hire hundreds of cleaners, many from Eastern Europe, who worked for months on end without overtime pay or a single day off. After federal agents raided dozens of Walmarts and arrested the cleaners as illegal immigrants, company executives used the subcontracting agreement to shirk responsibility for their exploitation of the cleaners, claiming they had no knowledge of their immigration status or conditions.

By any reasonable standard, much “temp” work is not even temporary. Employees sometimes work for years in a single workplace, even through promotions, without ever being granted official status as an employee. Similarly, “gig economy” platforms like Uber designate their workers as contractors rather than employees, a distinction that exempts the company from paying them minimum wage and overtime. Many “permatemps” and contractors perform the same work as employees, yet lack even the paltry protections and benefits awarded to full-time workers.

A weak job market, paired with the increasing precarity of work, means that more and more workers are forced to make their living by stringing together freelance assignments or winning fixed-term contracts, subjecting those workers to even more rules and restrictions. On top of their actual jobs, contractors and temp workers must do the additional work of appearing affable and employable not just on the job, but during their ongoing efforts to secure their next gig. Constantly pitching, writing up applications, and personal branding on social media requires a level of self-censorship, lest a controversial tweet or compromising Facebook photo sink their job prospects. Forced to anticipate the wishes not of a specific employer, but of all potential future employers, many opt out of participating in social media or practicing politics in any visible capacity. Their public personas are shaped not by their own beliefs and desires, but by the demands of the labor market.


For Livingston, it’s not just employers but work itself that is the problem. We toil because we must, but also because our culture has trained us to see work as the greatest enactment of our dignity and personal character. Livingston challenges us to turn away from such outmoded ideas, rooted in Protestant ideals. Like Anderson, he sweeps through centuries of labor theory with impressive efficiency, from Marx and Hegel to Freud and Lincoln, whose 1859 speech he also quotes. Livingston centers on these thinkers because they all found the connection between work and virtue troubling. Hegel believed that work causes individuals to defer their desires, nurturing a “slave morality.” Marx proposed that “real freedom came after work.” And Freud understood the Protestant work ethic as “the symptom of repression, perhaps even regression.”

Nor is it practical, Livingston argues, to exalt work: There are simply not enough jobs to keep most adults employed at a living wage, given the rise of automation and increases in productivity. Besides, the relation between income and work is arbitrary. Cooking dinner for your family is unpaid work, while cooking dinner for strangers usually comes with a paycheck. There’s nothing inherently different in the labor involved—only in the compensation. Anderson argues that work impedes individual freedom; Livingston points out that it rarely pays enough. As technological advances continue to weaken the demand for human labor, wages will inevitably be driven down even further. Instead of idealizing work and making it the linchpin of social organization, Livingston suggests, why not just get rid of it?

Livingston belongs to a cadre of thinkers, including Kathi Weeks, Nick Srnicek, and Alex Williams, who believe that we should strive for a “postwork” society in one form or another. Strands of this idea go back at least as far as Keynes’s 1930 essay on “Economic Possibilities for our Grandchildren.” Not only would work be eliminated or vastly reduced by technology, Keynes predicted, but we would also be unburdened spiritually. Devotion to work was, he deemed, one of many “pseudo-moral principles” that “exalted some of the most distasteful of human qualities into the position of the highest virtues.”

Since people in this new world would no longer have to earn a salary, they would, Livingston envisions, receive some kind of universal basic income. UBI is a slippery concept, adaptable to both the socialist left and libertarian right, but it essentially entails distributing a living wage to every member of society. In most conceptualizations, the income is indeed basic—no cases of Dom Pérignon—and would cover the essentials like rent and groceries. Individuals would then be free to choose whether and how much they want to work to supplement the UBI. Leftist proponents tend to advocate pairing UBI with a strong welfare state to provide nationalized health care, tuition-free education, and other services. Some libertarians view UBI as a way to pare down the welfare state, arguing that it’s better simply to give people money to buy food and health care directly, rather than forcing them to engage with food stamp and Medicaid bureaucracies.

According to Livingston, we are finally on the verge of this postwork society because of automation. Robots are now advanced enough to take over complex jobs in areas like agriculture and mining, eliminating the need for humans to perform dangerous or tedious tasks. In practice, however, automation is a double-edged sword, with the capacity to oppress as well as unburden. Machines often accelerate the rate at which humans can work, taxing rather than liberating them. Conveyor belts eliminated the need for workers to pass unfinished products along to their colleagues—but as Charlie Chaplin and Lucille Ball so hilariously demonstrated, the belts also increased the pace at which those same workers needed to turn wrenches and wrap chocolates. In retail and customer service, a main function of automation has been not to eliminate work, but to eliminate waged work, transferring much of the labor onto consumers, who must now weigh and code their own vegetables at the supermarket, check out their own library books, and tag their own luggage at the airport.

At the same time, it may be harder to automate some jobs that require a human touch, such as floristry or hairstyling. The same goes for the delicate work of caring for the young, sick, elderly, or otherwise vulnerable. In today’s economy, the demand for such labor is rising rapidly: “Nine of the twelve fastest-growing fields,” The New York Times reported earlier this year, “are different ways of saying ‘nurse.’” These jobs also happen to be low-paying, emotionally and physically grueling, dirty, hazardous, and shouldered largely by women and immigrants. Regardless of whether employment is virtuous or not, our immediate goal should perhaps be to distribute the burdens of caregiving, since such work is essential to the functioning of society and benefits us all.


A truly work-free world is one that would entail a revolution from our present social organizations. We could no longer conceive of welfare as a last resort—as the “safety net” metaphor implies—but would be forced to treat it as an unremarkable and universal fact of life. This alone would require us to support a massive redistribution of wealth, and to reclaim our political institutions from the big-money interests that are allergic to such changes. Tall orders indeed—but as Srnicek and Williams remind us in their book, Inventing the Future: Postcapitalism and a World Without Work, neoliberals pulled off just such a revolution in the postwar years. Thanks to their efforts, free-market liberalism replaced Keynesianism as the political and economic common sense all around the world.

Another possible solution to the current miseries of unemployment and worker exploitation is the one Livingston rejects in his title: full employment. For anti-work partisans, full employment takes us in the wrong direction, and UBI corrects the course. But the two are not mutually exclusive. In fact, rather than creating new jobs, full employment could require us to reduce our work hours drastically and spread them throughout the workforce—a scheme that could radically de-center waged work in our lives. A dual strategy of pursuing full employment while also demanding universal benefits—including health care, childcare, and affordable housing—would maximize workers’ bargaining power to ensure that they, and not just owners of capital, actually get to enjoy the bounty of labor-saving technology.

Nevertheless, Livingston’s critiques of full employment are worth heeding. As with automation, it can all go wrong if we use the banner of full employment to create pointless roles—what David Graeber has termed “bullshit jobs,” in which workers sit in some soul-sucking basement office for eight hours a day—or harmful jobs, like building nuclear weapons. If we do not have a deliberate politics rooted in universal social justice, then full employment, a basic income, and automation will not liberate us from the degradations of work.

Both Livingston and Anderson reveal how much of our own power we’ve already ceded in making waged work the conduit for our ideals of liberty and morality. The scale and coordination of the institutions we’re up against in the fight for our emancipation is, as Anderson demonstrates, staggering. Employers hold the means to our well-being, and they have the law on their side. Individual efforts to achieve a better “work-life balance” for ourselves and our families miss the wider issue we face as waged employees. Livingston demonstrates the scale at which we should be thinking: Our demands should be revolutionary, our imaginations wide. Standing amid the wreckage of last year’s presidential election, what other choice do we have?

 

Miya Tokumitsu is a lecturer of art history at the University of Melbourne and a contributing editor at Jacobin. She is the author of Do What You Love.  And Other Lies about Success and Happiness.

Against meaninglessness and precarity: the crisis of work

work

By David Frayne

Source: ROAR Magazine

If work is vital for income, social inclusion and a sense of identity, then one of the most troubling contradictions of our time is that the centrality of work in our societies persists even when work is in a state of crisis. The steady erosion of stable and satisfying employment makes it less and less clear whether modern jobs can offer the sense of moral agency, recognition and pride required to secure work as a source of meaning and identity. The standardization, precarity and dubious social utility that characterize many modern jobs are a major source of modern misery.

Mass unemployment is also now an enduring structural feature of capitalist societies. The elimination of huge quantities of human labor by the development of machine technologies is a process that has spanned centuries. However, perhaps due to high-profile developments like Apple’s Siri computer assistant or Amazon’s delivery drones, the discussion around automation has once again been ignited.

An often-cited study by Carl Frey and Michael Osborne anticipates an escalation of technological unemployment over the coming years. Occupations at high risk include the likes of models, cooks and construction workers, thanks to advances such as digital avatars, burger flipping machines and the ability to manufacture prefabricated buildings in factories with robots. It is also anticipated that advances in artificial intelligence and machine learning will allow an increasing quantity of cognitive work tasks to become automated.

What all of this means is that we are steadily becoming a society of workers without work: a society of people who are materially, culturally and psychologically bound to paid employment, but for whom there are not enough stable and meaningful jobs to go around. Perversely, the most pressing problem for many people is no longer exploitation, but the absence of opportunities to be sufficiently and dependably exploited. The impact of this problem in today’s epidemic of anxiety and exhaustion should not be underestimated.

What makes the situation all the crueler is the pervasive sense that the precarious victims of the crisis are somehow personally responsible for their fate. In the UK, barely a week goes by without a smug reaffirmation of the work ethic in the media, or some story that constructs unemployment as a form of deviance. The UK television show Benefits Street comes to mind, but perhaps the most outrageous example in recent times was not from the world of trash TV, but from Dr. Adam Perkins’ thesis, The Welfare Trait. Published last year, Perkins’ book tackled what he defined as the “employment-resistant personality”. Joblessness is explained in terms of an inter-generationally transmitted psychological disorder. Perkins’ study is the most polished product of the ideology of work one can imagine. His study is so dazzled by its own claims to scientific objectivity, so impervious to its own grounding in the work ethic, that it beggars belief.

It seems we find ourselves at a rift. On the one hand, work has been positioned as a central source of income, solidarity and social recognition, whereas on the other, the promise of stable, meaningful and satisfying employment crumbles around us. The crucial question: how should societies adjust to this deepening crisis of work?

 


This is an excerpt from David Frayne’s “Towards a Post-Work Society”, which will appear in ROAR Issue #2, The Future of Work, scheduled for release in June/July.

Fuck Work

 tumblr_mkr6eleomu1qiyurho1_r2_500

Economists believe in full employment. Americans think that work builds character. But what if jobs aren’t working anymore?

By James Livingston

Source: aeon

Work means everything to us Americans. For centuries – since, say, 1650 – we’ve believed that it builds character (punctuality, initiative, honesty, self-discipline, and so forth). We’ve also believed that the market in labour, where we go to find work, has been relatively efficient in allocating opportunities and incomes. And we’ve believed that, even if it sucks, a job gives meaning, purpose and structure to our everyday lives – at any rate, we’re pretty sure that it gets us out of bed, pays the bills, makes us feel responsible, and keeps us away from daytime TV.

These beliefs are no longer plausible. In fact, they’ve become ridiculous, because there’s not enough work to go around, and what there is of it won’t pay the bills – unless of course you’ve landed a job as a drug dealer or a Wall Street banker, becoming a gangster either way.

These days, everybody from Left to Right – from the economist Dean Baker to the social scientist Arthur C Brooks, from Bernie Sanders to Donald Trump – addresses this breakdown of the labour market by advocating ‘full employment’, as if having a job is self-evidently a good thing, no matter how dangerous, demanding or demeaning it is. But ‘full employment’ is not the way to restore our faith in hard work, or in playing by the rules, or in whatever else sounds good. The official unemployment rate in the United States is already below 6 per cent, which is pretty close to what economists used to call ‘full employment’, but income inequality hasn’t changed a bit. Shitty jobs for everyone won’t solve any social problems we now face.

Don’t take my word for it, look at the numbers. Already a fourth of the adults actually employed in the US are paid wages lower than would lift them above the official poverty line – and so a fifth of American children live in poverty. Almost half of employed adults in this country are eligible for food stamps (most of those who are eligible don’t apply). The market in labour has broken down, along with most others.

Those jobs that disappeared in the Great Recession just aren’t coming back, regardless of what the unemployment rate tells you – the net gain in jobs since 2000 still stands at zero – and if they do return from the dead, they’ll be zombies, those contingent, part-time or minimum-wage jobs where the bosses shuffle your shift from week to week: welcome to Wal-Mart, where food stamps are a benefit.

And don’t tell me that raising the minimum wage to $15 an hour solves the problem. No one can doubt the moral significance of the movement. But at this rate of pay, you pass the official poverty line only after working 29 hours a week. The current federal minimum wage is $7.25. Working a 40-hour week, you would have to make $10 an hour to reach the official poverty line. What, exactly, is the point of earning a paycheck that isn’t a living wage, except to prove that you have a work ethic?

But, wait, isn’t our present dilemma just a passing phase of the business cycle? What about the job market of the future? Haven’t the doomsayers, those damn Malthusians, always been proved wrong by rising productivity, new fields of enterprise, new economic opportunities? Well, yeah – until now, these times. The measurable trends of the past half-century, and the plausible projections for the next half-century, are just too empirically grounded to dismiss as dismal science or ideological hokum. They look like the data on climate change – you can deny them if you like, but you’ll sound like a moron when you do.

For example, the Oxford economists who study employment trends tell us that almost half of existing jobs, including those involving ‘non-routine cognitive tasks’ – you know, like thinking – are at risk of death by computerisation within 20 years. They’re elaborating on conclusions reached by two MIT economists in the book Race Against the Machine (2011). Meanwhile, the Silicon Valley types who give TED talks have started speaking of ‘surplus humans’ as a result of the same process – cybernated production. Rise of the Robots, a new book that cites these very sources, is social science, not science fiction.

So this Great Recession of ours – don’t kid yourself, it ain’t over – is a moral crisis as well as an economic catastrophe. You might even say it’s a spiritual impasse, because it makes us ask what social scaffolding other than work will permit the construction of character – or whether character itself is something we must aspire to. But that is why it’s also an intellectual opportunity: it forces us to imagine a world in which the job no longer builds our character, determines our incomes or dominates our daily lives.

In short, it lets us say: enough already. Fuck work.

Certainly this crisis makes us ask: what comes after work? What would you do without your job as the external discipline that organises your waking life – as the social imperative that gets you up and on your way to the factory, the office, the store, the warehouse, the restaurant, wherever you work and, no matter how much you hate it, keeps you coming back? What would you do if you didn’t have to work to receive an income?

And what would society and civilisation be like if we didn’t have to ‘earn’ a living – if leisure was not our choice but our lot? Would we hang out at the local Starbucks, laptops open? Or volunteer to teach children in less-developed places, such as Mississippi? Or smoke weed and watch reality TV all day?

I’m not proposing a fancy thought experiment here. By now these are practical questions because there aren’t enough jobs. So it’s time we asked even more practical questions. How do you make a living without a job – can you receive income without working for it? Is it possible, to begin with and then, the hard part, is it ethical? If you were raised to believe that work is the index of your value to society – as most of us were – would it feel like cheating to get something for nothing?

We already have some provisional answers because we’re all on the dole, more or less. The fastest growing component of household income since 1959 has been ‘transfer payments’ from government. By the turn of the 21st century, 20 per cent of all household income came from this source – from what is otherwise known as welfare or ‘entitlements’. Without this income supplement, half of the adults with full-time jobs would live below the poverty line, and most working Americans would be eligible for food stamps.

But are these transfer payments and ‘entitlements’ affordable, in either economic or moral terms? By continuing and enlarging them, do we subsidise sloth, or do we enrich a debate on the rudiments of the good life?

Transfer payments or ‘entitlements’, not to mention Wall Street bonuses (talk about getting something for nothing) have taught us how to detach the receipt of income from the production of goods, but now, in plain view of the end of work, the lesson needs rethinking. No matter how you calculate the federal budget, we can afford to be our brother’s keeper. The real question is not whether but how we choose to be.

I know what you’re thinking – we can’t afford this! But yeah, we can, very easily. We raise the arbitrary lid on the Social Security contribution, which now stands at $127,200, and we raise taxes on corporate income, reversing the Reagan Revolution. These two steps solve a fake fiscal problem and create an economic surplus where we now can measure a moral deficit.

Of course, you will say – along with every economist from Dean Baker to Greg Mankiw, Left to Right – that raising taxes on corporate income is a disincentive to investment and thus job creation. Or that it will drive corporations overseas, where taxes are lower.

But in fact raising taxes on corporate income can’t have these effects.

Let’s work backward. Corporations have been ‘multinational’ for quite some time. In the 1970s and ’80s, before Ronald Reagan’s signature tax cuts took effect, approximately 60 per cent of manufactured imported goods were produced offshore, overseas, by US companies. That percentage has risen since then, but not by much.

Chinese workers aren’t the problem – the homeless, aimless idiocy of corporate accounting is. That is why the Citizens United decision of 2010 applying freedom of speech regulations to campaign spending is hilarious. Money isn’t speech, any more than noise is. The Supreme Court has conjured a living being, a new person, from the remains of the common law, creating a real world more frightening than its cinematic equivalent: say, Frankenstein, Blade Runner or, more recently, Transformers.

But the bottom line is this. Most jobs aren’t created by private, corporate investment, so raising taxes on corporate income won’t affect employment. You heard me right. Since the 1920s, economic growth has happened even though net private investment has atrophied. What does that mean? It means that profits are pointless except as a way of announcing to your stockholders (and hostile takeover specialists) that your company is a going concern, a thriving business. You don’t need profits to ‘reinvest’, to finance the expansion of your company’s workforce or output, as the recent history of Apple and most other corporations has amply demonstrated.

So investment decisions by CEOs have only a marginal effect on employment. Taxing the profits of corporations to finance a welfare state that permits us to love our neighbours and to be our brothers’ keeper is not an economic problem. It’s something else – it’s an intellectual issue, a moral conundrum.

When we place our faith in hard work, we’re wishing for the creation of character; but we’re also hoping, or expecting, that the labour market will allocate incomes fairly and rationally. And there’s the rub, they do go together. Character can be created on the job only when we can see that there’s an intelligible, justifiable relation between past effort, learned skills and present reward. When I see that your income is completely out of proportion to your production of real value, of durable goods the rest of us can use and appreciate (and by ‘durable’ I don’t mean just material things), I begin to doubt that character is a consequence of hard work.

When I see, for example, that you’re making millions by laundering drug-cartel money (HSBC), or pushing bad paper on mutual fund managers (AIG, Bear Stearns, Morgan Stanley, Citibank), or preying on low-income borrowers (Bank of America), or buying votes in Congress (all of the above) – just business as usual on Wall Street – while I’m barely making ends meet from the earnings of my full-time job, I realise that my participation in the labour market is irrational. I know that building my character through work is stupid because crime pays. I might as well become a gangster like you.

That’s why an economic crisis such as the Great Recession is also a moral problem, a spiritual impasse – and an intellectual opportunity. We’ve placed so many bets on the social, cultural and ethical import of work that when the labour market fails, as it so spectacularly has, we’re at a loss to explain what happened, or to orient ourselves to a different set of meanings for work and for markets.

And by ‘we’ I mean pretty much all of us, Left to Right, because everybody wants to put Americans back to work, one way or another – ‘full employment’ is the goal of Right-wing politicians no less than Left-wing economists. The differences between them are over means, not ends, and those ends include intangibles such as the acquisition of character.

Which is to say that everybody has doubled down on the benefits of work just as it reaches a vanishing point. Securing ‘full employment’ has become a bipartisan goal at the very moment it has become both impossible and unnecessary. Sort of like securing slavery in the 1850s or segregation in the 1950s.

Why?

Because work means everything to us inhabitants of modern market societies – regardless of whether it still produces solid character and allocates incomes rationally, and quite apart from the need to make a living. It’s been the medium of most of our thinking about the good life since Plato correlated craftsmanship and the possibility of ideas as such. It’s been our way of defying death, by making and repairing the durable things, the significant things we know will last beyond our allotted time on earth because they teach us, as we make or repair them, that the world beyond us – the world before and after us – has its own reality principles.

Think about the scope of this idea. Work has been a way of demonstrating differences between males and females, for example by merging the meanings of fatherhood and ‘breadwinner’, and then, more recently, prying them apart. Since the 17th century, masculinity and femininity have been defined – not necessarily achieved – by their places in a moral economy, as working men who got paid wages for their production of value on the job, or as working women who got paid nothing for their production and maintenance of families. Of course, these definitions are now changing, as the meaning of ‘family’ changes, along with profound and parallel changes in the labour market – the entry of women is just one of those – and in attitudes toward sexuality.

When work disappears, the genders produced by the labour market are blurred. When socially necessary labour declines, what we once called women’s work – education, healthcare, service – becomes our basic industry, not a ‘tertiary’ dimension of the measurable economy. The labour of love, caring for one another and learning how to be our brother’s keeper – socially beneficial labour – becomes not merely possible but eminently necessary, and not just within families, where affection is routinely available. No, I mean out there, in the wide, wide world.

Work has also been the American way of producing ‘racial capitalism’, as the historians now call it, by means of slave labour, convict labour, sharecropping, then segregated labour markets – in other words, a ‘free enterprise system’ built on the ruins of black bodies, an economic edifice animated, saturated and determined by racism. There never was a free market in labour in these united states. Like every other market, it was always hedged by lawful, systematic discrimination against black folk. You might even say that this hedged market produced the still-deployed stereotypes of African-American laziness, by excluding black workers from remunerative employment, confining them to the ghettos of the eight-hour day.

And yet, and yet. Though work has often entailed subjugation, obedience and hierarchy (see above), it’s also where many of us, probably most of us, have consistently expressed our deepest human desire, to be free of externally imposed authority or obligation, to be self-sufficient. We have defined ourselves for centuries by what we do, by what we produce.

But by now we must know that this definition of ourselves entails the principle of productivity – from each according to his abilities, to each according to his creation of real value through work – and commits us to the inane idea that we’re worth only as much as the labour market can register, as a price. By now we must also know that this principle plots a certain course to endless growth and its faithful attendant, environmental degradation.

Until now, the principle of productivity has functioned as the reality principle that made the American Dream seem plausible. ‘Work hard, play by the rules, get ahead’, or, ‘You get what you pay for, you make your own way, you rightly receive what you’ve honestly earned’ – such homilies and exhortations used to make sense of the world. At any rate they didn’t sound delusional. By now they do.

Adherence to the principle of productivity therefore threatens public health as well as the planet (actually, these are the same thing). By committing us to what is impossible, it makes for madness. The Nobel Prize-winning economist Angus Deaton said something like this when he explained anomalous mortality rates among white people in the Bible Belt by claiming that they’ve ‘lost the narrative of their lives’ – by suggesting that they’ve lost faith in the American Dream. For them, the work ethic is a death sentence because they can’t live by it.

So the impending end of work raises the most fundamental questions about what it means to be human. To begin with, what purposes could we choose if the job – economic necessity – didn’t consume most of our waking hours and creative energies? What evident yet unknown possibilities would then appear? How would human nature itself change as the ancient, aristocratic privilege of leisure becomes the birthright of human beings as such?

Sigmund Freud insisted that love and work were the essential ingredients of healthy human being. Of course he was right. But can love survive the end of work as the willing partner of the good life? Can we let people get something for nothing and still treat them as our brothers and sisters – as members of a beloved community? Can you imagine the moment when you’ve just met an attractive stranger at a party, or you’re online looking for someone, anyone, but you don’t ask: ‘So, what do you do?’

We won’t have any answers until we acknowledge that work now means everything to us – and that hereafter it can’t.

Review: The Utopia of Rules, by David Graeber

index

By Jonathan Woolley

Source: Gods & Radicals

Reading a book about bureaucracy may not sound like an exciting way to spend a weekend off with my family. And yet, having just started David Graeber’s latest – A Utopia of Rules – when I wasn’t making tea for my elderly grandmother, I curled up in a comfy chair with this little pink book, mocked up to look like one of the forms it excoriates, and excited by each new page. Although many of the ideas Graeber presents here aren’t new, the clarity and force with which they are drawn together and set out is a rare pleasure – a contrast with turgid official paperwork that was almost certainly intentional.

Graeber – a social anthropologist, anarchist, and prominent leftist thinker, based at the London School of Economics (LSE) – develops his argument, in part, by thinking ethnographically with his own personal experiences of officialdom, beginning with a heartbreaking account of his own struggle to deal with his elderly mother’s Medicaid application. In response to this, he introduces the book as a series of short essays on different facets of what he calls “total bureaucratisation” – defined as “the gradual fusion of public and private power into a single entity, rife with rules and regulations whose ultimate purpose is to extract wealth in the form of profits”. Bureaucracy is not a simple matter of red-tape created by the state tying up private enterprise, as right-wing pundits would have us believe: Graeber points out that bureaucratic forms have become intrinsic to both private and public spheres.

While the Left has been largely unable to produce a critique of bureaucracy, the Right has such a critique – but efforts to “roll back” the state by the Right have had the opposite effect, producing even more paperwork than ever. This leads Graeber to propose what he calls “the Iron Law of Liberalism”, which states that “any market reform, any government initiative intended to reduce red tape and promote market forces will have the ultimate effect of increasing the total number of regulations, the total amount of paperwork, and the total number of bureaucrats the government employs.”

In stressing the coeval nature of the free market and an expansive state, Graeber directs his analysis away from shallow criticism of big government, towards the common institutional basis of all inequality, found at the heart of neoliberal governance. Given the extent to which the general public in the English-speaking world continue to view the expansive state and the “free” market as antithetical to one another and synonymous with the Left and the Right of politics respectively, this is an important point to make.

With the foundations laid, Graeber’s lucid prose carries the reader briskly through a sequence of stand-alone essays, each of which engages with a particular aspect of total bureaucratisation today. Each of these, Graeber claims, will need to be addressed by any critique of bureaucracy the Left might develop. Dead Zones of the Imagination utilises feminist theory of imaginative labour to develop the argument that bureaucracy – in addition to being stupid – exists to create stupidity. Its impersonal procedures, backed up by threat of violence, ensure that those in positions of authority – especially the police – are able to avoid doing the imaginative labour of empathising with others, while forcing those others to engage in imaginative labour towards the authorities, simply in order to avoid physical harm. Police insist upon being able to “define the situation” – those who contest this, rather than violent criminals, are the ones who are routinely meet with physical violence. This serves to emphasise a very basic point: don’t underestimate the importance of physical violence, even if it takes place behind a veil of paper.

In Of Flying Cars and the Declining Rate of Profit, Graeber turns his attention to the trajectory of technological development in the modern world. Why is it, he asks, that in the 1950s we were able to explore space, and expected to be surrounded by robotic servants and flying cars by now, but that this awesome potential has not been realised? The answer, he suggests, is that rather than cause social change by itself, the direction of technological innovation is directed by financial interests – so that instead of pursuing automation and space travel that could disrupt existing economic relations on Earth, major funders have prioritised less disruptive research lines, such as information technology. The greatest achievement of the late 20th century – the Internet – is revealed as decidedly chimeric; both a tool for enhanced communication, but also a means of surveillance and manipulation on an industrial scale. The promise of technology has been broken in favour of labour discipline and social control; R&D budgets have been slashed in favour of boosting executive pay and shareholder dividends. Instead of being allowed to pursue their research interests, academics are increasingly forced to spend more and more of their time doing paperwork. Rather than a driver of social change, technology is itself subject to the demands of capital.

The Utopia of Rules, or Why We Really Love Bureaucracy After All concludes the triptych, by exploring the ways in which bureaucracy can, in fact, be deeply enchanting – when it works well – providing human beings with a sense of predictability and certainty that can be deeply seductive. While the second essay uses science fiction to reflect upon the curious falling short of innovation, this essay turns to magic and fantasy fiction in an attempt to understand how the appeal of bureaucratic rationality is generated. Graeber argues that the elaborate angelic hierarchies and formulaic modes of ritual address, developed in the Rennaissance but that now enliven Western Ceremonial Magic, actually reflect a political imaginary – a vision of the chaotic, violent world of the Middle ages reordered according to a spiritualised version of the old, lost, Roman bureaucracy. Nowadays, however, this vision is inverted – fantasy fiction today constructs a pseudo-Medieval world, where bureaucracy is almost entirely absent, where creativity is directly channelled into reality via magic, and where leadership is acquired on the basis of personal virtue and conquest, rather than through impersonal qualification or graduate recruitment. However, while giving us an opportunity to vicariously enjoy a world without bureaucracy, medievalist fantasies – with their perennial sense of threat and danger – nonetheless reinforce our sense that it’s probably preferable to live with the devil we know. Just as the gruesome spectacle of Gladitorial combat both beguiled and repulsed the populace of Rome from the idea of democracy, the blood-soaked cities of Westeros instil in us a fear of a world without bureaucratic order.

Perhaps the most fascinating contestation made by Graeber – albeit, only in passing – is that bureaucratic rationality rests upon a resolutely spiritual set of commitments. The idea that numbers and their rational appraisal can help one to understand and manipulate reality, reaches back to the Pythagoreanism of ancient Greece. They, in turn, directly inspired Plato, the father of Western formalism, and in turn the Medieval angelic hierarchies mentioned above. This commitment to the power of logic and pure numbers conferred upon bureaucracy a utopian air; bureaucrats envision a world of perfect harmony, governed by well-designed, efficient institutions, and develop frameworks that attempt to make that world a reality. The fact that the complexity of the world-as-lived rarely fits these lofty ideals ensures that bureaucracy requires constant enforcement – with the force in question being the threat of violence meted out by private security, the police or the military.

But it is in the Appendix – Batman and the Problem of Constituent Power – that we find some of Graeber’s most timely observations for the present moment. In a playful analysis of the cultural and political significance of superheroes, Graeber points out that – building upon his analysis of medievalist fantasy in the previous chapter – comics teach the same kind of lesson. In pitting basically passive superheroes who seek to preserve the status quo against endlessly creative and scheming villains who wish to unseat it, comics allow the reader to vicariously enjoy the thrill of unfettered creative potential, only to enforce the idea that such potential necessarily leads to violence, and that violence is in turn the only way that it can be controlled.

In the Marvel and DC Universes, the only alternative to bureaucracy is violent creativity of villains – in short, fascism. This, in turn, allows Graeber to highlight a broad distinction between the left and the right: “Ultimately, the division between left-and right-wing sensibilities turns on one’s attitude towards the imagination. For the Left, imagination, creativity, by extension production, the power to bring new things and new social arrangements into being, is always to be celebrated. It is the source of all real value in the world. For the Right, it is dangerous; ultimately, evil. The urge to create is also a destructive urge. This kind of sensibility was rife in the popular Freudianism of the day [1950s]: where the Id was the motor of the psyche, but also amoral; if really unleashed, it would lead to an orgy of destruction. This is also what separates conservatives from fascists. Both agree that the imagination unleashed can only lead to violence and destruction. Conservatives wish to defend us against that possibility. Fascists wish to unleash it anyway. They aspire to be, as Hitler imagined himself, great artists painting with the minds, blood, and sinews of humanity.”

Following from the magistral philosophical treatise Debt: The First 5,000 years (2011), The Utopia of Rules is a more modest project. Graeber does not attempt to propose a leftist critique of total bureaucratisation within its pages, though he argues such a critique is long overdue. Nor does he advance a singular argument – his goal is simply to prompt a conversation. With the rise of the populist right, this conversation is more important than ever. The mainstream Left, Graeber points out, has for too long positioned itself on the side of state control, leaving critiques of bureaucracy to the Right. As the pro-market efforts of neoliberalism have done nothing but concentrate capital in the hands of the rentier classes, the frustration is now boiling over. And yet, in unveiling the mystical roots of stultifying modern paperwork, Graeber reveals a way forward for us – if total bureaucratisation is a spell laid over the world, that spell may be broken. We need not live out the fevered dreams of Renaissance mystics; we can awaken. Nor shall the dark blood and bone portraits of fascists necessarily hold sway over the human imagination, for the Left is just as creative as the right; indeed, unlike them, we can create without fear of creativity. The Right may aspire to break this world, but it is the birthright of the Left to make a better one.

Lara Trace Hentz

INDIAN COUNTRY NEWS

In Saner Thought

"It is the duty of every man, as far as his ability extends, to detect and expose delusion and error"..Thomas Paine

ZEDJournAI

Human in Algorithms

Rooster Crows

From the Roof Top

Aisle C

I See This

The Free

blog of the post capitalist transition.. Read or download the novel here + latest relevant posts

अध्ययन-अनुसन्धान(Essential Knowledge of the Overall Subject)

अध्ययन-अनुसन्धानको सार