After the Crash

Dispatches From a Long Recovery (Est. 10/2024)

After the Crash

Saturday Matinee: Obsolete

Source: Truthstream Media

The Future Doesn’t Need Us… Or So We’ve Been Told. With the rise of technology and the real-time pressures of an online, global economy, humans will have to be very clever – and very careful – not to be left behind by the future. From the perspective of those in charge, human labor is losing its value, and people are becoming a liability. This documentary reveals the real motivation behind the secretive effort to reduce the population and bring resource use into strict, centralized control. Could it be that the biggest threat we face isn’t just automation and robots destroying jobs, but the larger sense that humans could become obsolete altogether? *Please watch and share!* Link to film: http://amzn.to/2f69Ocr

The United States of Work

Employers exercise vast control over our lives, even when we’re not on the job. How did our bosses gain power that the government itself doesn’t hold?

By Miya Tokumitsu

Source: New Republic

Work no longer works. “You need to acquire more skills,” we tell young job seekers whose résumés at 22 are already longer than their parents’ were at 32. “Work will give you meaning,” we encourage people to tell themselves, so that they put in 60 hours or more per week on the job, removing them from other sources of meaning, such as daydreaming or social life. “Work will give you satisfaction,” we insist, even though it requires abiding by employers’ rules, and the unwritten rules of the market, for most of our waking hours. At the very least, work is supposed to be a means to earning an income. But if it’s possible to work full time and still live in poverty, what’s the point?

Even before the global financial crisis of 2008, it had become clear that if waged work is supposed to provide a measure of well-being and social structure, it has failed on its own terms. Real household wages in the United States have remained stagnant since the 1970s, even as the costs of university degrees and other credentials rise. Young people find an employment landscape defined by unpaid internships, temporary work, and low pay. The glut of degree-holding young workers has pushed many of them into the semi- or unskilled labor force, making prospects even narrower for non–degree holders. Entry-level wages for high school graduates have in fact fallen. According to a study by the Federal Reserve Bank of New York, these lost earnings will depress this generation’s wages for their entire working lives. Meanwhile, those at the very top—many of whom derive their wealth not from work, but from returns on capital—vacuum up an ever-greater share of prosperity.

Against this bleak landscape, a growing body of scholarship aims to overturn our culture’s deepest assumptions about how work confers wealth, meaning, and care throughout society. In Private Government: How Employers Rule Our Lives (and Why We Don’t Talk About It), Elizabeth Anderson, a professor of philosophy at the University of Michigan, explores how the discipline of work has itself become a form of tyranny, documenting the expansive power that firms now wield over their employees in everything from how they dress to what they tweet. James Livingston, a historian at Rutgers, goes one step further in No More Work: Why Full Employment Is a Bad Idea. Instead of insisting on jobs for all or proposing that we hold employers to higher standards, Livingston argues, we should just scrap work altogether.

Livingston’s vision is the more radical of the two; his book is a wide-ranging polemic that frequently delivers the refrain “Fuck work.” But in original ways, both books make a powerful claim: that our lives today are ruled, above all, by work. We can try to convince ourselves that we are free, but as long as we must submit to the increasing authority of our employers and the labor market, we are not. We therefore fancy that we want to work, that work grounds our character, that markets encompass the possible. We are unable to imagine what a full life could be, much less to live one. Even more radically, both books highlight the dramatic and alarming changes that work has undergone over the past century—insisting that, in often unseen ways, the changing nature of work threatens the fundamental ideals of democracy: equality and freedom.

Anderson’s most provocative argument is that large companies, the institutions that employ most workers, amount to a de facto form of government, exerting massive and intrusive power in our daily lives. Unlike the state, these private governments are able to wield power with little oversight, because the executives and boards of directors that rule them are accountable to no one but themselves. Although they exercise their power to varying degrees and through both direct and “soft” means, employers can dictate how we dress and style our hair, when we eat, when (and if) we may use the toilet, with whom we may partner and under what arrangements. Employers may subject our bodies to drug tests; monitor our speech both on and off the job; require us to answer questionnaires about our exercise habits, off-hours alcohol consumption, and childbearing intentions; and rifle through our belongings. If the state held such sweeping powers, Anderson argues, we would probably not consider ourselves free men and women.

Employees, meanwhile, have few ways to fight back. Yes, they may leave the company, but doing so usually necessitates being unemployed or migrating to another company and working under similar rules. Workers may organize, but unions have been so decimated in recent years that their clout is greatly diminished. What’s more, employers are swift to fire anyone they suspect of speaking to their colleagues about organizing, and most workers lack the time and resources to mount a legal challenge to wrongful termination.

It wasn’t supposed to be this way. As corporations have worked methodically to amass sweeping powers over their employees, they have held aloft the beguiling principle of individual freedom, claiming that only unregulated markets can guarantee personal liberty. Instead, operating under relatively few regulations themselves, these companies have succeeded at imposing all manner of regulation on their employees. That is to say, they use the language of individual liberty to claim that corporations require freedom to treat workers as they like.

Anderson sets out to discredit such arguments by tracing them back to their historical origins. The notion that personal freedom is rooted in free markets, for instance, originated with the Levellers in seventeenth-century England, when working conditions differed substantially from today’s. The Levellers believed that a market society was essential to liberate individuals from the remnants of feudal hierarchies; their vision of utopia was a world in which men could meet and interact on terms of equality and dignity. Their ideas echoed through the writing and politics of later figures like John Locke, Adam Smith, Thomas Paine, and Abraham Lincoln, all of whom believed that open markets could provide the essential infrastructure for individuals to shape their own destiny.

An anti-statist streak runs through several of these thinkers, particularly the Levellers and Paine, who viewed markets as the bulwark against state oppression. Paine and Smith, however, would hardly qualify as hard-line contemporary libertarians. Smith believed that public education was essential to a fair market society, and Paine proposed a system of social insurance that included old-age pensions as well as survivor and disability benefits. Their hope was not for a world of win-or-die competition, but one in which open markets would allow individuals to make the fullest use of their talents, free from state monopolies and meddlesome bosses.

For Anderson, the latter point is essential; the notion of lifelong employment under a boss was anathema to these earlier visions of personal freedom. Writing in the 1770s, Smith assumes that independent actors in his market society will be self-employed, and uses butchers and bakers as his exemplars; his “pin factory,” meant to illustrate division of labor, employs only ten people. These thinkers could not envision a world in which most workers spend most of their lives performing wage labor under a single employer. In an address before the Wisconsin State Agricultural Society in 1859, Lincoln stated, “The prudent, penniless beginner in the world labors for wages awhile, saves a surplus with which to buy tools or land for himself, then labors on his own account another while, and at length hires another new beginner to help him.” In other words, even well into the nineteenth century, defenders of an unregulated market society viewed wage labor as a temporary stage on the way to becoming a proprietor.

Lincoln’s scenario does not reflect the way most people work today. Yet the “small business owner” endures as an American stock character, conjured by politicians to push through deregulatory measures that benefit large corporations. In reality, thanks to a lack of guaranteed, nationalized health care and threadbare welfare benefits, setting up a small business is simply too risky a venture for many Americans, who must rely on their employers for health insurance and income. These conditions render long-term employment more palatable than a precarious existence of freelance gigs, which further gives companies license to oppress their employees.

The modern relationship between employer and employee began with the rise of large-scale companies in the nineteenth century. Although employment contracts date back to the Middle Ages, preindustrial arrangements bore little resemblance to the documents we know today. Like modern employees, journeymen and apprentices often served their employers for years, but masters performed the same or similar work in proximity to their subordinates. As a result, Anderson points out, working conditions—the speed required of workers and the hazards to which they might be exposed—were kept in check by what the masters were willing to tolerate for themselves.

The Industrial Revolution brought radical changes, as companies grew ever larger and management structures more complex. “Employers no longer did the same kind of work as employees, if they worked at all,” Anderson observes. “Mental labor was separated from manual labor, which was radically deskilled.” Companies multiplied rapidly in size. Labor contracts now bonded workers to massive organizations in which discipline, briefs, and decrees flowed downward, but whose leaders were unreachable by ordinary workers. Today, fast food workers or bank tellers would be hard-pressed to petition their CEOs at McDonald’s or Wells Fargo in person.

Despite this, we often speak of employment contracts as agreements between equals, as if we are living in Adam Smith’s eighteenth-century dream world. In a still-influential paper from 1937 titled “The Nature of the Firm,” the economist and Nobel laureate Ronald Coase established himself as an early observer and theorist of corporate concerns. He described the employment contract not as a document that handed the employer unaccountable powers, but as one that circumscribed those powers. In signing a contract, the employee “agrees to obey the directions of an entrepreneur within certain limits,” he emphasized. But such characterizations, as Anderson notes, do not reflect reality; most workers agree to employment without any negotiation or even communication about their employer’s power or its limits. The exceptions to this rule are few and notable: top professional athletes, celebrity entertainers, superstar academics, and the (increasingly small) groups of workers who are able to bargain collectively.

Yet because employment contracts create the illusion that workers and companies have arrived at a mutually satisfying agreement, the increasingly onerous restrictions placed on modern employees are often presented as “best practices” and “industry standards,” framing all sorts of behaviors and outcomes as things that ought to be intrinsically desired by workers themselves. Who, after all, would not want to work on something in the “best” way? Beyond employment contracts, companies also rely on social pressure to foster obedience: If everyone in the office regularly stays until seven o’clock every night, who would risk departing at five, even if it’s technically allowed? Such social prods exist alongside more rigid behavioral codes that dictate everything from how visible an employee’s tattoo can be to when and how long workers can break for lunch.

Many workers, in fact, have little sense of the legal scope of their employer’s power. Most would be shocked to discover that they could be fired for being too attractive, declining to attend a political rally favored by their employer, or finding out that their daughter was raped by a friend of the boss—all real-life examples cited by Anderson. Indeed, it is only after dismissal for such reasons that many workers learn of the sweeping breadth of at-will employment, the contractual norm that allows American employers to fire workers without warning and without cause, except for reasons explicitly deemed illegal.

In reality, the employment landscape is even more dire than Anderson outlines. The rise of staffing or “temp” agencies, for example, undercuts the very idea of a direct relationship between worker and employer. In The Temp Economy: From Kelly Girls to Permatemps in Postwar America, sociologist Erin Hatton notes that millions of workers now labor under subcontracting arrangements, which give employers even greater latitude to abuse employees. For years, Walmart—America’s largest retailer—used a subcontracting firm to hire hundreds of cleaners, many from Eastern Europe, who worked for months on end without overtime pay or a single day off. After federal agents raided dozens of Walmarts and arrested the cleaners as illegal immigrants, company executives used the subcontracting agreement to shirk responsibility for their exploitation of the cleaners, claiming they had no knowledge of their immigration status or conditions.

By any reasonable standard, much “temp” work is not even temporary. Employees sometimes work for years in a single workplace, even through promotions, without ever being granted official status as an employee. Similarly, “gig economy” platforms like Uber designate their workers as contractors rather than employees, a distinction that exempts the company from paying them minimum wage and overtime. Many “permatemps” and contractors perform the same work as employees, yet lack even the paltry protections and benefits awarded to full-time workers.

A weak job market, paired with the increasing precarity of work, means that more and more workers are forced to make their living by stringing together freelance assignments or winning fixed-term contracts, subjecting those workers to even more rules and restrictions. On top of their actual jobs, contractors and temp workers must do the additional work of appearing affable and employable not just on the job, but during their ongoing efforts to secure their next gig. Constantly pitching, writing up applications, and personal branding on social media requires a level of self-censorship, lest a controversial tweet or compromising Facebook photo sink their job prospects. Forced to anticipate the wishes not of a specific employer, but of all potential future employers, many opt out of participating in social media or practicing politics in any visible capacity. Their public personas are shaped not by their own beliefs and desires, but by the demands of the labor market.


For Livingston, it’s not just employers but work itself that is the problem. We toil because we must, but also because our culture has trained us to see work as the greatest enactment of our dignity and personal character. Livingston challenges us to turn away from such outmoded ideas, rooted in Protestant ideals. Like Anderson, he sweeps through centuries of labor theory with impressive efficiency, from Marx and Hegel to Freud and Lincoln, whose 1859 speech he also quotes. Livingston centers on these thinkers because they all found the connection between work and virtue troubling. Hegel believed that work causes individuals to defer their desires, nurturing a “slave morality.” Marx proposed that “real freedom came after work.” And Freud understood the Protestant work ethic as “the symptom of repression, perhaps even regression.”

Nor is it practical, Livingston argues, to exalt work: There are simply not enough jobs to keep most adults employed at a living wage, given the rise of automation and increases in productivity. Besides, the relation between income and work is arbitrary. Cooking dinner for your family is unpaid work, while cooking dinner for strangers usually comes with a paycheck. There’s nothing inherently different in the labor involved—only in the compensation. Anderson argues that work impedes individual freedom; Livingston points out that it rarely pays enough. As technological advances continue to weaken the demand for human labor, wages will inevitably be driven down even further. Instead of idealizing work and making it the linchpin of social organization, Livingston suggests, why not just get rid of it?

Livingston belongs to a cadre of thinkers, including Kathi Weeks, Nick Srnicek, and Alex Williams, who believe that we should strive for a “postwork” society in one form or another. Strands of this idea go back at least as far as Keynes’s 1930 essay on “Economic Possibilities for our Grandchildren.” Not only would work be eliminated or vastly reduced by technology, Keynes predicted, but we would also be unburdened spiritually. Devotion to work was, he deemed, one of many “pseudo-moral principles” that “exalted some of the most distasteful of human qualities into the position of the highest virtues.”

Since people in this new world would no longer have to earn a salary, they would, Livingston envisions, receive some kind of universal basic income. UBI is a slippery concept, adaptable to both the socialist left and libertarian right, but it essentially entails distributing a living wage to every member of society. In most conceptualizations, the income is indeed basic—no cases of Dom Pérignon—and would cover the essentials like rent and groceries. Individuals would then be free to choose whether and how much they want to work to supplement the UBI. Leftist proponents tend to advocate pairing UBI with a strong welfare state to provide nationalized health care, tuition-free education, and other services. Some libertarians view UBI as a way to pare down the welfare state, arguing that it’s better simply to give people money to buy food and health care directly, rather than forcing them to engage with food stamp and Medicaid bureaucracies.

According to Livingston, we are finally on the verge of this postwork society because of automation. Robots are now advanced enough to take over complex jobs in areas like agriculture and mining, eliminating the need for humans to perform dangerous or tedious tasks. In practice, however, automation is a double-edged sword, with the capacity to oppress as well as unburden. Machines often accelerate the rate at which humans can work, taxing rather than liberating them. Conveyor belts eliminated the need for workers to pass unfinished products along to their colleagues—but as Charlie Chaplin and Lucille Ball so hilariously demonstrated, the belts also increased the pace at which those same workers needed to turn wrenches and wrap chocolates. In retail and customer service, a main function of automation has been not to eliminate work, but to eliminate waged work, transferring much of the labor onto consumers, who must now weigh and code their own vegetables at the supermarket, check out their own library books, and tag their own luggage at the airport.

At the same time, it may be harder to automate some jobs that require a human touch, such as floristry or hairstyling. The same goes for the delicate work of caring for the young, sick, elderly, or otherwise vulnerable. In today’s economy, the demand for such labor is rising rapidly: “Nine of the twelve fastest-growing fields,” The New York Times reported earlier this year, “are different ways of saying ‘nurse.’” These jobs also happen to be low-paying, emotionally and physically grueling, dirty, hazardous, and shouldered largely by women and immigrants. Regardless of whether employment is virtuous or not, our immediate goal should perhaps be to distribute the burdens of caregiving, since such work is essential to the functioning of society and benefits us all.


A truly work-free world is one that would entail a revolution from our present social organizations. We could no longer conceive of welfare as a last resort—as the “safety net” metaphor implies—but would be forced to treat it as an unremarkable and universal fact of life. This alone would require us to support a massive redistribution of wealth, and to reclaim our political institutions from the big-money interests that are allergic to such changes. Tall orders indeed—but as Srnicek and Williams remind us in their book, Inventing the Future: Postcapitalism and a World Without Work, neoliberals pulled off just such a revolution in the postwar years. Thanks to their efforts, free-market liberalism replaced Keynesianism as the political and economic common sense all around the world.

Another possible solution to the current miseries of unemployment and worker exploitation is the one Livingston rejects in his title: full employment. For anti-work partisans, full employment takes us in the wrong direction, and UBI corrects the course. But the two are not mutually exclusive. In fact, rather than creating new jobs, full employment could require us to reduce our work hours drastically and spread them throughout the workforce—a scheme that could radically de-center waged work in our lives. A dual strategy of pursuing full employment while also demanding universal benefits—including health care, childcare, and affordable housing—would maximize workers’ bargaining power to ensure that they, and not just owners of capital, actually get to enjoy the bounty of labor-saving technology.

Nevertheless, Livingston’s critiques of full employment are worth heeding. As with automation, it can all go wrong if we use the banner of full employment to create pointless roles—what David Graeber has termed “bullshit jobs,” in which workers sit in some soul-sucking basement office for eight hours a day—or harmful jobs, like building nuclear weapons. If we do not have a deliberate politics rooted in universal social justice, then full employment, a basic income, and automation will not liberate us from the degradations of work.

Both Livingston and Anderson reveal how much of our own power we’ve already ceded in making waged work the conduit for our ideals of liberty and morality. The scale and coordination of the institutions we’re up against in the fight for our emancipation is, as Anderson demonstrates, staggering. Employers hold the means to our well-being, and they have the law on their side. Individual efforts to achieve a better “work-life balance” for ourselves and our families miss the wider issue we face as waged employees. Livingston demonstrates the scale at which we should be thinking: Our demands should be revolutionary, our imaginations wide. Standing amid the wreckage of last year’s presidential election, what other choice do we have?

 

Miya Tokumitsu is a lecturer of art history at the University of Melbourne and a contributing editor at Jacobin. She is the author of Do What You Love.  And Other Lies about Success and Happiness.

Reimagining Money

What if markets were designed to build trust instead of wealth?

By Douglas Rushkoff

(The Atlantic)

Bitcoin was conceived as a modern solution to an ages-old problem: How can two parties agree on and verify an exchange of value? In this sense, Bitcoin is an effective technology, in that it trains the massive processing power of distributed personal computers on the same situation that paper currency was built to resolve. But in important ways, Bitcoin transposes some of the shortcomings of traditional currency onto the digital realm. It ignores a whole host of questions about the potential to reimagine what money can be designed to emphasize: What sorts of money will encourage admirable human behavior? What sorts of money systems will encourage trust, reenergize local commerce, favor peer-to-peer value exchange, and transcend the growth requirement? In short, how can money be less an extractor of value and more a utility for its exchange?Around the world, people have proposed experimental, tentative answers to these questions. What follows are three ways that people have toyed with rearranging the priorities of transactions—all of which would encourage a radical reimagination of what money is and can do.

The simplest approach to limiting the delocalizing, extractive power of central currency is for communities to adopt their own local currencies, pegged or tied in some way to a central currency. One of the first and most successful contemporary efforts is the Massachusetts BerkShare, which was developed to help keep money from flowing out of the Berkshire region.

One hundred BerkShares cost $95 and are available at local banks throughout the region. Participating local merchants then accept them as if they were dollars—offering their customers what amounts to a 5-percent discount for using the local money. Although it amounts to selling goods at a perpetual discount, merchants can in turn spend their local currency at other local businesses and receive the same discounted rate. Nonlocals and tourists purchase goods with dollars at full price, and those who bother to purchase items with BerkShares presumably leave town with a bit of unspent local money in their pockets.

The 5-percent local discount may seem like a huge disadvantage to take on—but only if businesses think of themselves as competing individuals. In the long term, the discount is more than compensated for by the fact that BerkShares can circulate only locally. They remain in the region and come back to the same stores again and again. Even if nonlocal stores, such as Walmart, agree to accept the local currency, they can’t deliver it up to their shareholders or trap it in static savings. The best Walmart can do is use it to pay their local workers or purchase supplies and services from local merchants.

* * *

Unlike local discount currencies, cooperative community currencies don’t need to be pegged to the dollar at all. They are not purchased into existence but are worked into circulation. They are best thought of less like money than like exchanges.

The simplest form of cooperative currency is a favor bank, such as those founded in Greece and other parts of southern Europe during the Euro crisis. Incapable of finding work or sourcing Euros, people in many places lost the ability to transact. Even though a majority of what they needed could be produced locally, they had no cash with which to trade. So they built simple, secure trading websites—mini-eBays—where people offered their goods and services to others in return for the goods and services they needed. The sites did not record value amounts so much as keep general track of who was providing what to the community and coordinate fair exchanges. This casual, transparent solution works particularly well in a community where people already know one another and freeloaders can be pressured to contribute.

Larger communities have been using “time dollars,” a currency system that keeps track of how many hours people contribute to one another. Again, a simple exchange is set up on a website, where people list what they need and what they can contribute. The bigger and more anonymous a community, the more security and verification is required. Luckily, dozens of startups and nonprofit organizations have been developing apps and website kits via which local or even nonlocal communities can establish and run their own currencies.

Time exchanges tend to work best when everybody values their time the same way or is providing the same service. Time dollars are extremely egalitarian, valuing each person’s time the same as anyone else’s. An “hour” is worth one hour of work, whether it is performed by a plumber or a psychotherapist.

The Japanese recession gave rise to one of the most successful time exchanges yet, called Fureai Kippu, or “Caring Relationship Tickets.” People no longer had enough cash to pay for their parents’ or grandparents’ health-care services—but because they had moved far away from home to find jobs, they couldn’t take care of their relatives themselves either. The Fureai Kippu exchange gave people the ability to bank hours of eldercare by taking care of old people in their communities, which they could then spend to get care for their own relatives far away. So one person might provide an hour of bathing services for an elder in her neighborhood in return for someone preparing meals for her grandfather who lives in another city. As the Caring Relationship Tickets became accepted things of value, people began using them for a variety of services.

Although a person can do a bunch of work in order to bank enough hours to get a whole bunch of services, most time exchanges put a limit on how many hours members can accumulate. They also put a limit on how many hours a person can owe. This way a freeloader can be removed from the system, and the entire community can absorb the cost of the unearned hours pretty easily.

* * *

How might traditional banks participate effectively in the financial rehabilitation of the communities they serve? Here’s just one possibility:

Sam’s Pizzeria is thriving as a local business, and Sam needs $200,000 to expand the dining room and build a second restroom. Normally, the bank would evaluate his business and credit and then either reject his loan request or give him the money at around 8 percent interest. The risk is that he won’t get enough new business to fill the new space, won’t be able to pay back the loan, and will go out of business. Indeed, part of the cost of the loan is that speculative risk.

In another approach, the banker could make Sam a different offer. The bank could agree to put up $100,000 toward the expansion project at 8 percent if Sam is able to raise the other $100,000 from his community in the form of market money: Sam is to sell digital coupons for $120 worth of pizza at the expanded restaurant at a cost of $100 per coupon. The bank can supply the software and administrate the escrow. If Sam can’t raise the money, then it proves the community wasn’t ready, and the bank can return everyone’s money.

If he does raise the money, then the bank has gained the security of a terrific community buy-in. Sam got his money more cheaply than if he borrowed the whole sum from the bank, because he can pay back the interest in retail-priced pizza. The community lenders have earned a fast 20 percent on their money—far more than they could earn in a bank or mutual fund. And it’s an investment that pays all sorts of other dividends: a more thriving downtown, more customers for other local businesses, better real-estate values, a higher tax base, better public schools, and so on. These are benefits one can’t see when buying stocks or abstract derivatives. Meanwhile, all the local “investors” now have a stake in the restaurant’s staying open at least long enough for them to cash in all their coupons. That’s good motivation to publicize it, take friends out to eat there, and contribute to its success.

For its part, the bank has diversified its range of services, bet on the possibility that community currencies will gain traction, and demonstrated a willingness to do something other than extract value from a community. The bank becomes a community partner, helping a local region invest in itself. The approach also provides the bank with a great hedge against continued deflation, hyperinflation, or growing consumer dissatisfaction with Wall Street and centrally issued money. If capital lending continues to contract as a business sector, the bank has already positioned itself to function as more of a service company—providing the authentication and financial expertise small businesses still need to thrive.

The bank transforms itself from an agent of debt to a catalyst for distribution and circulation. Like money in a digital age, it becomes less a thing of value in itself than a way of fostering the value creation and exchange of others.


This article has been adapted from Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity.

Disposable Americans: The Numbers are Growing

middle-class

By Paul Buchheit

Source: Information Clearing House

As often noted in the passionate writings of Henry Giroux, poor Americans are becoming increasingly ‘disposable’ in our winner-take-all society. After 35 years of wealth distribution to the super-rich, inequality has forced much of the middle class towards the bottom, to near-poverty levels, and to a state of helplessness in which they find themselves being blamed for their own misfortunes.

The evidence keeps accumulating: income and wealth — and health — are declining for middle-class America. As wealth at the top grows, the super-rich feel they have little need for the rest of society.

Income Plummets for the Middle Class

According to Pew Research, in 1970 three of every ten income dollars went to upper-income households. Now five of every ten dollars goes to them.

The Social Security Administration reports that over half of Americans make less than $30,000 per year. That’s less than an appropriate average living wage of $16.87 per hour, as calculated by Alliance for a Just Society.

Wealth Collapses for Half of Us

Numerous sources report that half or more of American families have virtually no savings, and would have to borrow money or sell possessions to cover an emergency expense. Between half and two-thirds of Americans have less than $1,000.

For every $100 owned by a middle-class household in 2001, that household now has just $72.

Not surprisingly, race plays a role in the diminishing of middle America. According to Pew Research, the typical black family has only enough liquid savings to last five days, compared to 12 days for the typical Hispanic household, and 30 days for a white household.

Our Deteriorating Health

In a disgraceful display of high-level disregard for vital health issues, House Republicans are attempting to cut back on lunches for over 3 million kids.

The evidence for the health-related disposability of poor Americans comes from a new study that finds nearly a 15 year difference in life expectancy for 40-year-olds among the richest 1% and poorest 1% (10 years for women). Much of the disparity has arisen in just the past 15 years.

It’s not hard to understand the dramatic decline in life expectancy, as numerous studies have documented the health problems resulting from the inequality-driven levels of stress and worry and anger that make Americans much less optimistic about the future. The growing disparities mean that our children will likely see less opportunities for their own futures.

It May Be Getting Worse

The sense derived from all this is that half of America is severely financially burdened, at risk of falling deeper into debt.

It may be more than half. The Wall Street Journal recently reported on a JP Morgan study’s conclusion that “the bottom 80% of households by income lack sufficient savings to cover the type of volatility observed in income and spending.” Fewer than one in three 25- to 34-year-olds live in their own homes, a 20 percent drop in just the past 15 years.

It may be even worse for renters. The number of families spending more than half their incomes on rent — the ‘severely’ cost-burdened renters — has increased by a stunning 50 percent in just ten years. Billionaire Steve Schwarzman, whose company Blackstone has been buying up tens of thousands of homes at rock-bottom prices and then renting them back while waiting out the housing market, finds the growing anger among voters “astonishing.”

What’s astonishing is the disregard that many of the super-rich have for struggling Americans.

 

Paul Buchheit is a college teacher, an active member of US Uncut Chicago, founder and developer of social justice and educational websites (UsAgainstGreed.org, PayUpNow.org, RappingHistory.org), and the editor and main author of “American Wars: Illusions and Realities” (Clarity Press). He can be reached at paul@UsAgainstGreed.org.

Bureaucratic Insanity is Yours to Enjoy

1980

Reposted below is Dmitry Orlov’s review of the new publication from Club Orlov Press: “Bureaucratic Insanity: The American Bureaucrat’s Descent into Madness” by Sean J. Kerrigan.

By Dmitry Orlov

Source: ClubOrlov

In contemporary United States a child can be charged with battery for throwing a piece of candy at a schoolfriend. Students can be placed in solitary confinement for cutting class. Adults aren’t much better off: in 2011 the Supreme Court decided in 2011 that anyone the police arrest, even for an offense as minor as an unpaid traffic ticket, can be strip-searched. These acts of official violence are just the tip of the iceberg in our society.

The number of rules and laws to which Americans, mostly unbeknownst to them, are subject, is hilariously excessive. But what makes this comedy unbearable is that these rules and laws are often enforced with an overabundance of self-righteous venom. Increasingly, contemporary American bureaucrats—be they police, teachers or government officials—are obsessed with following strict rules and mercilessly punishing all those who fail to comply (unless they are very rich or politically connected).

In so doing, these bureaucrats have become so liberated from the constraints of common sense that the situation has gone far beyond parody and is now a full-blown farce. Consider this recent news story involving a Virginia sixth-grader, the son of two schoolteachers and a member of the school’s program for gifted students. The boy was targeted by school officials after they found a leaf, probably a maple leaf, in his backpack. Someone suspected it to be marijuana. The leaf in question was not marijuana (as confirmed by repeated lab tests). End of story, wouldn’t you think?

Not at all! The 11-year-old was expelled and charged with marijuana possession in juvenile court. These charges were eventually dropped. He was then forced to enroll in an alternative school away from his friends, where he is subjected to twice-daily searches for drugs and periodic evaluation for substance abuse problems—all of this for possession of a maple leaf.

“It doesn’t matter if your son or daughter brings a real pot leaf to school, or if he brings something that looks like a pot leaf—okra, tomato, maple, buckeye, etc. If your kid calls it marijuana as a joke, or if another kid thinks it might be marijuana, that’s grounds for expulsion,” the Washington Post cheerfully reassures us.

A reasonable school official would recognize the difference between a technical violation caused by an oversight and a conscious attempt to smuggle drugs into the school. But school officials were intent on ignoring their own better sense, instead favoring harsh punishments.

In his new book, Bureaucratic Insanity: The American Bureaucrat’s Descent into Madness, Sean Kerrigan documents dozens of eyebrow-raising examples in which America’s rule-enforcers perversely revel in handing out absurd and unfair punishments for minor infractions. They demand total and complete submission, driven by a perverse compulsion to “put us in our place” and to “teach us a lesson.” They mercilessly punish even the most inconsequential transgressions in order to maximize our terror and humiliation.

When Sean first began following this story several years ago, he became mesmerized by this bizarre carnival of unreason. “Where is all this pent-up rage coming from?” he wondered, “and why is it being directed toward the weakest and most vulnerable members of society?” And then news stories like those mentioned above grew more and more common. Eventually, he started compiling a list of the most egregious abuses, trying to detect patterns, searching for some explanation for why your average garden-variety bureaucrat has morphed into a monster and has started to take sadistic pleasure in the suffering of innocent people.

Some people might argue that this kind of behavior is the result of political correctness gone amok. Others point to the irrational fear of terrorism and mass shootings. Yet others might think that it has to do with the bureaucrats’ fear of losing their jobs—merely for failing to comply with the exact letter of some rule. While there may be some truth to each of these explanations, they are far from adequate. Many of these bureaucratic abuses have nothing to do with political constraints on free speech, or with guns or terrorism, and in most cases the bureaucrats have the power to minimize harm, but instead they choose to maximize it.

In looking closer at each individual instance, it became clear that most of the offending bureaucrats weren’t even attempting to use their judgment but were mindlessly following written rules. Even in the most nurturing and humanistic professions—teachers and physicians—their practitioners have been robotized to such an extent that they now perform a very narrow range of actions. Thanks to all the progress in IT, their work is now quite detached from physical reality. Much of their work now consists of monotonously, mindlessly pounding at the computer keyboard. Consequently, a large portion of their waking lives has taken on an ethereal, pointless quality. Even teachers, who once had a relatively free reign in forming the minds of the next generation, are now forced to behave like machines, teaching to standardized tests and working a grueling average of 53 hours a week.

The psychological effects of this pressure have been profound. Minus the opportunities to make their own decisions and to see those positive effects of their efforts, their work has become personally meaningless, alienating, depersonalizing and psychologically damaging. As a result of this damage, American bureaucrats, although they may look like mild-mannered professionals, have become prone to sudden bouts of aggressive, sadistic behavior. They are unable to act out their repressed rage in any socially acceptable way other than by doling out punishments, fines, rejections, expulsions and other forms of objective, systemic violence.

The road to hell is paved with good intentions, and rest assured that this is all being done for our own good. The purpose of all of these rules and laws, from the perspective of the American system of governance, is to maximize control over everything that can be controlled and to micromanage every possible detail of our lives—in order to make them better! From student testing all the way to global trade, those in leadership positions are trying to centralize as much authority as possible in order to maximize efficiency, profit, American power… while minimizing our dignity, well-being and happiness. Oops!

In his book Bureaucratic Insanity, Sean traces the development of this trend from the early years of the industrial revolution to the modern day, from its initial appearance in factory life and in the military, to it later metastasizing to the office, and now taking over America’s schools. He argues persuasively, based on a careful and thorough review of literature in history, philosophy, psychology, anthropology and social criticism, that the average American bureaucrat is literally, clinically insane. The average American bureaucrat has a warped perception of reality and an intense, repressed self-hatred. Their only way to vent their rage is by punishing others using bureaucratic methods. They demand absolute conformity because it is their only way to give their meaningless lives some semblance of meaning. They suppress all thoughts that might lead them to discover the true nature of their condition, because that would cause them to spiral down into outright schizophrenia.

The book concludes with an assessment of what we can do to insulate ourselves from this seemingly unstoppable trend, and of how we can reinvigorate our lives by giving it real meaning.

 

How 90% of American Households Lost an Average of $17,000 in Wealth to the Plutocrats in 2016

By Paul Buchheit

Source: Information Clearing House

America has always been great for the richest 1%, and it’s rapidly becoming greater. Confirmation comes from recent work by Thomas Piketty, Emmanuel Saez, and Gabriel Zucman; and from the 2015-2016 Credit Suisse Global Wealth Databooks (GWD). The data relevant to this report is summarized here.

The Richest 1% Extracted Wealth from Every Other Segment of Society 

These multi-millionaires effectively shifted nearly $4 trillion in wealth away from the rest of the nation to themselves in 2016. While there’s no need to offer condolences to the rest of the top 10%, who still have an average net worth of $1.3 million, nearly half of the wealth transfer ($1.94 trillion) came from the nation’s poorest 90% — the middle and lower classes, according to Piketty and Saez and Zucman. That’s over $17,000 in housing and savings per lower-to-middle-class household lost to the super-rich.

Put another way, the average 1% household took an additional $3 million of our national wealth in one year while education and infrastructure went largely unfunded.

It Gets Worse: Each MIDDLE-CLASS Household Lost $35,000 to the 1% 

According to Piketty and Saez and Zucman, the true middle class is “the group of adults with income between the median and the 90th percentile.” This group of 50 million households lost $1.76 trillion of their wealth in 2016, or over $35,000 each. That’s a $35,000 decline in housing and financial assets, with possibly increased debt, for every middle-class household.

Housing Wealth for the 90% Has Been Converted into Investment Wealth for the Plutocrats

In the 1980s, the housing wealth of the bottom 90% made up about 15 percent of total household wealth (Figure 8 here and Page 41 here).

In the 1980s, the corporate equities owned by the richest .01% made up about 1.2 percent of total household wealth (Figure 8 here).

Housing was 12 times greater than super-rich stock holdings back then. Now they’re nearly equal. The home values of 112,000,000 households have been reduced to just over 5 percent of total wealth, while the stocks and securities of the richest 12,000 households are approaching 5 percent of total wealth. Our homes have turned to dust, and the plutocrats have turned the dust into gold.

Even the Wages of the Poorest Americans Have Been Transferred to the Plutocrats 

It’s bad enough that the poorest 50% of America have no appreciable wealth, but their income has not increased in 40 years (see Table 1 here). More evidence comes from Pew Research.

As Piketty, Saez, and Zucman note, the richest 1% and the poorest 50% “have basically switched their income shares.” They explain, “We observe a complete collapse of the bottom 50% income share in the US between 1978 and 2015, from 20% to 12% of total income, while the top 1% income share rose from 11% to 20%.”

Making America Great for 1% of Us 

In his book, Glass House: The 1% Economy and the Shattering of the All-American Town, Brian Alexander describes today’s America through the lens of his hometown of Lancaster, Ohio, which had been a leading glasswares manufacturer. But the town started falling apart in the 1980s. A major glasswares company was bought up with borrowed money by private equity firms, which then cut jobs and wages, allowed manufacturing facilities to fall into disrepair, stopped contributing to pensions, moved company headquarters out of state, and demanded tax breaks to keep the glassware plant in Lancaster.

Capitalism as usual. Yet 59 percent of Lancaster’s county voted for Trump. Alexander explains that the people of Lancaster “remained captured by an ultra-conservative, anti-tax philosophy that prevented them from raising funds to repair the crumbling streets..”

Delusions persist about the power of the market and the dangers of governing ourselves. The business media has conditioned us to fear the words ‘social’ and ‘public,’ as if they connote evil or ineptitude or anti-Americanism. But the public good depends on cooperation. Society fosters individual accomplishment, not the other way around.

The obscene transfer of wealth and income to the plutocrats won’t end until we demand a return to the Commons, where we work as a society rather than allow predatory plutocratic individuals to control us. There are 112 million households in America that are giving thousands of their hard-earned dollars to the 1%, and we have finally begun to fight back, together, as a massive force of Americans who refuse to let the theft continue.

 

Paul Buchheit is a writer for progressive publications, and the founder and developer of social justice and educational websites, including: UsAgainstGreed.org, PayUpNow.org, and RappingHistory.org. This article was first published at Common Dreams

Against meaninglessness and precarity: the crisis of work

work

By David Frayne

Source: ROAR Magazine

If work is vital for income, social inclusion and a sense of identity, then one of the most troubling contradictions of our time is that the centrality of work in our societies persists even when work is in a state of crisis. The steady erosion of stable and satisfying employment makes it less and less clear whether modern jobs can offer the sense of moral agency, recognition and pride required to secure work as a source of meaning and identity. The standardization, precarity and dubious social utility that characterize many modern jobs are a major source of modern misery.

Mass unemployment is also now an enduring structural feature of capitalist societies. The elimination of huge quantities of human labor by the development of machine technologies is a process that has spanned centuries. However, perhaps due to high-profile developments like Apple’s Siri computer assistant or Amazon’s delivery drones, the discussion around automation has once again been ignited.

An often-cited study by Carl Frey and Michael Osborne anticipates an escalation of technological unemployment over the coming years. Occupations at high risk include the likes of models, cooks and construction workers, thanks to advances such as digital avatars, burger flipping machines and the ability to manufacture prefabricated buildings in factories with robots. It is also anticipated that advances in artificial intelligence and machine learning will allow an increasing quantity of cognitive work tasks to become automated.

What all of this means is that we are steadily becoming a society of workers without work: a society of people who are materially, culturally and psychologically bound to paid employment, but for whom there are not enough stable and meaningful jobs to go around. Perversely, the most pressing problem for many people is no longer exploitation, but the absence of opportunities to be sufficiently and dependably exploited. The impact of this problem in today’s epidemic of anxiety and exhaustion should not be underestimated.

What makes the situation all the crueler is the pervasive sense that the precarious victims of the crisis are somehow personally responsible for their fate. In the UK, barely a week goes by without a smug reaffirmation of the work ethic in the media, or some story that constructs unemployment as a form of deviance. The UK television show Benefits Street comes to mind, but perhaps the most outrageous example in recent times was not from the world of trash TV, but from Dr. Adam Perkins’ thesis, The Welfare Trait. Published last year, Perkins’ book tackled what he defined as the “employment-resistant personality”. Joblessness is explained in terms of an inter-generationally transmitted psychological disorder. Perkins’ study is the most polished product of the ideology of work one can imagine. His study is so dazzled by its own claims to scientific objectivity, so impervious to its own grounding in the work ethic, that it beggars belief.

It seems we find ourselves at a rift. On the one hand, work has been positioned as a central source of income, solidarity and social recognition, whereas on the other, the promise of stable, meaningful and satisfying employment crumbles around us. The crucial question: how should societies adjust to this deepening crisis of work?

 


This is an excerpt from David Frayne’s “Towards a Post-Work Society”, which will appear in ROAR Issue #2, The Future of Work, scheduled for release in June/July.

Fuck Work

 tumblr_mkr6eleomu1qiyurho1_r2_500

Economists believe in full employment. Americans think that work builds character. But what if jobs aren’t working anymore?

By James Livingston

Source: aeon

Work means everything to us Americans. For centuries – since, say, 1650 – we’ve believed that it builds character (punctuality, initiative, honesty, self-discipline, and so forth). We’ve also believed that the market in labour, where we go to find work, has been relatively efficient in allocating opportunities and incomes. And we’ve believed that, even if it sucks, a job gives meaning, purpose and structure to our everyday lives – at any rate, we’re pretty sure that it gets us out of bed, pays the bills, makes us feel responsible, and keeps us away from daytime TV.

These beliefs are no longer plausible. In fact, they’ve become ridiculous, because there’s not enough work to go around, and what there is of it won’t pay the bills – unless of course you’ve landed a job as a drug dealer or a Wall Street banker, becoming a gangster either way.

These days, everybody from Left to Right – from the economist Dean Baker to the social scientist Arthur C Brooks, from Bernie Sanders to Donald Trump – addresses this breakdown of the labour market by advocating ‘full employment’, as if having a job is self-evidently a good thing, no matter how dangerous, demanding or demeaning it is. But ‘full employment’ is not the way to restore our faith in hard work, or in playing by the rules, or in whatever else sounds good. The official unemployment rate in the United States is already below 6 per cent, which is pretty close to what economists used to call ‘full employment’, but income inequality hasn’t changed a bit. Shitty jobs for everyone won’t solve any social problems we now face.

Don’t take my word for it, look at the numbers. Already a fourth of the adults actually employed in the US are paid wages lower than would lift them above the official poverty line – and so a fifth of American children live in poverty. Almost half of employed adults in this country are eligible for food stamps (most of those who are eligible don’t apply). The market in labour has broken down, along with most others.

Those jobs that disappeared in the Great Recession just aren’t coming back, regardless of what the unemployment rate tells you – the net gain in jobs since 2000 still stands at zero – and if they do return from the dead, they’ll be zombies, those contingent, part-time or minimum-wage jobs where the bosses shuffle your shift from week to week: welcome to Wal-Mart, where food stamps are a benefit.

And don’t tell me that raising the minimum wage to $15 an hour solves the problem. No one can doubt the moral significance of the movement. But at this rate of pay, you pass the official poverty line only after working 29 hours a week. The current federal minimum wage is $7.25. Working a 40-hour week, you would have to make $10 an hour to reach the official poverty line. What, exactly, is the point of earning a paycheck that isn’t a living wage, except to prove that you have a work ethic?

But, wait, isn’t our present dilemma just a passing phase of the business cycle? What about the job market of the future? Haven’t the doomsayers, those damn Malthusians, always been proved wrong by rising productivity, new fields of enterprise, new economic opportunities? Well, yeah – until now, these times. The measurable trends of the past half-century, and the plausible projections for the next half-century, are just too empirically grounded to dismiss as dismal science or ideological hokum. They look like the data on climate change – you can deny them if you like, but you’ll sound like a moron when you do.

For example, the Oxford economists who study employment trends tell us that almost half of existing jobs, including those involving ‘non-routine cognitive tasks’ – you know, like thinking – are at risk of death by computerisation within 20 years. They’re elaborating on conclusions reached by two MIT economists in the book Race Against the Machine (2011). Meanwhile, the Silicon Valley types who give TED talks have started speaking of ‘surplus humans’ as a result of the same process – cybernated production. Rise of the Robots, a new book that cites these very sources, is social science, not science fiction.

So this Great Recession of ours – don’t kid yourself, it ain’t over – is a moral crisis as well as an economic catastrophe. You might even say it’s a spiritual impasse, because it makes us ask what social scaffolding other than work will permit the construction of character – or whether character itself is something we must aspire to. But that is why it’s also an intellectual opportunity: it forces us to imagine a world in which the job no longer builds our character, determines our incomes or dominates our daily lives.

In short, it lets us say: enough already. Fuck work.

Certainly this crisis makes us ask: what comes after work? What would you do without your job as the external discipline that organises your waking life – as the social imperative that gets you up and on your way to the factory, the office, the store, the warehouse, the restaurant, wherever you work and, no matter how much you hate it, keeps you coming back? What would you do if you didn’t have to work to receive an income?

And what would society and civilisation be like if we didn’t have to ‘earn’ a living – if leisure was not our choice but our lot? Would we hang out at the local Starbucks, laptops open? Or volunteer to teach children in less-developed places, such as Mississippi? Or smoke weed and watch reality TV all day?

I’m not proposing a fancy thought experiment here. By now these are practical questions because there aren’t enough jobs. So it’s time we asked even more practical questions. How do you make a living without a job – can you receive income without working for it? Is it possible, to begin with and then, the hard part, is it ethical? If you were raised to believe that work is the index of your value to society – as most of us were – would it feel like cheating to get something for nothing?

We already have some provisional answers because we’re all on the dole, more or less. The fastest growing component of household income since 1959 has been ‘transfer payments’ from government. By the turn of the 21st century, 20 per cent of all household income came from this source – from what is otherwise known as welfare or ‘entitlements’. Without this income supplement, half of the adults with full-time jobs would live below the poverty line, and most working Americans would be eligible for food stamps.

But are these transfer payments and ‘entitlements’ affordable, in either economic or moral terms? By continuing and enlarging them, do we subsidise sloth, or do we enrich a debate on the rudiments of the good life?

Transfer payments or ‘entitlements’, not to mention Wall Street bonuses (talk about getting something for nothing) have taught us how to detach the receipt of income from the production of goods, but now, in plain view of the end of work, the lesson needs rethinking. No matter how you calculate the federal budget, we can afford to be our brother’s keeper. The real question is not whether but how we choose to be.

I know what you’re thinking – we can’t afford this! But yeah, we can, very easily. We raise the arbitrary lid on the Social Security contribution, which now stands at $127,200, and we raise taxes on corporate income, reversing the Reagan Revolution. These two steps solve a fake fiscal problem and create an economic surplus where we now can measure a moral deficit.

Of course, you will say – along with every economist from Dean Baker to Greg Mankiw, Left to Right – that raising taxes on corporate income is a disincentive to investment and thus job creation. Or that it will drive corporations overseas, where taxes are lower.

But in fact raising taxes on corporate income can’t have these effects.

Let’s work backward. Corporations have been ‘multinational’ for quite some time. In the 1970s and ’80s, before Ronald Reagan’s signature tax cuts took effect, approximately 60 per cent of manufactured imported goods were produced offshore, overseas, by US companies. That percentage has risen since then, but not by much.

Chinese workers aren’t the problem – the homeless, aimless idiocy of corporate accounting is. That is why the Citizens United decision of 2010 applying freedom of speech regulations to campaign spending is hilarious. Money isn’t speech, any more than noise is. The Supreme Court has conjured a living being, a new person, from the remains of the common law, creating a real world more frightening than its cinematic equivalent: say, Frankenstein, Blade Runner or, more recently, Transformers.

But the bottom line is this. Most jobs aren’t created by private, corporate investment, so raising taxes on corporate income won’t affect employment. You heard me right. Since the 1920s, economic growth has happened even though net private investment has atrophied. What does that mean? It means that profits are pointless except as a way of announcing to your stockholders (and hostile takeover specialists) that your company is a going concern, a thriving business. You don’t need profits to ‘reinvest’, to finance the expansion of your company’s workforce or output, as the recent history of Apple and most other corporations has amply demonstrated.

So investment decisions by CEOs have only a marginal effect on employment. Taxing the profits of corporations to finance a welfare state that permits us to love our neighbours and to be our brothers’ keeper is not an economic problem. It’s something else – it’s an intellectual issue, a moral conundrum.

When we place our faith in hard work, we’re wishing for the creation of character; but we’re also hoping, or expecting, that the labour market will allocate incomes fairly and rationally. And there’s the rub, they do go together. Character can be created on the job only when we can see that there’s an intelligible, justifiable relation between past effort, learned skills and present reward. When I see that your income is completely out of proportion to your production of real value, of durable goods the rest of us can use and appreciate (and by ‘durable’ I don’t mean just material things), I begin to doubt that character is a consequence of hard work.

When I see, for example, that you’re making millions by laundering drug-cartel money (HSBC), or pushing bad paper on mutual fund managers (AIG, Bear Stearns, Morgan Stanley, Citibank), or preying on low-income borrowers (Bank of America), or buying votes in Congress (all of the above) – just business as usual on Wall Street – while I’m barely making ends meet from the earnings of my full-time job, I realise that my participation in the labour market is irrational. I know that building my character through work is stupid because crime pays. I might as well become a gangster like you.

That’s why an economic crisis such as the Great Recession is also a moral problem, a spiritual impasse – and an intellectual opportunity. We’ve placed so many bets on the social, cultural and ethical import of work that when the labour market fails, as it so spectacularly has, we’re at a loss to explain what happened, or to orient ourselves to a different set of meanings for work and for markets.

And by ‘we’ I mean pretty much all of us, Left to Right, because everybody wants to put Americans back to work, one way or another – ‘full employment’ is the goal of Right-wing politicians no less than Left-wing economists. The differences between them are over means, not ends, and those ends include intangibles such as the acquisition of character.

Which is to say that everybody has doubled down on the benefits of work just as it reaches a vanishing point. Securing ‘full employment’ has become a bipartisan goal at the very moment it has become both impossible and unnecessary. Sort of like securing slavery in the 1850s or segregation in the 1950s.

Why?

Because work means everything to us inhabitants of modern market societies – regardless of whether it still produces solid character and allocates incomes rationally, and quite apart from the need to make a living. It’s been the medium of most of our thinking about the good life since Plato correlated craftsmanship and the possibility of ideas as such. It’s been our way of defying death, by making and repairing the durable things, the significant things we know will last beyond our allotted time on earth because they teach us, as we make or repair them, that the world beyond us – the world before and after us – has its own reality principles.

Think about the scope of this idea. Work has been a way of demonstrating differences between males and females, for example by merging the meanings of fatherhood and ‘breadwinner’, and then, more recently, prying them apart. Since the 17th century, masculinity and femininity have been defined – not necessarily achieved – by their places in a moral economy, as working men who got paid wages for their production of value on the job, or as working women who got paid nothing for their production and maintenance of families. Of course, these definitions are now changing, as the meaning of ‘family’ changes, along with profound and parallel changes in the labour market – the entry of women is just one of those – and in attitudes toward sexuality.

When work disappears, the genders produced by the labour market are blurred. When socially necessary labour declines, what we once called women’s work – education, healthcare, service – becomes our basic industry, not a ‘tertiary’ dimension of the measurable economy. The labour of love, caring for one another and learning how to be our brother’s keeper – socially beneficial labour – becomes not merely possible but eminently necessary, and not just within families, where affection is routinely available. No, I mean out there, in the wide, wide world.

Work has also been the American way of producing ‘racial capitalism’, as the historians now call it, by means of slave labour, convict labour, sharecropping, then segregated labour markets – in other words, a ‘free enterprise system’ built on the ruins of black bodies, an economic edifice animated, saturated and determined by racism. There never was a free market in labour in these united states. Like every other market, it was always hedged by lawful, systematic discrimination against black folk. You might even say that this hedged market produced the still-deployed stereotypes of African-American laziness, by excluding black workers from remunerative employment, confining them to the ghettos of the eight-hour day.

And yet, and yet. Though work has often entailed subjugation, obedience and hierarchy (see above), it’s also where many of us, probably most of us, have consistently expressed our deepest human desire, to be free of externally imposed authority or obligation, to be self-sufficient. We have defined ourselves for centuries by what we do, by what we produce.

But by now we must know that this definition of ourselves entails the principle of productivity – from each according to his abilities, to each according to his creation of real value through work – and commits us to the inane idea that we’re worth only as much as the labour market can register, as a price. By now we must also know that this principle plots a certain course to endless growth and its faithful attendant, environmental degradation.

Until now, the principle of productivity has functioned as the reality principle that made the American Dream seem plausible. ‘Work hard, play by the rules, get ahead’, or, ‘You get what you pay for, you make your own way, you rightly receive what you’ve honestly earned’ – such homilies and exhortations used to make sense of the world. At any rate they didn’t sound delusional. By now they do.

Adherence to the principle of productivity therefore threatens public health as well as the planet (actually, these are the same thing). By committing us to what is impossible, it makes for madness. The Nobel Prize-winning economist Angus Deaton said something like this when he explained anomalous mortality rates among white people in the Bible Belt by claiming that they’ve ‘lost the narrative of their lives’ – by suggesting that they’ve lost faith in the American Dream. For them, the work ethic is a death sentence because they can’t live by it.

So the impending end of work raises the most fundamental questions about what it means to be human. To begin with, what purposes could we choose if the job – economic necessity – didn’t consume most of our waking hours and creative energies? What evident yet unknown possibilities would then appear? How would human nature itself change as the ancient, aristocratic privilege of leisure becomes the birthright of human beings as such?

Sigmund Freud insisted that love and work were the essential ingredients of healthy human being. Of course he was right. But can love survive the end of work as the willing partner of the good life? Can we let people get something for nothing and still treat them as our brothers and sisters – as members of a beloved community? Can you imagine the moment when you’ve just met an attractive stranger at a party, or you’re online looking for someone, anyone, but you don’t ask: ‘So, what do you do?’

We won’t have any answers until we acknowledge that work now means everything to us – and that hereafter it can’t.

Lara Trace Hentz

INDIAN COUNTRY NEWS

In Saner Thought

"It is the duty of every man, as far as his ability extends, to detect and expose delusion and error"..Thomas Paine

ZEDJournAI

Human in Algorithms

Rooster Crows

From the Roof Top

Aisle C

I See This

The Free

blog of the post capitalist transition.. Read or download the novel here + latest relevant posts

अध्ययन-अनुसन्धान(Essential Knowledge of the Overall Subject)

अध्ययन-अनुसन्धानको सार