After the Crash

Dispatches From a Long Recovery (Est. 10/2024)

After the Crash

Saturday Matinee: Obsolete

Source: Truthstream Media

The Future Doesn’t Need Us… Or So We’ve Been Told. With the rise of technology and the real-time pressures of an online, global economy, humans will have to be very clever – and very careful – not to be left behind by the future. From the perspective of those in charge, human labor is losing its value, and people are becoming a liability. This documentary reveals the real motivation behind the secretive effort to reduce the population and bring resource use into strict, centralized control. Could it be that the biggest threat we face isn’t just automation and robots destroying jobs, but the larger sense that humans could become obsolete altogether? *Please watch and share!* Link to film: http://amzn.to/2f69Ocr

The United States of Work

Employers exercise vast control over our lives, even when we’re not on the job. How did our bosses gain power that the government itself doesn’t hold?

By Miya Tokumitsu

Source: New Republic

Work no longer works. “You need to acquire more skills,” we tell young job seekers whose résumés at 22 are already longer than their parents’ were at 32. “Work will give you meaning,” we encourage people to tell themselves, so that they put in 60 hours or more per week on the job, removing them from other sources of meaning, such as daydreaming or social life. “Work will give you satisfaction,” we insist, even though it requires abiding by employers’ rules, and the unwritten rules of the market, for most of our waking hours. At the very least, work is supposed to be a means to earning an income. But if it’s possible to work full time and still live in poverty, what’s the point?

Even before the global financial crisis of 2008, it had become clear that if waged work is supposed to provide a measure of well-being and social structure, it has failed on its own terms. Real household wages in the United States have remained stagnant since the 1970s, even as the costs of university degrees and other credentials rise. Young people find an employment landscape defined by unpaid internships, temporary work, and low pay. The glut of degree-holding young workers has pushed many of them into the semi- or unskilled labor force, making prospects even narrower for non–degree holders. Entry-level wages for high school graduates have in fact fallen. According to a study by the Federal Reserve Bank of New York, these lost earnings will depress this generation’s wages for their entire working lives. Meanwhile, those at the very top—many of whom derive their wealth not from work, but from returns on capital—vacuum up an ever-greater share of prosperity.

Against this bleak landscape, a growing body of scholarship aims to overturn our culture’s deepest assumptions about how work confers wealth, meaning, and care throughout society. In Private Government: How Employers Rule Our Lives (and Why We Don’t Talk About It), Elizabeth Anderson, a professor of philosophy at the University of Michigan, explores how the discipline of work has itself become a form of tyranny, documenting the expansive power that firms now wield over their employees in everything from how they dress to what they tweet. James Livingston, a historian at Rutgers, goes one step further in No More Work: Why Full Employment Is a Bad Idea. Instead of insisting on jobs for all or proposing that we hold employers to higher standards, Livingston argues, we should just scrap work altogether.

Livingston’s vision is the more radical of the two; his book is a wide-ranging polemic that frequently delivers the refrain “Fuck work.” But in original ways, both books make a powerful claim: that our lives today are ruled, above all, by work. We can try to convince ourselves that we are free, but as long as we must submit to the increasing authority of our employers and the labor market, we are not. We therefore fancy that we want to work, that work grounds our character, that markets encompass the possible. We are unable to imagine what a full life could be, much less to live one. Even more radically, both books highlight the dramatic and alarming changes that work has undergone over the past century—insisting that, in often unseen ways, the changing nature of work threatens the fundamental ideals of democracy: equality and freedom.

Anderson’s most provocative argument is that large companies, the institutions that employ most workers, amount to a de facto form of government, exerting massive and intrusive power in our daily lives. Unlike the state, these private governments are able to wield power with little oversight, because the executives and boards of directors that rule them are accountable to no one but themselves. Although they exercise their power to varying degrees and through both direct and “soft” means, employers can dictate how we dress and style our hair, when we eat, when (and if) we may use the toilet, with whom we may partner and under what arrangements. Employers may subject our bodies to drug tests; monitor our speech both on and off the job; require us to answer questionnaires about our exercise habits, off-hours alcohol consumption, and childbearing intentions; and rifle through our belongings. If the state held such sweeping powers, Anderson argues, we would probably not consider ourselves free men and women.

Employees, meanwhile, have few ways to fight back. Yes, they may leave the company, but doing so usually necessitates being unemployed or migrating to another company and working under similar rules. Workers may organize, but unions have been so decimated in recent years that their clout is greatly diminished. What’s more, employers are swift to fire anyone they suspect of speaking to their colleagues about organizing, and most workers lack the time and resources to mount a legal challenge to wrongful termination.

It wasn’t supposed to be this way. As corporations have worked methodically to amass sweeping powers over their employees, they have held aloft the beguiling principle of individual freedom, claiming that only unregulated markets can guarantee personal liberty. Instead, operating under relatively few regulations themselves, these companies have succeeded at imposing all manner of regulation on their employees. That is to say, they use the language of individual liberty to claim that corporations require freedom to treat workers as they like.

Anderson sets out to discredit such arguments by tracing them back to their historical origins. The notion that personal freedom is rooted in free markets, for instance, originated with the Levellers in seventeenth-century England, when working conditions differed substantially from today’s. The Levellers believed that a market society was essential to liberate individuals from the remnants of feudal hierarchies; their vision of utopia was a world in which men could meet and interact on terms of equality and dignity. Their ideas echoed through the writing and politics of later figures like John Locke, Adam Smith, Thomas Paine, and Abraham Lincoln, all of whom believed that open markets could provide the essential infrastructure for individuals to shape their own destiny.

An anti-statist streak runs through several of these thinkers, particularly the Levellers and Paine, who viewed markets as the bulwark against state oppression. Paine and Smith, however, would hardly qualify as hard-line contemporary libertarians. Smith believed that public education was essential to a fair market society, and Paine proposed a system of social insurance that included old-age pensions as well as survivor and disability benefits. Their hope was not for a world of win-or-die competition, but one in which open markets would allow individuals to make the fullest use of their talents, free from state monopolies and meddlesome bosses.

For Anderson, the latter point is essential; the notion of lifelong employment under a boss was anathema to these earlier visions of personal freedom. Writing in the 1770s, Smith assumes that independent actors in his market society will be self-employed, and uses butchers and bakers as his exemplars; his “pin factory,” meant to illustrate division of labor, employs only ten people. These thinkers could not envision a world in which most workers spend most of their lives performing wage labor under a single employer. In an address before the Wisconsin State Agricultural Society in 1859, Lincoln stated, “The prudent, penniless beginner in the world labors for wages awhile, saves a surplus with which to buy tools or land for himself, then labors on his own account another while, and at length hires another new beginner to help him.” In other words, even well into the nineteenth century, defenders of an unregulated market society viewed wage labor as a temporary stage on the way to becoming a proprietor.

Lincoln’s scenario does not reflect the way most people work today. Yet the “small business owner” endures as an American stock character, conjured by politicians to push through deregulatory measures that benefit large corporations. In reality, thanks to a lack of guaranteed, nationalized health care and threadbare welfare benefits, setting up a small business is simply too risky a venture for many Americans, who must rely on their employers for health insurance and income. These conditions render long-term employment more palatable than a precarious existence of freelance gigs, which further gives companies license to oppress their employees.

The modern relationship between employer and employee began with the rise of large-scale companies in the nineteenth century. Although employment contracts date back to the Middle Ages, preindustrial arrangements bore little resemblance to the documents we know today. Like modern employees, journeymen and apprentices often served their employers for years, but masters performed the same or similar work in proximity to their subordinates. As a result, Anderson points out, working conditions—the speed required of workers and the hazards to which they might be exposed—were kept in check by what the masters were willing to tolerate for themselves.

The Industrial Revolution brought radical changes, as companies grew ever larger and management structures more complex. “Employers no longer did the same kind of work as employees, if they worked at all,” Anderson observes. “Mental labor was separated from manual labor, which was radically deskilled.” Companies multiplied rapidly in size. Labor contracts now bonded workers to massive organizations in which discipline, briefs, and decrees flowed downward, but whose leaders were unreachable by ordinary workers. Today, fast food workers or bank tellers would be hard-pressed to petition their CEOs at McDonald’s or Wells Fargo in person.

Despite this, we often speak of employment contracts as agreements between equals, as if we are living in Adam Smith’s eighteenth-century dream world. In a still-influential paper from 1937 titled “The Nature of the Firm,” the economist and Nobel laureate Ronald Coase established himself as an early observer and theorist of corporate concerns. He described the employment contract not as a document that handed the employer unaccountable powers, but as one that circumscribed those powers. In signing a contract, the employee “agrees to obey the directions of an entrepreneur within certain limits,” he emphasized. But such characterizations, as Anderson notes, do not reflect reality; most workers agree to employment without any negotiation or even communication about their employer’s power or its limits. The exceptions to this rule are few and notable: top professional athletes, celebrity entertainers, superstar academics, and the (increasingly small) groups of workers who are able to bargain collectively.

Yet because employment contracts create the illusion that workers and companies have arrived at a mutually satisfying agreement, the increasingly onerous restrictions placed on modern employees are often presented as “best practices” and “industry standards,” framing all sorts of behaviors and outcomes as things that ought to be intrinsically desired by workers themselves. Who, after all, would not want to work on something in the “best” way? Beyond employment contracts, companies also rely on social pressure to foster obedience: If everyone in the office regularly stays until seven o’clock every night, who would risk departing at five, even if it’s technically allowed? Such social prods exist alongside more rigid behavioral codes that dictate everything from how visible an employee’s tattoo can be to when and how long workers can break for lunch.

Many workers, in fact, have little sense of the legal scope of their employer’s power. Most would be shocked to discover that they could be fired for being too attractive, declining to attend a political rally favored by their employer, or finding out that their daughter was raped by a friend of the boss—all real-life examples cited by Anderson. Indeed, it is only after dismissal for such reasons that many workers learn of the sweeping breadth of at-will employment, the contractual norm that allows American employers to fire workers without warning and without cause, except for reasons explicitly deemed illegal.

In reality, the employment landscape is even more dire than Anderson outlines. The rise of staffing or “temp” agencies, for example, undercuts the very idea of a direct relationship between worker and employer. In The Temp Economy: From Kelly Girls to Permatemps in Postwar America, sociologist Erin Hatton notes that millions of workers now labor under subcontracting arrangements, which give employers even greater latitude to abuse employees. For years, Walmart—America’s largest retailer—used a subcontracting firm to hire hundreds of cleaners, many from Eastern Europe, who worked for months on end without overtime pay or a single day off. After federal agents raided dozens of Walmarts and arrested the cleaners as illegal immigrants, company executives used the subcontracting agreement to shirk responsibility for their exploitation of the cleaners, claiming they had no knowledge of their immigration status or conditions.

By any reasonable standard, much “temp” work is not even temporary. Employees sometimes work for years in a single workplace, even through promotions, without ever being granted official status as an employee. Similarly, “gig economy” platforms like Uber designate their workers as contractors rather than employees, a distinction that exempts the company from paying them minimum wage and overtime. Many “permatemps” and contractors perform the same work as employees, yet lack even the paltry protections and benefits awarded to full-time workers.

A weak job market, paired with the increasing precarity of work, means that more and more workers are forced to make their living by stringing together freelance assignments or winning fixed-term contracts, subjecting those workers to even more rules and restrictions. On top of their actual jobs, contractors and temp workers must do the additional work of appearing affable and employable not just on the job, but during their ongoing efforts to secure their next gig. Constantly pitching, writing up applications, and personal branding on social media requires a level of self-censorship, lest a controversial tweet or compromising Facebook photo sink their job prospects. Forced to anticipate the wishes not of a specific employer, but of all potential future employers, many opt out of participating in social media or practicing politics in any visible capacity. Their public personas are shaped not by their own beliefs and desires, but by the demands of the labor market.


For Livingston, it’s not just employers but work itself that is the problem. We toil because we must, but also because our culture has trained us to see work as the greatest enactment of our dignity and personal character. Livingston challenges us to turn away from such outmoded ideas, rooted in Protestant ideals. Like Anderson, he sweeps through centuries of labor theory with impressive efficiency, from Marx and Hegel to Freud and Lincoln, whose 1859 speech he also quotes. Livingston centers on these thinkers because they all found the connection between work and virtue troubling. Hegel believed that work causes individuals to defer their desires, nurturing a “slave morality.” Marx proposed that “real freedom came after work.” And Freud understood the Protestant work ethic as “the symptom of repression, perhaps even regression.”

Nor is it practical, Livingston argues, to exalt work: There are simply not enough jobs to keep most adults employed at a living wage, given the rise of automation and increases in productivity. Besides, the relation between income and work is arbitrary. Cooking dinner for your family is unpaid work, while cooking dinner for strangers usually comes with a paycheck. There’s nothing inherently different in the labor involved—only in the compensation. Anderson argues that work impedes individual freedom; Livingston points out that it rarely pays enough. As technological advances continue to weaken the demand for human labor, wages will inevitably be driven down even further. Instead of idealizing work and making it the linchpin of social organization, Livingston suggests, why not just get rid of it?

Livingston belongs to a cadre of thinkers, including Kathi Weeks, Nick Srnicek, and Alex Williams, who believe that we should strive for a “postwork” society in one form or another. Strands of this idea go back at least as far as Keynes’s 1930 essay on “Economic Possibilities for our Grandchildren.” Not only would work be eliminated or vastly reduced by technology, Keynes predicted, but we would also be unburdened spiritually. Devotion to work was, he deemed, one of many “pseudo-moral principles” that “exalted some of the most distasteful of human qualities into the position of the highest virtues.”

Since people in this new world would no longer have to earn a salary, they would, Livingston envisions, receive some kind of universal basic income. UBI is a slippery concept, adaptable to both the socialist left and libertarian right, but it essentially entails distributing a living wage to every member of society. In most conceptualizations, the income is indeed basic—no cases of Dom Pérignon—and would cover the essentials like rent and groceries. Individuals would then be free to choose whether and how much they want to work to supplement the UBI. Leftist proponents tend to advocate pairing UBI with a strong welfare state to provide nationalized health care, tuition-free education, and other services. Some libertarians view UBI as a way to pare down the welfare state, arguing that it’s better simply to give people money to buy food and health care directly, rather than forcing them to engage with food stamp and Medicaid bureaucracies.

According to Livingston, we are finally on the verge of this postwork society because of automation. Robots are now advanced enough to take over complex jobs in areas like agriculture and mining, eliminating the need for humans to perform dangerous or tedious tasks. In practice, however, automation is a double-edged sword, with the capacity to oppress as well as unburden. Machines often accelerate the rate at which humans can work, taxing rather than liberating them. Conveyor belts eliminated the need for workers to pass unfinished products along to their colleagues—but as Charlie Chaplin and Lucille Ball so hilariously demonstrated, the belts also increased the pace at which those same workers needed to turn wrenches and wrap chocolates. In retail and customer service, a main function of automation has been not to eliminate work, but to eliminate waged work, transferring much of the labor onto consumers, who must now weigh and code their own vegetables at the supermarket, check out their own library books, and tag their own luggage at the airport.

At the same time, it may be harder to automate some jobs that require a human touch, such as floristry or hairstyling. The same goes for the delicate work of caring for the young, sick, elderly, or otherwise vulnerable. In today’s economy, the demand for such labor is rising rapidly: “Nine of the twelve fastest-growing fields,” The New York Times reported earlier this year, “are different ways of saying ‘nurse.’” These jobs also happen to be low-paying, emotionally and physically grueling, dirty, hazardous, and shouldered largely by women and immigrants. Regardless of whether employment is virtuous or not, our immediate goal should perhaps be to distribute the burdens of caregiving, since such work is essential to the functioning of society and benefits us all.


A truly work-free world is one that would entail a revolution from our present social organizations. We could no longer conceive of welfare as a last resort—as the “safety net” metaphor implies—but would be forced to treat it as an unremarkable and universal fact of life. This alone would require us to support a massive redistribution of wealth, and to reclaim our political institutions from the big-money interests that are allergic to such changes. Tall orders indeed—but as Srnicek and Williams remind us in their book, Inventing the Future: Postcapitalism and a World Without Work, neoliberals pulled off just such a revolution in the postwar years. Thanks to their efforts, free-market liberalism replaced Keynesianism as the political and economic common sense all around the world.

Another possible solution to the current miseries of unemployment and worker exploitation is the one Livingston rejects in his title: full employment. For anti-work partisans, full employment takes us in the wrong direction, and UBI corrects the course. But the two are not mutually exclusive. In fact, rather than creating new jobs, full employment could require us to reduce our work hours drastically and spread them throughout the workforce—a scheme that could radically de-center waged work in our lives. A dual strategy of pursuing full employment while also demanding universal benefits—including health care, childcare, and affordable housing—would maximize workers’ bargaining power to ensure that they, and not just owners of capital, actually get to enjoy the bounty of labor-saving technology.

Nevertheless, Livingston’s critiques of full employment are worth heeding. As with automation, it can all go wrong if we use the banner of full employment to create pointless roles—what David Graeber has termed “bullshit jobs,” in which workers sit in some soul-sucking basement office for eight hours a day—or harmful jobs, like building nuclear weapons. If we do not have a deliberate politics rooted in universal social justice, then full employment, a basic income, and automation will not liberate us from the degradations of work.

Both Livingston and Anderson reveal how much of our own power we’ve already ceded in making waged work the conduit for our ideals of liberty and morality. The scale and coordination of the institutions we’re up against in the fight for our emancipation is, as Anderson demonstrates, staggering. Employers hold the means to our well-being, and they have the law on their side. Individual efforts to achieve a better “work-life balance” for ourselves and our families miss the wider issue we face as waged employees. Livingston demonstrates the scale at which we should be thinking: Our demands should be revolutionary, our imaginations wide. Standing amid the wreckage of last year’s presidential election, what other choice do we have?

 

Miya Tokumitsu is a lecturer of art history at the University of Melbourne and a contributing editor at Jacobin. She is the author of Do What You Love.  And Other Lies about Success and Happiness.

Bureaucratic Insanity is Yours to Enjoy

1980

Reposted below is Dmitry Orlov’s review of the new publication from Club Orlov Press: “Bureaucratic Insanity: The American Bureaucrat’s Descent into Madness” by Sean J. Kerrigan.

By Dmitry Orlov

Source: ClubOrlov

In contemporary United States a child can be charged with battery for throwing a piece of candy at a schoolfriend. Students can be placed in solitary confinement for cutting class. Adults aren’t much better off: in 2011 the Supreme Court decided in 2011 that anyone the police arrest, even for an offense as minor as an unpaid traffic ticket, can be strip-searched. These acts of official violence are just the tip of the iceberg in our society.

The number of rules and laws to which Americans, mostly unbeknownst to them, are subject, is hilariously excessive. But what makes this comedy unbearable is that these rules and laws are often enforced with an overabundance of self-righteous venom. Increasingly, contemporary American bureaucrats—be they police, teachers or government officials—are obsessed with following strict rules and mercilessly punishing all those who fail to comply (unless they are very rich or politically connected).

In so doing, these bureaucrats have become so liberated from the constraints of common sense that the situation has gone far beyond parody and is now a full-blown farce. Consider this recent news story involving a Virginia sixth-grader, the son of two schoolteachers and a member of the school’s program for gifted students. The boy was targeted by school officials after they found a leaf, probably a maple leaf, in his backpack. Someone suspected it to be marijuana. The leaf in question was not marijuana (as confirmed by repeated lab tests). End of story, wouldn’t you think?

Not at all! The 11-year-old was expelled and charged with marijuana possession in juvenile court. These charges were eventually dropped. He was then forced to enroll in an alternative school away from his friends, where he is subjected to twice-daily searches for drugs and periodic evaluation for substance abuse problems—all of this for possession of a maple leaf.

“It doesn’t matter if your son or daughter brings a real pot leaf to school, or if he brings something that looks like a pot leaf—okra, tomato, maple, buckeye, etc. If your kid calls it marijuana as a joke, or if another kid thinks it might be marijuana, that’s grounds for expulsion,” the Washington Post cheerfully reassures us.

A reasonable school official would recognize the difference between a technical violation caused by an oversight and a conscious attempt to smuggle drugs into the school. But school officials were intent on ignoring their own better sense, instead favoring harsh punishments.

In his new book, Bureaucratic Insanity: The American Bureaucrat’s Descent into Madness, Sean Kerrigan documents dozens of eyebrow-raising examples in which America’s rule-enforcers perversely revel in handing out absurd and unfair punishments for minor infractions. They demand total and complete submission, driven by a perverse compulsion to “put us in our place” and to “teach us a lesson.” They mercilessly punish even the most inconsequential transgressions in order to maximize our terror and humiliation.

When Sean first began following this story several years ago, he became mesmerized by this bizarre carnival of unreason. “Where is all this pent-up rage coming from?” he wondered, “and why is it being directed toward the weakest and most vulnerable members of society?” And then news stories like those mentioned above grew more and more common. Eventually, he started compiling a list of the most egregious abuses, trying to detect patterns, searching for some explanation for why your average garden-variety bureaucrat has morphed into a monster and has started to take sadistic pleasure in the suffering of innocent people.

Some people might argue that this kind of behavior is the result of political correctness gone amok. Others point to the irrational fear of terrorism and mass shootings. Yet others might think that it has to do with the bureaucrats’ fear of losing their jobs—merely for failing to comply with the exact letter of some rule. While there may be some truth to each of these explanations, they are far from adequate. Many of these bureaucratic abuses have nothing to do with political constraints on free speech, or with guns or terrorism, and in most cases the bureaucrats have the power to minimize harm, but instead they choose to maximize it.

In looking closer at each individual instance, it became clear that most of the offending bureaucrats weren’t even attempting to use their judgment but were mindlessly following written rules. Even in the most nurturing and humanistic professions—teachers and physicians—their practitioners have been robotized to such an extent that they now perform a very narrow range of actions. Thanks to all the progress in IT, their work is now quite detached from physical reality. Much of their work now consists of monotonously, mindlessly pounding at the computer keyboard. Consequently, a large portion of their waking lives has taken on an ethereal, pointless quality. Even teachers, who once had a relatively free reign in forming the minds of the next generation, are now forced to behave like machines, teaching to standardized tests and working a grueling average of 53 hours a week.

The psychological effects of this pressure have been profound. Minus the opportunities to make their own decisions and to see those positive effects of their efforts, their work has become personally meaningless, alienating, depersonalizing and psychologically damaging. As a result of this damage, American bureaucrats, although they may look like mild-mannered professionals, have become prone to sudden bouts of aggressive, sadistic behavior. They are unable to act out their repressed rage in any socially acceptable way other than by doling out punishments, fines, rejections, expulsions and other forms of objective, systemic violence.

The road to hell is paved with good intentions, and rest assured that this is all being done for our own good. The purpose of all of these rules and laws, from the perspective of the American system of governance, is to maximize control over everything that can be controlled and to micromanage every possible detail of our lives—in order to make them better! From student testing all the way to global trade, those in leadership positions are trying to centralize as much authority as possible in order to maximize efficiency, profit, American power… while minimizing our dignity, well-being and happiness. Oops!

In his book Bureaucratic Insanity, Sean traces the development of this trend from the early years of the industrial revolution to the modern day, from its initial appearance in factory life and in the military, to it later metastasizing to the office, and now taking over America’s schools. He argues persuasively, based on a careful and thorough review of literature in history, philosophy, psychology, anthropology and social criticism, that the average American bureaucrat is literally, clinically insane. The average American bureaucrat has a warped perception of reality and an intense, repressed self-hatred. Their only way to vent their rage is by punishing others using bureaucratic methods. They demand absolute conformity because it is their only way to give their meaningless lives some semblance of meaning. They suppress all thoughts that might lead them to discover the true nature of their condition, because that would cause them to spiral down into outright schizophrenia.

The book concludes with an assessment of what we can do to insulate ourselves from this seemingly unstoppable trend, and of how we can reinvigorate our lives by giving it real meaning.

 

Fuck Work

 tumblr_mkr6eleomu1qiyurho1_r2_500

Economists believe in full employment. Americans think that work builds character. But what if jobs aren’t working anymore?

By James Livingston

Source: aeon

Work means everything to us Americans. For centuries – since, say, 1650 – we’ve believed that it builds character (punctuality, initiative, honesty, self-discipline, and so forth). We’ve also believed that the market in labour, where we go to find work, has been relatively efficient in allocating opportunities and incomes. And we’ve believed that, even if it sucks, a job gives meaning, purpose and structure to our everyday lives – at any rate, we’re pretty sure that it gets us out of bed, pays the bills, makes us feel responsible, and keeps us away from daytime TV.

These beliefs are no longer plausible. In fact, they’ve become ridiculous, because there’s not enough work to go around, and what there is of it won’t pay the bills – unless of course you’ve landed a job as a drug dealer or a Wall Street banker, becoming a gangster either way.

These days, everybody from Left to Right – from the economist Dean Baker to the social scientist Arthur C Brooks, from Bernie Sanders to Donald Trump – addresses this breakdown of the labour market by advocating ‘full employment’, as if having a job is self-evidently a good thing, no matter how dangerous, demanding or demeaning it is. But ‘full employment’ is not the way to restore our faith in hard work, or in playing by the rules, or in whatever else sounds good. The official unemployment rate in the United States is already below 6 per cent, which is pretty close to what economists used to call ‘full employment’, but income inequality hasn’t changed a bit. Shitty jobs for everyone won’t solve any social problems we now face.

Don’t take my word for it, look at the numbers. Already a fourth of the adults actually employed in the US are paid wages lower than would lift them above the official poverty line – and so a fifth of American children live in poverty. Almost half of employed adults in this country are eligible for food stamps (most of those who are eligible don’t apply). The market in labour has broken down, along with most others.

Those jobs that disappeared in the Great Recession just aren’t coming back, regardless of what the unemployment rate tells you – the net gain in jobs since 2000 still stands at zero – and if they do return from the dead, they’ll be zombies, those contingent, part-time or minimum-wage jobs where the bosses shuffle your shift from week to week: welcome to Wal-Mart, where food stamps are a benefit.

And don’t tell me that raising the minimum wage to $15 an hour solves the problem. No one can doubt the moral significance of the movement. But at this rate of pay, you pass the official poverty line only after working 29 hours a week. The current federal minimum wage is $7.25. Working a 40-hour week, you would have to make $10 an hour to reach the official poverty line. What, exactly, is the point of earning a paycheck that isn’t a living wage, except to prove that you have a work ethic?

But, wait, isn’t our present dilemma just a passing phase of the business cycle? What about the job market of the future? Haven’t the doomsayers, those damn Malthusians, always been proved wrong by rising productivity, new fields of enterprise, new economic opportunities? Well, yeah – until now, these times. The measurable trends of the past half-century, and the plausible projections for the next half-century, are just too empirically grounded to dismiss as dismal science or ideological hokum. They look like the data on climate change – you can deny them if you like, but you’ll sound like a moron when you do.

For example, the Oxford economists who study employment trends tell us that almost half of existing jobs, including those involving ‘non-routine cognitive tasks’ – you know, like thinking – are at risk of death by computerisation within 20 years. They’re elaborating on conclusions reached by two MIT economists in the book Race Against the Machine (2011). Meanwhile, the Silicon Valley types who give TED talks have started speaking of ‘surplus humans’ as a result of the same process – cybernated production. Rise of the Robots, a new book that cites these very sources, is social science, not science fiction.

So this Great Recession of ours – don’t kid yourself, it ain’t over – is a moral crisis as well as an economic catastrophe. You might even say it’s a spiritual impasse, because it makes us ask what social scaffolding other than work will permit the construction of character – or whether character itself is something we must aspire to. But that is why it’s also an intellectual opportunity: it forces us to imagine a world in which the job no longer builds our character, determines our incomes or dominates our daily lives.

In short, it lets us say: enough already. Fuck work.

Certainly this crisis makes us ask: what comes after work? What would you do without your job as the external discipline that organises your waking life – as the social imperative that gets you up and on your way to the factory, the office, the store, the warehouse, the restaurant, wherever you work and, no matter how much you hate it, keeps you coming back? What would you do if you didn’t have to work to receive an income?

And what would society and civilisation be like if we didn’t have to ‘earn’ a living – if leisure was not our choice but our lot? Would we hang out at the local Starbucks, laptops open? Or volunteer to teach children in less-developed places, such as Mississippi? Or smoke weed and watch reality TV all day?

I’m not proposing a fancy thought experiment here. By now these are practical questions because there aren’t enough jobs. So it’s time we asked even more practical questions. How do you make a living without a job – can you receive income without working for it? Is it possible, to begin with and then, the hard part, is it ethical? If you were raised to believe that work is the index of your value to society – as most of us were – would it feel like cheating to get something for nothing?

We already have some provisional answers because we’re all on the dole, more or less. The fastest growing component of household income since 1959 has been ‘transfer payments’ from government. By the turn of the 21st century, 20 per cent of all household income came from this source – from what is otherwise known as welfare or ‘entitlements’. Without this income supplement, half of the adults with full-time jobs would live below the poverty line, and most working Americans would be eligible for food stamps.

But are these transfer payments and ‘entitlements’ affordable, in either economic or moral terms? By continuing and enlarging them, do we subsidise sloth, or do we enrich a debate on the rudiments of the good life?

Transfer payments or ‘entitlements’, not to mention Wall Street bonuses (talk about getting something for nothing) have taught us how to detach the receipt of income from the production of goods, but now, in plain view of the end of work, the lesson needs rethinking. No matter how you calculate the federal budget, we can afford to be our brother’s keeper. The real question is not whether but how we choose to be.

I know what you’re thinking – we can’t afford this! But yeah, we can, very easily. We raise the arbitrary lid on the Social Security contribution, which now stands at $127,200, and we raise taxes on corporate income, reversing the Reagan Revolution. These two steps solve a fake fiscal problem and create an economic surplus where we now can measure a moral deficit.

Of course, you will say – along with every economist from Dean Baker to Greg Mankiw, Left to Right – that raising taxes on corporate income is a disincentive to investment and thus job creation. Or that it will drive corporations overseas, where taxes are lower.

But in fact raising taxes on corporate income can’t have these effects.

Let’s work backward. Corporations have been ‘multinational’ for quite some time. In the 1970s and ’80s, before Ronald Reagan’s signature tax cuts took effect, approximately 60 per cent of manufactured imported goods were produced offshore, overseas, by US companies. That percentage has risen since then, but not by much.

Chinese workers aren’t the problem – the homeless, aimless idiocy of corporate accounting is. That is why the Citizens United decision of 2010 applying freedom of speech regulations to campaign spending is hilarious. Money isn’t speech, any more than noise is. The Supreme Court has conjured a living being, a new person, from the remains of the common law, creating a real world more frightening than its cinematic equivalent: say, Frankenstein, Blade Runner or, more recently, Transformers.

But the bottom line is this. Most jobs aren’t created by private, corporate investment, so raising taxes on corporate income won’t affect employment. You heard me right. Since the 1920s, economic growth has happened even though net private investment has atrophied. What does that mean? It means that profits are pointless except as a way of announcing to your stockholders (and hostile takeover specialists) that your company is a going concern, a thriving business. You don’t need profits to ‘reinvest’, to finance the expansion of your company’s workforce or output, as the recent history of Apple and most other corporations has amply demonstrated.

So investment decisions by CEOs have only a marginal effect on employment. Taxing the profits of corporations to finance a welfare state that permits us to love our neighbours and to be our brothers’ keeper is not an economic problem. It’s something else – it’s an intellectual issue, a moral conundrum.

When we place our faith in hard work, we’re wishing for the creation of character; but we’re also hoping, or expecting, that the labour market will allocate incomes fairly and rationally. And there’s the rub, they do go together. Character can be created on the job only when we can see that there’s an intelligible, justifiable relation between past effort, learned skills and present reward. When I see that your income is completely out of proportion to your production of real value, of durable goods the rest of us can use and appreciate (and by ‘durable’ I don’t mean just material things), I begin to doubt that character is a consequence of hard work.

When I see, for example, that you’re making millions by laundering drug-cartel money (HSBC), or pushing bad paper on mutual fund managers (AIG, Bear Stearns, Morgan Stanley, Citibank), or preying on low-income borrowers (Bank of America), or buying votes in Congress (all of the above) – just business as usual on Wall Street – while I’m barely making ends meet from the earnings of my full-time job, I realise that my participation in the labour market is irrational. I know that building my character through work is stupid because crime pays. I might as well become a gangster like you.

That’s why an economic crisis such as the Great Recession is also a moral problem, a spiritual impasse – and an intellectual opportunity. We’ve placed so many bets on the social, cultural and ethical import of work that when the labour market fails, as it so spectacularly has, we’re at a loss to explain what happened, or to orient ourselves to a different set of meanings for work and for markets.

And by ‘we’ I mean pretty much all of us, Left to Right, because everybody wants to put Americans back to work, one way or another – ‘full employment’ is the goal of Right-wing politicians no less than Left-wing economists. The differences between them are over means, not ends, and those ends include intangibles such as the acquisition of character.

Which is to say that everybody has doubled down on the benefits of work just as it reaches a vanishing point. Securing ‘full employment’ has become a bipartisan goal at the very moment it has become both impossible and unnecessary. Sort of like securing slavery in the 1850s or segregation in the 1950s.

Why?

Because work means everything to us inhabitants of modern market societies – regardless of whether it still produces solid character and allocates incomes rationally, and quite apart from the need to make a living. It’s been the medium of most of our thinking about the good life since Plato correlated craftsmanship and the possibility of ideas as such. It’s been our way of defying death, by making and repairing the durable things, the significant things we know will last beyond our allotted time on earth because they teach us, as we make or repair them, that the world beyond us – the world before and after us – has its own reality principles.

Think about the scope of this idea. Work has been a way of demonstrating differences between males and females, for example by merging the meanings of fatherhood and ‘breadwinner’, and then, more recently, prying them apart. Since the 17th century, masculinity and femininity have been defined – not necessarily achieved – by their places in a moral economy, as working men who got paid wages for their production of value on the job, or as working women who got paid nothing for their production and maintenance of families. Of course, these definitions are now changing, as the meaning of ‘family’ changes, along with profound and parallel changes in the labour market – the entry of women is just one of those – and in attitudes toward sexuality.

When work disappears, the genders produced by the labour market are blurred. When socially necessary labour declines, what we once called women’s work – education, healthcare, service – becomes our basic industry, not a ‘tertiary’ dimension of the measurable economy. The labour of love, caring for one another and learning how to be our brother’s keeper – socially beneficial labour – becomes not merely possible but eminently necessary, and not just within families, where affection is routinely available. No, I mean out there, in the wide, wide world.

Work has also been the American way of producing ‘racial capitalism’, as the historians now call it, by means of slave labour, convict labour, sharecropping, then segregated labour markets – in other words, a ‘free enterprise system’ built on the ruins of black bodies, an economic edifice animated, saturated and determined by racism. There never was a free market in labour in these united states. Like every other market, it was always hedged by lawful, systematic discrimination against black folk. You might even say that this hedged market produced the still-deployed stereotypes of African-American laziness, by excluding black workers from remunerative employment, confining them to the ghettos of the eight-hour day.

And yet, and yet. Though work has often entailed subjugation, obedience and hierarchy (see above), it’s also where many of us, probably most of us, have consistently expressed our deepest human desire, to be free of externally imposed authority or obligation, to be self-sufficient. We have defined ourselves for centuries by what we do, by what we produce.

But by now we must know that this definition of ourselves entails the principle of productivity – from each according to his abilities, to each according to his creation of real value through work – and commits us to the inane idea that we’re worth only as much as the labour market can register, as a price. By now we must also know that this principle plots a certain course to endless growth and its faithful attendant, environmental degradation.

Until now, the principle of productivity has functioned as the reality principle that made the American Dream seem plausible. ‘Work hard, play by the rules, get ahead’, or, ‘You get what you pay for, you make your own way, you rightly receive what you’ve honestly earned’ – such homilies and exhortations used to make sense of the world. At any rate they didn’t sound delusional. By now they do.

Adherence to the principle of productivity therefore threatens public health as well as the planet (actually, these are the same thing). By committing us to what is impossible, it makes for madness. The Nobel Prize-winning economist Angus Deaton said something like this when he explained anomalous mortality rates among white people in the Bible Belt by claiming that they’ve ‘lost the narrative of their lives’ – by suggesting that they’ve lost faith in the American Dream. For them, the work ethic is a death sentence because they can’t live by it.

So the impending end of work raises the most fundamental questions about what it means to be human. To begin with, what purposes could we choose if the job – economic necessity – didn’t consume most of our waking hours and creative energies? What evident yet unknown possibilities would then appear? How would human nature itself change as the ancient, aristocratic privilege of leisure becomes the birthright of human beings as such?

Sigmund Freud insisted that love and work were the essential ingredients of healthy human being. Of course he was right. But can love survive the end of work as the willing partner of the good life? Can we let people get something for nothing and still treat them as our brothers and sisters – as members of a beloved community? Can you imagine the moment when you’ve just met an attractive stranger at a party, or you’re online looking for someone, anyone, but you don’t ask: ‘So, what do you do?’

We won’t have any answers until we acknowledge that work now means everything to us – and that hereafter it can’t.

Will Robots Take Your Job?

Walmart Robots

By Nick Srnicek and Alex Williams

Source: ROAR

In recent months, a range of studies has warned of an imminent job apocalypse. The most famous of these—a study from Oxford—suggests that up to 47 percent of US jobs are at high-risk of automation over the next two decades. Its methodology—assessing likely developments in technology, and matching them up to the tasks typically deployed in jobs—has been replicated since then for a number of other countries. One study finds that 54 percent of EU jobs are likely automatable, while the chief economist of the Bank of England has argued that 45 percent of UK jobs are similarly under threat.

This is not simply a rich-country problem, either: low-income economies look set to be hit even harder by automation. As low-skill, low-wage and routine jobs have been outsourced from rich capitalist countries to poorer economies, these jobs are also highly susceptible to automation. Research by Citi suggests that for India 69 percent of jobs are at risk, for China 77 percent, and for Ethiopia a full 85 percent of current jobs. It would seem that we are on the verge of a mass job extinction.

Nothing New?

For many economists however, there is nothing to worry about. If we look at the history of technology and the labor market, past experiences would suggest that automation has not caused mass unemployment. Automation has always changed the labor market. Indeed, one of the primary characteristics of the capitalist mode of production has been to revolutionize the means of production—to really subsume the labor process and reorganize it in ways that more efficiently generate value. The mechanization of agriculture is an early example, as is the use of the cotton gin and spinning jenny. With Fordism, the assembly line turned complex manufacturing jobs into a series of simple and efficient tasks. And with the era of lean production, we have had the computerized management of long commodity chains turn the production process into a more and more heavily automated system.

In every case, we have not seen mass unemployment. Instead we have seen some jobs disappear, while others have been created to replace not only the lost jobs but also the new jobs necessary for a growing population. The only times we see massive unemployment tend to be the result of cyclical factors, as in the Great Depression, rather than some secular trend towards higher unemployment resulting from automation. On the basis of these considerations, most economists believe that the future of work will likely be the same as the past: some jobs will disappear, but others will be created to replace them.

In typical economist fashion, however, these thoughts neglect the broader social context of earlier historical periods. Capitalism may not have seen a massive upsurge in unemployment, but this is not a necessary outcome. Rather, it was dependent upon unique circumstances of earlier moments—circumstances that are missing today. In the earliest periods of automation, there was a major effort by the labor movement to reduce the working week. It was a successful project that reduced the week from around 60 hours at the turn of the century, down to 40 hours during the 1930s, and very nearly even down to 30 hours. In this context, it was no surprise that Keynes would famously extrapolate to a future where we all worked 15 hours. He was simply looking at the existing labor movement. With reduced work per person, however, this meant that the remaining work would be spread around more evenly. The impact of technology at that time was therefore heavily muted by a 33 percent reduction in the amount of work per person.

Today, by contrast, we have no such movement pushing for a reduced working week, and the effects of automation are likely to be much more serious. Similar issues hold for the postwar era. With most Western economies left in ruins, and massive American support for the revitalization of these economies, the postwar era saw incredibly high levels of economic growth. With the further addition of full employment policies, this period also saw incredibly high levels of job growth and a compact between trade unions and capital to maintain a sufficient amount of good jobs. This led to healthy wage growth and, subsequently, healthy growth in aggregate demand to stimulate the economy and keep jobs coming. Moreover, this was a period where nearly 50 percent of the potential labor force was constrained to the household.

Under these unique circumstances, it is no wonder that capitalism was able to create enough jobs even as automation continued to transform for the labor process. Today, we have sluggish economic growth, no commitments to full employment (even as we have commitments to harsh welfare policies), stagnant wage growth, and a major influx of women into the labor force. The context for a wave of automation is drastically different from the way it was before.

Likewise, the types of technology that are being developed and potentially introduced into the labor process are significantly different from earlier technologies. Whereas earlier waves of automation affected what economists call “routine work” (work that can be laid out in a series of explicit steps), today’s technology is beginning to affect non-routine work. The difference is between a factory job on an assembly line and driving a car in the chaotic atmosphere of the modern urban environment. Research from economists like David Autor and Maarten Goos shows that the decline of routine jobs in the past 40 years has played a significant role in increased job polarization and rising inequality. While these jobs are gone, and highly unlikely to come back, the next wave of automation will affect the remaining sphere of human labor. An entire range of low-wage jobs are now potentially automatable, involving both physical and mental labor.

Given that it is quite likely that new technologies will have a larger impact on the labor market than earlier waves of technological change, what is likely to happen? Will robots take your job? While one side of the debate warns of imminent apocalypse and the other yawns from the historical repetition, both tend to neglect the political economy of automation—particularly the role of labor. Put simply, if the labor movement is strong, we are likely to see more automation; if the labor movement is weak, we are likely to see less automation.

Workers Fight Back

In the first scenario, a strong labor movement is able to push for higher and higher wages (particularly relative to globally stagnant productivity growth). But the rising cost of labor means that machines become relatively cheap in comparison. We can already see this in China, where real wages have been surging for more than 10 years, thereby making Chinese labor increasingly less cheap. The result is that China has become the world’s biggest investor in industrial robots, and numerous companies—most famously Foxconn—have all stated their intentions to move towards increasingly automated factories.

This is the archetype of a highly automated world, but in order to be achievable under capitalism it requires that the power of labor be strong, given that the relative costs of labor and machines are key determinants for investment. What then happens under these circumstances? Do we get mass unemployment as robots take all the jobs? The simple answer is no. Rather than mass decimation of jobs, most workers who have their jobs automated end up moving into new sectors.

In the advanced capitalist economies this has been happening over the past 40 years, as workers move from routine jobs to non-routine jobs. As we saw earlier, the next wave of automation is different, and therefore its effects on the labor market are also different. Some job sectors are likely to take heavy hits under this scenario. Jobs in retail and transport, for instance, will likely be heavily affected. In the UK, there are currently 3 million retail workers, but estimates by the British Retail Consortium suggest this may decrease by a million over the next decade. In the US, there are 3.4 million cashiers alone—nearly all of whose work could be automated. The transport sector is similarly large, with 3.7 million truck drivers in the US, most of whose jobs could be incrementally automated as self-driving trucks become viable on public roads. Large numbers of workers in such sectors are likely to be pushed out of their jobs if mass automation takes place.

Where will they go? The story that Silicon Valley likes to tell us is that we will all become freelance programmers and software developers and that we should all learn how to code to succeed in their future utopia. Unfortunately they seem to have bought into their own hype and missed the facts. In the US, 1.8 percent of all jobs require knowledge of programming. This compares to the agricultural sector, which creates about 1.5 percent of all American jobs, and to the manufacturing sector, which employs 8.1 percent of workers in this deindustrialized country. Perhaps programming will grow? The facts here are little better. The Bureau of Labor Statistics (BLS) projects that by 2024 jobs involving programming will be responsible for a tiny 2.2 percent of the jobs available. If we look at the IT sector as a whole, according to Citi, it is expected to take up less than 3 percent of all jobs.

What about the people needed to take care of the robots? Will we see a massive surge in jobs here? Presently, robot technicians and engineers take up less than 0.1 percent of the job market—by 2024, this will dwindle even further. We will not see a major increase in jobs taking care of robots or in jobs involving coding, despite Silicon Valley’s best efforts to remake the world in its image.

This continues a long trend of new industries being very poor job creators. We all know about how few employees worked at Instagram and WhatsApp when they were sold for billions to Facebook. But the low levels of employment are a widespread sectoral problem. Research from Oxford has found that in the US, only 0.5 percent of the labor force moved into new industries (like streaming sites, web design and e-commerce) during the 2000s. The future of work does not look like a bunch of programmers or YouTubers.

In fact, the fastest growing job sectors are not for jobs that require high levels of education at all. The belief that we will all become high-skilled and well-paid workers is ideological mystification at its purest. The fastest growing job sector, by far, is the healthcare industry. In the US, the BLS estimates this sector to create 3.8 million new jobs between 2014 and 2024. This will increase its share of employment from 12 percent to 13.6 percent, making it the biggest employing sector in the country. The jobs of “healthcare support” and “healthcare practitioner” alone will contribute 2.3 million jobs—or 25 percent of all new jobs expected to be created.

There are two main reasons for why this sector will be such a magnet for workers forced out of other sectors. In the first place, the demographics of high-income economies all point towards a significantly growing elderly population. Fewer births and longer lives (typically with chronic conditions rather than infectious diseases) will put more and more pressure on our societies to take care of elderly, and force more and more people into care work. Yet this sector is not amenable to automation; it is one of the last bastions of human-centric skills like creativity, knowledge of social context and flexibility. This means the demand for labor is unlikely to decrease in this sector, as productivity remains low, skills remain human-centric, and demographics make it grow.

In the end, under the scenario of a strong labor movement, we are likely to see wages rise, which will cause automation to rapidly proceed in certain sectors, while workers are forced to struggle for jobs in a low-paying healthcare sector. The result is the continued elimination of middle-wage jobs and the increased polarization of the labor market as more and more are pushed into the low-wage sectors. On top of this, a highly educated generation that was promised secure and well-paying jobs will be forced to find lower-skilled jobs, putting downward pressure on wages—generating a “reserve army of the employed”, as Robert Brenner has put it.

Workers Fall Back

Yet what happens if the labor movement remains weak? Here we have an entirely different future of work awaiting us. In this case, we end up with stagnant wages, and workers remain relatively cheap compared to investment in new equipment. The consequences of this are low levels of business investment, and subsequently, low levels of productivity growth. Absent any economic reason to invest in automation, businesses fail to increase the productivity of the labor process. Perhaps unexpectedly, under this scenario we should expect high levels of employment as businesses seek to maximize the use of cheap labor rather than investing in new technology.

This is more than a hypothetical scenario, as it rather accurately describes the situation in the UK today. Since the 2008 crisis, real wages have stagnated and even fallen. Real average weekly earnings have started to rise since 2014, but even after eight years they have yet to return to their pre-crisis levels. This has meant that businesses have had incentives to hire cheap workers rather than invest in machines—and the low levels of investment in the UK bear this out. Since the crisis, the UK has seen long periods of decline in business investment—the most recent being a 0.4 percent decline between Q12015 and Q12016. The result of low levels of investment has been virtually zero growth in productivity: from 2008 to 2015, growth in output per worker has averaged 0.1 percent per year. Almost all of the UK’s recent growth has come from throwing more bodies into the economic machine, rather than improving the efficiency of the economy. Even relative to slow productivity growth across the world, the UK is particularly struggling.

With cheap wages, low investment and low productivity, we see that companies have instead been hiring workers. Indeed, employment levels in the UK have reached the highest levels on record—74.2 percent as of May 2016. Likewise, unemployment is low at 5.1 percent, especially when compared to their neighbors in Europe who average nearly double that level. So, somewhat surprisingly, an environment with a weak labor movement leads here to high levels of employment.

What is the quality of these jobs, however? We have already seen that wages have been stagnant, and that two-thirds of net job creation since 2008 has been in self-employed jobs. Yet there has also been a major increase in zero-hour contracts (employment situations that do not guarantee any hours to workers). Estimates are that up to 5 percent of the labor force is in such situations, with over 1.7 million zero-hour contracts out. Full-time employment is down as well: as a percentage of all jobs, its pre-crisis levels of 65 percent have been cut to 63 percent and refused to budge even as the economy grows (slowly). The percentage of involuntary part-time workers—those who would prefer a full-time job but cannot find one—more than doubled after the crisis, and has barely begun to recover since.

Likewise with temporary employees: involuntary temporary workers as a percentage of all temporary workers rose from below 25 percent to over 40 percent during the crisis, only partly recovering to around 35 percent today. There is a vast number of workers who would prefer to work in more permanent and full-time jobs, but who can no longer find them. The UK is increasingly becoming a low-wage and precarious labor market—or, in the Tories’ view, a competitive and flexible labor market. This, we would argue, is the future that obtains with a weak labor movement: low levels of automation, perhaps, but at the expense of wages (and aggregate demand), permanent jobs and full-time work. We may not get a fully automated future, but the alternative looks just as problematic.

These are therefore the two poles of possibility for the future of work. On the one hand, a highly automated world where workers are pushed out of much low-wage non-routine work and into lower-wage care work. On the other hand, a world where humans beat robots but only through lower wages and more precarious work. In either case, we need to build up the social systems that will enable people to survive and flourish in the midst of these significant changes. We need to explore ideas like a Universal Basic Income, we need to foster investment in automation that could eliminate the worst jobs in society, and we need to recover that initial desire of the labor movement for a shorter working week.

We must reclaim the right to be lazy—which is neither a demand to be lazy nor a belief in the natural laziness of humanity, but rather the right to refuse domination by a boss, by a manager, or by a capitalist. Will robots take our jobs? We can only hope so.

Note: All uncited figures either come directly from, or are based on authors’ calculations of, data from the Bureau of Labor Statistics, O*NET and the Office for National Statistics.

A Universal Basic Income Is The Bipartisan Solution To Poverty We’ve Been Waiting For

 Molly Crabapple Basic Income Banner

What if the government simply paid everyone enough so that no one was poor? It’s an insane idea that’s gaining an unlikely alliance of supporters.

By Ben Schiller

Source: FastCoexist.com

There’s a simple way to end poverty: the government just gives everyone enough money, so nobody is poor. No ifs, buts, conditions, or tests. Everyone gets the minimum they need to survive, even if they already have plenty.

This, in essence, is “universal minimum income” or “guaranteed basic income”—where, instead of multiple income assistance programs, we have just one: a single payment to all citizens, regardless of background, gender, or race. It’s a policy idea that sounds crazy at first, but actually begins to make sense when you consider some recent trends.

The first is that work isn’t what it used to be. Many people now struggle through a 50-hour week and still don’t have enough to live on. There are many reasons for this—including the heartlessness of employers and the weakness of unions—but it’s a fact. Work no longer pays. The wages of most American workers have stagnated or declined since the 1970s. About 25% of workers (including 40% of those in restaurants and food service) now need public assistance to top up what they earn.

The second: it’s likely to get worse. Robots already do many menial tasks. In the future, they’ll do more sophisticated jobs as well. A study last year from Carl Frey and Michael Osborne at Oxford University found that 47% of jobs are at risk of computerization over the next two decades. That includes positions in transport and logistics, office and administration, sales and construction, and even law, financial services and medicine. Of course, it’s possible that people who lose their jobs will find others. But it’s also feasible we’re approaching an era when there will simply be less to do.

The third is that traditional welfare is both not what it used to be and not very efficient. The value of welfare for families with children is now well below what it was in the 1990s, for example. The move towards means-testing, workfare—which was signed into law by Bill Clinton in 1996—and other forms of conditionality have killed the universal benefit. And not just in the U.S. It’s now rare anywhere in the world that people get a check without having to do something in return. Whatever the rights and wrongs of this, that makes the income assistance system more complicated and expensive to manage. Up to up to 10% of the income assistance budget now goes to administrating its distribution.

For these reasons and others, the idea of a basic income for everyone is becoming increasingly popular. There has been a flurry of reports and papers about it recently, and, unusually, the idea has advocates across the political spectrum.

The libertarian right likes basic income because it hates bureaucracy and thinks people should be responsible for themselves. Rather than giving out food stamps and health care (which are in-kind services), it thinks people should get cash, because cash is fungible and you do what you like with it.

The left likes basic income because it thinks society is unequal and basic income is redistributive. It evens up the playing field for people who haven’t had good opportunities in life by establishing a floor under the poorest. The “precariat” goes from being perpetually insecure to knowing it has something to live on. That, in turn, should raise well-being and produce more productive citizens.

The technology elite, like Netscape’s Marc Andreessen, also likes the idea. “As a VC, I like the fact that a lot of the political establishment is ignoring or dismissing this idea,” Albert Wenger, of Union Square Ventures, told a TED audience recently, “because what we see in startups is that the most powerful innovative ideas are ones truly dismissed by the incumbents.” A minimum income would allow us to “embrace automation rather than be afraid of it” and let more of us participate in the era of “digital abundance,” he says.

The exact details of basic income still need to be worked out, but it might work something like this: Instead of welfare payments, subsidies for health care, and tax credits for the working poor, we would take that money and use it to cover a single payment that would give someone the chance to live reasonably. Switzerland recently held an (unsuccessful) is planning to hold a referendum on a basic income this year, though no date is set. The proposed amount is $2,800 per month.

But would it actually work? The evidence from actual experiments is limited, though it’s more positive than not. A pilot in the 1970s in Manitoba, Canada, showed that a “Mincome” not only ended poverty but also reduced hospital visits and raised high-school completion rates. There seemed to be a community-affirming effect, which showed itself in people making use of free public services more responsibly.

Meanwhile, there were eight “negative income tax” trials in the U.S. in the ’70s, where people received payments and the government clawed back most of it in taxes based on your other income. The results for those trials was more mixed. They reduced poverty, but people also worked slightly less than normal. To some, this is the major drawback of basic income: it could make people lazier than they would otherwise be. That would certainly be a problem, though it’s questionable whether, in the future, there will be as much employment anyway. The age of robots and artificial intelligence seems likely to hollow out many jobs, perhaps changing how we view notions of laziness and productivity altogether.

Experiments outside the U.S. have been more encouraging. One in Namibia cut poverty from 76% to 37%, increased non-subsidized incomes, raised education and health standards, and cut crime levels. Another involving 6,000 people in India paid people $7 month—about a third of subsistence levels. It, too, proved successful.

“The important thing is to create a floor on which people can start building some security. If the economic situation allows, you can gradually increase the income to where it meets subsistence,” says Guy Standing, a professor of development studies at the School of Oriental and African Studies, in London, who was involved with the pilot. “Even that modest amount had incredible effects on people’s savings, economic status, health, in children going to school, in the acquisition of items like school shoes, so people felt in control of their lives. The amount of work people were doing increased as well.”

Given the gridlock in Congress, it’s unlikely we’ll see basic income here for a while. Though the idea has supporters in both left and right-leaning think-tanks, it’s doubtful actual politicians could agree to redesign much of the federal government if they can’t agree on much else. But the idea could take off in poorer countries that have more of a blank slate and suffer from less polarization. Perhaps we’ll re-import the concept one day once the developing world has perfected it?

Why Are We Still Working?

intcap2

By Mike Dowson

Source: NewMatilda.com

This may be an opportune moment to consider the question. Especially if you’re not actually working.

You may have retired. Perhaps you’ve just left university, considering your options. Perhaps you’re taking a welcome break.

Maybe you have no choice but to take a break. Did you retire early because your job was axed? Has the casual work you depend on dried up? Have you been unable to find a job, despite your qualifications?

Perhaps, as you read this, you’re at work, filling in time, forgoing a holiday. Or at the beach, while the kids play in the surf, watching for emails on your phone.

Of course, it’s obvious why we work. Money. You don’t get something for nothing. And everything is so expensive these days.

If anything, most of us need to work more. Both spouses, extra hours, second jobs. Would anyone, except an idiot, seriously suggest we should all be working less?

Well, actually, yes.

As long ago as 1930, the economist John Maynard Keynes predicted that, by now, people in technologically advanced societies wouldn’t need to work much at all. When Keynes said this, advances in technology were yielding extraordinary increases in productivity. The implications seemed obvious. If it took less time to produce what we needed, surely we’d work less.

It turns out that for much of the 20th Century average working hours in developed countries steadily fell. Then, around the 1970s, the trend plateaued. In some countries, it reversed and working hours began to climb again. This occurred at the same time women were entering the workforce in great numbers so total workforce participation also increased.

In Australia, by the new millennium, many full time employees were working more than their grandparents had.

What happened? Did technology fail to deliver the gains Keynes expected?

On the contrary. Technological advancement outstripped even the giddy imaginations of futurists from a century ago. We can grow food, dig up minerals, make fridges and bridges, move things and ourselves around the planet and share knowledge and information much faster with a fraction of the workforce it once took.

But if staggering productivity gains haven’t manifested as lower working hours, where did they go?

Some prominent economists, including some Nobel laureates, have grappled with this question.

Gary Becker observed that our appetite for material goods has expanded along with our ability to produce them. Instead of working less hours, we opted for bigger houses with more gadgets, which we replace more often.

This process has been fuelled by a deluge of marketing, which persuades us to consume things we previously didn’t recognise a need for.

Does that explain it? Anthropologist David Graeber doesn’t think so. If it continually takes fewer human hours to produce these things, shouldn’t we be able to afford them without working more? What are all these working hours producing?

Graeber argues that, although productive jobs have, in fact, been steadily automated away just as predicted, we have also seen a vast proliferation of new jobs that only seem to exist to keep people working.

Consider this. Productivity growth has stalled in Australia. How can this be? Technology hasn’t stopped advancing. The time we should be winning back through productivity gains must be getting reabsorbed.

Productivity returns are highest in capital-intensive industries like mining and manufacturing. As those jobs disappear, either replaced by technology, or lost altogether, the workforce moves into labour-intensive industries like hospitality and professional services. This dilutes the gains in the other industries.

At the same time, unemployment has been trending up since 2008. Young people especially, are out of work. The number of underemployed people, who would work more if they could, is also high. More jobs are casual.

There’s a downward trend in job prospects for new graduates. Some of them settle for part-time work or a free internship. Many find work which is unrelated to primary qualification. That’s now more likely to be in a job without benefits, or multiple such jobs.

There’s another factor. Our lives are now longer relative to our working lives. We tend to start full-time work later, after years of study, and more of life is spent in retirement. Many jobless older people are struggling with the cost of living. Many would work more if they could.

Instead of everyone working less, what seems to be happening is that experienced workers, in professions which are still in demand, are working more, while the young, the old, and those with skills which no longer attract investment have difficulty finding work.

MIT academics Andrew McAfee and Erik Brynjolfsson refer to this as the great decoupling. For many years, real GDP per capita and median income rose in tandem. Since the 1970s, wages as a percentage of GDP have fallen dramatically, while corporate profits as a percentage of GDP are now at their highest level, despite recurring economic shocks.

To put it simply, labour isn’t as important to growth as it used to be.

There is nothing in the economic outlook or current government policy settings which suggests this trend is going to change.

Automation, artificial intelligence and robotics are encroaching on more human occupations. The Committee for Economic Development of Australia (CEDA) has estimated that as many as 40 per cent of the jobs that are left are vulnerable to replacement by technology over the next decade.

No matter how many politicians chant the jobs mantra for the media, more productive jobs are going to disappear.

The terrible irony in this situation is that there is so much that needs to be done.

Among the underemployed graduates I personally know of, there is a psychologist, a soil chemist and a biodiversity specialist. Have we run out of things to do in the areas of mental health, agriculture and the environment?

Mental illness is widespread. Our food bowl is under threat from climate change. We have a mass extinction on our hands.

What we don’t have, apparently, is sufficient money to invest in making full use of the talent that is available to face these challenges.

Why? What failure of collective enterprise could result in this absurd incongruity?

Capital, like technology, is largely blind to human need. Capital goes where the profit is. If there was profit in healing minds and saving species, some of it would go there. While there is more profit in alcohol, gambling and deforestation, more of it will go there.

People don’t register their desire for a healthy society by shopping for it. Capital doesn’t get that signal through the market. The argument that consumers somehow direct the course of civilisation by choosing dolphin-friendly tuna and “eco” cleaning products is stupid and facile. The factors that most affect our destiny are not options in the supermarket.

If a healthy society is something we want, we have to act collectively. Since few people are active major shareholders, for the time being that task tends to fall to governments.

Whether enacted via direct spending, or by creating incentives for private investment, government initiatives are funded from collective surplus – in other words, tax revenue or borrowing against future earnings increases. Despite political spin to the contrary, our tax is low compared to the OECD as a proportion of GDP.

The great decoupling has coincided with rising inequality. Those with money to invest get rich. Those with only labour to sell miss out. Capital doesn’t like to pay for labour, and it doesn’t like to pay tax either.

But why, if our labour isn’t needed for profit, are we still working?

Faced with a looming crisis in social services, but committed ideologically to low taxation, successive Australian governments used tax concessions to turn superannuation and real estate – where most Australians keep their wealth – into a mini-capitalist alternative to social security.

Of course, this only works while people have jobs that provide super and sufficient income to buy housing. And it doesn’t help the real economy, the place where we apply technological innovation to produce things of real value, especially things we can export.

Nevertheless, one group of people enriched themselves through property investment, pushing up the value of real estate around the country in the process. Another group of people became affluent with nothing more than a job that paid super and a home in a good location.

With commodity revenue pouring in from overseas, it was easy to believe we had discovered some kind of magic prosperity formula. But the surplus generated from commodities mostly wasn’t invested back into productive activity. Instead it was turned into tax cuts and other benefits. These had broad electoral appeal but favoured the wealthy, and encouraged further speculation.

The real estate boom didn’t make the country richer. Nor did it make housing more accessible. It simply transferred wealth from one group of people to another. In the process, it put a basic need out of reach of many, including young people, and diverted investment from the productive economy. It also lured a huge number of Australians into precarious debt.

Contrary to popular opinion, encouraged by unscrupulous politics, we have relatively low government debt, but we now have the largest per capita private debt in the world.

So why are we still working? Because we’re in debt.

Middle-aged people are the ones working long hours. They’re also the ones buying houses. And they’re the ones with the most credit card debt as well.

The generation before them had affordable housing, job security and a real social safety net. They’re not so fortunate, but for the ones after them, a steady job with enough for a deposit has become a kind of Holy Grail, and social security is survival at best.

The current trend points to a time when a young graduate might start adult life with a HECS debt, go into credit card debt on a part-time job and a free internship, and eventually get into massive debt to own a flat her grandparents could have bought with ease.

She might even find a job in financial services, if they haven’t all been automated. It’s the sector that helps wealthy people turn their money into more money. It’s also where ordinary people go to borrow money for a house.

Debt is profitable. Even during the great decoupling, as productive jobs disappear, and real wages fall, it’s proven possible to harness the aspirations of ordinary people for profit, without any of the effort or intelligence required for developing new productive capacity, by simply enticing a greater proportion of personal income into servicing debt.

The mining boom is over. Not that it was ever as important as the miners like to claim. Manufacturing continues its long decline. The banks have been warned they are overexposed.

Whatever combination of policy levers is applied, we need to create the conditions that direct investment into producing things that we and the world need, while caring for our environment and our population. We don’t need to direct it in into unearned private wealth at the expense of our neighbours, our country and future generations.

Our current class of politicians has so far failed to even acknowledge our present circumstances, let alone articulate a credible vision for change. Many of them became rich from property investment. Our Prime Minister is a former banker.

Naturally, the people who’ve done well for themselves are reluctant to sacrifice their advantage. Nevertheless, we have to change the narrative around “wealth creation” from one which is essentially about personal enrichment from gaming the system, to one which is about mutual benefit through innovation and productivity.

Change has come, whether we like it or not. If we respond intelligently, taking advantage of the potential we have developed through our education system, we may very well end up working less, but not in a divided society, with many of us struggling to survive.

Forget Techno-Optimism: We Can’t Innovate Our Way Out of Inequality

thediplomat_2015-07-10_09-53-42-386x255

By Chris Lehmann

Source: In These Times

Toward the end of his 250-page hymn to digital-age innovation, The Industries of the Future, Alec Ross pauses to offer a rare cautionary note. Silicon Valley may have incubated all the wonders and conveniences one can imagine—and oh, so many more! But for the international business elites looking to remake their emerging market economies in the Valley’s gleaming, khaki-clad image, there’s some bad news: It can no longer be done. A “decades-long head start” has granted too great a competitive advantage to the charmed peninsula along the Northern California coast.

Not to worry, though! On-the-make tech globalists can still make a go of it, provided they’re prepared to embrace “specific cultural and labor market characteristics that can contradict both a society’s norms and the more controlling impulses of government leaders.”

Stripped of the vague and glowing techno-babble, this is a prescription for good old-fashioned neoliberal market discipline. Everywhere Ross looks across the radically transformed world of digital commerce, the benign logic of market triumphalism wins the day. When Terry Gou—the Taiwanese CEO of Foxconn, the vast Chinese electronics sweatshop that doubles as an incubator for worker suicides—plans to eliminate the headache of supervising an unstable human workforce by replacing it with “the first fully automated plant” in manufacturing history, why, he’s simply “responding to pure market forces”: i.e., an increase in Chinese wages that cuts into Foxconn’s ridiculously broad profit margins. And you and I might see the so-called sharing economy as a means to casualize service workers into nonunion, benefit-free gigs that transfer economic value on a massive scale to a rentier class of Silicon Valley app marketers. But bouncy New Economy cheerleaders like Ross see “a way of making a market out of anything, and a microentrepreneur out of anyone.”

When confronted with the spiraling of income inequality in the digital age, Ross, like countless other prophets of better living through software, sagely counsels that “rapid progress often comes with greater instability.” Sure, the “wealthy generally benefit over the short term,” but remember, kids: “Innovations have the potential to become cheaper over time and spread throughout the greater population.”

Ross first stormed into political prominence as an architect of Barack Obama’s “technology and innovation plan” during his 2008 presidential campaign, and he has spent four years captaining his own charmed, closed circle of tech triumphalism as the White House’s “senior advisor for innovation” under Secretary of State Hillary Clinton. This renders The Industries of the Future something more than another breathless, Tom Friedman-style tour of the wonderments being hatched in startups, trade confabs and gadget factories. Ross’ book is also a tech-policy playbook for the likely Democratic presidential nominee, who has spared no effort in soliciting the policy input—and landing the campaign donations—of the Silicon Valley mogul set. As such, it should give any Hillary-curious supporter of economic justice considerable pause.

To be sure, Ross raises some vague concerns about how, for example, the runaway growth of the sharing economy drains workers of job security, healthcare benefits, pensions and the like. He avers that “as the sharing economy grows … the safety net needs to grow with it,” but, much like his politically savvy boss, he offers nothing in the way of policy specifics besides the inarguable yet unactionable truism that if the sharing economy “generates enormous amounts of wealth for the platform owners, then the platform owners can and should help pay for added costs to society.”

The larger point for Ross, in any event, is that the innovative megafirms of tomorrow will come to spontaneously serve the public good. Not to mention that many IPO investors “are pension funds,” Ross coos, which “manage the retirement funds for people in the working class like teachers, police officers, and other civil servants.” Never mind, of course, that the neoliberal logic of the Uber model means that we’re creating a workforce that’s unlikely ever to come within shouting distance of a pension benefit again.

This kind of terminal Silicon Valley myopia also accounts for the vast economic and political blindspots that continually undermine Ross’ relentlessly chipper TED patter. To take just one instructive instance, in a book that devotes considerable real estate to the innovations of “fintech” (the streamlining of global digital currency exchanges and investment transactions) nowhere does the author acknowledge the pivotal role that tech-savvy Wall Street analysts—the “quants” as they’re known in Street argot—played in stoking the early-aughts housing bubble that led to the near-meltdown of the global economy.

That’s because it’s an axiomatic faith for this brand of techno-prophecy that innovation can never actually make anything worse—in just the same fashion that the quants were insisting, right up until the end, that there could never be a downturn in the national housing market. If this is the kind of wisdom Hillary Clinton relied on to promote her global innovation agenda at the State Department, one shudders to think of how it might run riot through the White House come next January.

Related Video:

Lara Trace Hentz

INDIAN COUNTRY NEWS

In Saner Thought

"It is the duty of every man, as far as his ability extends, to detect and expose delusion and error"..Thomas Paine

ZEDJournAI

Human in Algorithms

Rooster Crows

From the Roof Top

Aisle C

I See This

The Free

blog of the post capitalist transition.. Read or download the novel here + latest relevant posts

अध्ययन-अनुसन्धान(Essential Knowledge of the Overall Subject)

अध्ययन-अनुसन्धानको सार