After the Crash

Dispatches From a Long Recovery (Est. 10/2024)

After the Crash

The United States of Work

Employers exercise vast control over our lives, even when we’re not on the job. How did our bosses gain power that the government itself doesn’t hold?

By Miya Tokumitsu

Source: New Republic

Work no longer works. “You need to acquire more skills,” we tell young job seekers whose résumés at 22 are already longer than their parents’ were at 32. “Work will give you meaning,” we encourage people to tell themselves, so that they put in 60 hours or more per week on the job, removing them from other sources of meaning, such as daydreaming or social life. “Work will give you satisfaction,” we insist, even though it requires abiding by employers’ rules, and the unwritten rules of the market, for most of our waking hours. At the very least, work is supposed to be a means to earning an income. But if it’s possible to work full time and still live in poverty, what’s the point?

Even before the global financial crisis of 2008, it had become clear that if waged work is supposed to provide a measure of well-being and social structure, it has failed on its own terms. Real household wages in the United States have remained stagnant since the 1970s, even as the costs of university degrees and other credentials rise. Young people find an employment landscape defined by unpaid internships, temporary work, and low pay. The glut of degree-holding young workers has pushed many of them into the semi- or unskilled labor force, making prospects even narrower for non–degree holders. Entry-level wages for high school graduates have in fact fallen. According to a study by the Federal Reserve Bank of New York, these lost earnings will depress this generation’s wages for their entire working lives. Meanwhile, those at the very top—many of whom derive their wealth not from work, but from returns on capital—vacuum up an ever-greater share of prosperity.

Against this bleak landscape, a growing body of scholarship aims to overturn our culture’s deepest assumptions about how work confers wealth, meaning, and care throughout society. In Private Government: How Employers Rule Our Lives (and Why We Don’t Talk About It), Elizabeth Anderson, a professor of philosophy at the University of Michigan, explores how the discipline of work has itself become a form of tyranny, documenting the expansive power that firms now wield over their employees in everything from how they dress to what they tweet. James Livingston, a historian at Rutgers, goes one step further in No More Work: Why Full Employment Is a Bad Idea. Instead of insisting on jobs for all or proposing that we hold employers to higher standards, Livingston argues, we should just scrap work altogether.

Livingston’s vision is the more radical of the two; his book is a wide-ranging polemic that frequently delivers the refrain “Fuck work.” But in original ways, both books make a powerful claim: that our lives today are ruled, above all, by work. We can try to convince ourselves that we are free, but as long as we must submit to the increasing authority of our employers and the labor market, we are not. We therefore fancy that we want to work, that work grounds our character, that markets encompass the possible. We are unable to imagine what a full life could be, much less to live one. Even more radically, both books highlight the dramatic and alarming changes that work has undergone over the past century—insisting that, in often unseen ways, the changing nature of work threatens the fundamental ideals of democracy: equality and freedom.

Anderson’s most provocative argument is that large companies, the institutions that employ most workers, amount to a de facto form of government, exerting massive and intrusive power in our daily lives. Unlike the state, these private governments are able to wield power with little oversight, because the executives and boards of directors that rule them are accountable to no one but themselves. Although they exercise their power to varying degrees and through both direct and “soft” means, employers can dictate how we dress and style our hair, when we eat, when (and if) we may use the toilet, with whom we may partner and under what arrangements. Employers may subject our bodies to drug tests; monitor our speech both on and off the job; require us to answer questionnaires about our exercise habits, off-hours alcohol consumption, and childbearing intentions; and rifle through our belongings. If the state held such sweeping powers, Anderson argues, we would probably not consider ourselves free men and women.

Employees, meanwhile, have few ways to fight back. Yes, they may leave the company, but doing so usually necessitates being unemployed or migrating to another company and working under similar rules. Workers may organize, but unions have been so decimated in recent years that their clout is greatly diminished. What’s more, employers are swift to fire anyone they suspect of speaking to their colleagues about organizing, and most workers lack the time and resources to mount a legal challenge to wrongful termination.

It wasn’t supposed to be this way. As corporations have worked methodically to amass sweeping powers over their employees, they have held aloft the beguiling principle of individual freedom, claiming that only unregulated markets can guarantee personal liberty. Instead, operating under relatively few regulations themselves, these companies have succeeded at imposing all manner of regulation on their employees. That is to say, they use the language of individual liberty to claim that corporations require freedom to treat workers as they like.

Anderson sets out to discredit such arguments by tracing them back to their historical origins. The notion that personal freedom is rooted in free markets, for instance, originated with the Levellers in seventeenth-century England, when working conditions differed substantially from today’s. The Levellers believed that a market society was essential to liberate individuals from the remnants of feudal hierarchies; their vision of utopia was a world in which men could meet and interact on terms of equality and dignity. Their ideas echoed through the writing and politics of later figures like John Locke, Adam Smith, Thomas Paine, and Abraham Lincoln, all of whom believed that open markets could provide the essential infrastructure for individuals to shape their own destiny.

An anti-statist streak runs through several of these thinkers, particularly the Levellers and Paine, who viewed markets as the bulwark against state oppression. Paine and Smith, however, would hardly qualify as hard-line contemporary libertarians. Smith believed that public education was essential to a fair market society, and Paine proposed a system of social insurance that included old-age pensions as well as survivor and disability benefits. Their hope was not for a world of win-or-die competition, but one in which open markets would allow individuals to make the fullest use of their talents, free from state monopolies and meddlesome bosses.

For Anderson, the latter point is essential; the notion of lifelong employment under a boss was anathema to these earlier visions of personal freedom. Writing in the 1770s, Smith assumes that independent actors in his market society will be self-employed, and uses butchers and bakers as his exemplars; his “pin factory,” meant to illustrate division of labor, employs only ten people. These thinkers could not envision a world in which most workers spend most of their lives performing wage labor under a single employer. In an address before the Wisconsin State Agricultural Society in 1859, Lincoln stated, “The prudent, penniless beginner in the world labors for wages awhile, saves a surplus with which to buy tools or land for himself, then labors on his own account another while, and at length hires another new beginner to help him.” In other words, even well into the nineteenth century, defenders of an unregulated market society viewed wage labor as a temporary stage on the way to becoming a proprietor.

Lincoln’s scenario does not reflect the way most people work today. Yet the “small business owner” endures as an American stock character, conjured by politicians to push through deregulatory measures that benefit large corporations. In reality, thanks to a lack of guaranteed, nationalized health care and threadbare welfare benefits, setting up a small business is simply too risky a venture for many Americans, who must rely on their employers for health insurance and income. These conditions render long-term employment more palatable than a precarious existence of freelance gigs, which further gives companies license to oppress their employees.

The modern relationship between employer and employee began with the rise of large-scale companies in the nineteenth century. Although employment contracts date back to the Middle Ages, preindustrial arrangements bore little resemblance to the documents we know today. Like modern employees, journeymen and apprentices often served their employers for years, but masters performed the same or similar work in proximity to their subordinates. As a result, Anderson points out, working conditions—the speed required of workers and the hazards to which they might be exposed—were kept in check by what the masters were willing to tolerate for themselves.

The Industrial Revolution brought radical changes, as companies grew ever larger and management structures more complex. “Employers no longer did the same kind of work as employees, if they worked at all,” Anderson observes. “Mental labor was separated from manual labor, which was radically deskilled.” Companies multiplied rapidly in size. Labor contracts now bonded workers to massive organizations in which discipline, briefs, and decrees flowed downward, but whose leaders were unreachable by ordinary workers. Today, fast food workers or bank tellers would be hard-pressed to petition their CEOs at McDonald’s or Wells Fargo in person.

Despite this, we often speak of employment contracts as agreements between equals, as if we are living in Adam Smith’s eighteenth-century dream world. In a still-influential paper from 1937 titled “The Nature of the Firm,” the economist and Nobel laureate Ronald Coase established himself as an early observer and theorist of corporate concerns. He described the employment contract not as a document that handed the employer unaccountable powers, but as one that circumscribed those powers. In signing a contract, the employee “agrees to obey the directions of an entrepreneur within certain limits,” he emphasized. But such characterizations, as Anderson notes, do not reflect reality; most workers agree to employment without any negotiation or even communication about their employer’s power or its limits. The exceptions to this rule are few and notable: top professional athletes, celebrity entertainers, superstar academics, and the (increasingly small) groups of workers who are able to bargain collectively.

Yet because employment contracts create the illusion that workers and companies have arrived at a mutually satisfying agreement, the increasingly onerous restrictions placed on modern employees are often presented as “best practices” and “industry standards,” framing all sorts of behaviors and outcomes as things that ought to be intrinsically desired by workers themselves. Who, after all, would not want to work on something in the “best” way? Beyond employment contracts, companies also rely on social pressure to foster obedience: If everyone in the office regularly stays until seven o’clock every night, who would risk departing at five, even if it’s technically allowed? Such social prods exist alongside more rigid behavioral codes that dictate everything from how visible an employee’s tattoo can be to when and how long workers can break for lunch.

Many workers, in fact, have little sense of the legal scope of their employer’s power. Most would be shocked to discover that they could be fired for being too attractive, declining to attend a political rally favored by their employer, or finding out that their daughter was raped by a friend of the boss—all real-life examples cited by Anderson. Indeed, it is only after dismissal for such reasons that many workers learn of the sweeping breadth of at-will employment, the contractual norm that allows American employers to fire workers without warning and without cause, except for reasons explicitly deemed illegal.

In reality, the employment landscape is even more dire than Anderson outlines. The rise of staffing or “temp” agencies, for example, undercuts the very idea of a direct relationship between worker and employer. In The Temp Economy: From Kelly Girls to Permatemps in Postwar America, sociologist Erin Hatton notes that millions of workers now labor under subcontracting arrangements, which give employers even greater latitude to abuse employees. For years, Walmart—America’s largest retailer—used a subcontracting firm to hire hundreds of cleaners, many from Eastern Europe, who worked for months on end without overtime pay or a single day off. After federal agents raided dozens of Walmarts and arrested the cleaners as illegal immigrants, company executives used the subcontracting agreement to shirk responsibility for their exploitation of the cleaners, claiming they had no knowledge of their immigration status or conditions.

By any reasonable standard, much “temp” work is not even temporary. Employees sometimes work for years in a single workplace, even through promotions, without ever being granted official status as an employee. Similarly, “gig economy” platforms like Uber designate their workers as contractors rather than employees, a distinction that exempts the company from paying them minimum wage and overtime. Many “permatemps” and contractors perform the same work as employees, yet lack even the paltry protections and benefits awarded to full-time workers.

A weak job market, paired with the increasing precarity of work, means that more and more workers are forced to make their living by stringing together freelance assignments or winning fixed-term contracts, subjecting those workers to even more rules and restrictions. On top of their actual jobs, contractors and temp workers must do the additional work of appearing affable and employable not just on the job, but during their ongoing efforts to secure their next gig. Constantly pitching, writing up applications, and personal branding on social media requires a level of self-censorship, lest a controversial tweet or compromising Facebook photo sink their job prospects. Forced to anticipate the wishes not of a specific employer, but of all potential future employers, many opt out of participating in social media or practicing politics in any visible capacity. Their public personas are shaped not by their own beliefs and desires, but by the demands of the labor market.


For Livingston, it’s not just employers but work itself that is the problem. We toil because we must, but also because our culture has trained us to see work as the greatest enactment of our dignity and personal character. Livingston challenges us to turn away from such outmoded ideas, rooted in Protestant ideals. Like Anderson, he sweeps through centuries of labor theory with impressive efficiency, from Marx and Hegel to Freud and Lincoln, whose 1859 speech he also quotes. Livingston centers on these thinkers because they all found the connection between work and virtue troubling. Hegel believed that work causes individuals to defer their desires, nurturing a “slave morality.” Marx proposed that “real freedom came after work.” And Freud understood the Protestant work ethic as “the symptom of repression, perhaps even regression.”

Nor is it practical, Livingston argues, to exalt work: There are simply not enough jobs to keep most adults employed at a living wage, given the rise of automation and increases in productivity. Besides, the relation between income and work is arbitrary. Cooking dinner for your family is unpaid work, while cooking dinner for strangers usually comes with a paycheck. There’s nothing inherently different in the labor involved—only in the compensation. Anderson argues that work impedes individual freedom; Livingston points out that it rarely pays enough. As technological advances continue to weaken the demand for human labor, wages will inevitably be driven down even further. Instead of idealizing work and making it the linchpin of social organization, Livingston suggests, why not just get rid of it?

Livingston belongs to a cadre of thinkers, including Kathi Weeks, Nick Srnicek, and Alex Williams, who believe that we should strive for a “postwork” society in one form or another. Strands of this idea go back at least as far as Keynes’s 1930 essay on “Economic Possibilities for our Grandchildren.” Not only would work be eliminated or vastly reduced by technology, Keynes predicted, but we would also be unburdened spiritually. Devotion to work was, he deemed, one of many “pseudo-moral principles” that “exalted some of the most distasteful of human qualities into the position of the highest virtues.”

Since people in this new world would no longer have to earn a salary, they would, Livingston envisions, receive some kind of universal basic income. UBI is a slippery concept, adaptable to both the socialist left and libertarian right, but it essentially entails distributing a living wage to every member of society. In most conceptualizations, the income is indeed basic—no cases of Dom Pérignon—and would cover the essentials like rent and groceries. Individuals would then be free to choose whether and how much they want to work to supplement the UBI. Leftist proponents tend to advocate pairing UBI with a strong welfare state to provide nationalized health care, tuition-free education, and other services. Some libertarians view UBI as a way to pare down the welfare state, arguing that it’s better simply to give people money to buy food and health care directly, rather than forcing them to engage with food stamp and Medicaid bureaucracies.

According to Livingston, we are finally on the verge of this postwork society because of automation. Robots are now advanced enough to take over complex jobs in areas like agriculture and mining, eliminating the need for humans to perform dangerous or tedious tasks. In practice, however, automation is a double-edged sword, with the capacity to oppress as well as unburden. Machines often accelerate the rate at which humans can work, taxing rather than liberating them. Conveyor belts eliminated the need for workers to pass unfinished products along to their colleagues—but as Charlie Chaplin and Lucille Ball so hilariously demonstrated, the belts also increased the pace at which those same workers needed to turn wrenches and wrap chocolates. In retail and customer service, a main function of automation has been not to eliminate work, but to eliminate waged work, transferring much of the labor onto consumers, who must now weigh and code their own vegetables at the supermarket, check out their own library books, and tag their own luggage at the airport.

At the same time, it may be harder to automate some jobs that require a human touch, such as floristry or hairstyling. The same goes for the delicate work of caring for the young, sick, elderly, or otherwise vulnerable. In today’s economy, the demand for such labor is rising rapidly: “Nine of the twelve fastest-growing fields,” The New York Times reported earlier this year, “are different ways of saying ‘nurse.’” These jobs also happen to be low-paying, emotionally and physically grueling, dirty, hazardous, and shouldered largely by women and immigrants. Regardless of whether employment is virtuous or not, our immediate goal should perhaps be to distribute the burdens of caregiving, since such work is essential to the functioning of society and benefits us all.


A truly work-free world is one that would entail a revolution from our present social organizations. We could no longer conceive of welfare as a last resort—as the “safety net” metaphor implies—but would be forced to treat it as an unremarkable and universal fact of life. This alone would require us to support a massive redistribution of wealth, and to reclaim our political institutions from the big-money interests that are allergic to such changes. Tall orders indeed—but as Srnicek and Williams remind us in their book, Inventing the Future: Postcapitalism and a World Without Work, neoliberals pulled off just such a revolution in the postwar years. Thanks to their efforts, free-market liberalism replaced Keynesianism as the political and economic common sense all around the world.

Another possible solution to the current miseries of unemployment and worker exploitation is the one Livingston rejects in his title: full employment. For anti-work partisans, full employment takes us in the wrong direction, and UBI corrects the course. But the two are not mutually exclusive. In fact, rather than creating new jobs, full employment could require us to reduce our work hours drastically and spread them throughout the workforce—a scheme that could radically de-center waged work in our lives. A dual strategy of pursuing full employment while also demanding universal benefits—including health care, childcare, and affordable housing—would maximize workers’ bargaining power to ensure that they, and not just owners of capital, actually get to enjoy the bounty of labor-saving technology.

Nevertheless, Livingston’s critiques of full employment are worth heeding. As with automation, it can all go wrong if we use the banner of full employment to create pointless roles—what David Graeber has termed “bullshit jobs,” in which workers sit in some soul-sucking basement office for eight hours a day—or harmful jobs, like building nuclear weapons. If we do not have a deliberate politics rooted in universal social justice, then full employment, a basic income, and automation will not liberate us from the degradations of work.

Both Livingston and Anderson reveal how much of our own power we’ve already ceded in making waged work the conduit for our ideals of liberty and morality. The scale and coordination of the institutions we’re up against in the fight for our emancipation is, as Anderson demonstrates, staggering. Employers hold the means to our well-being, and they have the law on their side. Individual efforts to achieve a better “work-life balance” for ourselves and our families miss the wider issue we face as waged employees. Livingston demonstrates the scale at which we should be thinking: Our demands should be revolutionary, our imaginations wide. Standing amid the wreckage of last year’s presidential election, what other choice do we have?

 

Miya Tokumitsu is a lecturer of art history at the University of Melbourne and a contributing editor at Jacobin. She is the author of Do What You Love.  And Other Lies about Success and Happiness.

Fuck Work

 tumblr_mkr6eleomu1qiyurho1_r2_500

Economists believe in full employment. Americans think that work builds character. But what if jobs aren’t working anymore?

By James Livingston

Source: aeon

Work means everything to us Americans. For centuries – since, say, 1650 – we’ve believed that it builds character (punctuality, initiative, honesty, self-discipline, and so forth). We’ve also believed that the market in labour, where we go to find work, has been relatively efficient in allocating opportunities and incomes. And we’ve believed that, even if it sucks, a job gives meaning, purpose and structure to our everyday lives – at any rate, we’re pretty sure that it gets us out of bed, pays the bills, makes us feel responsible, and keeps us away from daytime TV.

These beliefs are no longer plausible. In fact, they’ve become ridiculous, because there’s not enough work to go around, and what there is of it won’t pay the bills – unless of course you’ve landed a job as a drug dealer or a Wall Street banker, becoming a gangster either way.

These days, everybody from Left to Right – from the economist Dean Baker to the social scientist Arthur C Brooks, from Bernie Sanders to Donald Trump – addresses this breakdown of the labour market by advocating ‘full employment’, as if having a job is self-evidently a good thing, no matter how dangerous, demanding or demeaning it is. But ‘full employment’ is not the way to restore our faith in hard work, or in playing by the rules, or in whatever else sounds good. The official unemployment rate in the United States is already below 6 per cent, which is pretty close to what economists used to call ‘full employment’, but income inequality hasn’t changed a bit. Shitty jobs for everyone won’t solve any social problems we now face.

Don’t take my word for it, look at the numbers. Already a fourth of the adults actually employed in the US are paid wages lower than would lift them above the official poverty line – and so a fifth of American children live in poverty. Almost half of employed adults in this country are eligible for food stamps (most of those who are eligible don’t apply). The market in labour has broken down, along with most others.

Those jobs that disappeared in the Great Recession just aren’t coming back, regardless of what the unemployment rate tells you – the net gain in jobs since 2000 still stands at zero – and if they do return from the dead, they’ll be zombies, those contingent, part-time or minimum-wage jobs where the bosses shuffle your shift from week to week: welcome to Wal-Mart, where food stamps are a benefit.

And don’t tell me that raising the minimum wage to $15 an hour solves the problem. No one can doubt the moral significance of the movement. But at this rate of pay, you pass the official poverty line only after working 29 hours a week. The current federal minimum wage is $7.25. Working a 40-hour week, you would have to make $10 an hour to reach the official poverty line. What, exactly, is the point of earning a paycheck that isn’t a living wage, except to prove that you have a work ethic?

But, wait, isn’t our present dilemma just a passing phase of the business cycle? What about the job market of the future? Haven’t the doomsayers, those damn Malthusians, always been proved wrong by rising productivity, new fields of enterprise, new economic opportunities? Well, yeah – until now, these times. The measurable trends of the past half-century, and the plausible projections for the next half-century, are just too empirically grounded to dismiss as dismal science or ideological hokum. They look like the data on climate change – you can deny them if you like, but you’ll sound like a moron when you do.

For example, the Oxford economists who study employment trends tell us that almost half of existing jobs, including those involving ‘non-routine cognitive tasks’ – you know, like thinking – are at risk of death by computerisation within 20 years. They’re elaborating on conclusions reached by two MIT economists in the book Race Against the Machine (2011). Meanwhile, the Silicon Valley types who give TED talks have started speaking of ‘surplus humans’ as a result of the same process – cybernated production. Rise of the Robots, a new book that cites these very sources, is social science, not science fiction.

So this Great Recession of ours – don’t kid yourself, it ain’t over – is a moral crisis as well as an economic catastrophe. You might even say it’s a spiritual impasse, because it makes us ask what social scaffolding other than work will permit the construction of character – or whether character itself is something we must aspire to. But that is why it’s also an intellectual opportunity: it forces us to imagine a world in which the job no longer builds our character, determines our incomes or dominates our daily lives.

In short, it lets us say: enough already. Fuck work.

Certainly this crisis makes us ask: what comes after work? What would you do without your job as the external discipline that organises your waking life – as the social imperative that gets you up and on your way to the factory, the office, the store, the warehouse, the restaurant, wherever you work and, no matter how much you hate it, keeps you coming back? What would you do if you didn’t have to work to receive an income?

And what would society and civilisation be like if we didn’t have to ‘earn’ a living – if leisure was not our choice but our lot? Would we hang out at the local Starbucks, laptops open? Or volunteer to teach children in less-developed places, such as Mississippi? Or smoke weed and watch reality TV all day?

I’m not proposing a fancy thought experiment here. By now these are practical questions because there aren’t enough jobs. So it’s time we asked even more practical questions. How do you make a living without a job – can you receive income without working for it? Is it possible, to begin with and then, the hard part, is it ethical? If you were raised to believe that work is the index of your value to society – as most of us were – would it feel like cheating to get something for nothing?

We already have some provisional answers because we’re all on the dole, more or less. The fastest growing component of household income since 1959 has been ‘transfer payments’ from government. By the turn of the 21st century, 20 per cent of all household income came from this source – from what is otherwise known as welfare or ‘entitlements’. Without this income supplement, half of the adults with full-time jobs would live below the poverty line, and most working Americans would be eligible for food stamps.

But are these transfer payments and ‘entitlements’ affordable, in either economic or moral terms? By continuing and enlarging them, do we subsidise sloth, or do we enrich a debate on the rudiments of the good life?

Transfer payments or ‘entitlements’, not to mention Wall Street bonuses (talk about getting something for nothing) have taught us how to detach the receipt of income from the production of goods, but now, in plain view of the end of work, the lesson needs rethinking. No matter how you calculate the federal budget, we can afford to be our brother’s keeper. The real question is not whether but how we choose to be.

I know what you’re thinking – we can’t afford this! But yeah, we can, very easily. We raise the arbitrary lid on the Social Security contribution, which now stands at $127,200, and we raise taxes on corporate income, reversing the Reagan Revolution. These two steps solve a fake fiscal problem and create an economic surplus where we now can measure a moral deficit.

Of course, you will say – along with every economist from Dean Baker to Greg Mankiw, Left to Right – that raising taxes on corporate income is a disincentive to investment and thus job creation. Or that it will drive corporations overseas, where taxes are lower.

But in fact raising taxes on corporate income can’t have these effects.

Let’s work backward. Corporations have been ‘multinational’ for quite some time. In the 1970s and ’80s, before Ronald Reagan’s signature tax cuts took effect, approximately 60 per cent of manufactured imported goods were produced offshore, overseas, by US companies. That percentage has risen since then, but not by much.

Chinese workers aren’t the problem – the homeless, aimless idiocy of corporate accounting is. That is why the Citizens United decision of 2010 applying freedom of speech regulations to campaign spending is hilarious. Money isn’t speech, any more than noise is. The Supreme Court has conjured a living being, a new person, from the remains of the common law, creating a real world more frightening than its cinematic equivalent: say, Frankenstein, Blade Runner or, more recently, Transformers.

But the bottom line is this. Most jobs aren’t created by private, corporate investment, so raising taxes on corporate income won’t affect employment. You heard me right. Since the 1920s, economic growth has happened even though net private investment has atrophied. What does that mean? It means that profits are pointless except as a way of announcing to your stockholders (and hostile takeover specialists) that your company is a going concern, a thriving business. You don’t need profits to ‘reinvest’, to finance the expansion of your company’s workforce or output, as the recent history of Apple and most other corporations has amply demonstrated.

So investment decisions by CEOs have only a marginal effect on employment. Taxing the profits of corporations to finance a welfare state that permits us to love our neighbours and to be our brothers’ keeper is not an economic problem. It’s something else – it’s an intellectual issue, a moral conundrum.

When we place our faith in hard work, we’re wishing for the creation of character; but we’re also hoping, or expecting, that the labour market will allocate incomes fairly and rationally. And there’s the rub, they do go together. Character can be created on the job only when we can see that there’s an intelligible, justifiable relation between past effort, learned skills and present reward. When I see that your income is completely out of proportion to your production of real value, of durable goods the rest of us can use and appreciate (and by ‘durable’ I don’t mean just material things), I begin to doubt that character is a consequence of hard work.

When I see, for example, that you’re making millions by laundering drug-cartel money (HSBC), or pushing bad paper on mutual fund managers (AIG, Bear Stearns, Morgan Stanley, Citibank), or preying on low-income borrowers (Bank of America), or buying votes in Congress (all of the above) – just business as usual on Wall Street – while I’m barely making ends meet from the earnings of my full-time job, I realise that my participation in the labour market is irrational. I know that building my character through work is stupid because crime pays. I might as well become a gangster like you.

That’s why an economic crisis such as the Great Recession is also a moral problem, a spiritual impasse – and an intellectual opportunity. We’ve placed so many bets on the social, cultural and ethical import of work that when the labour market fails, as it so spectacularly has, we’re at a loss to explain what happened, or to orient ourselves to a different set of meanings for work and for markets.

And by ‘we’ I mean pretty much all of us, Left to Right, because everybody wants to put Americans back to work, one way or another – ‘full employment’ is the goal of Right-wing politicians no less than Left-wing economists. The differences between them are over means, not ends, and those ends include intangibles such as the acquisition of character.

Which is to say that everybody has doubled down on the benefits of work just as it reaches a vanishing point. Securing ‘full employment’ has become a bipartisan goal at the very moment it has become both impossible and unnecessary. Sort of like securing slavery in the 1850s or segregation in the 1950s.

Why?

Because work means everything to us inhabitants of modern market societies – regardless of whether it still produces solid character and allocates incomes rationally, and quite apart from the need to make a living. It’s been the medium of most of our thinking about the good life since Plato correlated craftsmanship and the possibility of ideas as such. It’s been our way of defying death, by making and repairing the durable things, the significant things we know will last beyond our allotted time on earth because they teach us, as we make or repair them, that the world beyond us – the world before and after us – has its own reality principles.

Think about the scope of this idea. Work has been a way of demonstrating differences between males and females, for example by merging the meanings of fatherhood and ‘breadwinner’, and then, more recently, prying them apart. Since the 17th century, masculinity and femininity have been defined – not necessarily achieved – by their places in a moral economy, as working men who got paid wages for their production of value on the job, or as working women who got paid nothing for their production and maintenance of families. Of course, these definitions are now changing, as the meaning of ‘family’ changes, along with profound and parallel changes in the labour market – the entry of women is just one of those – and in attitudes toward sexuality.

When work disappears, the genders produced by the labour market are blurred. When socially necessary labour declines, what we once called women’s work – education, healthcare, service – becomes our basic industry, not a ‘tertiary’ dimension of the measurable economy. The labour of love, caring for one another and learning how to be our brother’s keeper – socially beneficial labour – becomes not merely possible but eminently necessary, and not just within families, where affection is routinely available. No, I mean out there, in the wide, wide world.

Work has also been the American way of producing ‘racial capitalism’, as the historians now call it, by means of slave labour, convict labour, sharecropping, then segregated labour markets – in other words, a ‘free enterprise system’ built on the ruins of black bodies, an economic edifice animated, saturated and determined by racism. There never was a free market in labour in these united states. Like every other market, it was always hedged by lawful, systematic discrimination against black folk. You might even say that this hedged market produced the still-deployed stereotypes of African-American laziness, by excluding black workers from remunerative employment, confining them to the ghettos of the eight-hour day.

And yet, and yet. Though work has often entailed subjugation, obedience and hierarchy (see above), it’s also where many of us, probably most of us, have consistently expressed our deepest human desire, to be free of externally imposed authority or obligation, to be self-sufficient. We have defined ourselves for centuries by what we do, by what we produce.

But by now we must know that this definition of ourselves entails the principle of productivity – from each according to his abilities, to each according to his creation of real value through work – and commits us to the inane idea that we’re worth only as much as the labour market can register, as a price. By now we must also know that this principle plots a certain course to endless growth and its faithful attendant, environmental degradation.

Until now, the principle of productivity has functioned as the reality principle that made the American Dream seem plausible. ‘Work hard, play by the rules, get ahead’, or, ‘You get what you pay for, you make your own way, you rightly receive what you’ve honestly earned’ – such homilies and exhortations used to make sense of the world. At any rate they didn’t sound delusional. By now they do.

Adherence to the principle of productivity therefore threatens public health as well as the planet (actually, these are the same thing). By committing us to what is impossible, it makes for madness. The Nobel Prize-winning economist Angus Deaton said something like this when he explained anomalous mortality rates among white people in the Bible Belt by claiming that they’ve ‘lost the narrative of their lives’ – by suggesting that they’ve lost faith in the American Dream. For them, the work ethic is a death sentence because they can’t live by it.

So the impending end of work raises the most fundamental questions about what it means to be human. To begin with, what purposes could we choose if the job – economic necessity – didn’t consume most of our waking hours and creative energies? What evident yet unknown possibilities would then appear? How would human nature itself change as the ancient, aristocratic privilege of leisure becomes the birthright of human beings as such?

Sigmund Freud insisted that love and work were the essential ingredients of healthy human being. Of course he was right. But can love survive the end of work as the willing partner of the good life? Can we let people get something for nothing and still treat them as our brothers and sisters – as members of a beloved community? Can you imagine the moment when you’ve just met an attractive stranger at a party, or you’re online looking for someone, anyone, but you don’t ask: ‘So, what do you do?’

We won’t have any answers until we acknowledge that work now means everything to us – and that hereafter it can’t.

The Age of Authoritarianism: Government of the Politicians, by the Military, for the Corporations

what-corporate-america-wants

By John W. Whitehead

Source: The Rutherford Institute

“I was astonished, bewildered. This was America, a country where, whatever its faults, people could speak, write, assemble, demonstrate without fear. It was in the Constitution, the Bill of Rights. We were a democracy… But I knew it wasn’t a dream; there was a painful lump on the side of my head… The state and its police were not neutral referees in a society of contending interests. They were on the side of the rich and powerful. Free speech? Try it and the police will be there with their horses, their clubs, their guns, to stop you. From that moment on, I was no longer a liberal, a believer in the self-correcting character of American democracy. I was a radical, believing that something fundamental was wrong in this country—not just the existence of poverty amidst great wealth, not just the horrible treatment of black people, but something rotten at the root. The situation required not just a new president or new laws, but an uprooting of the old order, the introduction of a new kind of society—cooperative, peaceful, egalitarian.” ― Historian Howard Zinn

America is at a crossroads.

History may show that from this point forward, we will have left behind any semblance of constitutional government and entered into a militaristic state where all citizens are suspects and security trumps freedom.

Certainly, this is a time when government officials operate off their own inscrutable, self-serving playbook with little in the way of checks and balances, while American citizens are subjected to all manner of indignities and violations with little hope of defending themselves.

As I make clear in my book Battlefield America: The War on the American People, we have moved beyond the era of representative government and entered a new age—the age of authoritarianism. Even with its constantly shifting terrain, this topsy-turvy travesty of law and government has become America’s new normal.

Don’t believe me?

Let me take you on a brief guided tour, but prepare yourself. The landscape is particularly disheartening to anyone who remembers what America used to be.

The Executive Branch: Whether it’s the Obama administration’s war on whistleblowers, the systematic surveillance of journalists and regular citizens, the continued operation of Guantanamo Bay, or the occupation of Afghanistan, Barack Obama has surpassed his predecessors in terms of his abuse of the Constitution and the rule of law. President Obama, like many of his predecessors, has routinely disregarded the Constitution when it has suited his purposes, operating largely above the law and behind a veil of secrecy, executive orders and specious legal justifications. Rest assured that no matter who wins this next presidential election, very little will change. The policies of the American police state will continue.

The Legislative Branch:  It is not overstating matters to say that Congress may well be the most self-serving, semi-corrupt institution in America. Abuses of office run the gamut from elected representatives neglecting their constituencies to engaging in self-serving practices, including the misuse of eminent domain, earmarking hundreds of millions of dollars in federal contracting in return for personal gain and campaign contributions, having inappropriate ties to lobbyist groups and incorrectly or incompletely disclosing financial information. Pork barrel spending, hastily passed legislation, partisan bickering, a skewed work ethic, graft and moral turpitude have all contributed to the public’s increasing dissatisfaction with congressional leadership. No wonder 86 percent of Americans disapprove of the job Congress is doing.

The Judicial Branch: The Supreme Court was intended to be an institution established to intervene and protect the people against the government and its agents when they overstep their bounds. Yet through their deference to police power, preference for security over freedom, and evisceration of our most basic rights for the sake of order and expediency, the justices of the United States Supreme Court have become the guardians of the American police state in which we now live. As a result, sound judgment and justice have largely taken a back seat to legalism, statism and elitism, while preserving the rights of the people has been deprioritized and made to play second fiddle to both governmental and corporate interests.

Shadow Government: America’s next president will inherit more than a bitterly divided nation teetering on the brink of financial catastrophe when he or she assumes office. He or she will also inherit a shadow government, one that is fully operational and staffed by unelected officials who are, in essence, running the country. Referred to as the Deep State, this shadow government is comprised of unelected government bureaucrats, corporations, contractors, paper-pushers, and button-pushers who are actually calling the shots behind the scenes right now.

Law Enforcement: By and large the term “law enforcement” encompasses all agents within a militarized police state, including the military, local police, and the various agencies such as the Secret Service, FBI, CIA, NSA, etc. Having been given the green light to probe, poke, pinch, taser, search, seize, strip and generally manhandle anyone they see fit in almost any circumstance, all with the general blessing of the courts, America’s law enforcement officials, no longer mere servants of the people entrusted with keeping the peace but now extensions of the military, are part of an elite ruling class dependent on keeping the masses corralled, under control, and treated like suspects and enemies rather than citizens. In the latest move to insulate police from charges of misconduct, Virginia lawmakers are considering legislation to keep police officers’ names secret, ostensibly creating secret police forces.

A Suspect Surveillance Society: Every dystopian sci-fi film we’ve ever seen is suddenly converging into this present moment in a dangerous trifecta between science, technology and a government that wants to be all-seeing, all-knowing and all-powerful. By tapping into your phone lines and cell phone communications, the government knows what you say. By uploading all of your emails, opening your mail, and reading your Facebook posts and text messages, the government knows what you write. By monitoring your movements with the use of license plate readers, surveillance cameras and other tracking devices, the government knows where you go. By churning through all of the detritus of your life—what you read, where you go, what you say—the government can predict what you will do. By mapping the synapses in your brain, scientists—and in turn, the government—will soon know what you remember. And by accessing your DNA, the government will soon know everything else about you that they don’t already know: your family chart, your ancestry, what you look like, your health history, your inclination to follow orders or chart your own course, etc. Consequently, in the face of DNA evidence that places us at the scene of a crime, behavior sensing technology that interprets our body temperature and facial tics as suspicious, and government surveillance devices that cross-check our biometricslicense plates and DNA against a growing database of unsolved crimes and potential criminals, we are no longer “innocent until proven guilty.”

Military Empire: America’s endless global wars and burgeoning military empire—funded by taxpayer dollars—have depleted our resources, over-extended our military and increased our similarities to the Roman Empire and its eventual demise. The U.S. now operates approximately 800 military bases in foreign countries around the globe at an annual cost of at least $156 billion. The consequences of financing a global military presence are dire. In fact, David Walker, former comptroller general of the U.S., believes there are “striking similarities” between America’s current situation and the factors that contributed to the fall of Rome, including “declining moral values and political civility at home, an over-confident and over-extended military in foreign lands and fiscal irresponsibility by the central government.”

I haven’t even touched on the corporate state, the military industrial complex, SWAT team raids, invasive surveillance technology, zero tolerance policies in the schools, overcriminalization, or privatized prisons, to name just a few, but what I have touched on should be enough to show that the landscape of our freedoms has already changed dramatically from what it once was and will no doubt continue to deteriorate unless Americans can find a way to wrest back control of their government and reclaim their freedoms.

That brings me to the final and most important factor in bringing about America’s shift into authoritarianism: “we the people.” We are the government. Thus, if the government has become a tyrannical agency, it is because we have allowed it to happen, either through our inaction or our blind trust.

Essentially, there are four camps of thought among the citizenry when it comes to holding the government accountable. Which camp you fall into says a lot about your view of government—or, at least, your view of whichever administration happens to be in power at the time.

In the first camp are those who trust the government to do the right thing, despite the government’s repeated failures in this department. In the second camp are those who not only don’t trust the government but think the government is out to get them. In the third camp are those who see government neither as an angel nor a devil, but merely as an entity that needs to be controlled, or as Thomas Jefferson phrased it, bound “down from mischief with the chains of the Constitution.”

Then there’s the fourth camp, comprised of individuals who pay little to no attention to the workings of government, so much so that they barely vote, let alone know who’s in office. Easily entertained, easily distracted, easily led, these are the ones who make the government’s job far easier than it should be.

It is easy to be diverted, distracted and amused by the antics of the presidential candidates, the pomp and circumstance of awards shows, athletic events, and entertainment news, and the feel-good evangelism that passes for religion today. What is far more difficult to face up to is the reality of life in America, where unemployment, poverty, inequality, injustice and violence by government agents are increasingly norms.

The powers-that-be want us to remain divided, alienated from each other based on our politics, our bank accounts, our religion, our race and our value systems. Yet as George Orwell observed, “The real division is not between conservatives and revolutionaries but between authoritarians and libertarians.”

The only distinction that matters anymore is where you stand in the American police state. In other words, you’re either part of the problem or part of the solution.

There is something extraordinary happening in the world

flux1-660x375

By Gustavo Tanaka

Source: Medium

A few months ago, I freed myself from society, I’ve released myself from attachments I had and fear that locked me to the system. And since then, I started seeing the world from a different perspective. The perspective that everything is changing and most of us have not even realized that.

Why is the world changing? In this post I’ll list the reasons that take me to believe this.

1 — No one can stand the employment model anymore.

Each one is reaching its own limit. People that work in big corporations can’t handle their jobs. The lack of purpose starts to knock the door of each one as a desperate scream coming from the heart.

People want to escape. They want to leave everything behind. Look how many people trying to become entrepreneurs, how many people going on sabbaticals, how many people depressed in their jobs, how many people in burnout.

2- The entrepreneurship model is also changing

A few years ago, with the explosion of the startups, thousands of entrepreneurs, ran to their garages to create their billion dollar ideas. The glory was to get funded by an investor. Investor’s money in hand was just like winning the World Cup.

But what happens after you get funded?

You become an employee again. You have people that are not aligned with your dream, that don’t give a damn to the purpose and everything turns into money. The financial return starts to be the main driver.

Many people are suffering with this. Brilliant startups start to fall because the model of chasing money never ends.

We need a new model of entrepreneurship.

And there is already many good people doing this.

3- The rise of collaboration

Many people have already realized that makes no sense to go alone. Many people awake to this crazy mentality of “going on your own”.

Stop, take a step back and think. Isn’t it an absurd, we, 7 billion people living in the same planet get so separated from each other? What sense does it make, you and the thousands (or millions) of people living in the same city turn your back to each other? Every time I think of that I get kind of depressed.

But fortunately, things are changing. All the movements of sharing and collaborative economy are pointing towards this direction. The rise of collaboration, sharing, helping, giving a hand, getting united.

It is beautiful. It brings tears to my eyes.

4- We are finally starting to understand what the internet is

Internet is an incredibly spectacular thing and only now, after so many years we are understanding it’s power. With internet, the world opens, the barriers fall, separation ends, union starts, collaboration explodes, help emerges.

Some nations made revolution with the internet, such as the Arab Spring. In Brazil we are just starting to use better this magnificent tool.

Internet is taking down mass control. There is no more television, no more few newspapers showing the news they want us to read. You can go after whatever you want, you relate to whoever you want. You can explore whatever you want, whenever you want.

With internet, the small starts to get a voice. The anonymous become known. The world gets united. And the system may fall.

5- The fall of exaggerated consumption

For many years, we have been manipulated, stimulated to consume as maniacs. To buy everything that was launched in the market. To have the newest car, the latest iphone, the best brands, lots of clothes, lots of shoes, lots of lots, lots of everything.

But many people have already understood that it makes no sense at all. Movements such as the lowsumerism, slow life, slow food, start to show us that we have organized ourselves in the most absurd possible way.

Each time less people using cars, less people buying a lot, each time more people trading clothes, donating, buying old things, sharing goods, sharing cars, apartments, offices.

We need nothing of what they told us we needed.

And this consciousness can break any corporation that depends on exaggerated consumption.

6- Healthy and organic eating

We were so crazy that we accepted eating any kind of garbage. It only needed to taste good, that was ok.

We were so disconnected, that the guys started to add poison in our food and we didn’t say anything.

But then some guys started to wake up and give strength to movements of healthy eating and organic consumption.

And this is going to be huge.

But what does it have to do with economy and work? Everything!

The production of food is the basis of our society. Food industry is one of the most important in the world. If consciousness changes, our eating habits also change, and consumption changes, and then the big corporations must follow these changes.

The small farmer is starting to have strength again. Also people who are planting their own food.

And that changes the whole economy.

7 — The awakening of spirituality

How many friends do you have that practice yoga? What about meditation?

How many used to do it 10 years ago?

Spirituality for many years was a thing of the esoteric people. Of those weird people from mysticism.

But fortunately, this is also changing. We got to the limit of our rationality. We could see that only with the rational mind we cannot understand everything that happens here. There is something more happening and I know you want to understand.

You want to understand how things work in here. How life operates, what happens after death, what is this energy thing that people talk so much, what is quantum physics, how can thoughts become things and create our reality, what are coincidences and synchronicities, why meditation works, how is it possible to cure with the hands and what about these alternative therapies that medicine does not approve, but work?

Companies are promoting meditation to their employees. Schools teaching meditation to kids.

8 — Unschooling movements

Who created this teaching model? Who chose the classes you have to take? Who chose the lessons we learn in history classes? Why didn’t they teach us the truth about other ancient civilizations?

Why should the kids obey rules? Why should they watch everything in silence? Why should they wear uniform?

Take a test to prove that you learned?

We created a model that forms followers of the system. That prepare people to be ordinary human beings.

But fortunately there are many people working to change that. Movements like unschooling, hackschooling, homeschooling.

Maybe you have never thought of this and you are chocked with the points I’m listing here.

But all these things are happening.

Silently, people are awakening and realizing how crazy it is to live in this society.

Look at all these movements and try to think everything is normal.

I don’t think it is.

There is something extraordinary happening.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Gustavo Tanaka — Brazilian author and entrepreneur, trying to create with my friends a new model, a new system and maybe helping to create a new economy.

Skynet Ascendant

t2skynetbdcap1

By Cory Doctorow

Source: Locus Online

As I’ve written here before, science fiction is terrible at predicting the future, but it’s great at predicting the present. SF writers imagine all the futures they can, and these futures are processed by a huge, dynamic system consisting of editors, booksellers, and readers. The futures that attain popular and commercial success tell us what fears and aspirations for technology and society are bubbling in our collective imaginations.

When you read an era’s popular SF, you don’t learn much about the future, but you sure learn a lot about the past. Fright and hope are the inner and outer boundaries of our imagination, and the stories that appeal to either are the parameters of an era’s political reality.

Pay close attention to the impossibilities. When we find ourselves fascinated by faster than light travel, consciousness uploading, or the silly business from The Matrix of AIs using human beings as batteries, there’s something there that’s chiming with our lived experience of technology and social change.

Postwar SF featured mass-scale, state-level projects, a kind of science fictional New Deal. Americans and their imperial rivals built cities in space, hung skyhooks in orbit, even made Dyson Spheres that treated all the Solar System’s matter as the raw material for the a new, human-optimized megaplanet/space-station that would harvest every photon put out by our sun and put it to work for the human race.

Meanwhile, the people buying these books were living in an era of rapid economic growth, and even more importantly, the fruits of that economic growth were distributed to the middle class as well as to society’s richest. This was thanks to nearly unprecedented policies that protected tenants at the expense of landlords, workers at the expense of employers, and buy­ers at the expense of sellers. How those policies came to be enacted is a question of great interest today, even as most of them have been sunsetted by successive governments across the developed world.

Thomas Piketty’s data-driven economics bestseller Capital in the Twenty-First Century argues that the vast capital destruction of the two World Wars (and the chaos of the interwar years) weakened the grip of the wealthy on the governments of the world’s developed states. The arguments in favor of workplace safety laws, taxes on capital gains, and other policies that undermined the wealthy and benefited the middle class were not new. What was new was the political possibility of these ideas.

As developed nations’ middle classes grew, so did their material wealth, political influence, and expectations that governments would build am­bitious projects like interstate highways and massive civil engineering projects. These were politically popular – because lawmakers could use them to secure pork for their voters – and also lucrative for government contractors, making ‘‘Big Government’’ a rare point of agreement between the rich and middle-income earners.

(A note on poor people: Piketty’s data suggests that the share of the national wealth controlled by the bottom 50% has not changed much for several centuries – eras of prosperity are mostly about redistributing from the top 10-20% to the next 30-40%)

Piketty hypothesizes that the returns on investment are usually greater than the rate of growth in an economy. The best way to get rich is to start with a bunch of money that you turn over to professional managers to invest for you – all things being equal, this will make you richer than you could get by inventing something everyone uses and loves. For example, Piketty contrasts Bill Gates’s fortunes as the founder of Microsoft, once the most profitable company in the world, with Gates’s fortunes as an investor after his retirement from the business. Gates-the-founder made a lot less by creating one of the most successful and profitable products in history than he did when he gave up making stuff and started owning stuff for a living.

By the early 1980s, the share of wealth controlled by the top decile tipped over to the point where they could make their political will felt again – again, Piketty supports this with data showing that nations elect seriously investor-friendly/worker-unfriendly governments when investors gain control over a critical percentage of the national wealth. Leaders like Reagan, Thatcher, Pinochet, and Mulroney enacted legislative reforms that reversed the post-war trend, dis­mantling the rules that had given skilled workers an edge over their employers – and the investors the employers served.

The greed-is-good era was also the cyberpunk era of literary globalized corporate dystopias. Even though Neuromancer and Mirrorshades predated the anti-WTO protests by a decade and a half, they painted similar pictures. Educated, skilled people – people who comprised the mass of SF buyers – became a semi-disposable under­class in world where the hyperrich had literally ascended to the heavens, living in orbital luxury hotels and harvesting wealth from the bulk of humanity like whales straining krill.

Seen in this light, the vicious literary feuds between the cyberpunks and the old guard of space-colonizing stellar engineer writers can be seen as a struggle over our political imagination. If we crank the state’s dials all the way over the right, favoring the industrialist ‘‘job creators’’ to the exclusion of others, will we find our way to the stars by way of trickle-down, or will the overclass graft their way into a decadent New Old Rome, where reality TV and hedge fund raids consume the attention and work we once devoted to exploring our solar system?

Today, wealth disparity consumes the popular imagination and political debates. The front-running science fictional impossibility of the unequal age is rampant artificial intelligence. There were a lot of SF movies produced in the mid-eighties, but few retain the currency of the Termina­tor and its humanity-annihilating AI, Skynet. Everyone seems to thrum when that chord is plucked – even the NSA named one of its illegal mass surveillance programs SKYNET.

It’s been nearly 15 years since the Matrix movies debuted, but the Red Pill/Blue Pill business still gets a lot of play, and young adults who were small children when Neo fought the AIs know exactly what we mean when we talk about the Matrix.

Stephen Hawking, Elon Musk, and other luminaries have issued pan­icked warnings about the coming age of humanity-hating computerized overlords. We dote on the party tricks of modern AIs, sending half-admiring/half-dreading laurels to the Watson team when it manages to win at Jeopardy or randomwalk its way into a new recipe.

The fear of AIs is way out of proportion to their performance. The Big Data-trawling systems that are supposed to find terrorists or figure out what ads to show you have been a consistent flop. Facebook’s new growth model is sending a lot of Web traffic to businesses whose Facebook followers are increasing, waiting for them to shift their major commercial strategies over to Facebook marketing, then turning off the traffic and demanding recurr­ing payments to send it back – a far cry from using all the facts of your life to figure out that you’re about to buy a car before even you know it.

Google’s self-driving cars can only operate on roads that humans have mapped by hand, manually marking every piece of street-furniture. The NSA can’t point to a single terrorist plot that mass-surveillance has disrupted. Ad personalization sucks so hard you can hear it from orbit.

We don’t need artificial intelligences that think like us, after all. We have a lot of human cognition lying around, going spare – so much that we have to create listicles and other cognitive busy-work to absorb it. An AI that thinks like a human is a redundant vanity project – a thinking version of the ornithopter, a useless mechanical novelty that flies like a bird.

We need machines that don’t fly like birds. We need AI that thinks unlike humans. For example, we need AIs that can be vigilant for bomb-parts on airport X-rays. Humans literally can’t do this. If you spend all day looking for bomb-parts but finding water bottles, your brain will rewire your neurons to look for water bottles. You can’t get good at something you never do.

What does the fear of futuristic AI tell us about the parameters of our present-day fears and hopes?

I think it’s corporations.

We haven’t made Skynet, but we have made these autonomous, transhuman, transnational technolo­gies whose bodies are distributed throughout our physical and economic reality. The Internet of Things version of the razorblade business model (sell cheap handles, use them to lock people into buying expensive blades) means that the products we buy treat us as adversaries, checking to see if we’re breaking the business logic of their makers and self-destructing if they sense tampering.

Corporations run on a form of code – financial regulation and accounting practices – and the modern version of this code literally prohibits corporations from treating human beings with empathy. The principle of fiduciary duty to inves­tors means that where there is a chance to make an investor richer while making a worker or customer miserable, management is obliged to side with the investor, so long as the misery doesn’t backfire so much that it harms the investor’s quarterly return.

We humans are the inconvenient gut-flora of the corporation. They aren’t hostile to us. They aren’t sympathetic to us. Just as every human carries a hundred times more non-human cells in her gut than she has in the rest of her body, every corpora­tion is made up of many separate living creatures that it relies upon for its survival, but which are fundamentally interchangeable and disposable for its purposes. Just as you view stray gut-flora that attacks you as a pathogen and fight it off with anti­biotics, corporations attack their human adversaries with an impersonal viciousness that is all the more terrifying for its lack of any emotional heat.

The age of automation gave us stories like Chap­lan’s Modern Times, and the age of multinational hedge-fund capitalism made The Matrix into an enduring parable. We’ve gone from being cogs to being a reproductive agar within which new cor­porations can breed. As Mitt Romney reminded us, ‘‘Corporations are people.’’

Wall Street Panic

panicked trader

By Mike Whitney

Source: Counterpunch

“Not only is the equity market at the second most overvalued point in U.S. history, it is also more leveraged against probable long-term corporate cash flows than at any previous point in history.”

— John P. Hussman, Ph.D. “Debt-Financed Buybacks Have Quietly Placed Investors On Margin“, Hussman Funds

“This year feels like the last days of Pompeii: everyone is wondering when the volcano will erupt.”

— Senior banker commenting to the Financial Times

Last Friday’s stock market bloodbath was the worst one-day crash since 2008. The Dow Jones dropped 531 points, while the S&P 500 fell 64, and the tech-heavy Nasdaq slid 171. The Dow lost more than 1,000 points on the week dipping back into the red for the year. At the same time, commodities continued to get hammered with oil prices briefly dropping below the critical $40 per barrel mark. More tellingly, the market’s so called “fear gauge” (VIX) skyrocketed to a 2015 high indicating more volatility to come.  The VIX has remained at unusually low levels for a number of years as investors have grown more complacent figuring the Fed will intervene whenever stocks fall too far. But last week’s massacre cast doubts on  the Central Bank’s intentions. Will the Fed ride to the rescue again or not? To the vast majority of institutional investors, who now base their buying decisions on Fed policy rather than market fundamentals, that is the crucial question.

Ostensibly, last week’s selloff  was triggered by China’s unexpected decision to devalue its currency, the juan.   The announcement confirmed that the world’s second biggest economy is rapidly cooling off increasing the likelihood of a global slowdown. Over the last decade, China has accounted  “for a third of the expansion in the global economy,… almost double the contribution of the US and more than triple the impacts of Europe and Japan.” Fears of a slowdown were greatly intensified on Friday when a survey showed that manufacturing in China shrank at the fastest pace since the recession in 2009. That’s all it took to put the global markets into a nosedive. According to the World Socialist Web Site:

“The deceleration of growth in China, reflected in figures on production, exports and imports, business investment and producer prices, is fueling a near-collapse in so-called “emerging market” economies that depend on the Chinese market for exports of raw materials. The past week saw a further plunge in stock prices and currency rates in Russia, Turkey, Brazil, South Africa and other countries. These economies are being hit by a massive outflow of capital, placing in doubt their ability to meet debt obligations.”

(“Panic sell-off on world financial markets”, World Socialist Web Site)

While a correction was not entirely unexpected following a 6-year long bull market, the sudden drop in equities does have analysts rethinking the effectiveness of the Fed’s monetary policies which have had little impact on personal consumption, retail spending, wages, productivity, household income, or economic growth all of which remain weaker than they have been following any recession in the post war era.  For all intents and purposes, the plan to inflate asset prices by dropping rates to zero and injecting trillions in liquidity into the financial system has been an abject failure.   GDP continues to hover at an abysmal 1.5% while  signs of a strong, self sustaining recovery are nowhere to be seen. At the same time, government and corporate debt continue to balloon at a near-record pace draining capital  away from productive investments that could lay the groundwork for higher employment and stronger growth.

What’s so odd about last week’s market action is that the bad news on China put shares into a tailspin instead of sending them into the stratosphere which has been the pattern for the last four years. In fact, the reason volatility has stayed so low and investors have grown so complacent is because every announcement of bad economic data has been followed by cheery promises from the Fed to keep the easy-money sluicegates open until the storm passes.  That hasn’t been the case this time, in fact, Fed chair Janet Yellen hasn’t even scrapped the idea of jacking up rates some time in September which is almost unthinkable given last week’s market ructions.

Why? What’s changed?   Surely, Yellen isn’t going to sit back and let six years of stock market gains be wiped out in a few sessions, is she?  Or is there something we’re missing here that is beyond the Fed’s powers to change? Is that it?

My own feeling is that China is not the real issue. Yes, it is the catalyst for the selloff, but the real problem is in the credit markets where the spreads on high yield bonds continue to widen relative to US Treasuries.

What does that mean?

It means the price of capital is going up, and when the price of capital goes up, it costs more for businesses to borrow. And when it costs more for businesses to borrow, they reduce their borrowing, which decreases the demand for credit. And when the demand for credit decreases in a credit-based system, then there’s a corresponding slowdown in business investment which impacts stock prices and growth. And that is particularly significant now, since the bulk of corporate investment is being diverted into stock buybacks. Check out this excerpt from a post at Wall Street on Parade:

“According to data from Bloomberg, corporations have issued a stunning $9.3 trillion in bonds since the beginning of 2009. The major beneficiary of this debt binge has been the stock market rather than investment in modernizing the plant, equipment or new hires to make the company more competitive for the future. Bond proceeds frequently ended up buying back shares or boosting dividends, thus elevating the stock market on the back of heavier debt levels on corporate balance sheets.

Now, with commodity prices resuming their plunge and currency wars spreading, concerns of financial contagion are back in the markets and spreads on corporate bonds versus safer, more liquid instruments like U.S. Treasury notes, are widening in a fashion similar to the warning signs heading into the 2008 crash. The $2.2 trillion junk bond market (high-yield) as well as the investment grade market have seen spreads widen as outflows from Exchange Traded Funds (ETFs) and bond funds pick up steam.” (“Keep Your Eye on Junk Bonds: They’re Starting to Behave Like ‘08 “, Wall Street on Parade)

As you can see, the nation’s corporations don’t borrow at zero rates from the Fed. They borrow at market rates in the bond market, and those rates are gradually inching up. And while that hasn’t slowed the stock buyback craze so far,  the clock is quickly running out. We are fast approaching the point where debt servicing, shrinking revenues, too much leverage, and higher rates will no longer make stock repurchases a sensible option, at which point stocks are going to fall off a cliff. Here’s more from Andrew Ross Sorkin at the New York Times:

“Since 2004, companies have spent nearly $7 trillion purchasing their own stock — often at inflated prices, according to data from Mustafa Erdem Sakinc of the Academic-Industry Research Network. That amounts to about 54 percent of all profits from Standard & Poor’s 500-stock index companies between 2003 and 2012, according to William Lazonick, a professor of economics at the University of Massachusetts Lowell.”

You can see the game that’s being played here. Mom and Pop investors are getting fleeced again. They’ve been lending trillions of dollars to corporate CEOs (via bond purchases) who’ve taken the money, split it up among themselves and their wealthy shareholder buddies, (through buybacks and dividends neither of which add a thing to a company’s productive capacity) and made out like bandits.  This, in essence, is how stock buybacks work. Ordinary working people stick their life savings into bonds (because they were told “Stocks are risky, but bonds are safe”.) that offer a slightly better return than ultra-safe, low-yield government debt (US Treasuries) and, in doing so, provide lavish rewards for scheming executives who use it to shower themselves and their cutthroat shareholders with windfall profits that will never be repaid. When analysts talk about “liquidity issues” in the bond market, what they really mean is that they’ve already divvied up the money between themselves and you’ll be lucky if you ever see a dime of it back. Sound familiar?

Of course, it does. The same thing happened before the Crash of ’08. Now we are reaching the end of the credit cycle which could produce the same result. According to one analyst:

“There’s been worrying deterioration in the overall global demand picture with the continuation of EM (Emerging Markets) FX (Currency Markets) onslaught, deterioration in credit metrics with rising leverage in the US as well as outflows in credit funds in conjunction with significant widening in credit spreads…..The goldilocks period of “low rates volatility-stable carry trade environment of the last couple of years is likely coming to an end.”

(“Credit: Magical Thinking“, Macronomics)

In other words, the good times are behind us while hard times are just ahead. And while the end of the credit cycle doesn’t always signal a stock market crash, the massive buildup of leverage in unproductive financial assets like buybacks suggest that equities are in line for a serious whooping. Here’s more from Bloomberg:

“Credit traders have an uncanny knack for sounding alarm bells well before stocks realize there’s a problem. This time may be no different. Investors yanked $1.1 billion from U.S. investment-grade bond funds last week, the biggest withdrawal since 2013, according to data compiled by Wells Fargo & Co…..

“Credit is the warning signal that everyone’s been looking for,” said Jim Bianco, founder of Bianco Research LLC in Chicago. “That is something that’s been a very good leading indicator for the past 15 years.”

Bond buyers are less interested in piling into notes that yield a historically low 3.4 percent at a time when companies are increasingly using the proceeds for acquisitions, share buybacks and dividend payments. Also, the Federal Reserve is moving to raise interest rates for the first time since 2006, possibly as soon as next month, ending an era of unprecedented easy-money policies that have suppressed borrowing costs….

“Unlike the credit market, the equity market well into 2008 was very complacent about the subprime crisis that led to a full blown financial crisis,” the analysts wrote…..

So if you’re very excited about buying stocks right now, just beware of the credit traders out there who are sending some pretty big warning signs.”  (“U.S. Credit Traders Send Warning Signal to Rest of World Markets”, Bloomberg)

It’s worth noting that the above article was written on August 14, a week before the stock market blew up. But credit was “flashing red” long before stock traders ever took notice.

But that’s beside the point. Whether the troubles started with China or the credit markets, probably doesn’t matter. What matters is that the system about to be put-to-the-test once again because the appropriate safeguards haven’t been put in place, because bubbles are unwinding, and because the policymakers who were supposed to monitor and regulate the system decided that they were more interested in shifting  wealth to their voracious colleagues on Wall Street than building a strong foundation for a healthy economy. That’s why a simple correction could turn into something much worse.

NOTE: As of posting time, Sunday night, the Nikkei Index is down 710, Shanghai down 296, HSI down 1,031. US equity futures are all deep in the red

MIKE WHITNEY lives in Washington state. He is a contributor to Hopeless: Barack Obama and the Politics of Illusion (AK Press). Hopeless is also available in a Kindle edition. He can be reached at fergiewhitney@msn.com.

Rushkoff on the Economy

pyramid_of_power

I’ve been reading Douglas Rushkoff’s “Present Shock: When Everything Happens Now” and have by coincidence just reached a chapter of the book covering the topic of currencies and the economy as Washington D.C. attempts to avoid another default. I found similar writings from Rushkoff on the same topic in two articles published by Arthur Magazine. As can be seen from these excerpts, they’re helpful for understanding our current situation:

Local currencies favored local transactions, and worked against the interests of large corporations working from far away. In order to secure their own position as well as that of their chartered monopolies, monarchs began to make local currencies illegal, and force locals to instead use “coin of the realm.” These centralized currencies worked the opposite way. They were not earned into existence, they were lent into existence by a central bank. This meant any money issued to a person or business had to be paid back to the central bank, with interest.

What does that do to an economy? It bankrupts it. Think of it this way: A business borrows 1000 dollars from the bank to get started. In ten years, say, it is supposed to pay back 2000 to the bank. Where does the other 1000 come from? Some other business that has borrowed 1000 from the bank. For one business to pay back what it owes, another must go bankrupt. That, or borrow yet another 1000, and so on.

An economy based on an interest-bearing centralized currency must grow to survive, and this means extracting more, producing more and consuming more. Interest-bearing currency favors the redistribution of wealth from the periphery (the people) to the center (the corporations and their owners). Just sitting on money—capital—is the most assured way of increasing wealth. By the very mechanics of the system, the rich get richer on an absolute and relative basis.

The biggest wealth generator of all was banking itself. By lending money at interest to people and businesses who had no other way to conduct transactions or make investments, banks put themselves at the center of the extraction equation. The longer the economy survived, the more money would have to be borrowed, and the more interest earned by the bank.

[…]Commerce is good. Commerce is not the problem. Monopolies are.

Except in a few rare cases, corporate charters and centralized currency were never intended to promote commerce. They were intended to prevent locals and non-chartered entities from creating and exchanging value. They are not extensions of the free market, but efforts at extracting value from the free market. Corporate monopoly charters were extended to a king’s favorite companies in return for shares. Then, no one else was allowed to do business in that industry. Centralized currency forced businesses to run their revenue through the king’s coffers. Likewise, in its current form, centralized currency is more akin to a ponzi scheme of interest rates, each borrower paying up to the banker above him.

Both of these innovations—corporate charters and centralized currency—tend towards resource exploitation rather than innovation. They are extractive in nature, not productive. And, more importantly, these particular innovations cause wealth to end up being generated through speculation rather than creation. They cause scarcity, not abundance. Over time, it becomes easier to make money by having money than by doing anything. And this was the pure, stated intent of centralized currency and banking in the early Renaissance: to keep the wealthy wealthy, in the face of a rising merchant class.

This isn’t some extremist perspective. It’s just historical fact, though largely forgotten and seemingly refuted by our collective false memory of the Renaissance’s greatness. If you’re interested in finding out more about this, or seeing the evidence on which my research is based, take a look at the best historians writing about the era: Fernand Braudel (The Wheels of Commerce: Civilization and Capitalism: 15th-18th Century, Volume 2, Univ. of California Press, 1992), Carlo M. Cipolla (Before the Industrial Revolution: European Society and Economy, 1000-1700, WW Norton, 1994) or Bernard A. Lietaer, whose book On Human Wealth used to be available for free download off his site, but doesn’t seem to be anymore. In these books, you can find out about the sustainable local economic systems of the Late Middle Ages, learn that the Black Plague actually began after mandated centralized currency had impoverished Europe, and find support of my contention that cathedrals were built with local money before the Renaissance, not Vatican money during the Renaissance.

I highly recommend checking out both articles here (as well as his most recent book “Present Shock”):

http://arthurmag.com/2009/03/16/let-it-die-rushkoff-on-the-economy/

http://arthurmag.com/2009/03/23/hack-money-hack-banking-rushkoff-on-the-economy/

More voices of sanity (Nicole Voss and Laurence Boomert) calling for an overhaul of the monetary system can be heard on the C-Realm podcast :

Lara Trace Hentz

INDIAN COUNTRY NEWS

In Saner Thought

"It is the duty of every man, as far as his ability extends, to detect and expose delusion and error"..Thomas Paine

ZEDJournAI

Human in Algorithms

Rooster Crows

From the Roof Top

Aisle C

I See This

The Free

blog of the post capitalist transition.. Read or download the novel here + latest relevant posts

अध्ययन-अनुसन्धान(Essential Knowledge of the Overall Subject)

अध्ययन-अनुसन्धानको सार