After the Crash

Dispatches From a Long Recovery (Est. 10/2024)

After the Crash

A Monster Eating the Nation

By James Howard Kunstler

Source: Kunstler.com

Is there any question now that the Deep State is preparing to expel President Donald Trump from the body politic like a necrotic organ? The Golden Golem of Greatness has floundered pretty badly on the job, it’s true, but his mighty adversaries in the highly politicized federal agencies want him to fail spectacularly, and fast, they have a lot of help from the NY Times / WashPo / CNN axis of hysteria, as well as such slippery swamp creatures as Lindsey Graham.

There are more problematic layers in this matter than in a Moldavian wedding cake. America has been functionally ungovernable for quite a while, well before Trump arrived on the scene. His predecessor managed to misdirect the nation’s attention from the cumulative dysfunction with sheer charm and supernatural placidity — NoDrama Obama. But there were a few important things he could have accomplished as chief exec, such as directing his attorney general to prosecute Wall Street crime (or fire the attorney general and replace him with someone willing to do the job). He could have broken up the giant TBTF banks. He could have aggressively sponsored legislation to overcome the Citizens United SOTUS decision (unlimited corporate money in politics) by redefining corporate “citizenship.” Stuff like that. But he let it slide, and the nation slid with him down a greasy chute of political collapse.

Which we find embodied in Trump, a sort of tragicomic figure who manages to compound all of his weaknesses of character with a childish impulsiveness that scares folks. It is debatable whether he has simply been rendered incompetent by the afflictions heaped on by his adversaries, or if he is just plain incompetent in, say, the 25th Amendment way. I think we’ll find out soon enough, because impeachment is a very long and arduous path out of this dark place.

The most curious feature of the current crisis, of course, is the idiotic Russia story that has been the fulcrum for levering Trump out of the White House. This was especially funny the past week with the episode involving Russian Foreign Minister Lavrov and Ambassador Kislyak conferring with Trump in the White House about aviation security around the Middle East. The media and the Lindsey Graham wing of the Deep State acted as if Trump had entertained Focalor and Vepar, the Dukes of Hell, in the oval office.

Why do you suppose nations employ foreign ministers and ambassadors, if not to conduct conversations at the highest level with other national leaders? And might these conversations include matters of great sensitivity, that is, classified information? If you doubt that then you have no understanding of geopolitics or history.

The General Mike Flynn story is especially a crack-up. Did he accept a twenty thousand dollar speaking fee from the Russian news outlet RT in his interlude as a private citizen? How does that compare to the millions sucked in by the Clinton Foundation in pay-to-play deal when Madame was secretary of state? Or her six-figure speeches to Goldman Sachs and their ilk. Are private citizens forbidden to accept speaking fees or consulting fees from countries that we are not at war with? I’d like to know how many other alumni of the Bill Clinton, Bush-II and Obama admins have hired themselves out on this basis. Scores and scores, I would bet.

Trump’s adversaries might not get any traction on the Russia story, but they may enrage the rogue elephant Trump enough in the process that he will appear sufficiently incompetent to run him over with the 25th Amendment, and I think that is the plan for now. Of course, there are some jokers in the deck. A really striking one is the story of murdered DNC staffer Seth Rich last July. He was shot in the back on the street outside his apartment one night by persons as yet unknown, and twelve days later over 40,000 DNC emails landed at Wikileaks. His laptop is reportedly in the possession of the DC cops — if it hasn’t been dumped in the Potomac. I’m generally allergic to conspiracy theories, but this looks like an especially ugly story, which might ultimately be clarified if-or-when Julian Assange of Wikileaks ever divulges the source of that data dump. Anyway, the new Special Counsel at the DOJ, former FBI Director Robert Mueller, may have to venture down that dark trail.

One way or another, though, the Deep State is determined to drive Trump from office. In the final rounds of this struggle, Trump might conceivably undertake a sudden swamp-draining operation: the firing of a great many politicized Intelligence Community officers, especially the ones legally culpable for leaking classified information to media — another area that Mr. Mueller could also shine a light on. The colossal security apparatus of this country — especially the fairly new giant NSA — has become a monster eating America. Somebody needs to literally cut it down to size. Perhaps that’s the Deep State’s main motive in moving heaven and earth to dump Trump.

When they do, of course, they are libel to foment an insurrection every bit as ugly as the dust-up that followed the shelling of Fort Sumter. Trump, whatever you think of him — and I’ve never been a fan, to put it mildly — was elected for a reason: the ongoing economic collapse of the nation, and the suffering of a public without incomes or purposeful employment. That part of the common weal is liable to completely whirl down the drain later this year in something like a currency crisis or a depressionary market meltdown engineered by yet another Deep State player, the Federal Reserve. That and the ejection of Trump could coincide with disastrous results.

Racket of Rackets

By Charles Hugh Smith

Source: Of Two Minds

If you thought banking in our time was a miserable racket — which it is, of course, and by “racket” I mean a criminal enterprise — then so-called health care has it beat by a country mile, with an added layer of sadism and cruelty built into its operations. Lots of people willingly sign onto mortgages and car loans they wouldn’t qualify for in an ethically sound society, but the interest rates and payments are generally spelled out on paper. They know what they’re signing on for, even if the contract is reckless and stupid on the parts of both borrower and lender. Pension funds and insurance companies foolishly bought bundled mortgage bonds of this crap concocted in the housing bubble. They did it out of greed and desperation, but a little due diligence would have clued them into the fraud being served up by the likes of Goldman Sachs.

Medicine is utterly opaque cost-wise, and that is the heart of the issue. Nobody in the system will say what anything costs and nobody wants to because it would break the spell that they work in an honest, legit business. There is no rational scheme for the cost of any service from one “provider” to the next or even one patient to the next. Anyway, the costs are obscenely inflated and concealed in so many deliberately deceptive coding schemes that even actuaries and professors of economics are confounded by their bills. The services are provided when the customer is under the utmost duress, often life-threatening, and the outcome even in a successful recovery from illness is financial ruin that leaves a lot of people better off dead.

It is a hostage racket, in plain English, a disgrace to the profession that has adopted it, and an insult to the nation. All the idiotic negotiations in congress around the role of insurance companies are a grand dodge to avoid acknowledging the essential racketeering of the “providers” — doctors and hospitals. We are never going to reform it in its current incarnation. For all his personality deformities, President Trump is right in saying that ObamaCare is going to implode. It is only a carbuncle on the gangrenous body of the US medical establishment. The whole system will go down with it.

The New York Times departed from its usual obsessions with Russian turpitude and transgender life last week to publish a valuable briefing on this aspect of the health care racket: Those Indecipherable Medical Bills? They’re One Reason Health Care Costs So Much by Elisabeth Rosenthal. Much of this covers ground exposed in the now famous March 4, 2013 Time Magazine cover story (it took up the whole issue): Bitter Pill: Why Medical Bills Are Killing Us, by Steven Brill. The American public and its government have been adequately informed about the gross and lawless chiseling rampant in every quarter of medicine. The system is one of engineered criminality. It is inflicting ruin on millions. It is really a wonder that the public has not stormed the hospitals with pitchforks and flaming brands to string up that gang in the parking lots high above their Beemers and Lexuses.

There are only two plausible arcs to this story. One is that the nation might face the facts and resort to the Single Payer system found in virtually every other nation that affects to be civilized. There is no other way to eliminate the deliberate racketeering. The other outcome would be the inevitable collapse of the system and its eventual re-set to a much less complex, cash-on-the-barrelhead, local clinic-based model with far less heroic high-tech interventions available for the broad public, but much more affordable basic care. Both outcomes would require jettisoning the immense overburden of administrative dross that clutters up the current model, with its absurd tug-of-war between the price-gouging hospital “Chargemaster” clerks and the sadistic insurance company monitors bent on denying treatment to their sick and hapless “customers” (hostages). Be warned: these represent tens of thousands of supposedly “good” jobs. Of course, they are “good” because they pay middle class wages, of which there are fewer and fewer elsewhere in the economy. But, they are well-paid because of the grotesquely profitable racket they serve. They’ve turned an entire generation of office workers into servants of criminal enterprise. Imagine the damage this does to the soul of our culture.

My suggestion for real reform of the medical racket looks to historical precedent:

In 1932 (before the election of FDR, by the way), the US Senate formed a commission to look into the causes of the 1929 Wall Street Crash and recommend corrections in banking regulation to obviate future episodes like it. It is known to history as the Pecora Commission, after its chief counsel Ferdinand Pecora, an assistant Manhattan DA, who performed gallantly in his role. The commission ran for two years. Its hearings led to prison terms for many bankers and ultimately to the Glass-Steagall Act of 1932, which kept banking relatively honest and stable until its nefarious repeal in 1999 under President Bill Clinton — which led rapidly to a new age of Wall Street malfeasance, still underway.

The US Senate needs to set up an equivalent of the Pecora Commission to thoroughly expose the cost racketeering in medicine, enable the prosecution of the people driving it, and propose a Single Payer remedy for flushing it away. The Department of Justice can certainly apply the RICO anti-racketeering statutes against the big health care conglomerates and their executives personally. I don’t know why it has not done so already — except for the obvious conclusion that our elected officials have been fully complicit in the medical rackets, which is surely the case of new Secretary of Health and Human Services, Tom Price, a former surgeon and congressman who trafficked in medical stocks during his years representing his suburban Atlanta district. A new commission could bypass this unprincipled clown altogether.

It is getting to the point where we have to ask ourselves if we are even capable of being a serious people anymore. Medicine is now a catastrophe every bit as pernicious as the illnesses it is supposed to treat, and a grave threat to a nation that we’re supposed to care about. What party, extant or waiting to be born, will get behind this cleanup operation?

The United States of Work

Employers exercise vast control over our lives, even when we’re not on the job. How did our bosses gain power that the government itself doesn’t hold?

By Miya Tokumitsu

Source: New Republic

Work no longer works. “You need to acquire more skills,” we tell young job seekers whose résumés at 22 are already longer than their parents’ were at 32. “Work will give you meaning,” we encourage people to tell themselves, so that they put in 60 hours or more per week on the job, removing them from other sources of meaning, such as daydreaming or social life. “Work will give you satisfaction,” we insist, even though it requires abiding by employers’ rules, and the unwritten rules of the market, for most of our waking hours. At the very least, work is supposed to be a means to earning an income. But if it’s possible to work full time and still live in poverty, what’s the point?

Even before the global financial crisis of 2008, it had become clear that if waged work is supposed to provide a measure of well-being and social structure, it has failed on its own terms. Real household wages in the United States have remained stagnant since the 1970s, even as the costs of university degrees and other credentials rise. Young people find an employment landscape defined by unpaid internships, temporary work, and low pay. The glut of degree-holding young workers has pushed many of them into the semi- or unskilled labor force, making prospects even narrower for non–degree holders. Entry-level wages for high school graduates have in fact fallen. According to a study by the Federal Reserve Bank of New York, these lost earnings will depress this generation’s wages for their entire working lives. Meanwhile, those at the very top—many of whom derive their wealth not from work, but from returns on capital—vacuum up an ever-greater share of prosperity.

Against this bleak landscape, a growing body of scholarship aims to overturn our culture’s deepest assumptions about how work confers wealth, meaning, and care throughout society. In Private Government: How Employers Rule Our Lives (and Why We Don’t Talk About It), Elizabeth Anderson, a professor of philosophy at the University of Michigan, explores how the discipline of work has itself become a form of tyranny, documenting the expansive power that firms now wield over their employees in everything from how they dress to what they tweet. James Livingston, a historian at Rutgers, goes one step further in No More Work: Why Full Employment Is a Bad Idea. Instead of insisting on jobs for all or proposing that we hold employers to higher standards, Livingston argues, we should just scrap work altogether.

Livingston’s vision is the more radical of the two; his book is a wide-ranging polemic that frequently delivers the refrain “Fuck work.” But in original ways, both books make a powerful claim: that our lives today are ruled, above all, by work. We can try to convince ourselves that we are free, but as long as we must submit to the increasing authority of our employers and the labor market, we are not. We therefore fancy that we want to work, that work grounds our character, that markets encompass the possible. We are unable to imagine what a full life could be, much less to live one. Even more radically, both books highlight the dramatic and alarming changes that work has undergone over the past century—insisting that, in often unseen ways, the changing nature of work threatens the fundamental ideals of democracy: equality and freedom.

Anderson’s most provocative argument is that large companies, the institutions that employ most workers, amount to a de facto form of government, exerting massive and intrusive power in our daily lives. Unlike the state, these private governments are able to wield power with little oversight, because the executives and boards of directors that rule them are accountable to no one but themselves. Although they exercise their power to varying degrees and through both direct and “soft” means, employers can dictate how we dress and style our hair, when we eat, when (and if) we may use the toilet, with whom we may partner and under what arrangements. Employers may subject our bodies to drug tests; monitor our speech both on and off the job; require us to answer questionnaires about our exercise habits, off-hours alcohol consumption, and childbearing intentions; and rifle through our belongings. If the state held such sweeping powers, Anderson argues, we would probably not consider ourselves free men and women.

Employees, meanwhile, have few ways to fight back. Yes, they may leave the company, but doing so usually necessitates being unemployed or migrating to another company and working under similar rules. Workers may organize, but unions have been so decimated in recent years that their clout is greatly diminished. What’s more, employers are swift to fire anyone they suspect of speaking to their colleagues about organizing, and most workers lack the time and resources to mount a legal challenge to wrongful termination.

It wasn’t supposed to be this way. As corporations have worked methodically to amass sweeping powers over their employees, they have held aloft the beguiling principle of individual freedom, claiming that only unregulated markets can guarantee personal liberty. Instead, operating under relatively few regulations themselves, these companies have succeeded at imposing all manner of regulation on their employees. That is to say, they use the language of individual liberty to claim that corporations require freedom to treat workers as they like.

Anderson sets out to discredit such arguments by tracing them back to their historical origins. The notion that personal freedom is rooted in free markets, for instance, originated with the Levellers in seventeenth-century England, when working conditions differed substantially from today’s. The Levellers believed that a market society was essential to liberate individuals from the remnants of feudal hierarchies; their vision of utopia was a world in which men could meet and interact on terms of equality and dignity. Their ideas echoed through the writing and politics of later figures like John Locke, Adam Smith, Thomas Paine, and Abraham Lincoln, all of whom believed that open markets could provide the essential infrastructure for individuals to shape their own destiny.

An anti-statist streak runs through several of these thinkers, particularly the Levellers and Paine, who viewed markets as the bulwark against state oppression. Paine and Smith, however, would hardly qualify as hard-line contemporary libertarians. Smith believed that public education was essential to a fair market society, and Paine proposed a system of social insurance that included old-age pensions as well as survivor and disability benefits. Their hope was not for a world of win-or-die competition, but one in which open markets would allow individuals to make the fullest use of their talents, free from state monopolies and meddlesome bosses.

For Anderson, the latter point is essential; the notion of lifelong employment under a boss was anathema to these earlier visions of personal freedom. Writing in the 1770s, Smith assumes that independent actors in his market society will be self-employed, and uses butchers and bakers as his exemplars; his “pin factory,” meant to illustrate division of labor, employs only ten people. These thinkers could not envision a world in which most workers spend most of their lives performing wage labor under a single employer. In an address before the Wisconsin State Agricultural Society in 1859, Lincoln stated, “The prudent, penniless beginner in the world labors for wages awhile, saves a surplus with which to buy tools or land for himself, then labors on his own account another while, and at length hires another new beginner to help him.” In other words, even well into the nineteenth century, defenders of an unregulated market society viewed wage labor as a temporary stage on the way to becoming a proprietor.

Lincoln’s scenario does not reflect the way most people work today. Yet the “small business owner” endures as an American stock character, conjured by politicians to push through deregulatory measures that benefit large corporations. In reality, thanks to a lack of guaranteed, nationalized health care and threadbare welfare benefits, setting up a small business is simply too risky a venture for many Americans, who must rely on their employers for health insurance and income. These conditions render long-term employment more palatable than a precarious existence of freelance gigs, which further gives companies license to oppress their employees.

The modern relationship between employer and employee began with the rise of large-scale companies in the nineteenth century. Although employment contracts date back to the Middle Ages, preindustrial arrangements bore little resemblance to the documents we know today. Like modern employees, journeymen and apprentices often served their employers for years, but masters performed the same or similar work in proximity to their subordinates. As a result, Anderson points out, working conditions—the speed required of workers and the hazards to which they might be exposed—were kept in check by what the masters were willing to tolerate for themselves.

The Industrial Revolution brought radical changes, as companies grew ever larger and management structures more complex. “Employers no longer did the same kind of work as employees, if they worked at all,” Anderson observes. “Mental labor was separated from manual labor, which was radically deskilled.” Companies multiplied rapidly in size. Labor contracts now bonded workers to massive organizations in which discipline, briefs, and decrees flowed downward, but whose leaders were unreachable by ordinary workers. Today, fast food workers or bank tellers would be hard-pressed to petition their CEOs at McDonald’s or Wells Fargo in person.

Despite this, we often speak of employment contracts as agreements between equals, as if we are living in Adam Smith’s eighteenth-century dream world. In a still-influential paper from 1937 titled “The Nature of the Firm,” the economist and Nobel laureate Ronald Coase established himself as an early observer and theorist of corporate concerns. He described the employment contract not as a document that handed the employer unaccountable powers, but as one that circumscribed those powers. In signing a contract, the employee “agrees to obey the directions of an entrepreneur within certain limits,” he emphasized. But such characterizations, as Anderson notes, do not reflect reality; most workers agree to employment without any negotiation or even communication about their employer’s power or its limits. The exceptions to this rule are few and notable: top professional athletes, celebrity entertainers, superstar academics, and the (increasingly small) groups of workers who are able to bargain collectively.

Yet because employment contracts create the illusion that workers and companies have arrived at a mutually satisfying agreement, the increasingly onerous restrictions placed on modern employees are often presented as “best practices” and “industry standards,” framing all sorts of behaviors and outcomes as things that ought to be intrinsically desired by workers themselves. Who, after all, would not want to work on something in the “best” way? Beyond employment contracts, companies also rely on social pressure to foster obedience: If everyone in the office regularly stays until seven o’clock every night, who would risk departing at five, even if it’s technically allowed? Such social prods exist alongside more rigid behavioral codes that dictate everything from how visible an employee’s tattoo can be to when and how long workers can break for lunch.

Many workers, in fact, have little sense of the legal scope of their employer’s power. Most would be shocked to discover that they could be fired for being too attractive, declining to attend a political rally favored by their employer, or finding out that their daughter was raped by a friend of the boss—all real-life examples cited by Anderson. Indeed, it is only after dismissal for such reasons that many workers learn of the sweeping breadth of at-will employment, the contractual norm that allows American employers to fire workers without warning and without cause, except for reasons explicitly deemed illegal.

In reality, the employment landscape is even more dire than Anderson outlines. The rise of staffing or “temp” agencies, for example, undercuts the very idea of a direct relationship between worker and employer. In The Temp Economy: From Kelly Girls to Permatemps in Postwar America, sociologist Erin Hatton notes that millions of workers now labor under subcontracting arrangements, which give employers even greater latitude to abuse employees. For years, Walmart—America’s largest retailer—used a subcontracting firm to hire hundreds of cleaners, many from Eastern Europe, who worked for months on end without overtime pay or a single day off. After federal agents raided dozens of Walmarts and arrested the cleaners as illegal immigrants, company executives used the subcontracting agreement to shirk responsibility for their exploitation of the cleaners, claiming they had no knowledge of their immigration status or conditions.

By any reasonable standard, much “temp” work is not even temporary. Employees sometimes work for years in a single workplace, even through promotions, without ever being granted official status as an employee. Similarly, “gig economy” platforms like Uber designate their workers as contractors rather than employees, a distinction that exempts the company from paying them minimum wage and overtime. Many “permatemps” and contractors perform the same work as employees, yet lack even the paltry protections and benefits awarded to full-time workers.

A weak job market, paired with the increasing precarity of work, means that more and more workers are forced to make their living by stringing together freelance assignments or winning fixed-term contracts, subjecting those workers to even more rules and restrictions. On top of their actual jobs, contractors and temp workers must do the additional work of appearing affable and employable not just on the job, but during their ongoing efforts to secure their next gig. Constantly pitching, writing up applications, and personal branding on social media requires a level of self-censorship, lest a controversial tweet or compromising Facebook photo sink their job prospects. Forced to anticipate the wishes not of a specific employer, but of all potential future employers, many opt out of participating in social media or practicing politics in any visible capacity. Their public personas are shaped not by their own beliefs and desires, but by the demands of the labor market.


For Livingston, it’s not just employers but work itself that is the problem. We toil because we must, but also because our culture has trained us to see work as the greatest enactment of our dignity and personal character. Livingston challenges us to turn away from such outmoded ideas, rooted in Protestant ideals. Like Anderson, he sweeps through centuries of labor theory with impressive efficiency, from Marx and Hegel to Freud and Lincoln, whose 1859 speech he also quotes. Livingston centers on these thinkers because they all found the connection between work and virtue troubling. Hegel believed that work causes individuals to defer their desires, nurturing a “slave morality.” Marx proposed that “real freedom came after work.” And Freud understood the Protestant work ethic as “the symptom of repression, perhaps even regression.”

Nor is it practical, Livingston argues, to exalt work: There are simply not enough jobs to keep most adults employed at a living wage, given the rise of automation and increases in productivity. Besides, the relation between income and work is arbitrary. Cooking dinner for your family is unpaid work, while cooking dinner for strangers usually comes with a paycheck. There’s nothing inherently different in the labor involved—only in the compensation. Anderson argues that work impedes individual freedom; Livingston points out that it rarely pays enough. As technological advances continue to weaken the demand for human labor, wages will inevitably be driven down even further. Instead of idealizing work and making it the linchpin of social organization, Livingston suggests, why not just get rid of it?

Livingston belongs to a cadre of thinkers, including Kathi Weeks, Nick Srnicek, and Alex Williams, who believe that we should strive for a “postwork” society in one form or another. Strands of this idea go back at least as far as Keynes’s 1930 essay on “Economic Possibilities for our Grandchildren.” Not only would work be eliminated or vastly reduced by technology, Keynes predicted, but we would also be unburdened spiritually. Devotion to work was, he deemed, one of many “pseudo-moral principles” that “exalted some of the most distasteful of human qualities into the position of the highest virtues.”

Since people in this new world would no longer have to earn a salary, they would, Livingston envisions, receive some kind of universal basic income. UBI is a slippery concept, adaptable to both the socialist left and libertarian right, but it essentially entails distributing a living wage to every member of society. In most conceptualizations, the income is indeed basic—no cases of Dom Pérignon—and would cover the essentials like rent and groceries. Individuals would then be free to choose whether and how much they want to work to supplement the UBI. Leftist proponents tend to advocate pairing UBI with a strong welfare state to provide nationalized health care, tuition-free education, and other services. Some libertarians view UBI as a way to pare down the welfare state, arguing that it’s better simply to give people money to buy food and health care directly, rather than forcing them to engage with food stamp and Medicaid bureaucracies.

According to Livingston, we are finally on the verge of this postwork society because of automation. Robots are now advanced enough to take over complex jobs in areas like agriculture and mining, eliminating the need for humans to perform dangerous or tedious tasks. In practice, however, automation is a double-edged sword, with the capacity to oppress as well as unburden. Machines often accelerate the rate at which humans can work, taxing rather than liberating them. Conveyor belts eliminated the need for workers to pass unfinished products along to their colleagues—but as Charlie Chaplin and Lucille Ball so hilariously demonstrated, the belts also increased the pace at which those same workers needed to turn wrenches and wrap chocolates. In retail and customer service, a main function of automation has been not to eliminate work, but to eliminate waged work, transferring much of the labor onto consumers, who must now weigh and code their own vegetables at the supermarket, check out their own library books, and tag their own luggage at the airport.

At the same time, it may be harder to automate some jobs that require a human touch, such as floristry or hairstyling. The same goes for the delicate work of caring for the young, sick, elderly, or otherwise vulnerable. In today’s economy, the demand for such labor is rising rapidly: “Nine of the twelve fastest-growing fields,” The New York Times reported earlier this year, “are different ways of saying ‘nurse.’” These jobs also happen to be low-paying, emotionally and physically grueling, dirty, hazardous, and shouldered largely by women and immigrants. Regardless of whether employment is virtuous or not, our immediate goal should perhaps be to distribute the burdens of caregiving, since such work is essential to the functioning of society and benefits us all.


A truly work-free world is one that would entail a revolution from our present social organizations. We could no longer conceive of welfare as a last resort—as the “safety net” metaphor implies—but would be forced to treat it as an unremarkable and universal fact of life. This alone would require us to support a massive redistribution of wealth, and to reclaim our political institutions from the big-money interests that are allergic to such changes. Tall orders indeed—but as Srnicek and Williams remind us in their book, Inventing the Future: Postcapitalism and a World Without Work, neoliberals pulled off just such a revolution in the postwar years. Thanks to their efforts, free-market liberalism replaced Keynesianism as the political and economic common sense all around the world.

Another possible solution to the current miseries of unemployment and worker exploitation is the one Livingston rejects in his title: full employment. For anti-work partisans, full employment takes us in the wrong direction, and UBI corrects the course. But the two are not mutually exclusive. In fact, rather than creating new jobs, full employment could require us to reduce our work hours drastically and spread them throughout the workforce—a scheme that could radically de-center waged work in our lives. A dual strategy of pursuing full employment while also demanding universal benefits—including health care, childcare, and affordable housing—would maximize workers’ bargaining power to ensure that they, and not just owners of capital, actually get to enjoy the bounty of labor-saving technology.

Nevertheless, Livingston’s critiques of full employment are worth heeding. As with automation, it can all go wrong if we use the banner of full employment to create pointless roles—what David Graeber has termed “bullshit jobs,” in which workers sit in some soul-sucking basement office for eight hours a day—or harmful jobs, like building nuclear weapons. If we do not have a deliberate politics rooted in universal social justice, then full employment, a basic income, and automation will not liberate us from the degradations of work.

Both Livingston and Anderson reveal how much of our own power we’ve already ceded in making waged work the conduit for our ideals of liberty and morality. The scale and coordination of the institutions we’re up against in the fight for our emancipation is, as Anderson demonstrates, staggering. Employers hold the means to our well-being, and they have the law on their side. Individual efforts to achieve a better “work-life balance” for ourselves and our families miss the wider issue we face as waged employees. Livingston demonstrates the scale at which we should be thinking: Our demands should be revolutionary, our imaginations wide. Standing amid the wreckage of last year’s presidential election, what other choice do we have?

 

Miya Tokumitsu is a lecturer of art history at the University of Melbourne and a contributing editor at Jacobin. She is the author of Do What You Love.  And Other Lies about Success and Happiness.

Reimagining Money

What if markets were designed to build trust instead of wealth?

By Douglas Rushkoff

(The Atlantic)

Bitcoin was conceived as a modern solution to an ages-old problem: How can two parties agree on and verify an exchange of value? In this sense, Bitcoin is an effective technology, in that it trains the massive processing power of distributed personal computers on the same situation that paper currency was built to resolve. But in important ways, Bitcoin transposes some of the shortcomings of traditional currency onto the digital realm. It ignores a whole host of questions about the potential to reimagine what money can be designed to emphasize: What sorts of money will encourage admirable human behavior? What sorts of money systems will encourage trust, reenergize local commerce, favor peer-to-peer value exchange, and transcend the growth requirement? In short, how can money be less an extractor of value and more a utility for its exchange?Around the world, people have proposed experimental, tentative answers to these questions. What follows are three ways that people have toyed with rearranging the priorities of transactions—all of which would encourage a radical reimagination of what money is and can do.

The simplest approach to limiting the delocalizing, extractive power of central currency is for communities to adopt their own local currencies, pegged or tied in some way to a central currency. One of the first and most successful contemporary efforts is the Massachusetts BerkShare, which was developed to help keep money from flowing out of the Berkshire region.

One hundred BerkShares cost $95 and are available at local banks throughout the region. Participating local merchants then accept them as if they were dollars—offering their customers what amounts to a 5-percent discount for using the local money. Although it amounts to selling goods at a perpetual discount, merchants can in turn spend their local currency at other local businesses and receive the same discounted rate. Nonlocals and tourists purchase goods with dollars at full price, and those who bother to purchase items with BerkShares presumably leave town with a bit of unspent local money in their pockets.

The 5-percent local discount may seem like a huge disadvantage to take on—but only if businesses think of themselves as competing individuals. In the long term, the discount is more than compensated for by the fact that BerkShares can circulate only locally. They remain in the region and come back to the same stores again and again. Even if nonlocal stores, such as Walmart, agree to accept the local currency, they can’t deliver it up to their shareholders or trap it in static savings. The best Walmart can do is use it to pay their local workers or purchase supplies and services from local merchants.

* * *

Unlike local discount currencies, cooperative community currencies don’t need to be pegged to the dollar at all. They are not purchased into existence but are worked into circulation. They are best thought of less like money than like exchanges.

The simplest form of cooperative currency is a favor bank, such as those founded in Greece and other parts of southern Europe during the Euro crisis. Incapable of finding work or sourcing Euros, people in many places lost the ability to transact. Even though a majority of what they needed could be produced locally, they had no cash with which to trade. So they built simple, secure trading websites—mini-eBays—where people offered their goods and services to others in return for the goods and services they needed. The sites did not record value amounts so much as keep general track of who was providing what to the community and coordinate fair exchanges. This casual, transparent solution works particularly well in a community where people already know one another and freeloaders can be pressured to contribute.

Larger communities have been using “time dollars,” a currency system that keeps track of how many hours people contribute to one another. Again, a simple exchange is set up on a website, where people list what they need and what they can contribute. The bigger and more anonymous a community, the more security and verification is required. Luckily, dozens of startups and nonprofit organizations have been developing apps and website kits via which local or even nonlocal communities can establish and run their own currencies.

Time exchanges tend to work best when everybody values their time the same way or is providing the same service. Time dollars are extremely egalitarian, valuing each person’s time the same as anyone else’s. An “hour” is worth one hour of work, whether it is performed by a plumber or a psychotherapist.

The Japanese recession gave rise to one of the most successful time exchanges yet, called Fureai Kippu, or “Caring Relationship Tickets.” People no longer had enough cash to pay for their parents’ or grandparents’ health-care services—but because they had moved far away from home to find jobs, they couldn’t take care of their relatives themselves either. The Fureai Kippu exchange gave people the ability to bank hours of eldercare by taking care of old people in their communities, which they could then spend to get care for their own relatives far away. So one person might provide an hour of bathing services for an elder in her neighborhood in return for someone preparing meals for her grandfather who lives in another city. As the Caring Relationship Tickets became accepted things of value, people began using them for a variety of services.

Although a person can do a bunch of work in order to bank enough hours to get a whole bunch of services, most time exchanges put a limit on how many hours members can accumulate. They also put a limit on how many hours a person can owe. This way a freeloader can be removed from the system, and the entire community can absorb the cost of the unearned hours pretty easily.

* * *

How might traditional banks participate effectively in the financial rehabilitation of the communities they serve? Here’s just one possibility:

Sam’s Pizzeria is thriving as a local business, and Sam needs $200,000 to expand the dining room and build a second restroom. Normally, the bank would evaluate his business and credit and then either reject his loan request or give him the money at around 8 percent interest. The risk is that he won’t get enough new business to fill the new space, won’t be able to pay back the loan, and will go out of business. Indeed, part of the cost of the loan is that speculative risk.

In another approach, the banker could make Sam a different offer. The bank could agree to put up $100,000 toward the expansion project at 8 percent if Sam is able to raise the other $100,000 from his community in the form of market money: Sam is to sell digital coupons for $120 worth of pizza at the expanded restaurant at a cost of $100 per coupon. The bank can supply the software and administrate the escrow. If Sam can’t raise the money, then it proves the community wasn’t ready, and the bank can return everyone’s money.

If he does raise the money, then the bank has gained the security of a terrific community buy-in. Sam got his money more cheaply than if he borrowed the whole sum from the bank, because he can pay back the interest in retail-priced pizza. The community lenders have earned a fast 20 percent on their money—far more than they could earn in a bank or mutual fund. And it’s an investment that pays all sorts of other dividends: a more thriving downtown, more customers for other local businesses, better real-estate values, a higher tax base, better public schools, and so on. These are benefits one can’t see when buying stocks or abstract derivatives. Meanwhile, all the local “investors” now have a stake in the restaurant’s staying open at least long enough for them to cash in all their coupons. That’s good motivation to publicize it, take friends out to eat there, and contribute to its success.

For its part, the bank has diversified its range of services, bet on the possibility that community currencies will gain traction, and demonstrated a willingness to do something other than extract value from a community. The bank becomes a community partner, helping a local region invest in itself. The approach also provides the bank with a great hedge against continued deflation, hyperinflation, or growing consumer dissatisfaction with Wall Street and centrally issued money. If capital lending continues to contract as a business sector, the bank has already positioned itself to function as more of a service company—providing the authentication and financial expertise small businesses still need to thrive.

The bank transforms itself from an agent of debt to a catalyst for distribution and circulation. Like money in a digital age, it becomes less a thing of value in itself than a way of fostering the value creation and exchange of others.


This article has been adapted from Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity.

The Continuing Decline of McDonald’s

AC991E1E

By James Corbett

Source: The Corbett Report

Long-time Corbett Reporteers might recall my 2015 video, “Celebrate! McDonald’s is Dying!” where I detailed the many, many woes the fast “food” giant was dealing with at the time, including:

Since then, McCancer’s has been undergoing a sweeping “restructuring” that has seen many layers of lipstick slapped on their factory-farmed pig. This restructuring includes not only cosmetic changes (“All-day breakfasts and new value menus for everyone!”) but behind-the-scenes efforts to trim $500 million from the company’s operating expenses, including buyouts and layoffs at company headquarters and the re-franchising of 4,000 corporate “restaurants.”

The global giant’s influential PR machine has used sleight-of-hand and other tricks to make this restructuring look like a smash success. They used their cheerleaders at the Wall Street Journal to hype “stronger-than-expected” profit and sales figures and their boosters at US News & World Report to hype some highly-selective earnings comparisons suggesting that this “turnaround” is, to use the WSJ’s phrase, “sustainable.”

But one doesn’t have to scratch too hard to reveal the rusty reality beneath this PR paint job.

McPinkslime’s might have “beat expectations” for sales and profits, but beating diminished expectations is hardly a sign of booming business. Just look at the nuts and bolts of the Q3 2016 earnings report: Year-on-year revenue is down 2.9% and net income is down 2.6%. And keep in mind, those numbers are in comparison to the already-terrible 2015 figures.

And that “re-franchising” operation? It cost $130 million in pre-tax charges.

But don’t worry, everyone, they “beat expectations!” Pay no attention to the hemorrhaging corporation behind the curtain!

And now the latest sign of McDonteat’s global retreat (via Corbett Report member “BuddhaForce”): “McDonald’s gives up control of its China business in $2 billion deal.”

The story is fascinating enough in its own right, what with McDonteats throwing in the corporate towel on the largest and fastest-growing consumer market in the world. But the devil is, as always, in the details. Who is purchasing the majority stake in the company’s mainland operations? None other than The Carlyle Group and CITIC Group.

The Carlyle Group’s name will likely ring a bell as one of the largest swamp pits “private equity firms” in the world, and one with its fingers in many a pie, including, of course, 9/11.

CITIC Group, meanwhile, will be familiar to The Corbett Report faithful as a key player in “China and the New World Order,” a Chinese state-owned investment company that helped serve as the Rockefeller-Kissinger nexus between the Deng Xiaoping-era “capitalist roaders” and their western finance oligarch recolonizers.

That these two cesspools are converging on the giant turd of American fast food is fitting enough. The McDonaldization of China is proceeding apace, and the usual crew are there to profit from it.

But as to what this story says about the continuing decline of the once-mighty golden arches, there are two main takeaways to the story, one depressing and one positive.

On the depressing front, there is a simple reason for the across-the-board slowdown in fast food sales in recent years (despite the predictable attempts to overcomplicate the problem in clickbait-y listicle format). For once, the Wall Street Journal gets it right: It’s the economy, stupid. What greater rebuke to the easily-disprovable economic “recovery” nonsense of the Obama years could be possible than pointing out the simple fact that people are too worried about their economic future to splurge on a $5 value meal?

But on a positive note, we can take McFatfood’s woes as a sign that, try as they might with their considerable propaganda resources, the corporate chieftains can’t put their egg McMuffin back together again. People are fed up with fast food. And although some, concerned with cost, are turning to eating at home as the cheaper option, others are more concerned with what’s in their food, where it’s sourced from, how it’s being prepared and who is being paid for it. Who wants instant, nutritionless, food-like substitute rolled up in plastic and slapped down on a tray by surly, overworked servers (or, increasingly, robots) anyway?

For those interested in how they can take part in the real food revolution that will render the McFastfood economy obsolete, may I humbly offer this podcast on guerrilla gardening? Bon appétit!

Our Hopelessly Dysfunctional Democracy

By Charles Hugh Smith

Source: Of Two Minds

When the system is rigged, “democracy” is just another public-relations screen to mask the unsavory reality of Oligarchy.

Democracy in America has become a hollow shell. The conventional markers of democracy–elections and elected representatives–exist, but they are mere facades; the mechanisms of setting the course of the nation are corrupt, and the power lies outside the public’s reach.

History has shown that democratic elections don’t guarantee an uncorrupt, functional government. Rather, democracy has become the public-relations stamp of approval for corrupt governance that runs roughshod over individual liberty while centralizing the power to enforce consent, silence critics and maintain the status quo.

Consider Smith’s Neofeudalism Principle #1: If the citizenry cannot replace a dysfunctional government and/or limit the power of the financial Aristocracy at the ballot box, the nation is a democracy in name only.

In other words, if the citizenry changes the elected representation but the financial Aristocracy and the Deep State remain in charge, then the democracy is nothing but a PR facade for an oppressive oligarchy.

If the erosion of civil liberties and rising inequality characterize the state of the nation, democracy is both dysfunctional and illiberal. A state that strips away the civil liberties of its citizens via civil forfeiture, a war-on-drugs Gulag and unlimited surveillance may be a democracy in name, but it is at heart an oppressive oligarchy.

If the super-wealthy continue to become ever wealthier while the bottom 95% of the citizenry struggle in various stages of debt-serfdom, the state may be a democracy in name, but it is at heart an oppressive oligarchy.

Author/commentator Fareed Zakaria recently addressed the illiberal aspects of America’s faded democracy in an article America’s democracy has become illiberal.

Zakar’s prettified critique avoided the real worm at the heart of our democracy:the state exists to enforce cartels. Some might be private, some might be state-run, and others might be hybrids, such as our failed Sickcare system and our military industrial complex.

The ultimate role of democracy isn’t to “give the people a voice;” the only meaningful role of democracy is to protect the liberties of individuals from state encroachment, break up cartels and monopolies and limit the corruption of private/public money.

America’s democracy has failed on all counts. Civil liberties in a nation of ubiquitous central-state surveillance, a quasi-political Gulag (that nickel bag will earn you a tenner in America’s drug-war Gulag) and civil forfeiture (we suspect you’re up to no good, so we have the right to steal your car and cash) are eroding fast.

In America, the central government’s primary job is enforcing and funding cartels. As many of us have pointed out for years, a mere $10 million in lobbying, revolving-door graft (getting paid $250,000 for a speech or for a couple of board meetings) and bribes (cough-cough, I mean campaign contributions) can secure $100 million in profits–either by erecting regulatory/legal barriers or by direct federal funding of the cartel’s racket (healthcare, defense, “National Security,” etc.).

I explain why this is so in my books Resistance, Revolution, Liberation, Inequality and the Collapse of Privilege and Why Our Status Quo Failed and Is Beyond Reform.

The fact that the corruption is veiled does not mean it isn’t corruption. In the sort of nations Americans mock as fake democracies, the wealthy protect their wealth and incomes with bags of cash delivered at night to politicians.

Nothing so crass or obvious here, of course. Here, the government of Algiers gives $25 million to the Clinton Foundation for “favors,” the Russian government gives hundreds of thousands to John Podesta’s firm for “advice” (heh), the Koch Brothers fund an array of front-organizations that work on behalf of their agenda, K Street lobbying firms rake in tens of millions of dollars every year, and the first thing tech companies do when they realize some interest group might crimp their profits courtesy of lobbying the central state’s politicos is set up their own lavish lobbying and “contribution” schemes.

In theory, democracy enables advocacy by a variety of groups in order to reach a consensual solution to problems shared by everyone. In practice, the advocacy is limited to a select group of insiders, donors and the various fronts of the wealthy: foundations, think-tanks, lobbyists, etc.

Does anyone think America’s democracy is still capable of solving the truly major long-term problems threatening the nation? Based on what evidence? What we see is a corrupt machine of governance that kicks every can down the road rather than suffer the blowback of honestly facing problems that will require deep sacrifices and changes in the status quo.

We see a dysfunctional machine of governance that changes the name of legislation and proposes policy tweaks, while leaving the rapacious cartels untouched. (See the current sickcare “debate” for examples.)

We see an Imperial Project setting the state’s agenda to suit its own desires, and a corporate media that is quivering with rage now that the public no longer believes its tainted swill of “news” and “reporting.”

The divide between the haves and the have-nots is not limited to money–it’s also widening between the few with political power and the teeming serfs with effectively zero political power. When the system is rigged, “democracy” is just another public-relations screen to mask the unsavory reality of oligarchy.

 

The Dance of Death

By Chris Hedges

Source: OpEdNews.com

The ruling corporate elites no longer seek to build. They seek to destroy. They are agents of death. They crave the unimpeded power to cannibalize the country and pollute and degrade the ecosystem to feed an insatiable lust for wealth, power and hedonism. Wars and military “virtues” are celebrated. Intelligence, empathy and the common good are banished. Culture is degraded to patriotic kitsch. Education is designed only to instill technical proficiency to serve the poisonous engine of corporate capitalism.

Historical amnesia shuts us off from the past, the present and the future. Those branded as unproductive or redundant are discarded and left to struggle in poverty or locked away in cages. State repression is indiscriminate and brutal. And, presiding over the tawdry Grand Guignol is a deranged ringmaster tweeting absurdities from the White House.

The graveyard of world empires — Sumerian, Egyptian, Greek, Roman, Mayan, Khmer, Ottoman and Austro-Hungarian — followed the same trajectory of moral and physical collapse. Those who rule at the end of empire are psychopaths, imbeciles, narcissists and deviants, the equivalents of the depraved Roman emperors Caligula, Nero, Tiberius and Commodus. The ecosystem that sustains the empire is degraded and exhausted. Economic growth, concentrated in the hands of corrupt elites, is dependent on a crippling debt peonage imposed on the population. The bloated ruling class of oligarchs, priests, courtiers, mandarins, eunuchs, professional warriors, financial speculators and corporate managers sucks the marrow out of society as its members retreat into privileged enclaves.

The elites’ myopic response to the looming collapse of the natural world and the civilization is to make subservient populations work harder for less, squander capital in grandiose projects such as pyramids, palaces, border walls and fracking, and wage war. President Trump’s decision to increase military spending by $54 billion and take the needed funds out of the flesh of domestic programs typifies the behavior of terminally ill civilizations. When the Roman Empire fell, it was trying to sustain an army of half a million soldiers that had become a parasitic drain on state resources.

The complex bureaucratic mechanisms that are created by all civilizations ultimately doom them. The difference now, as Joseph Tainter points out in “The Collapse of Complex Societies,” is that “collapse, if and when it comes again, will this time be global. No longer can any individual nation collapse. World civilization will disintegrate as a whole.”

Civilizations in decline, despite the palpable signs of decay around them, remain fixated on restoring their “greatness.” Their illusions condemn them. They cannot see that the forces that gave rise to modern civilization, namely technology, industrial violence and fossil fuels, are the same forces that are extinguishing it. Their leaders are trained only to serve the system, slavishly worshiping the old gods long after these gods begin to demand millions of sacrificial victims.

“Hope drives us to invent new fixes for old messes, which in turn create even more dangerous messes,” Ronald Wright writes in “A Short History of Progress.” “Hope elects the politician with the biggest empty promise; and as any stockbroker or lottery seller knows, most of us will take a slim hope over prudent and predictable frugality. Hope, like greed, fuels the engine of capitalism.”

The Trump appointees — Steve Bannon, Jeff Sessions, Rex Tillerson, Steve Mnuchin, Betsy DeVos, Wilbur Ross, Rick Perry, Alex Acosta and others — do not advocate innovation or reform. They are Pavlovian dogs that salivate before piles of money. They are hard-wired to steal from the poor and loot federal budgets. Their single-minded obsession with personal enrichment drives them to dismantle any institution or abolish any law or regulation that gets in the way of their greed. Capitalism, Karl Marx wrote, is “a machine for demolishing limits.” There is no internal sense of proportion or scale. Once all external impediments are lifted, global capitalism ruthlessly commodifies human beings and the natural world to extract profit until exhaustion or collapse. And when the last moments of a civilization arrive, the degenerate edifices of power appear to crumble overnight.

Sigmund Freud wrote that societies, along with individuals, are driven by two primary instincts. One is the instinct for life, Eros, the quest to love, nurture, protect and preserve. The second is the death instinct. The death instinct, called Thanatos by post-Freudians, is driven by fear, hatred and violence. It seeks the dissolution of all living things, including our own beings. One of these two forces, Freud wrote, is always ascendant. Societies in decline enthusiastically embrace the death instinct, as Freud observed in “Civilization and Its Discontents,” written on the eve of the rise of European fascism and World War II.

“It is in sadism, where the death instinct twists the erotic aim in its own sense and yet at the same time fully satisfies the erotic urge, that we succeed in obtaining the clearest insight into its nature and its relation to Eros,” Freud wrote. “But even where it emerges without any sexual purpose, in the blindest fury of destructiveness, we cannot fail to recognize that the satisfaction of the instinct is accompanied by an extraordinary high degree of narcissistic enjoyment, owing to its presenting the ego with a fulfillment of the latter’s old wishes for omnipotence.”

The lust for death, as Freud understood, is not, at first, morbid. It is exciting and seductive. I saw this in the wars I covered. A god-like power and adrenaline-driven fury, even euphoria, sweep over armed units and ethnic or religious groups given the license to destroy anything and anyone around them. Ernst Juenger captured this “monstrous desire for annihilation” in his World War I memoir, “Storm of Steel.”

A population alienated and beset by despair and hopelessness finds empowerment and pleasure in an orgy of annihilation that soon morphs into self-annihilation. It has no interest in nurturing a world that has betrayed it and thwarted its dreams. It seeks to eradicate this world and replace it with a mythical landscape. It turns against institutions, as well as ethnic and religious groups, that are scapegoated for its misery. It plunders diminishing natural resources with abandon. It is seduced by the fantastic promises of demagogues and the magical solutions characteristic of the Christian right or what anthropologists call “crisis cults.”

Norman Cohn, in “The Pursuit of the Millennium: Revolutionary Messianism in Medieval and Reformation Europe and Its Bearing on Modern Totalitarian Movements,” draws a link between that turbulent period and our own. Millennial movements are a peculiar, collective psychological response to profound societal despair. They recur throughout human history. We are not immune.

“These movements have varied in tone from the most violent aggressiveness to the mildest pacifism and in aim from the most ethereal spirituality to the most earth-bound materialism; there is no counting the possible ways of imagining the Millennium and the route to it,” Cohen wrote. “But similarities can present themselves as well as differences; and the more carefully one compares the outbreaks of militant social chiliasm during the later Middle Ages with modern totalitarian movements the more remarkable the similarities appear. The old symbols and the old slogans have indeed disappeared, to be replaced by new ones; but the structure of the basic phantasies seems to have changed scarcely at all.”

These movements, Cohen wrote, offered “a coherent social myth which was capable of taking entire possession of those who believed in it. It explained their suffering, it promised them recompense, it held their anxieties at bay, it gave them an illusion of security — even while it drove them, held together by a common enthusiasm, on a quest which was always vain and often suicidal.

“So it came about that multitudes of people acted out with fierce energy a shared phantasy which though delusional yet brought them such intense emotional relief that they could live only through it and were perfectly willing to die for it. It is a phenomenon which was to recur many times between the eleventh century and the sixteenth century, now in one area, now in another, and which, despite the obvious differences in cultural context and in scale, is not irrelevant to the growth of totalitarian movements, with their messianic leaders, their millennial mirages and their demon-scapegoats, in the present century.”

The severance of a society from reality, as ours has been severed from collective recognition of the severity of climate change and the fatal consequences of empire and deindustrialization, leaves it without the intellectual and institutional mechanisms to confront its impending mortality.

It exists in a state of self-induced hypnosis and self-delusion. It seeks momentary euphoria and meaning in tawdry entertainment and acts of violence and destruction, including against people who are demonized and blamed for society’s demise. It hastens its self-immolation while holding up the supposed inevitability of a glorious national resurgence. Idiots and charlatans, the handmaidens of death, lure us into the abyss.

 

Disposable Americans: The Numbers are Growing

middle-class

By Paul Buchheit

Source: Information Clearing House

As often noted in the passionate writings of Henry Giroux, poor Americans are becoming increasingly ‘disposable’ in our winner-take-all society. After 35 years of wealth distribution to the super-rich, inequality has forced much of the middle class towards the bottom, to near-poverty levels, and to a state of helplessness in which they find themselves being blamed for their own misfortunes.

The evidence keeps accumulating: income and wealth — and health — are declining for middle-class America. As wealth at the top grows, the super-rich feel they have little need for the rest of society.

Income Plummets for the Middle Class

According to Pew Research, in 1970 three of every ten income dollars went to upper-income households. Now five of every ten dollars goes to them.

The Social Security Administration reports that over half of Americans make less than $30,000 per year. That’s less than an appropriate average living wage of $16.87 per hour, as calculated by Alliance for a Just Society.

Wealth Collapses for Half of Us

Numerous sources report that half or more of American families have virtually no savings, and would have to borrow money or sell possessions to cover an emergency expense. Between half and two-thirds of Americans have less than $1,000.

For every $100 owned by a middle-class household in 2001, that household now has just $72.

Not surprisingly, race plays a role in the diminishing of middle America. According to Pew Research, the typical black family has only enough liquid savings to last five days, compared to 12 days for the typical Hispanic household, and 30 days for a white household.

Our Deteriorating Health

In a disgraceful display of high-level disregard for vital health issues, House Republicans are attempting to cut back on lunches for over 3 million kids.

The evidence for the health-related disposability of poor Americans comes from a new study that finds nearly a 15 year difference in life expectancy for 40-year-olds among the richest 1% and poorest 1% (10 years for women). Much of the disparity has arisen in just the past 15 years.

It’s not hard to understand the dramatic decline in life expectancy, as numerous studies have documented the health problems resulting from the inequality-driven levels of stress and worry and anger that make Americans much less optimistic about the future. The growing disparities mean that our children will likely see less opportunities for their own futures.

It May Be Getting Worse

The sense derived from all this is that half of America is severely financially burdened, at risk of falling deeper into debt.

It may be more than half. The Wall Street Journal recently reported on a JP Morgan study’s conclusion that “the bottom 80% of households by income lack sufficient savings to cover the type of volatility observed in income and spending.” Fewer than one in three 25- to 34-year-olds live in their own homes, a 20 percent drop in just the past 15 years.

It may be even worse for renters. The number of families spending more than half their incomes on rent — the ‘severely’ cost-burdened renters — has increased by a stunning 50 percent in just ten years. Billionaire Steve Schwarzman, whose company Blackstone has been buying up tens of thousands of homes at rock-bottom prices and then renting them back while waiting out the housing market, finds the growing anger among voters “astonishing.”

What’s astonishing is the disregard that many of the super-rich have for struggling Americans.

 

Paul Buchheit is a college teacher, an active member of US Uncut Chicago, founder and developer of social justice and educational websites (UsAgainstGreed.org, PayUpNow.org, RappingHistory.org), and the editor and main author of “American Wars: Illusions and Realities” (Clarity Press). He can be reached at paul@UsAgainstGreed.org.

Lara Trace Hentz

INDIAN COUNTRY NEWS

In Saner Thought

"It is the duty of every man, as far as his ability extends, to detect and expose delusion and error"..Thomas Paine

ZEDJournAI

Human in Algorithms

Rooster Crows

From the Roof Top

Aisle C

I See This

The Free

blog of the post capitalist transition.. Read or download the novel here + latest relevant posts

अध्ययन-अनुसन्धान(Essential Knowledge of the Overall Subject)

अध्ययन-अनुसन्धानको सार