After the Crash

Dispatches From a Long Recovery (Est. 10/2024)

After the Crash

What Was Covid Really About? Triggering A Multi-Trillion Dollar Global Debt Crisis. “Ramping up an Imperialist Strategy”?

Covid, Capitalism, Friedrich Engels and Boris Johnson

By Colin Todhunter

Source: Global Research

“And thus it renders more and more evident the great central fact that the cause of the miserable condition of the working class is to be sought, not in these minor grievances, but in the capitalistic system itself.” Friedrich Engels, The Condition of the Working Class in England (1845) (preface to the English Edition, p.36)  

The IMF and World Bank have for decades pushed a policy agenda based on cuts to public services, increases in taxes paid by the poorest and moves to undermine labour rights and protections.

IMF ‘structural adjustment’ policies have resulted in 52% of Africans lacking access to healthcare and 83% having no safety nets to fall back on if they lose their job or become sick. Even the IMF has shown that neoliberal policies fuel poverty and inequality.

In 2021, an Oxfam review of IMF COVID-19 loans showed that 33 African countries were encouraged to pursue austerity policies. The world’s poorest countries are due to pay $43 billion in debt repayments in 2022, which could otherwise cover the costs of their food imports.

Oxfam and Development Finance International (DFI) have also revealed that 43 out of 55 African Union member states face public expenditure cuts totalling $183 billion over the next five years.

According to Prof Michel Chossudovsky of the Centre for Research on Globalization, the closure of the world economy (March 11, 2020 Lockdown imposed on more than 190 countries) has triggered an unprecedented process of global indebtedness. Governments are now under the control of global creditors in the post-COVID era.

What we are seeing is a de facto privatisation of the state as governments capitulate to the needs of Western financial institutions.

Moreover, these debts are largely dollar-denominated, helping to strengthen the US dollar and US leverage over countries.

It raises the question: what was COVID really about?

Millions have been asking that question since lockdowns and restrictions began in early 2020. If it was indeed about public health, why close down the bulk of health services and the global economy knowing full well what the massive health, economic and debt implications would be?

Why mount a military-style propaganda campaign to censor world-renowned scientists and terrorise entire populations and use the full force and brutality of the police to ensure compliance?

These actions were wholly disproportionate to any risk posed to public health, especially when considering the way ‘COVID death’ definitions and data were often massaged and how PCR tests were misused to scare populations into submission.

Prof Fabio Vighi of Cardiff University implies we should have been suspicious from the start when the usually “unscrupulous ruling elites” froze the global economy in the face of a pathogen that targets almost exclusively the unproductive (the over 80s).

COVID was a crisis of capitalism masquerading as a public health emergency.

Capitalism  

Capitalism needs to keep expanding into or creating new markets to ensure the accumulation of capital to offset the tendency for the general rate of profit to fall. The capitalist needs to accumulate capital (wealth) to be able to reinvest it and make further profits. By placing downward pressure on workers’ wages, the capitalist extracts sufficient surplus value to be able to do this.

But when the capitalist is unable to sufficiently reinvest (due to declining demand for commodities, a lack of investment opportunities and markets, etc), wealth (capital) over accumulates, devalues and the system goes into crisis. To avoid crisis, capitalism requires constant growth, markets and sufficient demand.

According to writer Ted Reese, the capitalist rate of profit has trended downwards from an estimated 43% in the 1870s to 17% in the 2000s. Although wages and corporate taxes have been slashed, the exploitability of labour was increasingly insufficient to meet the demands of capital accumulation.

By late 2019, many companies could not generate sufficient profit. Falling turnover, limited cashflows and highly leveraged balance sheets were prevalent.

Economic growth was weakening in the run up to the massive stock market crash in February 2020, which saw trillions more pumped into the system in the guise of ‘COVID relief’.

To stave off crisis up until that point, various tactics had been employed.

Credit markets were expanded and personal debt increased to maintain consumer demand as workers’ wages were squeezed. Financial deregulation occurred and speculative capital was allowed to exploit new areas and investment opportunities. At the same time, stock buy backs, the student debt economy, quantitative easing and massive bail outs and subsidies and an expansion of militarism helped to maintain economic growth.

There was also a ramping up of an imperialist strategy that has seen indigenous systems of production abroad being displaced by global corporations and states pressurised to withdraw from areas of economic activity, leaving transnational players to occupy the space left open.

While these strategies produced speculative bubbles and led to an overevaluation of assets and increased both personal and government debt, they helped to continue to secure viable profits and returns on investment.

But come 2019, former governor of the Bank of England Mervyn King warned that the world was sleepwalking towards a fresh economic and financial crisis that would have devastating consequences. He argued that the global economy was stuck in a low growth trap and recovery from the crisis of 2008 was weaker than that after the Great Depression.

King concluded that it was time for the Federal Reserve and other central banks to begin talks behind closed doors with politicians.

That is precisely what happened as key players, including BlackRock, the world’s most powerful investment fund, got together to work out a strategy going forward. This took place in the lead up to COVID.

Aside from deepening the dependency of poorer countries on Western capital, Fabio Vighi says lockdowns and the global suspension of economic transactions allowed the US Fed to flood the ailing financial markets (under the guise of COVID) with freshly printed money while shutting down the real economy to avoid hyperinflation. Lockdowns suspended business transactions, which drained the demand for credit and stopped the contagion.

COVID provided cover for a multi-trillion-dollar bailout for the capitalist economy that was in meltdown prior to COVID. Despite a decade or more of ‘quantitative easing’, this new bailout came in the form of trillions of dollars pumped into financial markets by the US Fed (in the months prior to March 2020) and subsequent ‘COVID relief’.

The IMF, World bank and global leaders knew full well what the impact on the world’s poor would be of closing down the world economy through COVID-related lockdowns.

Yet they sanctioned it and there is now the prospect that in excess of a quarter of a billion more people worldwide will fall into extreme levels of poverty in 2022 alone.

In April 2020, the Wall Street Journal stated the IMF and World Bank faced a deluge of aid requests from scores of poorer countries seeking bailouts and loans from financial institutions with $1.2 trillion to lend.

In addition to helping to reboot the financial system, closing down the global economy deliberately deepened poorer countries’ dependency on Western global conglomerates and financial interests.

Lockdowns also helped accelerate the restructuring of capitalism that involves smaller enterprises being driven to bankruptcy or bought up by monopolies and global chains, thereby ensuring continued viable profits for Big Tech, the digital payments giants and global online corporations like Meta and Amazon and the eradication of millions of jobs.

Although the effects of the conflict in Ukraine cannot be dismissed, with the global economy now open again, inflation is rising and causing a ‘cost of living’ crisis. With a debt-ridden economy, there is limited scope for rising interest rates to control inflation.

But this crisis is not inevitable: current inflation is not only induced by the liquidity injected into the financial system but also being fuelled by speculation in food commodity markets and corporate greed as energy and food corporations continue to rake in vast profits at the expense of ordinary people.

Resistance  

However, resistance is fertile.

Aside from the many anti-restriction/pro-freedom rallies during COVID, we are now seeing a more strident trade unionism coming to the fore – in Britain at least – led by media savvy leaders like Mick Lynch, general secretary of the National Union of Rail, Maritime and Transport Workers (RMT), who know how to appeal to the public and tap into widely held resentment against soaring cost of living rises.

Teachers, health workers and others could follow the RMT into taking strike action.

Lynch says that millions of people in Britain face lower living standards and the stripping out of occupational pensions. He adds:

“COVID has been a smokescreen for the rich and powerful in this country to drive down wages as far as they can.”

Just like a decade of imposed ‘austerity’ was used to achieve similar results in the lead up to COVID.

The trade union movement should now be taking a leading role in resisting the attack on living standards and further attempts to run-down state-provided welfare and privatise what remains.

The strategy to wholly dismantle and privatise health and welfare services seems increasingly likely given the need to rein in (COVID-related) public debt and the trend towards AI, workplace automisation and worklessness.

This is a real concern because, following the logic of capitalism, work is a condition for the existence of the labouring classes. So, if a mass labour force is no longer deemed necessary, there is no need for mass education, welfare and healthcare provision and systems that have traditionally served to reproduce and maintain labour that capitalist economic activity has required.

In 2019, Philip Alston, the UN rapporteur on extreme poverty, accused British government ministers of the “systematic immiseration of a significant part of the British population” in the decade following the 2008 financial crash.

Alston stated:

“As Thomas Hobbes observed long ago, such an approach condemns the least well off to lives that are ‘solitary, poor, nasty, brutish, and short’. As the British social contract slowly evaporates, Hobbes’ prediction risks becoming the new reality.”

Post-COVID, Alston’s words carry even more weight.

As this article draws to a close, news is breaking that Boris Johnson has resigned as prime minister. A remarkable PM if only for his criminality, lack of moral foundation and double standards – also applicable to many of his cronies in government.

With this in mind, let’s finish where we began.

“I have never seen a class so deeply demoralised, so incurably debased by selfishness, so corroded within, so incapable of progress, as the English bourgeoisie…

For it nothing exists in this world, except for the sake of money, itself not excluded. It knows no bliss save that of rapid gain, no pain save that of losing gold.

In the presence of this avarice and lust of gain, it is not possible for a single human sentiment or opinion to remain untainted.” Friedrich EngelsThe Condition of the Working Class in England (1845), p.275

The Future is Hawaii

By Peter Van Buren

Source: We Meant Well

I have seen the future. It looks a lot like Hawaii. What I saw there (absent the beautiful beaches, confused tourists, and incredible nature) was a glimpse of the future for much of America.

COVID paved the way for internal travel restrictions — Americans moving around inside their own country — never before thought possible, or even constitutional. Hawaii, an American state, had to decide if they accepted American me, much as a foreign country controls its borders and decides which outsiders may enter.

Hawaii required a very specific COVID test, from a “trusted partner” company they contract with, at the cost of $119 (no insurance accepted.) To drive home the Orwellian aspects of this all, after receiving the test kit I had to spit into the test tube during a Zoom call, some large head onscreen peeping into my bedroom watching to ensure it was indeed my spit. And now of course, after clicking Accept several times, my DNA information is in Hawaiian government hands along with whoever else’s name was buried in pages of Terms of Service. I was rewarded with the Scooby snack of an QR code on my phone.

Hawaii used to offer the option of skipping the test and doing quarantine on-island. However, they now pre-screen at major airports and so no QR code, no boarding. And for those who don’t think good, today it’s a COVID test, tomorrow other criteria may be applied. Aloha!

I will add that all the extra health screening at the airport made me a little nostalgic when I finally got to the bombs and weapons detecting set up by TSA. Just like the good old days when we worried about Muslim terrorists instead of each other turning our planes into flying death tubes, I was checked to make sure I was not carrying more than 3 ounces of shampoo. It felt… quaint to remove my shoes alongside everyone else, millions of pairs a day, all because some knucklehead failed to explode his shoe bomb and was subdued by other passengers 12 freaking years ago. For old times’ sake I prepared mentally to subdue my fellow cabin mates. The nostalgia was driven home as the TSA screener made everyone remove their mask for a moment to verify the face matched the ID picture except Muslim women, ensuring every non-Muslim woman passenger got to exhale a couple of COVID-era breaths into the crowd. Viva!

The future in Hawaii strikes you as soon as you clear the airport into that beautiful Pacific air. It smells good in patches, but in fact there are growing masses of homeless people everywhere; the unsheltered homeless population is up 12 percent on Oahu. Coming from NYC I am certainly not surprised by the zombie armies, but these people live outside. You can’t escape them by surrendering control of the subway system, or by creating shelters in someone else’s neighborhood. The homeless here live in tents, some in gleefully third world shacks made of found materials, others in government-paid shanties creatively called “tiny houses.”

Some make solo camp sites alone on the sidewalk, some create mini-Burning Man encampments in public parks. I’d like to say the latter resemble the migratory camps in Grapes of Wrath, but the Joad family could still afford an old jalopy and these people cannot. The Joads were also headed to find work; these people have burrowed in, with laundry hanging out, dogs running among the trash, rats and bugs happily exploring the host-parasite relationship. These folks stake out areas once full of tourists on Waikiki, and in public spaces once enjoyed more by locals. Drugs are a major problem and whether a homeless person will hassle you depends on which drug he favors, the kind that makes him aggressive or the kind that makes him sleep standing up at the bus stop.

The future is built around the homeless, literally. My business was in the Kakaako area, once a warehouse district between Waikiki and downtown Honolulu, now home to a dozen or more 40 story condos. They are all built like fortresses against the homeless. Each tower sits on a pedestal with parking inside, such that the street view of most places is a four story wall. There is an entrance (with security) but in fact the “first floor” for us is already four floors above ground. Once you’re up there, the top of the pedestal usually features a pool, a garden, BBQ, kiddie play area, dog walking space, all safely out of reach from whatever ugly is going on down below.

If you look out the windows from the upper, most expensive floors, you can see the ocean and sand but not the now tiny homeless people. They become invisible if you’re rich enough. Don’t be offended or shocked — what did you think runaway economic inequality was gonna end up doing to us? Macroeconomics isn’t a morality play. But for most of the wealthy the issue isn’t confronting the reality of inequality, it is navigating the society it has created. Never mind stuff like those bars on park benches that make it impossible to lay down. The architects in Kakaako have stepped it up.

These heavily defended apartments can run lots of millions of dollars, with most owners either coming from the mainland U.S. or Asia. They will live a nice life. Most of them work elsewhere, or own businesses elsewhere, which is good, because the future in Hawaii does not look good for the 99 percent below. It’s inevitable in a society that is constantly adding to its homeless population while simultaneously lacking any comprehensive way to provide medical treatment, all the while smoothing over the bumps on the street with plentiful supplies of alcohol and opioids.

Hawaii’s economy may be the future. Very little is made here. As making steel and cars left the Midwest in the late 2oth century, so did Hawaii’s old economy based on agriculture. It was cheaper to grow food elsewhere and import it to the mainland. The bulk of pineapple consumed in the United States now comes from Mexican, Central and South American growers same as steel now comes from China, and the few pineapple fields in Hawaii are for tourists. Hawaii now depends on two industries: tourism and defense spending. And both are controlled by government.

Tourism accounts directly for 24 percent of the state’s economy, more if one factors in secondary spending. The industry currently does not exist in viable form, with arrivals down some 75 percent. Unemployment Hawaii-wide is 24 percent, much more if you add in those who long ago gave up looking or are underemployed frying burgers. Much is driven by COVID. Will those ever recede? No one knows. When might things get better? No one knows. The decisions which control lives are made largely in secret, by the governor or “scientists,” and are not subject to public debate or a state congressional vote. One imagines a Dickensonian kid in hula skirt asking “Please sir, may we have jobs?”

Everyone knows Pearl Harbor, not only once a major tourist destination but also a part of direct Pentagon spending which pumps $7.2 billion into Hawaii’s economy, about 7.7 percent of the state’s GDP. Hawaii is second in the United States for the highest defense spending as a share of state GDP, and that’s just the overt stuff. Rumor has it the NSA has multiple facilities strewn around western Oahu with thousands of employees. All those government personnel, uniformed or covert, do a lot of personal spending in the local economy, much as they do in the shanty towns which ring American bases abroad. Everyone relies on local utilities like water, power, and sewers, and those bases need engineers, plumbers, electricians and others. Many are local residents either directly employed by DoD or working through contracts with private companies. The point is even more then tourism, this large sector of the economy is controlled by the government. At least they’re still working.

Another important sector of the Hawaiian economy is also government controlled, those who live entirely on public benefits. Benefits in Hawaii are the highest in the nation, an average of $49,175 and untaxed. For the last 9 years Hawaii spent more on public welfare benefits, about 20 percent of the state budget, then it did on education. More than one out ten people in Hawaii get food stamps (SNAP), though the number is higher if you include free lunches at school and for the elderly. Fewer working people means fewer tax paying people, so this is unsustainable into the future.

Who owns the future? The government in Hawaii owns the land. The Federal government owns about 20 percent of everything, and the state of Hawaii owns some 50 percent of the rest. Do Not Enter – U.S. Government Property signs are everywhere if you take a drive out of town. There are also plenty of private roads and gated communities to separate the rich from the poor, but the prize goes to Oracle owner Larry Ellison who owns almost the entire island of Lanai, serving as a gatekeeper inside another gatekeeper’s turf. For the rest of the people, homeownership rates in Hawaii are some of the lowest in the nation.

The good news (for some…) is in the future whites will be a minority race in all of America. They already are in Hawaii. Asians not including Native Hawaiians make up 37 percent of the population, with whites tagging in at 25 percent. Local government, some 55 percent of the jobs, is dominated by people of Japanese heritage. Japanese heritage people also have the highest percentage of homeownership, 70 percent. Almost all have a high school diploma, and about a third have a four-year college degree.

The well-loved mainland concept of “people of color” fades quickly in Hawaii, where Japanese color people are a majority over everyone else. And unlike in some minds, people in Hawaii are very aware that the concept of “Asian” is racist as hell, and know the differences among Japanese, Korean, Chinese, and Vietnamese. Things are such that local Caucasian and Hawaii Democratic Congressman Ed Case said he was an “Asian trapped in a white body” and meant it as, and was understood in Hawaii as, a good thing and was echoed by Case’s Japanese-American wife.

White supremacy has clearly been defeated here, though I am not sure BLM would be happy with how that actually worked out without them. On a personal note, I will say as a white-identifying minority I was well-treated by the police and others. I was not forced to wear one of those goofy shirts or add an apostrophe to words while in Hawai’i against my cultural mores, so there may be hope yet in the future I saw.

The Erosion of the Middle Class — Why Americans Are Working Harder and Earning Less

By John Liberty

Source: The Mind Unleashed

“I don’t have to tell you things are bad. Everybody knows things are bad. It’s a depression. Everybody’s out of work or scared of losing their job. The dollar buys a nickel’s worth, banks are going bust, shopkeepers keep a gun under the counter. Punks are running wild in the street and there’s nobody anywhere who seems to know what to do, and there’s no end to it.” — Howard Beale

Howard Beale, the main character in the 1976 film Network, became a part of cinematic history when he uttered the line “I’m mad as hell and I’m not gonna take it anymore.” That one line expressed a growing rage among America’s shrinking middle class at a time when Americans were reeling from years of war, political scandals and economic downturn.

In the four decades that have followed, little has improved for the average American. We’re still ‘mad as hell’ and the middle class is being eroded right in front of our eyes. When adjusted for inflation, many Americans are working longer hours and earning less than they did in 1976. So, how have we gone from vibrant middle class to the working poor in a matter of decades?

Median Incomes Are Stagnant

Despite increases in the national income over the past fifty years, middle class families have experienced little income growth over the past few decades. According to U.S. Census datamiddle class incomes have grown by only 28 percent from 1979 – 2014. Meanwhile, a report from the Congressional Budget Office (CBO) shows that the top 20 percent of earners has seen their incomes rise by 95 percent over that same period of time.

Contributing to the stagnation of wages is a notable decrease in the workforce participation rate. According to the Brookings institute, “One reason for these declines in employment and labor force participation is that work is less rewarding. Wages for those at the bottom and middle of the skill and wage distribution have declined or stagnated.” Historical data from the Bureau of Labor Statistics backs up these findings, showing a steady decrease in workforce participation over the last two decades.

The Erosion of the Minimum Wage & America’s Purchasing Power

Anyone who has read a comment thread on the internet about minimum wage laws knows the debate is currently one of the most highly contentious political topics in America. In the halls of Congress, the debate has turned into a nearly decade long impasse. As a result, workers at the low end of the wage scale have watched the purchasing power of their wages decrease from $7.25 in 2009, to $6.19 in 2018 due to inflation. In 2018, you need to perform 47 hours of minimum wage work to achieve the same amount of purchasing power as 40 hours of work in 2009.

The inflation-adjusted minimum wage value has been in steady decline since 1968, when the $1.60 minimum wage was equal to $11.39 (in 2018 dollars). Since then, lawmakers have reduced minimum wage increases relative to the rate of inflation. As Christopher Ingraham reports:

“Recent research shows that the reason politicians — Democrats and Republicans alike — are dragging their feet on popular policies such as the minimum wage is that they pay a lot more attention to the needs and desires of deep-pocketed business groups than they do to regular voters. Those groups tend to oppose minimum wage increases for the simple reason that they eat into their profit margins.”

To be clear, the erosion of the purchasing power of everyday Americans is hardly a new phenomenon. According to data from the U.S. Bureau of Labor Statistics, the purchasing power of the U.S. Dollar has plummeted by over 95 percent since 1913, the year the Federal Reserve was created. The Bureau’s Consumer Price Index indicates that prices in 2018 are 2,436.33% higher than prices in 1913 and that the dollar has experienced an average inflation rate of 3.13% per year during this period.

The Rich Get Richer

While the outlook may be grim for low-wage workers, this is fantastic news for large corporations. Data from the U.S. Bureau of Economics shows that corporate profits are approaching all-time highs. But it’s not just workers who are feeling the effect of growing income inequality. The contrast is also being felt on Main Street. An analysis of the S & P 500 and the Russell 1,000 & 2,000 indexes by Bloomberg revealed a growing gap between America’s largest employers and smaller businesses.

A report from the Institute for Policy Studies entitled Billionaire Bonanza: The Forbes 400 and the Rest of Us echoed these findings when it revealed that America’s 20 wealthiest people — a group that could fit comfortably in one single Gulfstream G650 luxury jet –­ now own more wealth than the bottom half of the American population combined.

Although the Trump administration continues to tout stock market and labor force increases as signs of economic prosperity, numbers show that the wealthiest 10 percent of Americans own 84 percent of all stock. A study conducted by the Economic Policy Institute found that wage growth remains too weak to consider the economy at full employment and that stagnant wage growth has contributed to the growing level of income inequality in America. The study noted that while wages have recovered from the 2008 recession, the gap between those at the top and those at the middle and bottom has continued to increase since 2000. As the study’s author, Elise Gould writes:

“We’re looking at nominal wage growth that is still slower than you would expect in a full employment economy, slower than you would expect if you thought there were any sort of inflation pressures from wage growth.”

The Decimation of the American Dream

Comedian George Carlin once said, “The reason they call it the American Dream is because you have to be asleep to believe it.” For millions of middle class Americans Carlin’s statement has proven eerily accurate. Stagnant wages and decreased purchasing power has put the prospects for middle class children in a tailspin as upward mobility trends have reportedly fallen by over 40 percent since 1950.

A poll conducted by the Pew Research Institute corroborates this claim. According to Pew, only 37 percent of Americans believe that today’s children will grow up to be better off financially than their parents. That means more Americans think that today’s children will be financially worse off than their parents than those who believe they will be better off.

The sentiments expressed by millions of middle class Americans appear to be wholly justified due to the fact that middle class families are becoming more fragile and dependent on two incomes. A report from the Council of Economic Advisors found the majority of the income gains made by the middle class from 1979 to 2013 were a result of increased participation in the workplace by women. The report also noted the fragility of two income families amidst a decline in marriage and a drastic rise in single parent homes in recent years.

As a result of the slow growth in wages, over half of Americans now receive more in Government transfer payments (Medicare, Medicaid, food stamps, Social Security) than they pay in federal taxes. An analysis of all 50 states also found that in 42 states the cost of living is higher than the median income.

The rising cost of healthcare is also putting the pinch on the wallets of many Americans. As Jeffrey Pfeffer noted in his book Dying for a Paycheck, healthcare spending—per capita—has increased 29 fold over the past 40 years, outpacing the growth of the American economy.

While many Americans continue to look to the government to fix problems like wage stagnation, income inequality and rising healthcare costs, the sad truth is that we live in a time when 1 in 3 households has trouble paying energy bills and 40 percent of Americans face poverty in retirement at the exact same time the Federal Government has admitted that they lost $21 trillion. Not only did they lose $21 trillion (yes that’s TRILLION with a T), but the Department of Defense indicated in a press conference that they “never expected to pass” the audit to locate the missing taxpayer money.

John Emerich Edward Dalberg Acton famously proclaimed in 1887:

“Power tends to corrupt, and absolute power corrupts absolutely. Great men are almost always bad men.”

Perhaps it’s time for the millions of Americans who are quietly ‘mad as hell’ to start expressing their rage at the corrupt institutions of power that are decimating their livelihoods rather than expecting those very same institutions to fix the problems they created.

 

The United States of Work

Employers exercise vast control over our lives, even when we’re not on the job. How did our bosses gain power that the government itself doesn’t hold?

By Miya Tokumitsu

Source: New Republic

Work no longer works. “You need to acquire more skills,” we tell young job seekers whose résumés at 22 are already longer than their parents’ were at 32. “Work will give you meaning,” we encourage people to tell themselves, so that they put in 60 hours or more per week on the job, removing them from other sources of meaning, such as daydreaming or social life. “Work will give you satisfaction,” we insist, even though it requires abiding by employers’ rules, and the unwritten rules of the market, for most of our waking hours. At the very least, work is supposed to be a means to earning an income. But if it’s possible to work full time and still live in poverty, what’s the point?

Even before the global financial crisis of 2008, it had become clear that if waged work is supposed to provide a measure of well-being and social structure, it has failed on its own terms. Real household wages in the United States have remained stagnant since the 1970s, even as the costs of university degrees and other credentials rise. Young people find an employment landscape defined by unpaid internships, temporary work, and low pay. The glut of degree-holding young workers has pushed many of them into the semi- or unskilled labor force, making prospects even narrower for non–degree holders. Entry-level wages for high school graduates have in fact fallen. According to a study by the Federal Reserve Bank of New York, these lost earnings will depress this generation’s wages for their entire working lives. Meanwhile, those at the very top—many of whom derive their wealth not from work, but from returns on capital—vacuum up an ever-greater share of prosperity.

Against this bleak landscape, a growing body of scholarship aims to overturn our culture’s deepest assumptions about how work confers wealth, meaning, and care throughout society. In Private Government: How Employers Rule Our Lives (and Why We Don’t Talk About It), Elizabeth Anderson, a professor of philosophy at the University of Michigan, explores how the discipline of work has itself become a form of tyranny, documenting the expansive power that firms now wield over their employees in everything from how they dress to what they tweet. James Livingston, a historian at Rutgers, goes one step further in No More Work: Why Full Employment Is a Bad Idea. Instead of insisting on jobs for all or proposing that we hold employers to higher standards, Livingston argues, we should just scrap work altogether.

Livingston’s vision is the more radical of the two; his book is a wide-ranging polemic that frequently delivers the refrain “Fuck work.” But in original ways, both books make a powerful claim: that our lives today are ruled, above all, by work. We can try to convince ourselves that we are free, but as long as we must submit to the increasing authority of our employers and the labor market, we are not. We therefore fancy that we want to work, that work grounds our character, that markets encompass the possible. We are unable to imagine what a full life could be, much less to live one. Even more radically, both books highlight the dramatic and alarming changes that work has undergone over the past century—insisting that, in often unseen ways, the changing nature of work threatens the fundamental ideals of democracy: equality and freedom.

Anderson’s most provocative argument is that large companies, the institutions that employ most workers, amount to a de facto form of government, exerting massive and intrusive power in our daily lives. Unlike the state, these private governments are able to wield power with little oversight, because the executives and boards of directors that rule them are accountable to no one but themselves. Although they exercise their power to varying degrees and through both direct and “soft” means, employers can dictate how we dress and style our hair, when we eat, when (and if) we may use the toilet, with whom we may partner and under what arrangements. Employers may subject our bodies to drug tests; monitor our speech both on and off the job; require us to answer questionnaires about our exercise habits, off-hours alcohol consumption, and childbearing intentions; and rifle through our belongings. If the state held such sweeping powers, Anderson argues, we would probably not consider ourselves free men and women.

Employees, meanwhile, have few ways to fight back. Yes, they may leave the company, but doing so usually necessitates being unemployed or migrating to another company and working under similar rules. Workers may organize, but unions have been so decimated in recent years that their clout is greatly diminished. What’s more, employers are swift to fire anyone they suspect of speaking to their colleagues about organizing, and most workers lack the time and resources to mount a legal challenge to wrongful termination.

It wasn’t supposed to be this way. As corporations have worked methodically to amass sweeping powers over their employees, they have held aloft the beguiling principle of individual freedom, claiming that only unregulated markets can guarantee personal liberty. Instead, operating under relatively few regulations themselves, these companies have succeeded at imposing all manner of regulation on their employees. That is to say, they use the language of individual liberty to claim that corporations require freedom to treat workers as they like.

Anderson sets out to discredit such arguments by tracing them back to their historical origins. The notion that personal freedom is rooted in free markets, for instance, originated with the Levellers in seventeenth-century England, when working conditions differed substantially from today’s. The Levellers believed that a market society was essential to liberate individuals from the remnants of feudal hierarchies; their vision of utopia was a world in which men could meet and interact on terms of equality and dignity. Their ideas echoed through the writing and politics of later figures like John Locke, Adam Smith, Thomas Paine, and Abraham Lincoln, all of whom believed that open markets could provide the essential infrastructure for individuals to shape their own destiny.

An anti-statist streak runs through several of these thinkers, particularly the Levellers and Paine, who viewed markets as the bulwark against state oppression. Paine and Smith, however, would hardly qualify as hard-line contemporary libertarians. Smith believed that public education was essential to a fair market society, and Paine proposed a system of social insurance that included old-age pensions as well as survivor and disability benefits. Their hope was not for a world of win-or-die competition, but one in which open markets would allow individuals to make the fullest use of their talents, free from state monopolies and meddlesome bosses.

For Anderson, the latter point is essential; the notion of lifelong employment under a boss was anathema to these earlier visions of personal freedom. Writing in the 1770s, Smith assumes that independent actors in his market society will be self-employed, and uses butchers and bakers as his exemplars; his “pin factory,” meant to illustrate division of labor, employs only ten people. These thinkers could not envision a world in which most workers spend most of their lives performing wage labor under a single employer. In an address before the Wisconsin State Agricultural Society in 1859, Lincoln stated, “The prudent, penniless beginner in the world labors for wages awhile, saves a surplus with which to buy tools or land for himself, then labors on his own account another while, and at length hires another new beginner to help him.” In other words, even well into the nineteenth century, defenders of an unregulated market society viewed wage labor as a temporary stage on the way to becoming a proprietor.

Lincoln’s scenario does not reflect the way most people work today. Yet the “small business owner” endures as an American stock character, conjured by politicians to push through deregulatory measures that benefit large corporations. In reality, thanks to a lack of guaranteed, nationalized health care and threadbare welfare benefits, setting up a small business is simply too risky a venture for many Americans, who must rely on their employers for health insurance and income. These conditions render long-term employment more palatable than a precarious existence of freelance gigs, which further gives companies license to oppress their employees.

The modern relationship between employer and employee began with the rise of large-scale companies in the nineteenth century. Although employment contracts date back to the Middle Ages, preindustrial arrangements bore little resemblance to the documents we know today. Like modern employees, journeymen and apprentices often served their employers for years, but masters performed the same or similar work in proximity to their subordinates. As a result, Anderson points out, working conditions—the speed required of workers and the hazards to which they might be exposed—were kept in check by what the masters were willing to tolerate for themselves.

The Industrial Revolution brought radical changes, as companies grew ever larger and management structures more complex. “Employers no longer did the same kind of work as employees, if they worked at all,” Anderson observes. “Mental labor was separated from manual labor, which was radically deskilled.” Companies multiplied rapidly in size. Labor contracts now bonded workers to massive organizations in which discipline, briefs, and decrees flowed downward, but whose leaders were unreachable by ordinary workers. Today, fast food workers or bank tellers would be hard-pressed to petition their CEOs at McDonald’s or Wells Fargo in person.

Despite this, we often speak of employment contracts as agreements between equals, as if we are living in Adam Smith’s eighteenth-century dream world. In a still-influential paper from 1937 titled “The Nature of the Firm,” the economist and Nobel laureate Ronald Coase established himself as an early observer and theorist of corporate concerns. He described the employment contract not as a document that handed the employer unaccountable powers, but as one that circumscribed those powers. In signing a contract, the employee “agrees to obey the directions of an entrepreneur within certain limits,” he emphasized. But such characterizations, as Anderson notes, do not reflect reality; most workers agree to employment without any negotiation or even communication about their employer’s power or its limits. The exceptions to this rule are few and notable: top professional athletes, celebrity entertainers, superstar academics, and the (increasingly small) groups of workers who are able to bargain collectively.

Yet because employment contracts create the illusion that workers and companies have arrived at a mutually satisfying agreement, the increasingly onerous restrictions placed on modern employees are often presented as “best practices” and “industry standards,” framing all sorts of behaviors and outcomes as things that ought to be intrinsically desired by workers themselves. Who, after all, would not want to work on something in the “best” way? Beyond employment contracts, companies also rely on social pressure to foster obedience: If everyone in the office regularly stays until seven o’clock every night, who would risk departing at five, even if it’s technically allowed? Such social prods exist alongside more rigid behavioral codes that dictate everything from how visible an employee’s tattoo can be to when and how long workers can break for lunch.

Many workers, in fact, have little sense of the legal scope of their employer’s power. Most would be shocked to discover that they could be fired for being too attractive, declining to attend a political rally favored by their employer, or finding out that their daughter was raped by a friend of the boss—all real-life examples cited by Anderson. Indeed, it is only after dismissal for such reasons that many workers learn of the sweeping breadth of at-will employment, the contractual norm that allows American employers to fire workers without warning and without cause, except for reasons explicitly deemed illegal.

In reality, the employment landscape is even more dire than Anderson outlines. The rise of staffing or “temp” agencies, for example, undercuts the very idea of a direct relationship between worker and employer. In The Temp Economy: From Kelly Girls to Permatemps in Postwar America, sociologist Erin Hatton notes that millions of workers now labor under subcontracting arrangements, which give employers even greater latitude to abuse employees. For years, Walmart—America’s largest retailer—used a subcontracting firm to hire hundreds of cleaners, many from Eastern Europe, who worked for months on end without overtime pay or a single day off. After federal agents raided dozens of Walmarts and arrested the cleaners as illegal immigrants, company executives used the subcontracting agreement to shirk responsibility for their exploitation of the cleaners, claiming they had no knowledge of their immigration status or conditions.

By any reasonable standard, much “temp” work is not even temporary. Employees sometimes work for years in a single workplace, even through promotions, without ever being granted official status as an employee. Similarly, “gig economy” platforms like Uber designate their workers as contractors rather than employees, a distinction that exempts the company from paying them minimum wage and overtime. Many “permatemps” and contractors perform the same work as employees, yet lack even the paltry protections and benefits awarded to full-time workers.

A weak job market, paired with the increasing precarity of work, means that more and more workers are forced to make their living by stringing together freelance assignments or winning fixed-term contracts, subjecting those workers to even more rules and restrictions. On top of their actual jobs, contractors and temp workers must do the additional work of appearing affable and employable not just on the job, but during their ongoing efforts to secure their next gig. Constantly pitching, writing up applications, and personal branding on social media requires a level of self-censorship, lest a controversial tweet or compromising Facebook photo sink their job prospects. Forced to anticipate the wishes not of a specific employer, but of all potential future employers, many opt out of participating in social media or practicing politics in any visible capacity. Their public personas are shaped not by their own beliefs and desires, but by the demands of the labor market.


For Livingston, it’s not just employers but work itself that is the problem. We toil because we must, but also because our culture has trained us to see work as the greatest enactment of our dignity and personal character. Livingston challenges us to turn away from such outmoded ideas, rooted in Protestant ideals. Like Anderson, he sweeps through centuries of labor theory with impressive efficiency, from Marx and Hegel to Freud and Lincoln, whose 1859 speech he also quotes. Livingston centers on these thinkers because they all found the connection between work and virtue troubling. Hegel believed that work causes individuals to defer their desires, nurturing a “slave morality.” Marx proposed that “real freedom came after work.” And Freud understood the Protestant work ethic as “the symptom of repression, perhaps even regression.”

Nor is it practical, Livingston argues, to exalt work: There are simply not enough jobs to keep most adults employed at a living wage, given the rise of automation and increases in productivity. Besides, the relation between income and work is arbitrary. Cooking dinner for your family is unpaid work, while cooking dinner for strangers usually comes with a paycheck. There’s nothing inherently different in the labor involved—only in the compensation. Anderson argues that work impedes individual freedom; Livingston points out that it rarely pays enough. As technological advances continue to weaken the demand for human labor, wages will inevitably be driven down even further. Instead of idealizing work and making it the linchpin of social organization, Livingston suggests, why not just get rid of it?

Livingston belongs to a cadre of thinkers, including Kathi Weeks, Nick Srnicek, and Alex Williams, who believe that we should strive for a “postwork” society in one form or another. Strands of this idea go back at least as far as Keynes’s 1930 essay on “Economic Possibilities for our Grandchildren.” Not only would work be eliminated or vastly reduced by technology, Keynes predicted, but we would also be unburdened spiritually. Devotion to work was, he deemed, one of many “pseudo-moral principles” that “exalted some of the most distasteful of human qualities into the position of the highest virtues.”

Since people in this new world would no longer have to earn a salary, they would, Livingston envisions, receive some kind of universal basic income. UBI is a slippery concept, adaptable to both the socialist left and libertarian right, but it essentially entails distributing a living wage to every member of society. In most conceptualizations, the income is indeed basic—no cases of Dom Pérignon—and would cover the essentials like rent and groceries. Individuals would then be free to choose whether and how much they want to work to supplement the UBI. Leftist proponents tend to advocate pairing UBI with a strong welfare state to provide nationalized health care, tuition-free education, and other services. Some libertarians view UBI as a way to pare down the welfare state, arguing that it’s better simply to give people money to buy food and health care directly, rather than forcing them to engage with food stamp and Medicaid bureaucracies.

According to Livingston, we are finally on the verge of this postwork society because of automation. Robots are now advanced enough to take over complex jobs in areas like agriculture and mining, eliminating the need for humans to perform dangerous or tedious tasks. In practice, however, automation is a double-edged sword, with the capacity to oppress as well as unburden. Machines often accelerate the rate at which humans can work, taxing rather than liberating them. Conveyor belts eliminated the need for workers to pass unfinished products along to their colleagues—but as Charlie Chaplin and Lucille Ball so hilariously demonstrated, the belts also increased the pace at which those same workers needed to turn wrenches and wrap chocolates. In retail and customer service, a main function of automation has been not to eliminate work, but to eliminate waged work, transferring much of the labor onto consumers, who must now weigh and code their own vegetables at the supermarket, check out their own library books, and tag their own luggage at the airport.

At the same time, it may be harder to automate some jobs that require a human touch, such as floristry or hairstyling. The same goes for the delicate work of caring for the young, sick, elderly, or otherwise vulnerable. In today’s economy, the demand for such labor is rising rapidly: “Nine of the twelve fastest-growing fields,” The New York Times reported earlier this year, “are different ways of saying ‘nurse.’” These jobs also happen to be low-paying, emotionally and physically grueling, dirty, hazardous, and shouldered largely by women and immigrants. Regardless of whether employment is virtuous or not, our immediate goal should perhaps be to distribute the burdens of caregiving, since such work is essential to the functioning of society and benefits us all.


A truly work-free world is one that would entail a revolution from our present social organizations. We could no longer conceive of welfare as a last resort—as the “safety net” metaphor implies—but would be forced to treat it as an unremarkable and universal fact of life. This alone would require us to support a massive redistribution of wealth, and to reclaim our political institutions from the big-money interests that are allergic to such changes. Tall orders indeed—but as Srnicek and Williams remind us in their book, Inventing the Future: Postcapitalism and a World Without Work, neoliberals pulled off just such a revolution in the postwar years. Thanks to their efforts, free-market liberalism replaced Keynesianism as the political and economic common sense all around the world.

Another possible solution to the current miseries of unemployment and worker exploitation is the one Livingston rejects in his title: full employment. For anti-work partisans, full employment takes us in the wrong direction, and UBI corrects the course. But the two are not mutually exclusive. In fact, rather than creating new jobs, full employment could require us to reduce our work hours drastically and spread them throughout the workforce—a scheme that could radically de-center waged work in our lives. A dual strategy of pursuing full employment while also demanding universal benefits—including health care, childcare, and affordable housing—would maximize workers’ bargaining power to ensure that they, and not just owners of capital, actually get to enjoy the bounty of labor-saving technology.

Nevertheless, Livingston’s critiques of full employment are worth heeding. As with automation, it can all go wrong if we use the banner of full employment to create pointless roles—what David Graeber has termed “bullshit jobs,” in which workers sit in some soul-sucking basement office for eight hours a day—or harmful jobs, like building nuclear weapons. If we do not have a deliberate politics rooted in universal social justice, then full employment, a basic income, and automation will not liberate us from the degradations of work.

Both Livingston and Anderson reveal how much of our own power we’ve already ceded in making waged work the conduit for our ideals of liberty and morality. The scale and coordination of the institutions we’re up against in the fight for our emancipation is, as Anderson demonstrates, staggering. Employers hold the means to our well-being, and they have the law on their side. Individual efforts to achieve a better “work-life balance” for ourselves and our families miss the wider issue we face as waged employees. Livingston demonstrates the scale at which we should be thinking: Our demands should be revolutionary, our imaginations wide. Standing amid the wreckage of last year’s presidential election, what other choice do we have?

 

Miya Tokumitsu is a lecturer of art history at the University of Melbourne and a contributing editor at Jacobin. She is the author of Do What You Love.  And Other Lies about Success and Happiness.

A Universal Basic Income Is The Bipartisan Solution To Poverty We’ve Been Waiting For

 Molly Crabapple Basic Income Banner

What if the government simply paid everyone enough so that no one was poor? It’s an insane idea that’s gaining an unlikely alliance of supporters.

By Ben Schiller

Source: FastCoexist.com

There’s a simple way to end poverty: the government just gives everyone enough money, so nobody is poor. No ifs, buts, conditions, or tests. Everyone gets the minimum they need to survive, even if they already have plenty.

This, in essence, is “universal minimum income” or “guaranteed basic income”—where, instead of multiple income assistance programs, we have just one: a single payment to all citizens, regardless of background, gender, or race. It’s a policy idea that sounds crazy at first, but actually begins to make sense when you consider some recent trends.

The first is that work isn’t what it used to be. Many people now struggle through a 50-hour week and still don’t have enough to live on. There are many reasons for this—including the heartlessness of employers and the weakness of unions—but it’s a fact. Work no longer pays. The wages of most American workers have stagnated or declined since the 1970s. About 25% of workers (including 40% of those in restaurants and food service) now need public assistance to top up what they earn.

The second: it’s likely to get worse. Robots already do many menial tasks. In the future, they’ll do more sophisticated jobs as well. A study last year from Carl Frey and Michael Osborne at Oxford University found that 47% of jobs are at risk of computerization over the next two decades. That includes positions in transport and logistics, office and administration, sales and construction, and even law, financial services and medicine. Of course, it’s possible that people who lose their jobs will find others. But it’s also feasible we’re approaching an era when there will simply be less to do.

The third is that traditional welfare is both not what it used to be and not very efficient. The value of welfare for families with children is now well below what it was in the 1990s, for example. The move towards means-testing, workfare—which was signed into law by Bill Clinton in 1996—and other forms of conditionality have killed the universal benefit. And not just in the U.S. It’s now rare anywhere in the world that people get a check without having to do something in return. Whatever the rights and wrongs of this, that makes the income assistance system more complicated and expensive to manage. Up to up to 10% of the income assistance budget now goes to administrating its distribution.

For these reasons and others, the idea of a basic income for everyone is becoming increasingly popular. There has been a flurry of reports and papers about it recently, and, unusually, the idea has advocates across the political spectrum.

The libertarian right likes basic income because it hates bureaucracy and thinks people should be responsible for themselves. Rather than giving out food stamps and health care (which are in-kind services), it thinks people should get cash, because cash is fungible and you do what you like with it.

The left likes basic income because it thinks society is unequal and basic income is redistributive. It evens up the playing field for people who haven’t had good opportunities in life by establishing a floor under the poorest. The “precariat” goes from being perpetually insecure to knowing it has something to live on. That, in turn, should raise well-being and produce more productive citizens.

The technology elite, like Netscape’s Marc Andreessen, also likes the idea. “As a VC, I like the fact that a lot of the political establishment is ignoring or dismissing this idea,” Albert Wenger, of Union Square Ventures, told a TED audience recently, “because what we see in startups is that the most powerful innovative ideas are ones truly dismissed by the incumbents.” A minimum income would allow us to “embrace automation rather than be afraid of it” and let more of us participate in the era of “digital abundance,” he says.

The exact details of basic income still need to be worked out, but it might work something like this: Instead of welfare payments, subsidies for health care, and tax credits for the working poor, we would take that money and use it to cover a single payment that would give someone the chance to live reasonably. Switzerland recently held an (unsuccessful) is planning to hold a referendum on a basic income this year, though no date is set. The proposed amount is $2,800 per month.

But would it actually work? The evidence from actual experiments is limited, though it’s more positive than not. A pilot in the 1970s in Manitoba, Canada, showed that a “Mincome” not only ended poverty but also reduced hospital visits and raised high-school completion rates. There seemed to be a community-affirming effect, which showed itself in people making use of free public services more responsibly.

Meanwhile, there were eight “negative income tax” trials in the U.S. in the ’70s, where people received payments and the government clawed back most of it in taxes based on your other income. The results for those trials was more mixed. They reduced poverty, but people also worked slightly less than normal. To some, this is the major drawback of basic income: it could make people lazier than they would otherwise be. That would certainly be a problem, though it’s questionable whether, in the future, there will be as much employment anyway. The age of robots and artificial intelligence seems likely to hollow out many jobs, perhaps changing how we view notions of laziness and productivity altogether.

Experiments outside the U.S. have been more encouraging. One in Namibia cut poverty from 76% to 37%, increased non-subsidized incomes, raised education and health standards, and cut crime levels. Another involving 6,000 people in India paid people $7 month—about a third of subsistence levels. It, too, proved successful.

“The important thing is to create a floor on which people can start building some security. If the economic situation allows, you can gradually increase the income to where it meets subsistence,” says Guy Standing, a professor of development studies at the School of Oriental and African Studies, in London, who was involved with the pilot. “Even that modest amount had incredible effects on people’s savings, economic status, health, in children going to school, in the acquisition of items like school shoes, so people felt in control of their lives. The amount of work people were doing increased as well.”

Given the gridlock in Congress, it’s unlikely we’ll see basic income here for a while. Though the idea has supporters in both left and right-leaning think-tanks, it’s doubtful actual politicians could agree to redesign much of the federal government if they can’t agree on much else. But the idea could take off in poorer countries that have more of a blank slate and suffer from less polarization. Perhaps we’ll re-import the concept one day once the developing world has perfected it?

Walmart Admits: ‘Our Profits’ Depend on ‘Their Poverty’

walmart_low_morals_alt

By Lauren McCauley

Originally posted at CommonDreams.org

Although a notorious recipient of “corporate welfare,” Walmart has now admitted that their massive profits also depend on the funding of food stamps and other public assistance programs.

In their annual report, filed with the Security and Exchange Commission last week, the retail giant lists factors that could potentially harm future profitability. Listed among items such as “economic conditions” and “consumer confidence,” the company writes that changes in taxpayer-funded public assistance programs are also a major threat to their bottom line.

The company writes:

Our business operations are subject to numerous risks, factors and uncertainties, domestically and internationally, which are outside our control … These factors include … changes in the amount of payments made under the Supplement[al] Nutrition Assistance Plan and other public assistance plans, changes in the eligibility requirements of public assistance plans …

Walmart, the nation’s largest private employer, is notorious for paying poverty wages and coaching employees to take advantage of social programs. In many states, Walmart employees are the largest group of Medicaid recipients.

However, this report is the first public acknowledgement of the chain’s reliance on the funding of these programs to sustain a profit.

According to Stacy Mitchell, senior researcher with the Institute for Local Self-Reliance, the irony of their admission is that Walmart “is the company that has done, perhaps, more than any other corporation to push people into poverty.”

Citing a Penn State study, Mitchell told Common Dreams that research has proven that “when Walmart opens a store, poverty rates are negatively impacted” and that the more stores that have opened in a particular county, the worse it is. “This is a company that everywhere it goes it creates poverty.”

In addition to their own worker’s low wages, Mitchell explains that Walmart, because of their enormous size and market power, have “held down wages for the whole sector.”

As a retailer that specifically targets a low-income demographic, Mitchell adds that the “insidious genius” of their business model is that “they have so squeezed American workers […] many feel that their only choice is to shop at Walmart.”

The International Business Times reports:

Prior to the earnings report, Walmart Chief Financial Officer Charles Holley said the company didn’t anticipate how much the end to such programs as the unemployment benefits extension would affect it. Specifically, reductions to the Supplemental Nutrition Assistance Program that went into effect on Nov. 1, the first day of the company’s fourth quarter, pose a potential concern. The cuts led to a between $1 and $36 reduction in SNAP benefits per household, or up to $460 a year. Congress is debating reinstating the extension to the program and making the benefits retroactive to Nov. 1, something Walmart would clearly consider beneficial to its growth.

Previously, Walmart has joined forces with Big Food labels such as Coca Cola and Kelloggs to lobby the United States Department of Agriculture and Congress against any measures that would restrict SNAP use to healthy food choices. According to an earlier study by Michele Simon at Eat Drink Politics, in just one year, nine Walmart Supercenters in Massachusetts received more than $33 million in SNAP revenues.

Lara Trace Hentz

INDIAN COUNTRY NEWS

In Saner Thought

"It is the duty of every man, as far as his ability extends, to detect and expose delusion and error"..Thomas Paine

ZEDJournAI

Human in Algorithms

Rooster Crows

From the Roof Top

Aisle C

I See This

The Free

blog of the post capitalist transition.. Read or download the novel here + latest relevant posts

अध्ययन-अनुसन्धान(Essential Knowledge of the Overall Subject)

अध्ययन-अनुसन्धानको सार