Who’s to Blame for Losing Afghanistan?

By Peter Van Buren

Source: We Meant Well

Who should we blame for losing Afghanistan? Why blame anyone?

Did anyone expect the U.S. war in Afghanistan to end cleanly? If so, you bought the lies all along and the cold water now is hitting sharp. While the actual ending is particularly harsh and clearly spliced together from old clips of Saigon 1975, those are simply details.

Why blame Biden? He played his part as a Senator and VP keeping the war going, but his role today is just being the last guy in a long line of people to blame, a pawn in the game. That Biden is willing to be the “president who lost Afghanistan” is all the proof you need he does not intend to run again for anything. Kind of an ironic version of a young John Kerry’s take on Vietnam “how do you ask the last man to die for a mistake?” Turns out, it’s easy: call Joe.

Blame Trump for the deal? One of the saddest things about the brutal ending of the U.S.-Afghan war is we would have gotten the same deal — just leave it to the Taliban and go home — at basically any point during the last 20 years. That makes every death and every dollar a waste. Afghanistan is simply reverting, quickly, to more or less status quo 9/10/01 and everything between then and now, including lost opportunities, will have been wasted.

Blame the NeoCons? No one in Washington who supported this war was ever called out, with the possible exception of Donald Rumsfeld who, if there is a hell, now cleans truck stop toilets there. Dick Cheney walks free. The generals and diplomats who ran the war have nice think tank or university jobs, if they are not still in government making equally bad decisions. No one has been legally, financially, or professionally disadvantaged by the blood on their hands. Some of the era’s senior leaders — Blinken, Rice, Power, Nuland — are now working in better jobs for Biden. I’d like to hope they have trouble sleeping at night, but I doubt it.

George Bush is a cuddly grandpa today, not the man who drove the United States into building a global prison archipelago to torture people. Barack Obama, who kept much of that system in place and added the drone killing of American citizens to his resume, remains a Democratic rock god. Neither man nor any of his significant underlings has expressed any regret or remorse.

For example, I just listened to Ryan Crocker, our former ambassador to Iraq and Afghanistan, on CNN. Making myself listen to him was about as fun as sticking my tongue in a wood chipper. Same for former general David Petraeus and the usual gang of idiots. None of them, the ones who made the decisions, accept any blame. Instead. they seem settled on blaming Trump because, well, everything bad is Trump’s fault even if he came into all this in the middle of the movie.

In the end the only people punished were the whistleblowers.

No one in the who is to blame community seems willing to take the story back to its beginning, at least the beginning for America’s latest round in the Graveyard of Empires (talk about missing an early clue.) This is what makes Blame Trump and Blame Biden so absurd. America’s modern involvement in this war began in 1979 when Jimmy Carter, overreacting to the Soviet invasion of Afghanistan to prop up what was already a pro-Soviet puppet government, began arming and organizing Islamic warriors we now collectively know as “The Taliban.”

People who want to only see trees they can chop down and purposely want to miss the vastness of the forest ahead at this point try to sideline things by claiming there never was a single entity called “The Taliban” and the young Saudis who flocked to jihad to kill Russians technically weren’t funded by the U.S. (it was indirectly through Pakistan) or that the turning point was the 1991 Gulf War, etc. Quibbles and distractions.

If Carter’s baby steps to pay for Islamic warriors to fight the Red Army was playing with matches, Ronald Reagan poured gas, then jet fuel, on the fire. Under the Reagan administration the U.S. funded the warriors (called mujaheddin if not freedom fighters back then), armed them, invited their ilk to the White House, helped lead them, worked with the Saudis to send in even more money, and fanned the flames of jihad to ensure a steady stream of new recruits.

When we “won” it was hailed as the beginning of the real end of the Evil Empire. The U.S. defeated the mighty Red Army by sending over some covert operators to fight alongside stooge Islam warriors for whom a washing machine was high technology. Pundits saw it as a new low-cost model for executing American imperial will.

We paid little attention to events as we broke up the band and cut off the warriors post-Soviet withdrawal (soon enough some bozo at the State Department declared “the end of history.” He teaches at Stanford now) until the blowback from this all nipped us in the largely unsuccessful World Trade Center bombing of 1993, followed by the very successful World Trade Center bombing on September 11, 2001. Seems like there was still some history left to go.

How did U.S. intelligence know who the 9/11 culprits were so quickly? Several of them had been on our payroll, or received financing via proxies in Pakistan and Saudi Arabia, or were inspired by what had happened in Afghanistan, the defeat of the infidels (again; check Alexander the Great, Genghis Khan, the Mughal Empire, various Persian Empires, the Sikhs, the British, et al.)

If post-9/11 the U.S. had limited itself to a vengeful hissy fit in Afghanistan, ending with Bush’s 2003 declaration of “Mission Accomplished,” things would have been different. If the U.S. had used the assassination of Osama bin Laden, living “undiscovered” in the shadow of Pakistan’s military academy, as an excuse of sorts to call it a day in Afghanistan, things would have been different.

Instead Afghanistan became a petri dish to try out the worst NeoCon wet dream, nation-building across the Middle East. Our best and brightest would not just bomb Afghanistan into the stone age, they would then phoenix-it from the rubble as a functioning democracy. There was something for everyone: a military task to displace post-Cold War budget cuts, a pork-laden reconstruction program for contractors and diplomats, even a plan to empower Afghan women to placate the left.

Though many claim Bush pulling resources away from Afghanistan for Iraq doomed the big plans, it was never just a matter of not enough resources. Afghanistan was never a country in any modern sense to begin with, just an association of tribal entities who hated each other almost as much as they hated the west. The underpinnings of the society were a virulent strain of Islam, about as far away from any western political and social ideas as possible. Absent a few turbaned Uncle Toms, nobody in Afghanistan was asking to be freed by the United States anyway.

Pakistan, America’s “ally” in all this, was a principal funder and friend of the Taliban, always more focused on the perceived threat from India, seeing a failed state in Afghanistan as a buffer zone. Afghanistan was a narco-state with its only real export heroin. Not only did this mean the U.S. wanted to build a modern economy on a base of crime, the U.S. in different periods actually encouraged/ignored the drug trade into American cities in favor of the cash flow.

The Afghan puppet government and military the U.S. formed were uniformly corrupt, and encouraged by the endless inflow of American money to get more corrupt all the time. They had no support from the people and could care less. The Afghans in general and the Afghan military in particular did not fail to hold up their end of the fighting; they never signed up for the fight in the first place. No Afghan wanted to be the last man to die in service to American foreign policy.

There was no way to win. The “turning point” was starting the war at all. Afghanistan had to fail. There was no other path for it, other than being propped up at ever-higher costs. That was American policy for two decades: prop up things and hope something might change. It was like sending more money to a Nigerian cyber-scammer hoping to recoup your original loss.

Everything significant our government, the military, and the MSM told us about Afghanistan was a lie. They filled and refilled the bag with bullhockey and Americans bought it every time expecting candy canes. Keep that in mind when you decide who to listen to next time, because of course there will be a next time. Who has not by now realized that? We just passively watched 20 years of Vietnam all over again, including the sad ending. So really, who’s to blame?

The Weird Politics Of Aspartame: Conspiracy Theory Or Startling Truth?

050406aspartame3

By Paanii Powell Cleaver

Source: Inquisitr

Earlier this month, news wires and Twitter feeds were abuzz with info about the potential danger of certain artificial sweeteners.

In reality, recent reports about the potential perils of low-calorie sweeteners are not exactly breaking news. Almost 20 years ago, on December 29, 1996, Mike Wallace conducted an eye-opening segment about aspartame, also known as NutraSweet, on 60 Minutes.

The segment aired in response to a flurry of reports noting a dramatic increase in brain tumors and other serious health issues following the approval of aspartame for use in dry foods in 1981. Fifteen years after its somewhat dubious approval, (the most controversial FDA decision to date), more than 7,000 consumer reports of adverse reaction to aspartame had been delivered to the FDA. As reported by 60 Minutes, the litany of consumer complaints included severe headaches, dizziness, respiratory issues, and seizures. The FDA countered with a statement that aspartame was the most tested product in FDA history.

Dr. Virginia V. Weldon, a pediatrician from Missouri and Vice President of Public Policy for the Monsanto Company from 1989 through 1998, told 60 Minutes that aspartame is “one of the safest food ingredients ever approved by the Food and Drug Administration.”

Dr. John W. Olney, a neuroscientist at Washington University School of Medicine, vehemently disagreed with Dr. Weldon’s assessment of the controversial super sweetener. Notable for his discovery of the brain-harming effects of an amino acid called glutamate, Dr. Olney was influential in legislating the ban of MSG in baby food. At the time of the 60 Minutes segment, he had been studying the effects of aspartame and other compounds on brain health for more than two decades.

Olney told 60 Minutes’ Mike Wallace that since the approval of aspartame, there had been “a striking increase in the incidence of malignant brain tumors.” The doctor did not directly blame aspartame for the increase. He did, however, state that there was enough questionable evidence to merit reevaluating the chemical compound. He said that the FDA should reassess aspartame and that “this time around, they should do it right.”

Dr. Erik Millstone, Professor of Science Policy at the University of Sussex, told 60 Minutes that Searle’s testing procedures in the early 1970s were so flawed that there was no way to know for certain if aspartame was safe for human consumption. Millstone claimed that the company’s failure to dissect a test animal that died during an aspartame experiment was merely one example of “deficiencies” in Searle’s conduct. He also noted that when test mice presented with tumors, the tumors were “cut out and discarded and not reported.” In addition, Dr. Millstone told Mike Wallace that G.D. Searle and contractors hired by the drug company administered antibiotics to some test animals yet neglected to reveal this information in official reports.

In 1974, after the G.D. Searle company had already manufactured a significant quantity of aspartame, then-commissioner of the FDA, Alexander Schmidt, came very close to approving the chemical for human food purposes. Relying solely on evidence provided by Searle, the FDA allowed a mere 30 days for the public to respond before putting the FDA seal of approval on the now-controversial food additive. Dr. John Olney wasted no time in joining forces with James Turner, a public interest attorney who also worked with consumer advocate Ralph Nader. Just before the allotted time for public response ran out, Dr. Olney and his attorney petitioned the FDA with data that indicated the dangerous similarities between aspartame and glutamate.

In his 1970 best seller, The Chemical Feast, Turner detailed numerous ways the FDA shirked its obligation to protect the American people. At the time of its publication, Time magazine described the tome as “the most devastating critique of a U.S. government agency ever issued.”

In response to Dr. Olney’s allegations that aspartame was potentially as brain-damaging as glutamate, the FDA called for a task force to investigate the matter. By late 1975, the FDA found that Searle’s own research into the safety of aspartame was so flawed that they stayed the approval process, citing “a pattern of conduct which compromises the scientific integrity of the studies.”

Former U.S. Senator Howard Metzenbaum told 60 Minutes that when Searle presented information to the FDA in 1974, the drug company “willfully misrepresented” and omitted facts that may have halted approval of what would soon become its best-selling product. Metzenbaum went on to say that the FDA was so disturbed by its findings, it forwarded a file to the U.S. Attorney’s Chicago office in 1975 in the interest of calling a grand jury to determine whether criminal indictments against Searle were warranted.

When did the grand jury convene? Never. According to 60 Minutes, U.S. Attorney Samuel Skinner requested a grand jury investigation in 1977 but recused himself from the case when he was offered a job at the Sidley & Austin law firm, which also happened to be the Chicago law firm that represented the G.D. Searle company. The investigation was stalled until the statute of limitations ran its course, and no grand jury ever heard the case against Searle’s questionable research standards. Skinner, by the way, did eventually accept the job with Searle’s Chicago law office.

In 1977, a new FDA task force convened in Skokie, Illinois, with the sole purpose of investigating the research methods employed by G.D. Searle in its effort to gain FDA approval of aspartame. The task force examined raw data from 15 studies that Searle used to back up its uniformly positive claims about aspartame. According to journalist Andrew Cockburn, the task force noted numerous “falsifications and omissions” in Searle’s research reports.

In 1980, at the tail end of the Carter administration, the FDA conducted two-panel investigations into claims that aspartame caused brain tumors. Led by scientific and medical experts, each panel concluded that more tests were needed to prove the safety of the sweetener. Both panels concluded that aspartame should not be approved at that time.

So, how does the fellow in the cover picture figure into this equation?

If you do not recognize the face in the feature photo, here is a memory refresher: The man in the pic — and up to his ears in the aspartame controversy — is Donald Rumsfeld. Perhaps best known as Secretary of Defense during the George W. Bush presidency, Donald Rumsfeld also happened to be CEO of the G.D. Searle drug company when Ronald Reagan was sworn in as President of the United States on January 1981.

Regardless of its safety or potential peril, the fact remains that without the clout and political influence of Donald Rumsfeld, aspartame might never have been approved for human use at all.

Donald Rumsfeld, who at one time aspired to be Reagan’s running mate, was a member of the new president’s transition team. Part of the team’s duties involved the selection of a new FDA Commissioner. Rumsfeld et al chose a pharmacist with no experience in food additive science to lead the agency.

On January 21, 1981, Ronald Reagan’s first full day as president, the G.D. Searle company, headed by Donald Rumsfeld, reapplied for FDA approval of aspartame. That same day, in one of his first official acts, President Reagan issued an executive order that rescinded much of the FDA commissioner’s power.

In April 1981, newly appointed FDA commissioner Arthur Hayes Hull, Jr., put together a five-person panel tasked to reevaluate the agency’s 1975 decision to not approve aspartame as a food additive. At first, the panel voted 3-2 to uphold non-approval of the chemical sweetener. Hayes then invited another member to the official FDA panel, and the vote was retaken. The panel deadlocked, and Hayes contributed his own vote to break the tie. Two months later, the product that the FDA refused to approve for seven long years was suddenly approved for human consumption.

Four years later, in 1985, the Monsanto corporation bought G.D. Searle and established a separate division, The NutraSweet Company, to manage the sales and public relations of one of its best-selling and most profitable products. It may be worth noting that when Monsanto purchased Searle and the patent on aspartame, Donald Rumsfeld reportedly received a fat $12 million bonus.

Before reading this, how much did you know about the origins of aspartame? If you’re like most Americans, the answer is “not much.” And, if you’re like many Americans, your interest in a story such as this one will wane as soon as the next hot topic comes along. Perhaps this is the reason that there has been little if any public outcry regarding aspartame or the weird way that it received FDA approval.

Updates:

In 1987, UPI investigative journalist Gregory Gordon reported that Dr. Richard Wurtman, a neuroscientist at Massachusetts Institute of Technology and a die-hard supporter of aspartame during its 1981 rush to approval, had reversed his thoughts on the sweetener. He noted the once-ardent supporter as saying his views had evolved along with scientific studies and his increased skepticism of industry research standards.

In 1997, the U.K. government obliged makers of sweetened food to prominently include the words “with sweeteners” on product labels. Ten years later, U.K. supermarket chain Mark & Spencer announced the end of artificial sweeteners and coloring in their chilled goods and bakery departments, according to the Daily Mail.

In his well received 2007 book, Rumsfeld: His Rise, Fall and Catastrophic Legacy, author Andrew Cockburn described the results of the 1977 FDA task force that found “falsifications and omissions” in Searle’s research data. The New York Times called author Cockburn’s biographical tome “quite persuasive.”

In 2009, Woolworths, a South African retailer, announced that it would no long brand products containing aspartame.

On February 28, 2010, Dr. Arthur Hayes Hull, Jr., the FDA Commissioner who hurried aspartame to market and later squelched public fear of Tylenol during the 1982 poisoning scare, died in Connecticut. According to his New York Times obituary, Hayes was employed as president of E. M. Pharmaceuticals after his term at the FDA. Hayes succumbed to leukemia at age 76.

A 2011 report in the Huffington Post noted that 10,000 American consumers notified the FDA about the ill effects of aspartame between 1981 and 1995. According to the article, the use of aspartame elicited more complaints than any other product in history, comprising 75 percent of complaints received by the U.S. Food and Drug Administration.

In 2013, the EFSA (the British equivalent of the FDA), reiterated its claim that aspartame is harmless. Professor Erik Millstone responded with his own reevaluation of aspartame, in which he noted that every study the EFSA used to approve aspartame was funded by the same industry that manufactures and profits from the controversial sweetener.

Dr. John W. Olney passed away at the age of 83 on April 14, 2015. In addition to his campaigns against aspartame and glutamate, the doctor devoted half a century of his life to finding a cure for Multiple Sclerosis, the crippling neurological disease that claimed his own sister when she was 16. According to the St. Louis Post Dispatch, cause of death of the pioneering brain researcher included complications of ALS, a neurological disorder more commonly known as Lou Gehrig’s Disease.

Those interested in learning more about the approval of aspartame are invited to read the FDA Commissioner’s Final Report, published by the Department of Health and Human Services on July 24, 1981. A detailed version of the aspartame timeline is available at Rense.com.

On the Drug War, and Other “Mistakes”

nixon-war-on-drugs-quote

By Kevin Carson

Source: Center for a Stateless Society

In a new article at Harper’s (“Legalize It All,” April 2016), Dan Baum recalls a 1994 confession by former Nixon domestic policy adviser John Ehrlichmann, about Nixon’s motives in first launching the War on Drugs. Baum, interviewing Ehrlichman for a book on drug prohibition, asked a “series of earnest, wonky questions, that he impatiently waved away”:

“The Nixon campaign in 1968, and the Nixon White House after that, had two enemies:  the antiwar Left, and black people…. We knew we couldn’t make it illegal to be either against the war or black. But by getting the public to associate the hippies with marijuana and blacks with heroin, and then criminalizing both heavily, we could disrupt those communities. We could arrest their leaders, raid their homes, break up their meetings, and vilify them night after night on the evening news. Did we know we were lying about the drugs? Of course we did.”

Judged by those objectives, Nixon’s War on Drugs and its subsequent dramatic escalation under Reagan have been resounding successes.

Many liberals, unfortunately, are prone to describing the War on Drugs as a “failure” — much as the Vietnam or Iraq War was “a mistake” — implicitly accepting the general goals of the American state as good and well-meaning, and merely unfortunate in their execution. The liberals who frame the wars in this way, as Noam Chomsky has argued, share the hawks’ view that “America owns the world” and has the right to define as a “threat” any country that defies its authority or attempts to undermine the global corporate order. And liberals and progressives are nauseatingly prone to referring to criminal foreign wars of aggression and domestic police wars on civil society as something “we” did.

But if you genuinely think the actions of the American state have anything to do with “we” or “us,” either you belong to the economic classes served by the state, or you probably still ask the dentist to save your extracted molars to put under your pillow.

Long before I saw Ehrlichman’s admission, I noted that the expanded War on Drugs against crack and meth under Reagan and Clinton had had a disruptive effect on two of the demographic groups (inner city black people and rural poor whites) that, as it happens, are least socialized to cheerfully accept direction from authority figures behind desks.

Going back to the passage of the Virginia Slave Code after the defeat of Bacon’s Rebellion, running through the use of racial divisions to split and defeat the southern tenant farmers’ unions, and right up to the present, the possibility of a strategic political alliance between poor black and white people has been one of the major fears of the propertied classes who control the American state.

So whether it be Nixon’s or Reagan’s War on Drugs, or the Clintons’ support for a Crime Bill (to “bring to heel” so-called black “super-predators”) that completed America’s growth into the largest carceral state in the world, the fact that a third of the urban black male population is in some phase of the “criminal justice” system and deprived of the franchise has had an enormous effect on radical political possibilities in this country. It has gone a long way towards nullifying the effects of the Voting Rights Act, in much the same way that Black Codes nullified the effects of Emancipation. Jeb Bush’s purge of 70,000 alleged “felons” — mostly not felons, but mostly black — from the Florida voting rolls was the main factor in handing the presidency to his brother.

I’m not, by the way, the kind of conspiracist who thinks every government policy fits into some larger, malign strategy that serves as the “real” motivation for all officials. I don’t doubt a great deal of legislation and executive action is intended as a good faith response to the stated concerns of policy-makers. Of course even such “well-meaning” policies are subject to the law of unintended consequences, mission creep, refusal to reassess in response to feedback on their effectiveness, and abusive or self-dealing execution by the bureaucracies tasked with enforcement.

But even when policies are sincerely “well-meaning,” they still tend to serve vested interests through a sort of structural “invisible hand” effect. The “well-meaning” policies that get passed are those that structurally benefit the economic ruling class, and those that get repealed are those that no longer do so.

The state does not represent “us,” and the destructive and genocidal effects of its policies are not “mistakes.”

A Brief History of Nicaragua

IMG_4398

To have even a basic understanding of Nicaraguan culture it’s important to first know a little about the land’s history. In the Pre-Columbian era, the region now called Nicaragua was inhabited by several tribes culturally related to Aztec and Maya civilizations. Not long after Christopher Columbus first reached Nicaragua in 1502, an attempt was made to conquer the region by Gil González Dávila in the 1520s. On April 17, 1523, Dávila first met with Cacique Diriangen, leader of the Dirian peoples. Dávila gave the tribe a three day deadline to become Christians but rather than comply Diriangen led an attack, making him the first known resistance fighter of Nicaragua.

A statue of Diriangen at the entrance to the town of Diria.

A statue of Diriangen at the entrance to the town of Diria.

During over 300 years of colonization, countless indigenous people died of diseases, rival conquistadors waged war on each other, Caribbean pirates raided cities along Lake Nicaragua, British forces fought the Spanish in Nicaragua during a sub-conflict of the the Seven Years’ War, and in 1610 Momotombo volcano erupted, destroying the old capital city of León. In 1838, Nicaragua became an independent republic. Within a few decades, during a power struggle between León and Granada, filibusterer William Walker was hired by the government of León to fight on their side but he exploited the region’s instability and briefly established himself as President of Nicaragua before being forced out of the country a few years later. Three decades of conservative rule followed, during which the U.S. began formulating plans to build a canal across Nicaragua (which may soon become a reality with funding from Chinese corporations). However, when the U.S. shifted their plans to Panama, President Jose Santos Zelaya attempted to negotiate with European partners. Because of the potential threat Zelaya posed to U.S. hegemony and his ambitions to unite the Central American nations, the U.S. government compelled him to resign with the threat of military force and funding of conservative opposition groups, replacing him with a series of puppet regimes. Attempting to prevent insurrection, Nicaragua was occupied by U.S. Marines from 1912 to 1933. From 1927 (the start of Somoza’s rise to power though the National Guard), national hero Augusto César Sandino led a guerrilla war against the conservative government and the U.S. Marines. Shortly after a peace agreement was reached with a newly elected Sacasa administration, the Marines left Nicaragua and the head of the National Guard, Anastasio Somoza García ordered Sandino’s assassination. Sandino was killed by National Guard troops on February 21, 1934. His body was hidden and never found. In 1937 Somoza ousted the Sacasa government in a rigged election.

A statue of Sandino at the Augusto C. Sandino Library, a museum located in the house where he grew up in the town of Niquinohomo (Valley of the Warriors).

A statue of Sandino at the Augusto C. Sandino Library, a museum located in the house where he grew up in the town of Niquinohomo (Valley of the Warriors).

The Somoza regime was Nicaragua’s longest lasting hereditary military dictatorship, having ruled for 43 years. The father of the dynasty, Anastasio Somoza García, was famously called “our son of a bitch” by FDR and was assassinated by 27 year old poet Rigoberto López Pérez in León in 1956. In response to increasingly corrupt and reactionary policies of the Somoza government, Carlos Fonseca, Silvio Mayorga, and Tomás Borge led the formation of the Sandinista National Liberation Front (Frente Sandinista de Liberación Nacional or FSLN, named after and inspired by Augusto Sandino) in 1961. In 1972 a major earthquake hit Managua killing 6000 people, injuring tens of thousands and leaving hundreds of thousands homeless. President Anastasio Somoza Debayle mishandled the situation by failing to distribute essential aid and supplies. When it was later revealed that the government was siphoning relief money for personal gain, popularity and membership of the FSLN greatly increased. Hundreds of Chilean refugees also joined their ranks after a CIA-backed coup assassinated Chilean president Salvador Allende in 1973 and installed the dictator Augusto Pinochet the following year.

A display at the Carlos Fonseca Museum in Matagalpa.

A display at the Carlos Fonseca Museum in Matagalpa.

When Pedro Joaquin Chamorro, editor of the national newspaper and critic of Somoza, was assassinated by the government on January 10, 1978, a mass insurrection was triggered. By the end of Summer, armed youths took over Matagalpa while factions of the FSLN and civilian recruits had the National Guard under siege in Managua, Masaya, León, Chinandega and Estelí. On July 19, 1979, FSLN forces entered the capital and officially assumed power. Just two days before, Anastasio Somoza Debayle resigned and fled to Miami. He was killed a year later by a rocket attack from members of the Argentinian Revolutionary Workers Party while in exile in Paraguay.

Though the Sandinista government inherited a country in ruins and over a billion dollars in debt, they had an ambitious platform which included:

  • nationalization of property owned by the Somozas and their supporters
  • improved rural and urban working conditions
  • free unionization for all workers, both urban and rural
  • price fixing for commodities of basic necessity
  • improved public services, housing conditions, education
  • abolition of torture, political assassination and the death penalty
  • protection of democratic liberties
  • equality for women
  • non-aligned foreign policy

The Sandinistas had early successes with their education and literacy programs but were soon hindered by emerging conflicts with counter-revolutionary Contra forces heavily financed, armed and trained by the CIA. Investigations into the Iran-Contra scandal revealed some of the funding was acquired through arms sales to Iran and drug shipments to U.S. inner cities (read Gary Webb’s Dark Alliance for more about this). Despite strong support for the opposition by the U.S., the FSLN’s Daniel Ortega won the 1984 elections. Less than a year later the Reagan administration implemented a complete embargo on U.S. trade with Nicaragua that would last five years. By the late 80s, the continuing Contra campaign was notorious for human rights violations, corruption and terrorism. In August 1987, Costa Rican president Oscar Arias Sanchez created a peace accord which led to a ceasefire signed by Contra and Sandinista representatives a year later. Disillusioned by conflict and economic strife (made worse by Reagan’s embargo), Nicaraguan voters elected conservative administrations throughout the 1990s and early 2000s but seeing little improvement and much corruption, they reelected FSLN member Daniel Ortega in 2006 and 2011. So far, there has not yet been radical reforms that corporate investors feared and that more radical liberals hoped for, but Ortega has maintained a skepticism towards capitalism while simultaneously maintaining relations with the U.S. and rivals such as Iran, Libya and Venezuela.

As for how the average Nicaraguan feels about their current situation, opinions seem to vary but I plan to share some of the impressions I got in a future post.