UMass Amherst Scientist Says New Study Challenges How Regulators Determine Risk

Dec. 21, 2006

 

 

AMHERST, Mass. – A new study of a large U.S. National Cancer Institute database provides the strongest evidence yet that a key portion of the traditional dose-response model used in drug testing and risk assessment for toxins is wrong when it comes to measuring the effects of very low doses, says Edward J. Calabrese, a scientist at the University of Massachusetts Amherst. The findings, based on a review of more than 56,000 tests in 13 strains of yeast using 2,200 drugs, are published in the journal Toxicological Sciences and offer strong backing for the theory of hormesis, Calabrese and his colleagues contend.

Calabrese says the size of the new study and the preponderance of evidence supporting hormesis, a dose-response phenomenon in which low doses have the opposite effect of high doses, is a breakthrough that should help scientists assess and predict risks from new drugs, toxicants and possibly carcinogens. Calabrese says, “This is a fundamental biological principle that has been missed.”

Calabrese says that the field of toxicology got the dose response wrong in the 1930s and this mistake has infiltrated all regulations for low-dose exposures for toxic chemicals and drugs. These low-dose effects can be beneficial or harmful, something that the regulations miss because they are currently based on high-dose testing schemes that differ greatly from the conditions of human exposures.

In this latest study, which uses data from a large and highly standardized National Cancer Institute tumor-drug screening database, Calabrese says the evidence of hormesis is overwhelming. In the study, high doses of anticancer drugs frequently inhibit yeast growth, but at low doses they enhance growth, exactly what the homesis model predicts.

Whether one accepts the hormesis theory is not the critical public policy issue, according to Calabrese. He says that the major issue is that the risk assessments models used by the federal Environmental Protection Agency and the Food and Drug Administration fail to accurately predict responses in the low-dose zone, that is, where people live most of their daily lives.

Calabrese also says challenging the existing dose-response model has profound public policy and health implications. “I believe the hormesis model is the fundamental dose-response and government testing and risk assessment procedures should reflect that,” Calabrese says. For example, in environmental regulations, it has been assumed that most carcinogens possess real or theoretical risks at low levels, and therefore must be nearly completely removed from the environments to assure public safety. Some would contend that if hormesis is the correct model for very low levels, that cleanup standards may have to be significantly changed. Others, however, see the evidence as insufficient for such radical change and worry about other factors that can influence the effects of chemicals in low doses. The new study promises to add fuel to the debate, Calabrese says.

Calabrese also suggests that the findings may have important implications for the pharmaceutical industry and medical practices. He says that hormesis is likely to identify new life-saving drugs that were missed through traditional testing and to markedly improve the accuracy of patient dosing, which will not only improve health outcomes but also reduce adverse side effects.
Go Here for the complete article from the Toxicological Sciences journal.

 

Gazette Fair Use Statement

Drinking Water Week Begins


Posted May 8th, 2012

American Water Works Association Announces the Start of Drinking Water Week, to “Celebrate the Essential.”

The American Water Works Association (AWWA) kicked off Drinking Water Week 2012 with a call to “Celebrate the Essential” throughout North America, according to a press release.

This is the 35th Annual Drinking Water Week celebration, and it is obviously hard to come up with original slogans, so “Celebrate the Essential” will have to do.

Throughout the week, AWWA and its partners will celebrate water by recognizing the essential role drinking water plays in our daily lives, with special attention to water infrastructure, the economy and careers in the water profession, stated the release.

“There is nothing more essential to a community’s health and vitality than reliable access to safe drinking water,” said AWWA Executive Director David LaFrance, stating the obvious.  “Drinking Water Week provides an excellent moment to focus on the importance of caring for our water supplies and systems.”

To read the entire press release, click here.


What Kind of Water Makes the Best Tasting Coffee?

By Hardly Waite, Pure Water Gazette Senior Editor

What kind of water makes the best tasting coffee? Distilled? Softened? Reverse Osmosis? Filtered? Spring water? Rain water? We did some research and decided to reprint a clip from an interesting piece on the subject from TheCoffeeBrewers website at http://www.thecoffeebrewers.com/whisbewaforb.html.

The article below states one opinion.There are others.

The short version of this article is that minerals are necessary to bring out the flavor in coffee but not in expresso. Therefore, un-softened water (what they mean is non-distilled or non-RO water, since softening is really a different issue) is better for coffee and distilled water (or RO water) would be better for expresso.

This, as I said, is one opinion, and a simple web search will get you many opinions, some of which go much deeper into the matter than you probably want to go, specifying the dissolved solids count (one source insists that 150 to 200 ppm is ideal), pH (neutral often preferred, but hard to maintain), alkalinity, and even the Langlier Index.

One sensible suggestion would be that removing the chlorine or chloramines used to disinfect the water certainly won’t hurt the taste, so carbon filtration would be an obvious plus for all coffee water. Carbon does not affect the mineral content of the water.

Here’s what TheCoffeeBrewers has to say:

What is the Best Water for Brewing Coffee or Espresso?

Did you ever notice how salt will “bring out” the flavor in food (which is why professionally prepared restaurant food tends to be salty)? On the other hand, have you noticed how salt (and other minerals; particularly calcium) will buildup on shower walls and plumbing fixtures?

When you prepare coffee or espresso, you need to be aware of the mineral content in the water that you are using. Since the preparation of (American) coffee and espresso are predicated on very different extraction techniques, the “best” water is different for coffee than it is for espresso.

To review (or in case you weren’t aware), the flavor in coffee is mostly contained within the oils within the beans. Brewing coffee or espresso is a matter of extracting these flavors from the beans (the coffee grounds) so that they permeate the water.

The preparation of plain coffee is a steeping process, almost exactly like tea. The coffee grounds (coarse grounds work better for plain coffee) are mixed with near-boiling water. The heat and minerals in the water work together to extract the flavor from the coffee. After a short steeping period, the grounds are strained out of the mixture (via a filter), leaving the beverage known as “coffee.”

To get a flavorful coffee, there must be mineral content in the water. If the water is distilled, or if it has been softened too much (softening is the process of removing minerals), the extraction will be weak, and the beverage will be relatively flavorless – as food can be if no salt is used.

On the other hand, espresso extraction is a very different process that does not require minerals, and in which near-boiling temperatures are actually detrimental. For espresso, a more finely ground coffee is first compressed into a “puck” through which water will not pass easily. Ideally, immediately prior to extraction, the puck is pre-wet (both to begin dissolution, and to make the density within the puck uniform, so that the extraction will also be uniform).

Then, hot water (195-200 degrees Fahrenheit) is rapidly pushed through the puck under pressure. Ideally, the pressure should be in the 10-15 bar range (1 bar = 14.6 pounds per square inch), and the extraction time should be 20-25 seconds, maximum. (A longer extraction will result in a bitter and burnt flavor.)

In this kind of extraction, since the water is forced through the puck very rapidly (each water molecule moves through the puck in a fraction of a second), the water is not in contact with the coffee long enough for the minerals (in the water) to play much of a role in the extraction.

Also, for those of you who have taken some Chemistry, you may remember the ideal gas law: PV = nRT. While we are dealing with fluids in this case, note that Pressure (P) and Temperature (T) are on opposite sides of the equation. Since we do espresso extraction under (relatively) high pressure, we do not need a boiling temperature.

In fact, water that is too hot will over-extract the espresso, resulting in a bitter flavor. The reason that moka pots (stovetop brewers) tend to make bitter brews is because the temperature is steam temperature, and the pressure is too low, so the extraction will tend to be too long.

Therefore, minerals in water will not enhance the flavor of espresso. On the other hand, minerals will build up on the inner surfaces (the boiler, the internal tubes, and the portafilter) of the espresso machine. This buildup will alter the pressure within the machine, and it will corrode the internals of the machine.

The gradually increasing pressure change will adversely cause the uniformity of the extraction (hence, the flavor) of the espresso to change over time. The added pressure will also cause the internal pumps and gaskets to wear out quickly. By far, the one thing that is most detrimental to an espresso machine is mineral buildup.

This is why it is so important to do periodic cleaning of an espresso machine (as per the manufacturer’s recommendations) with a de-scaling agent. In addition, it is best if you use distilled water, or at least a water softener. For commercial machines (which will see heavy use), an in-line water softener is essential.

While drip coffee-makers will also get mineral buildup, and should be de-scaled occasionally, this is just so that the water will flow (at all) through the machine. Since no pressure in involved in the brewing of drip coffee, mineral buildup will not damage a drip coffee-maker the way that it will destroy an espresso machine. If you have an expensive espresso machine, it is imperative to keep it clean.

For plain coffee, a minimum mineral content of 150-200 parts per million is essential to a good extraction. Water softer than this will result in weak and flavorless coffee. For espresso, you should use distilled water. If the espresso machine is connected to the building plumbing, an in-line water softener (to remove the minerals) is essential.

TheCoffeeBrewers

And what about tea?

The “English Tea” website says:

The Best Water for Making Tea

A cup of tea comprises of over 99% water so it is hardly surprising that the quality of the water used is critical to the flavour of the tea. Fine teas are especially sensitive to the type of water used.

The best water for making a cup of tea is low in mineral content, free of contamination and additives and high in oxygen content. The presence of these factors can all influence the taste of tea – so a good test is to try the water before you use it to make your brew.  It the water tastes good, then it’s safe to use. If the water is tainted in any way, it’s best not to use it.

After that, the site says, the way you boil the tea is of extreme importance:

Re-using water in your kettle that has already been pre-boiled is not a good idea if you want to make a perfect cup of tea. Most experts agree that you should never re-boil previously boiled water, or boil the water for too long. As water boils, oxygen is driven out and the more it boils, the less oxygen stays in the water. Water that has already been boiled, like the water that usually sits in your kettle, contains much less oxygen than fresh water. Tea made with water that has depleted oxygen content loses its crisp, fresh taste.

 

The World’s First Road Death

 

Gazette Introduction:  There is always information about who was the last soldier to die in a war or the first baby born in a new year.  We thought it interesting to find out something about the very first person to die in an auto accident.  Now you know.

 

The Victim

On August 17, 1896, Bridget Driscoll, became the first road fatality in the world.

She was a 44year old mother with two children who had come to London with her teenage daughter and a friend to watch a dancing display.

The crash

While the driver was reported to be doing 4 mph, witnesses described her as being hit by a car travelling at “tremendous speed.”

The crash occurred on a terrace in the grounds of Crystal Palace in London.

The vehicle

The car was owned by the Anglo-French Motor Car Company who were offering demonstration rides to the public.

The driver

At the time of the crash, the car was being driven by Arthur Edsell, an employee of the company.

He had had been driving for only 3 weeks (no driving tests or licenses existed at that time).

He had apparently tampered with the belt, causing the car to go at twice the intended speed.

He was also said to have been talking to the young lady passenger beside him.

The inquest

After a six-hour inquest, the jury returned a verdict of “Accidental Death.”

At the inquest, the Coroner said “This must never happen again.”

No prosecution was proposed or brought against the driver or the company.
The aftermath

*
It has happened again and again-worldwide, over 1 million people are killed each year in road crashes and countless millions are injured.
*
Five times as many people are killed on the roads than are murdered in the UK (yet traffic safety is not a core function of the police).
*
More people died in the UK on the roads during the blackouts than in combat.
*
While there has been a substantial reduction in those reported killed and seriously injured on the road in the UK, road crashes are still the leading cause of death and acquired disability in the UK for those between 5 and 40 years old.
*
Over half of all road deaths in London are pedestrians.
*
One in 80 EU residents are expected to die 40 years prematurely due to a road crash.
*
Official casualty statistics underestimate the human casualty toll by referring to police rather than hospital statistics. Road casualties and crashes are not required to be reported to the police.

Ian Roberts, Professor of Epidemiology and Public Health, said about the epidemic of road death and injury: “…it is unusual to encounter a serious analysis of road danger in national news media. By 2020, road crashes will have moved from ninth to third place in the world disease ranking..if we overlook this carnage, it will be the propaganda coup of the new millennium.”
Editor’s Note: This article is reprinted from the  British website http://www.roadpeace.org/.

Gazette’s Fair Use Policy

 

The Ambiguities of “Cut and Run”

By Thomas Michael Holmes

 

Mr. Holmes is a historian at the University of California San Diego and a writer for the History News Service.

Gazette Note:  We’ve included this article in our Heroes category in honor of the brave political leaders like President Eisenhower who have had the courage, and the good sense, to “cut and run.”  Long live Archilochus. –Hardly Waite.

 

Karl Rove’s recent “cut and run” accusations against the Bush administration’s Democratic opponents ought to be answered. What one person sees as “cut and run” might be seen by another person as a responsible decision; it’s all in the eye of the beholder. Let’s examine some relevant recent history.

Did Dwight D. Eisenhower “cut and run” in Korea in 1953? It was Ike who told the nation that if he were elected he would go to Korea and, by implication, end the war. It is generally conceded that Eisenhower did the responsible thing when he quickly completed the truce negotiations that ended the fighting in the Korea.

Would Harry Truman have been accused of “cut and run” in September 1950, three months after the initial invasion of South Korea, had he accepted the status quo ante bellum following the rout of the overextended North Korean forces at the 38th parallel? Instead, Truman followed the advice of General Douglas MacArthur and elected to “liberate” North Korea. As the United Nations forces approached the border of the People’s Republic of China at the Yalu River, communist China entered the war and almost drove the UN forces off the southern tip of the Korean peninsula.

Had Truman been willing to “cut and run,” tens of thousands of American lives might have been saved and North Korea might not have been condemned to the isolation it has experienced ever since.Ê In the end, the war lasted for another three years. America sent 1.8 million of our own into the fray: 54,200 were killed, 103,300 were wounded and 8,200 were listed as missing in action. We ended up at the 38th parallel, right where we were in September 1950 — and where we remain today.

Did Richard Nixon “cut and run” in Vietnam? Who can forget the television footage of the American embassy in Saigon being evacuated by helicopter in 1975 as we left those Vietnamese who had depended upon us to the tender mercies of the North Vietnamese communists? They might feel, with some justification, that America had “cut and run.”

Yet in retrospect, it appears that the responsible thing for Nixon to have done in 1969, when he first entered the White House, would have been to follow the example of President Eisenhower and pull the plug on the Vietnam War. It is worth remembering that almost half the 58,000 Americans killed in Vietnam died during Nixon’s presidency.

The real mistake during what we call the Vietnam War was Lyndon Johnson’s, when he escalated the war after the bogus Tonkin Gulf Resolution. An even greater mistake, made at the end of World War II, was to have allowed the French to reestablish their colonial rule throughout Indochina after the Allied forces had liberated it from Japanese occupation. It was the fall of French colonial rule in 1954 that triggered America’s disastrous involvement in Vietnam.

Did Ronald Reagan “cut and run” in 1983 after 241 American servicemen died in Beirut in the suicide bombing of the Marine barracks? Some would say that it wasn’t the fact that Reagan pulled the American troops out of Lebanon that was the mistake; the real mistake was the fact that those Americans were put into an untenable position in the first place.

Did President George Herbert Walker Bush “cut and run” after the coalition’s qualified victory in the First Gulf War in 1991? The Shiites of southern Iraq might say so. The elder Bush not only pulled out of Iraq, but on the way out he invited the Shiites to overthrow their repressive dictator, Saddam Hussein. Then, when they attempted to do so, American forces stood by and watched while Saddam’s army ripped the Shiites to shreds.

It’s ironic that the elder Bush, the current president’s father, would later explain that he didn’t intervene because he didn’t want the U.S. to become bogged down in an Iraqi civil war. He didn’t have to. American air power, deployed outside of Iraq, could have destroyed Saddam’s army, just as American planes, deployed outside of Iraq, recently killed the insurgent leader Abu Musab al-Zarqawi.

This reminds us of another part of Karl Rove’s recent statement, that if the United States had “cut and run” in Iraq, Zarqawi would still be there plotting against us. But it was Jordanian and Iraqi intelligence that tracked and located Zarqawi, allowing for the successful American strike, with aircraft based outside of Iraq.

One might also argue that the decision of the Bush administration to re-deploy American forces from Afghanistan to Iraq constituted a “cut and run” decision that has seriously jeopardized the chances for the success of that mission.

Charges of “cut and run” have been leveled over the years by politicians on both sides of the aisle. Upon closer examination, it turns out to be a blunt rhetorical instrument that tends to obscure, rather than illuminate, difficult decisions in complex situations.

Reprinted courtesy of the History News Service.

Surprise–The Very Dark Side of U. S. History

by Peter Dale Scott and Robert Parry

October, 1010

Editor’s Note: Many Americans view their country and its soldiers as the “good guys” spreading “democracy” and “liberty” around the world. When the United States inflicts unnecessary death and destruction, it’s viewed as a mistake or an aberration.

In the following article Peter Dale Scott and Robert Parry examine the long history of these acts of brutality, a record that suggests they are neither a “mistake” nor an “aberration” but rather conscious counterinsurgency doctrine on the “dark side.”

There is a dark — seldom acknowledged — thread that runs through U.S. military doctrine, dating back to the early days of the Republic.

This military tradition has explicitly defended the selective use of terror, whether in suppressing Native American resistance on the frontiers in the 19th Century or in protecting U.S. interests abroad in the 20th Century or fighting the “war on terror” over the last decade.

The American people are largely oblivious to this hidden tradition because most of the literature advocating state-sponsored terror is carefully confined to national security circles and rarely spills out into the public debate, which is instead dominated by feel-good messages about well-intentioned U.S. interventions abroad.

Over the decades, congressional and journalistic investigations have exposed some of these abuses. But when that does happen, the cases are usually deemed anomalies or excesses by out-of-control soldiers.

But the historical record shows that terror tactics have long been a dark side of U.S. military doctrine. The theories survive today in textbooks on counterinsurgency warfare, “low-intensity” conflict and “counter-terrorism.”

Some historians trace the formal acceptance of those brutal tenets to the 1860s when the U.S. Army was facing challenge from a rebellious South and resistance from Native Americans in the West. Out of those crises emerged the modern military concept of “total war” — which considers attacks on civilians and their economic infrastructure an integral part of a victorious strategy.

In 1864, Gen. William Tecumseh Sherman cut a swath of destruction through civilian territory in Georgia and the Carolinas. His plan was to destroy the South’s will to fight and its ability to sustain a large army in the field. The devastation left plantations in flames and brought widespread Confederate complaints of rape and murder of civilians.

Meanwhile, in Colorado, Col. John M. Chivington and the Third Colorado Cavalry were employing their own terror tactics to pacify Cheyennes. A scout named John Smith later described the attack at Sand Creek, Colorado, on unsuspecting Indians at a peaceful encampment:

“They were scalped; their brains knocked out; the men used their knives, ripped open women, clubbed little children, knocked them in the head with their guns, beat their brains out, mutilated their bodies in every sense of the word.” [U.S. Cong., Senate, 39 Cong., 2nd Sess., “The Chivington Massacre,” Reports of the Committees.]

Though Smith’s objectivity was challenged at the time, today even defenders of the Sand Creek raid concede that most women and children there were killed and mutilated. [See Lt. Col. William R. Dunn, I Stand by Sand Creek.]

Yet, in the 1860s, many whites in Colorado saw the slaughter as the only realistic way to bring peace, just as Sherman viewed his “march to the sea” as necessary to force the South’s surrender.

The brutal tactics in the West also helped clear the way for the transcontinental railroad, built fortunes for favored businessmen and consolidated Republican political power for more than six decades, until the Great Depression of the 1930s. [See Consortiumnews.com’s “Indian Genocide and Republican Power.”]

Four years after the Civil War, Sherman became commanding general of the Army and incorporated the Indian pacification strategies — as well as his own tactics — into U.S. military doctrine. Gen. Philip H. Sheridan, who had led Indian wars in the Missouri territory, succeeded Sherman in 1883 and further entrenched those strategies as policy. [See Ward Churchill, A Little Matter of Genocide.]

By the end of the 19th Century, the Native American warriors had been vanquished, but the Army’s winning strategies lived on.

Imperial America

When the United States claimed the Philippines as a prize in the Spanish-American War, Filipino insurgents resisted. In 1900, the U.S. commander, Gen. J. Franklin Bell, consciously modeled his brutal counterinsurgency campaign after the Indian wars and Sherman’s “march to the sea.”

Bell believed that by punishing the wealthier Filipinos through destruction of their homes — much as Sherman had done in the South — they would be coerced into helping convince their countrymen to submit.

Learning from the Indian wars, he also isolated the guerrillas by forcing Filipinos into tightly controlled zones where schools were built and other social amenities were provided.

“The entire population outside of the major cities in Batangas was herded into concentration camps,” wrote historian Stuart Creighton Miller. “Bell’s main target was the wealthier and better-educated classes. … Adding insult to injury, Bell made these people carry the petrol used to burn their own country homes.” [See Miller’s “Benevolent Assimilation.”]

For those outside the protected areas, there was terror. A supportive news correspondent described one scene in which American soldiers killed “men, women, children … from lads of 10 and up, an idea prevailing that the Filipino, as such, was little better than a dog. …

“Our soldiers have pumped salt water into men to ‘make them talk,’ have taken prisoner people who held up their hands and peacefully surrendered, and an hour later, without an atom of evidence to show they were even insurrectos, stood them on a bridge and shot them down one by one, to drop into the water below and float down as an example to those who found their bullet-riddled corpses.”

Defending the tactics, the correspondent noted that “it is not civilized warfare, but we are not dealing with a civilized people. The only thing they know and fear is force, violence, and brutality.” [Philadelphia Ledger, Nov. 19, 1900]
In 1901, anti-imperialists in Congress exposed and denounced Bell’s brutal tactics. Nevertheless, Bell’s strategies won military acclaim as a refined method of pacification.

In a 1973 book, one pro-Bell military historian, John Morgan Gates, termed reports of U.S. atrocities “exaggerated” and hailed Bell’s “excellent understanding of the role of benevolence in pacification.”

Gates recalled that Bell’s campaign in Batanga was regarded by military strategists as “pacification in its most perfected form.” [See Gates’s Schoolbooks and Krags: The United States Army in the Philippines, 1898-1902.]

Spreading the Word

At the turn of the century, the methodology of pacification was a hot topic among the European colonial powers, too. From Namibia to Indochina, Europeans struggled to subdue local populations.

Often outright slaughter proved effective, as the Germans demonstrated with massacres of the Herrero tribe in Namibia from 1904-1907. But military strategists often compared notes about more subtle techniques of targeted terror mixed with demonstrations of benevolence.

Counterinsurgency strategies were back in vogue after World War II as many subjugated people demanded independence from colonial rule and Washington worried about the expansion of communism. In the 1950s, the Huk rebellion against U.S. dominance made the Philippines again the laboratory, with Bell’s earlier lessons clearly remembered.

“The campaign against the Huk movement in the Philippines … greatly resembled the American campaign of almost 50 years earlier,” historian Gates observed. “The American approach to the problem of pacification had been a studied one.”

But the war against the Huks had some new wrinkles, particularly the modern concept of psychological warfare or psy-war.

Under the pioneering strategies of the CIA’s Maj. Gen. Edward G. Lansdale, psy-war was a new spin to the old game of breaking the will of a target population. The idea was to analyze the psychological weaknesses of a people and develop “themes” that could induce actions favorable to those carrying out the operation.

While psy-war included propaganda and disinformation, it also relied on terror tactics of a demonstrative nature. An Army psy-war pamphlet, drawing on Lansdale’s experience in the Philippines, advocated “exemplary criminal violence — the murder and mutilation of captives and the display of their bodies,” according to Michael McClintock’s Instruments of Statecraft.

In his memoirs, Lansdale boasted of one legendary psy-war trick used against the Huks who were considered superstitious and fearful of a vampire-like creature called an asuang.

“The psy-war squad set up an ambush along a trail used by the Huks,” Lansdale wrote. “When a Huk patrol came along the trail, the ambushers silently snatched the last man on the patrol, their move unseen in the dark night. They punctured his neck with two holes, vampire-fashion, held the body up by the heels, drained it of blood, and put the corpse back on the trail.

“When the Huks returned to look for the missing man and found their bloodless comrade, every member of the patrol believed the asuang had got him.” [See Lansdale’s In the Midst of Wars.]

The Huk rebellion also saw the refinement of free-fire zones, a technique used effectively by Bell’s forces a half-century earlier. In the 1950s, special squadrons were assigned to do the dirty work.

“The special tactic of these squadrons was to cordon off areas; anyone they caught inside the cordon was considered an enemy,” explained one pro-U.S. Filipino colonel. “Almost daily you could find bodies floating in the river, many of them victims of [Major Napoleon] Valeriano’s Nenita Unit. [See Benedict J. Kerkvliet, The Huk Rebellion: A Study of Peasant Revolt in the Philippines.]

On to Vietnam

The successful suppression of the Huks led the war’s architects to share their lessons elsewhere in Asia and beyond. Valeriano went on to co-author an important American textbook on counterinsurgency and to serve as part of the American pacification effort in Vietnam with Lansdale.

Following the Philippine model, Vietnamese were crowded into “strategic hamlets”; “free-fire zones” were declared with homes and crops destroyed; and the Phoenix program eliminated thousands of suspected Viet Cong cadre.

The ruthless strategies were absorbed and accepted even by widely respected military figures, such as Gen. Colin Powell who served two tours in Vietnam and endorsed the routine practice of murdering Vietnamese males as a necessary part of the counterinsurgency effort.

“I recall a phrase we used in the field, MAM, for military-age male,” Powell wrote in his much-lauded memoir, My American Journey. “If a helo [a U.S. helicopter] spotted a peasant in black pajamas who looked remotely suspicious, a possible MAM, the pilot would circle and fire in front of him. If he moved, his movement was judged evidence of hostile intent, and the next burst was not in front, but at him.

“Brutal? Maybe so. But an able battalion commander with whom I had served at Gelnhausen [West Germany], Lt. Col. Walter Pritchard, was killed by enemy sniper fire while observing MAMs from a helicopter. And Pritchard was only one of many. The kill-or-be-killed nature of combat tends to dull fine perceptions of right and wrong.”

In 1965, the U.S. intelligence community formalized its hard-learned counterinsurgency lessons by commissioning a top-secret program called Project X. Based at the U.S. Army Intelligence Center and School at Fort Holabird, Maryland, the project drew from field experience and developed teaching plans to “provide intelligence training to friendly foreign countries,” according to a Pentagon history prepared in 1991 and released in 1997.

Called “a guide for the conduct of clandestine operations,” Project X “was first used by the U.S. Intelligence School on Okinawa to train Vietnamese and, presumably, other foreign nationals,” the history stated.

Linda Matthews of the Pentagon’s Counterintelligence Division recalled that in 1967-68, some of the Project X training material was prepared by officers connected to the Phoenix program. “She suggested the possibility that some offending material from the Phoenix program may have found its way into the Project X materials at that time,” the Pentagon report said.

In the 1970s, the U.S. Army Intelligence Center and School moved to Fort Huachuca in Arizona and began exporting Project X material to U.S. military assistance groups working with “friendly foreign countries.” By the mid-1970s, the Project X material was going to armies all over the world.

In its 1992 review, the Pentagon acknowledged that Project X was the source for some of the “objectionable” lessons at the School of the Americas where Latin American officers were trained in blackmail, kidnapping, murder and spying on non-violent political opponents.

But disclosure of the full story was blocked near the end of the first Bush administration when senior Pentagon officials working for then-Defense Secretary Dick Cheney ordered the destruction of most Project X records. [See Robert Parry’s Lost History.]

Living Dangerously

By the mid-1960s, some of the U.S. counterinsurgency lessons had reached Indonesia, too. The U.S. military training was surreptitious because Washington viewed the country’s neutralist leader Sukarno as politically suspect. The training was permitted only to give the United States influence within the Indonesian military which was considered more reliable.

The covert U.S. aid and training was mostly innocuous-sounding “civic action,” which is generally thought to mean building roads, staffing health clinics and performing other “hearts-and-minds” activities with civilians. But “civic action” also provided cover in Indonesia, as in the Philippines and Vietnam, for psy-war.

The secret U.S.-Indonesian military connections paid off for Washington when a political crisis erupted, threatening Sukarno’s government.

To counter Indonesia’s powerful Communist Party, known as the PKI, the army’s Red Berets organized the slaughter of tens of thousands of men, women and children. So many bodies were dumped into the rivers of East Java that they ran red with blood.

In a classic psy-war tactic, the bloated carcasses also served as a political warning to villages down river.

“To make sure they didn’t sink, the carcasses were deliberately tied to, or impaled on, bamboo stakes,” wrote eyewitness Pipit Rochijat. “And the departure of corpses from the Kediri region down the Brantas achieved its golden age when bodies were stacked on rafts over which the PKI banner proudly flew.” [See Rochijat’s “Am I PKI or Non-PKI?” Indonesia, Oct. 1985.]

Some historians have attributed the grotesque violence to a crazed army which engaged in “unplanned brutality” or “mass hysteria” leading ultimately to the slaughter of some half million Indonesians, many of Chinese descent.

But the recurring tactic of putting bodies on gruesome display fits as well with the military doctrines of psy-war, a word that one of the leading military killers used in un-translated form in one order demanding elimination of the PKI.

Sarwo Edhie, chief of the political para-commando battalion known as the Red Berets, warned that the communist opposition “should be given no opportunity to concentrate/consolidate. It should be pushed back systematically by all means, including psy-war.” [See The Revolt of the G30S/PKI and Its Suppression, translated by Robert Cribb in The Indonesian Killings.]

Sarwo Edhie had been identified as a CIA contact when he served at the Indonesian Embassy in Australia. [See Pacific, May-June 1968.]

US Media Sympathy

Elite U.S. reaction to the horrific slaughter was muted and has remained ambivalent ever since. The Johnson administration denied any responsibility for the massacres, but New York Times columnist James Reston spoke for many opinion leaders when he approvingly termed the bloody developments in Indonesia “a gleam of light in Asia.”

The American denials of involvement held until 1990 when U.S. diplomats admitted to a reporter that they had aided the Indonesian army by supplying lists of suspected communists.

“It really was a big help to the army,” embassy officer Robert Martens told Kathy Kadane of States News Service. “I probably have a lot of blood on my hands, but that’s not all bad. There’s a time when you have to strike hard at a decisive moment.” Martens had headed the U.S. team that compiled the death lists.

Kadane’s story provoked a telling response from Washington Post senior editorial writer Stephen S. Rosenfeld. He accepted the fact that American officials had assisted “this fearsome slaughter,” but then justified the killings.

Rosenfeld argued that the massacre “was and still is widely regarded as the grim but earned fate of a conspiratorial revolutionary party that represented the same communist juggernaut that was on the march in Vietnam.”

In a column entitled, “Indonesia 1965: The Year of Living Cynically?” Rosenfeld reasoned that “either the army would get the communists or the communists would get the army, it was thought: Indonesia was a domino, and the PKI’s demise kept it [Indonesia] standing in the free world. …

“Though the means were grievously tainted, we — the fastidious among us as well as the hard-headed and cynical — can be said to have enjoyed the fruits in the geopolitical stability of that important part of Asia, in the revolution that never happened.” [Washington Post, July 13, 1990]

The fruit tasted far more bitter to the peoples of the Indonesian archipelago, however. In 1975, the army of Indonesia’s new dictator, Gen. Suharto, invaded the former Portuguese colony of East Timor. When the East Timorese resisted, the Indonesian army returned to its gruesome bag of tricks, engaging in virtual genocide against the population.

A Catholic missionary provided an eyewitness account of one search-and-destroy mission in East Timor in 1981.

“We saw with our own eyes the massacre of the people who were surrendering: all dead, even women and children, even the littlest ones. … Not even pregnant women were spared: they were cut open. …. They did what they had done to small children the previous year, grabbing them by the legs and smashing their heads against rocks. …

“The comments of Indonesian officers reveal the moral character of this army: ‘We did the same thing [in 1965] in Java, in Borneo, in the Celebes, in Irian Jaya, and it worked.” [See A. Barbedo de Magalhaes, East Timor: Land of Hope.]

The references to the success of the 1965 slaughter were not unusual. In Timor: A People Betrayed, author James Dunn noted that “on the Indonesian side, there have been many reports that many soldiers viewed their operation as a further phase in the ongoing campaign to suppress communism that had followed the events of September 1965.”

Classic psy-war and pacification strategies were followed to the hilt in East Timor. The Indonesians put on display corpses and the heads of their victims. Timorese also were herded into government-controlled camps before permanent relocation in “resettlement villages” far from their original homes.

“The problem is that people are forced to live in the settlements and are not allowed to travel outside,” said Msgr. Costa Lopes, apostolic administrator of Dili. “This is the main reason why people cannot grow enough food.” [See John G. Taylor, Indonesia’s Forgotten War: The Hidden History of East Timor.]

Public Revulsion

Through television in the 1960-70s, the Vietnam War finally brought the horrors of counterinsurgency home to millions of Americans. They watched as U.S. troops torched villages and forced distraught old women to leave ancestral homes.

Camera crews caught on film brutal interrogation of Viet Cong suspects, the execution of one young VC officer, and the bombing of children with napalm.

In effect, the Vietnam War was the first time Americans got to witness the pacification strategies that had evolved secretly as national security policy since the 19th Century. As a result, millions of Americans protested the war’s conduct and Congress belatedly compelled an end to U.S. participation in 1974.

But the psy-war doctrinal debates were not resolved by the Vietnam War. Counterinsurgency advocates regrouped in the 1980s behind President Ronald Reagan, who mounted a spirited defense of the Vietnamese intervention and reaffirmed U.S. resolve to employ similar tactics against leftist forces especially in Central America. [See Consortiumnews.com’s “Guatemala: A Test Tube for Repression.”]

Reagan also added an important new component to the mix. Recognizing how graphic images and honest reporting from the war zone had undercut public support for the counterinsurgency in Vietnam, Reagan authorized an aggressive domestic “public diplomacy” operation which practiced what was called “perception management” — in effect, intimidating journalists to ensure that only sanitized information would reach the American people.

Reporters who disclosed atrocities by U.S.-trained forces, such as the El Mozote massacre by El Salvador’s Atlacatl battalion in 1981, came under harsh criticism and saw their careers damaged.

Some Reagan operatives were not shy about their defense of political terror as a necessity of the Cold War. Neil Livingstone, a counter-terrorism consultant to the National Security Council, called death squads “an extremely effective tool, however odious, in combatting terrorism and revolutionary challenges.” [See McClintock’s Instruments of Statecraft.]

When Democrats in Congress objected to excesses of Reagan’s interventions in Central America, the administration responded with more public relations and political pressure, questioning the patriotism of the critics. For instance, Reagan’s United Nations Ambassador Jeane Kirkpatrick accused anyone who took note of U.S.-backed war crimes of “blaming America first.”

Many Democrats in Congress and journalists in the Washington press corps buckled under the attacks, giving the Reagan administration much freer rein to carry out brutal “death squad” strategies in El Salvador, Honduras, Guatemala and Nicaragua.

What is clear from these experiences in Indonesia, Vietnam, Central America and elsewhere is that the United States, for generations, has sustained two parallel but opposed states of mind about military atrocities and human rights: one of U.S. benevolence, generally held by the public, and the other of ends-justify-the-means brutality embraced by counterinsurgency specialists.

Normally the specialists carry out their actions in remote locations with little notice in the national press. But sometimes the two competing visions – of a just America and a ruthless one – clash in the open, as they did in Vietnam.

Or the dark side of U.S. security policy is thrown into the light by unauthorized leaks, such as the photos of abused detainees at Abu Ghraib prison in Iraq or by revelations about waterboarding and other torture authorized by George W. Bush’s White House as part of the “war on terror.”

Only then does the public get a glimpse of the grim reality, the bloody and brutal tactics that have been deemed “necessary” for more than two centuries in the defense of the purported “national interests.”

 

Peter Dale Scott is an author and poet whose books have focused on “deep politics,” the intersection of economics, criminality and national security. (For more, go to http://www.peterdalescott.net/) Robert Parry is a veteran Washington investigative journalist. (For his books, go to http://www.neckdeepbook.com)

Fair Use.

Earth could hold more water

Five times as much water as in all the world’s oceans may lurk deep below its surface.

8 March 2002

by Philip Ball

 

Geologists have divined water where you might least expect it: 1,000 kilometres below the Earth’s surface. Here, rocks heated to over 1,000o C and squeezed under high pressures may harbour around five times as much water as in all the world’s oceans. This could give clues to how the Earth formed and how it behaves today.

Between 650 and 2,900 km below the Earth’s surface hot, compressed minerals surround the planet’s iron-rich core. Called the lower mantle, this material may hold up to 0.2 per cent of its own weight in water, estimate Motohiko Murakami, of the Tokyo Institute of Technology in Japan, and colleagues1.

Theories of planetary formation take into account how much easily vaporized material, such as water and carbon dioxide, were originally present. The findings hint that Earth’s starter mix may have been sloppier than anticipated.

Water would lower the melting point of rocks in the lower mantle and decrease their viscosity. Over millions of years, the mantle churns like a pan of hot soup. This moves the tectonic plates and mixes the mantle’s chemical components. A less viscous mantle would churn faster.

The take-up of water by minerals in the lower mantle might also affect the ease with which tectonic plates sink deep into the Earth. As the plates descend, heat up and become squeezed, the water that they release might soften the surrounding mantle and ease their passage.

There is already thought to be several oceans’ worth of water slightly higher in the mantle, at a depth of around 400-650 km. This region is called the transition zone, as it is between the upper and the lower mantle.

The lower mantle’s minerals can retain about a tenth as much water as the rocks above, Murakami’s team finds. But because the volume of the lower mantle is much greater than that of the transition zone, it could hold a comparable amount of water.

“The findings will boost the debate about how much water is locked away in the mantle,” says geologist Bernard Wood of the University of Bristol, UK. Until now, he says, “most people would have argued that there isn’t much water in the mantle”. A similar study two years ago concluded that there isn’t much water down there at all2.

Taking on the mantle

Murakami’s team mimicked the lower mantle in the laboratory. They studied the three kinds of mineral thought to make up most of the region: two perovskites, one rich in magnesium, the other in calcium, and magnesiowustite, a mixture of magnesium and iron oxides.

To recreate its furious conditions, the researchers used a multi-anvil cell. This heats materials while squeezing them between hard teeth. Having baked the minerals at around 1,600o C and 250,000 atmospheres, the team measured how much hydrogen the rocks contained using secondary-ion mass spectrometry. This technique blasts the material with a beam of ions and detects the ions sprayed out from the surface.

Any hydrogen in the rocks presumably comes from trapped water, an idea that other measurements support. The researchers found more hydrogen than previous experiments had led them to expect.

References

1. Murakami, M., Hirose, K., Yurimoto, H., Nakashima, S. & Takafuji, N. Water in Earth’s lower mantle. Science, 295, 1885 – 1887, (2002).
2. Bolfan-Casanova, N., Kepler, H. & Rubie, D.C. Water partitioning between nominally anhydrous minerals in the MgO-SiO2-H2O system up to 24 GPa: implications for the distribution of water in the Earth’s mantle. Earth and Planetary Science Letters, 182, 209, (2000).

Fair Use

 

Dr. Laura on Abominations

 

Dr. Laura Schlessinger is a radio personality who dispenses advice to people who call in to her radio show. Recently, she said that, as an observant Orthodox Jew, homosexuality is an abomination according to Leviticus 18:22 and cannot be condoned under any circumstance.

The following is an open letter to Dr. Laura penned by a Florida resident, which was posted on the Internet.

Dear Dr. Laura:

Thank you for doing so much to educate people regarding God’s Law. I have learned a great deal from your show, and try to share that knowledge with as many people as I can. When someone tries to defend the homosexual lifestyle, for example, I simply remind them that Leviticus 18:22 clearly states it to be an abomination. End of debate. I do need some advice from you, however, regarding some of the other specific laws and how to follow them:

When I burn a bull on the altar as a sacrifice, I know it creates a pleasing odor for the Lord – Lev. 1:9. The problem is my neighbors. They claim the odor is not pleasing to them. Should I smite them?

I would like to sell my daughter into slavery, as sanctioned in Exodus 21:7. In this day and age, what do you think would be a fair price for her?

I know that I am allowed no contact with a woman while she is in her period of menstrual uncleanliness – Lev. 15:19- 24. The problem is, how do I tell? I have tried asking, but most women take offense.

Lev. 25:44 states that I may indeed possess slaves, both male and female, provided they are purchased from neighboring nations. A friend of mine claims that this applies to Mexicans, but not Canadians. Can you clarify? Why can’t I own Canadians?

I have a neighbor who insists on working on the Sabbath. Exodus 35:2 clearly states he should be put to death. Am I morally obligated to kill him myself?

A friend of mine feels that even though eating shellfish is an Abomination, in Lev. 11:10, it is a lesser abomination than homosexuality. I don’t agree. Can you settle this?

Lev. 21:20 states that I may not approach the altar of God if I have a defect in my sight. I have to admit that I wear reading glasses. Does my vision have to be 20/20, or is there some wiggle room here?

Most of my male friends get their hair trimmed, including the hair around their temples, even though this is expressly forbidden by Lev. 19:27. How should they die?

I know from Lev. 11:6-8 that touching the skin of a dead pig makes me unclean, but may I still play football if I wear gloves?

My uncle has a farm. He violates Lev. 19:19 by planting two different crops in the same field, as does his wife by wearing garments made of two different kinds of thread (cotton/polyester blend). He also tends to curse and blaspheme a lot. Is it really necessary that we go to all the trouble of getting the whole town together to stone them? – Lev.24:10-16. Couldn’t we just burn them to death at a private family affair like we do with people who sleep with their in-laws? (Lev. 20:14)

I know you have studied these things extensively, so I am confident you can help. Thank you again for reminding us that God’s word is eternal and unchanging.

Your devoted fan,

marty

Dr. Laura does not recommend our undersink water filters because they are not mentioned in Leviticus.

    Will Dr. Bronner’s Magic Soap Continue to Defy Selling Out to Corporate Culture?

By Richard Seireeni,   Chelsea Green Publishing

July 2009

You can use it in a river. You can use it in the shower. You can lather up outside, and it doesn’t hurt a flower! Yes, you got it. It’s Dr. Bronner’s magical soap.

Started by Emmanuel Bronner, a third-generation soap maker, rabbi, and wacky spiritual guru, Dr. Bronner’s soap has been hot since the 60s and is still going strong. Mr. Bronner rejected the use of industrial chemicals way ahead of his time, and now, more than forty years later, his grandsons run the business. So with a mega-historic company whose founder is called “the godfather of today’s green brands,” how will his grandsons keep the vision alive?

The following is an excerpt from The Gort Cloud: The Invisible Force Powering Today’s Most Visible Green Brands by Richard Seireeni.

 

Emanuel Bronner was on a lifelong spiritual mission (his Hebrew name means “search for truth”). He espoused the view that a prophet arrives on earth every seventy-six years, inspired by Halley’s comet, to bring man back to God. These prophets, to name a few, are thought to have included Moses, Jesus, Muhammad, Hillel, Lao-tzu, and Gautama, the Buddha.

The doctor’s obsessive passion was sometimes mistaken for mental illness, due in part to his tendency to rant about his opinions. “He was often yelling,” says Michael Bronner.

In 1947, while giving a talk on the importance of free speech at the University of Chicago, Bronner was detained by authorities, who eventually contacted his sister, then living in Rhode Island. She agreed to commit her brother to the Illinois State Asylum in Elgin. There he underwent shock treatments, says Michael, for what they saw as his “crazy beliefs that we’re all children of one divine source, and we will destroy ourselves if we don’t realize this.”

Bronner ultimately escaped the asylum after stealing twenty dollars out of his sister’s purse when she was visiting. He headed west, thereafter referring to the mental institution as the time he spent in a “concentration camp.” “I think he did have some slight schizophrenic tendencies that were exacerbated by the asylum’s persecutory environment,” says David.

Michael adds, “He ended up setting up shop in Pershing Square in Los Angeles, which was a hotbed of political activity at the time. He was a very passionate speaker. People would come and listen to him.”

Product storytelling with a spiritual message

 

As the company’s Web site states, “Bronner’s essential vision and philosophy were born out of the fate of his family and the Holocaust, and are emphatic that we are all children of the same divine source: People must realize that we are ‘All-One!’ and that the prophets and spiritual giants of the world’s various faith traditions all realized and said this.”

“Constructive capitalism is where you share the profit with the workers and the earth from which you made it,” the site continues in its summary of Bronner’s teachings. “We are all brothers and sisters, and we should take care of each other and spaceship earth!”

Following his speeches in Pershing Square, Bronner would hand out a bottle of peppermint soap made with his family’s secret formula.

“People would come for the soap because it was so darn good, and then leave and not always listen to him,” Michael says.

It wasn’t long before Dr. Bronner was putting his “Moral ABC” message on the bottle labels. “Whereas no 6 year old can get by without learning the ABC’s, no 12 year old can get by without learning the moral ABC’s,” he was fond of saying. He didn’t waste any space, squeezing in as much text as possible, eventually adding well over two thousand words per bottle. To this day, approximately thirty thousand words of the doctor’s teachings are spread across the range of the company’s products.

A hit with hippies

When the late 1960s hit and a new counterculture erupted, Dr. Bronner’s eco-friendly soaps and his peace-loving message found their audience.

The product “became successful for all the reasons that it wasn’t successful before,” says Michael Bronner. “The quality was always good, but you had this packaging that included my grandfather’s spiritual message that was completely anti-corporate.”

The soap “was never advertised, yet everybody seemed to know about it . . . like it arrived on the scene by magic, appearing in backpack after backpack,” Michael continues. In addition, “it was a soap that could be used for anything . . . It was biodegradable, good for the earth . . . you could jump into a nearby lake and use it,” which is what I used it for back then. We always had a bottle of Dr. Bronner’s in our packs when we went hiking in the Pacific Northwest.

Dr. Bronner’s 18-in-1 Pure-Castile Soap, as it was called back then, became a sought-after product for those in the know, spreading to hippie communes across the United States. “If you were a part of that world, you knew Dr. Bronner’s soaps,” David Bronner explains. “It was like a club. The fact that it wasn’t advertised was a big advantage.”

Whether consciously or unconsciously, Dr. Emanuel Bronner knew his nonconformist, antiestablishment target audience well enough to understand that using conventional channels to reach them would not work. That is still largely the company’s understanding now.

Keeping a loyal customer base happy

As the members of the counterculture have grown up and aged, many have stayed loyal to the Dr. Bronner’s product. David and Michael Bronner, who were not alive in the 1960s, do their best to keep this market segment satisfied.

“Making our soaps is similar to making wine — you can have the same ingredients, but it’ll turn out a little bit differently depending on where those ingredients come from, where they’re grown, such as the peppermint coming from a different field. Especially with a natural product, there can be variation,” Mike Bronner explains. “People will call us up and ask about it because they want to know what’s going on. They’ll say, ‘What did you do with my soap?’ So while you’re always supposed to improve a product, no one lets you change it.”

Because they’re not of the ’60s generation, the brothers are also fighting the perception that “we’re trying to milk the product and the profits out of our grandfather’s legacy,” Michael explains. “If we raise our prices, no one understands that our materials cost twice as much as they did before — they just think that we’ve gone for a cheaper grade, that we’re selling out in some way. I get pretty strongly worded e-mails calling us out, saying, ‘You’ve lost a customer forever!’ or ‘You sold out!’ exclamation point, exclamation point, exclamation point.”

“All you can really do,” he continues, “is write these people back and say something like, ‘Our peppermint oil did change a little bit when we went organic. It now comes from India so it has a little bit more of an edge.’ And sometimes they’ll e-mail back and they’ll say, ‘Wow, keep up the good doctor’s work.'”

Keeping the legacy alive

David and Michael Bronner attempt to keep their grandfather’s spiritual message alive while at the same time relegating it to the background. They work to keep the brand associated with truth and goodness and respect for the planet but attempt to stay away from promoting a religious- sounding message.

“I very much respect my grandfather for his beliefs and for the cosmic vision he had . . . his urging people to break free of whatever barriers confine them . . . to reach out to others [who] may not share our same cultural or religious perspective on things . . . to be mindful of the environment,” explains Michael Bronner. “But that is not part of how we brand ourselves these days. We’re a secular company. We don’t get into religious discussion.” The company does send out a booklet on Emanuel Bronner’s philosophy, The Moral ABC’s, to customers who ask for it.

The Bronner brothers believe they are keeping their grandfather’s social mission alive, albeit in a different way than he did. “What the whole thing meant to him was very much what he put on the label,” explains Michael. “He wanted those words to find their way into everybody’s mind on Planet Earth so that they could interpret them and come together.”

“The ideas behind that label are very sound,” Michael continues. “And those ideas are ones of environmental sustainability and of social accountability and responsibility. By going organic, we’ve achieved the environmental aspect of my grandfather’s mission. And by going Fair Trade, we’re on our way to fulfilling his social mission.”

Fair Use Statement

Doctors Are The Third Leading Cause of Death in the US, Causing 250,000 Deaths Every Year

Reprinted from Dr. Joseph Mercola’s email newsletter.  (See signup invitation below.)

Gazette Introductory Note:  Since this 2000 report was issued, progress has been made.  Doctors have kept pace and are now causing far more than the annual 250,000 deaths reported by JAMA.

This article in the Journal of the American Medical Association (JAMA) is the best article I have ever seen written in the published literature documenting the tragedy of the traditional medical paradigm.

If you want to keep updated on issues like this click here to sign up for my free newsletter.

This information is a followup of the Institute of Medicine report which hit the papers in December of last year, but the data was hard to reference as it was not in peer-reviewed journal. Now it is published in JAMA which is the most widely circulated medical periodical in the world.

The author is Dr. Barbara Starfield of the Johns Hopkins School of Hygiene and Public Health and she desribes how the US health care system may contribute to poor health.

ALL THESE ARE DEATHS PER YEAR:

  • 12,000 — unnecessary surgery 8
  • 7,000 — medication errors in hospitals 9
  • 20,000 — other errors in hospitals 10
  • 80,000 — infections in hospitals 10
  • 106,000 — non-error, negative effects of drugs 2

These total to 250,000 deaths per year from iatrogenic causes!!

What does the word iatrogenic mean? This term is defined as induced in a patient by a physician’s activity, manner, or therapy. Used especially of a complication of treatment.

Dr. Starfield offers several warnings in interpreting these numbers:

  • First, most of the data are derived from studies in hospitalized patients.
  • Second, these estimates are for deaths only and do not include negative effects that are associated with disability or discomfort.
  • Third, the estimates of death due to error are lower than those in the IOM report.1

If the higher estimates are used, the deaths due to iatrogenic causes would range from 230,000 to 284,000. In any case, 225,000 deaths per year constitutes the third leading cause of death in the United States, after deaths from heart disease and cancer. Even if these figures are overestimated, there is a wide margin between these numbers of deaths and the next leading cause of death (cerebrovascular disease).

Another analysis 11 concluded that between 4% and 18% of consecutive patients experience negative effects in outpatient settings,with:

  • 116 million extra physician visits
  • 77 million extra prescriptions
  • 17 million emergency department visits
  • 8 million hospitalizations
  • 3 million long-term admissions
  • 199,000 additional deaths
  • $77 billion in extra costs

The high cost of the health care system is considered to be a deficit, but seems to be tolerated under the assumption that better health results from more expensive care.

However, evidence from a few studies indicates that as many as 20% to 30% of patients receive inappropriate care.

An estimated 44,000 to 98,000 among them die each year as a result of medical errors.2

This might be tolerated if it resulted in better health, but does it? Of 13 countries in a recent comparison,3,4 the United States ranks an average of 12th (second from the bottom) for 16 available health indicators. More specifically, the ranking of the US on several indicators was:

  • 13th (last) for low-birth-weight percentages
  • 13th for neonatal mortality and infant mortality overall 14
  • 11th for postneonatal mortality
  • 13th for years of potential life lost (excluding external causes)
  • 11th for life expectancy at 1 year for females, 12th for males
  • 10th for life expectancy at 15 years for females, 12th for males
  • 10th for life expectancy at 40 years for females, 9th for males
  • 7th for life expectancy at 65 years for females, 7th for males
  • 3rd for life expectancy at 80 years for females, 3rd for males
  • 10th for age-adjusted mortality

The poor performance of the US was recently confirmed by a World Health Organization study, which used different data and ranked the United States as 15th among 25 industrialized countries.

There is a perception that the American public “behaves badly” by smoking, drinking, and perpetrating violence.” However the data does not support this assertion.

  • The proportion of females who smoke ranges from 14% in Japan to 41% in Denmark; in the United States, it is 24% (fifth best). For males, the range is from 26% in Sweden to 61% in Japan; it is 28% in the United States (third best).
  • The US ranks fifth best for alcoholic beverage consumption.
  • The US has relatively low consumption of animal fats (fifth lowest in men aged 55-64 years in 20 industrialized countries) and the third lowest mean cholesterol concentrations among men aged 50 to 70 years among 13 industrialized countries.

These estimates of death due to error are lower than those in a recent Institutes of Medicine report, and if the higher estimates are used, the deaths due to iatrogenic causes would range from 230,000 to 284,000.

Even at the lower estimate of 225,000 deaths per year, this constitutes the third leading cause of death in the US, following heart disease and cancer.

Lack of technology is certainly not a contributing factor to the US’s low ranking.

  • Among 29 countries, the United States is second only to Japan in the availability of magnetic resonance imaging units and computed tomography scanners per million population. 17
  • Japan, however, ranks highest on health, whereas the US ranks among the lowest.
  • It is possible that the high use of technology in Japan is limited to diagnostic technology not matched by high rates of treatment, whereas in the US, high use of diagnostic technology may be linked to more treatment.
  • Supporting this possibility are data showing that the number of employees per bed (full-time equivalents) in the United States is highest among the countries ranked, whereas they are very low in Japan, far lower than can be accounted for by the common practice of having family members rather than hospital staff provide the amenities of hospital care.

Journal American Medical Association 2000 Jul 26;284(4):483-5