The Week That Was
Dec. 11, 2004

This is a disturbing trend towards a patchwork scheme that can have severe impacts on interstate commerce and may even be unconstitutional.
We note in passing that these initiatives come almost exclusively from "blue" states (that voted for Kerry in the Nov. election). We also note with some amusement that NY's attorney general Spitzer plans to run for governor of the state of NY; he was chief instigator of the junk lawsuit against five electric utilities for creating a public nuisance by emitting CO2.









2. Automakers Take California's Climate Emissions Rule To Court

FRESNO, California, December 8, 2004 (ENS) - A coalition of the nation's largest carmakers filed suit in federal court in Fresno, California, Tuesday challenging California's new standards for vehicle emissions of greenhouse gases linked to global warming. The Alliance of Automobile Manufacturers argues that Californians would pay "an average of $3,000 more" for a new automobile and "would never recoup those extra, up-front dollars through savings at the gas pump."

Automakers Sue to Block Emissions Law in California

DETROIT, Dec. 7- Toyota, General Motors and seven other automakers filed suit on Tuesday to block California's new greenhouse gas regulation, which was approved by the state in its final form in September.
The suit sets up a battle between automakers and Gov. Arnold Schwarzenegger, a Republican. Although Mr. Schwarzenegger is a fan of the Hummer, an S.U.V. with prodigious greenhouse gas emissions, he has promised nonetheless to defend the regulation, which was signed by his Democratic predecessor, Gray Davis.

The regulation - the first of its kind in North America - would require automakers to cut by roughly 30 percent the greenhouse gas emissions from cars and trucks sold in the state by the 2016 model year. The industry is suing in federal court in Fresno, Calif., contending that California's regulation is pre-empted by Washington's authority to regulate fuel economy. Greenhouse gas emissions from cars and trucks are a function of fuel economy.

3. U.S. PIRG urges supporters to begin call-in campaign of falsehoods against Arctic oil exploration

By Tom Randall
December 3, 2004

Issue: The U.S. Public Interest Research Group (PIRG) has sent a letter to its supporters asking them to call their senators to oppose oil exploration in the Arctic National Wildlife Refuge (ANWR). It urges supporters to tell their senators, "I'm calling to urge you to oppose oil and gas drilling in the Arctic National Wildlife Refuge. I don't want to see one of America's last wild places ruined for a miniscule amount of oil."

Comment 1: The Clinton administration report, Environmental Benefits of Advanced Oil and Gas and Production Technology, thoroughly demonstrated that oil and gas can be removed from even the most sensitive environments with complete environmental safety, and PIRG knows this. The study said, "On land and offshore, oil and gas producers have developed innovative ways to restore sites to original - and sometimes better-than-original condition -- for diverse uses ranging from housing to agriculture to wildlife habitats." (Readers can get the complete report at the link shown below.)

Comment 2: On March 16, 2004 the EIA released a study which shows that opening just 0.01% (or 2,000 of the 19,500,000 acres of federal land) in ANWR to energy development would reduce oil imports by 876,000 barrels per day by 2025. This would increase domestic production by 20%. There is nothing trivial about that.

Comment 3: It is abundant, affordable energy that makes the lives of Americans healthier and safer. It provides us with fresh fruit and vegetables year around. It makes our workplaces cleaner, safer and more productive, our homes healthier places to live. It lights our homes making them more comfortable and our streets safer. Energy brings the world to us and us to the world via the Internet, television, radio and the print media. It enables us to travel cross-country safely in a matter of hours, rather than arduous, dangerous months. Abundant, affordable energy helps power our hospitals, develop and produce our drugs. It helps to safely deliver our babies, extend our lives and increase our productivity. It creates better lives for all. It is in the public interest, making us wonder what public interest PIRG really cares about.

Link: See the 1999 Clinton administration report, Environmental Benefits of Advanced Oil and Gas Exploration and Production Technology on the House Committee on Resources website at
You may get a hard copy of the document from the Department of Energy by requesting document number DOE-FE-0385.


4. Homeland Security issues more reasonable radiation guidelines

An overview of the new draft "protective action guidelines" recommended by the Department of Homeland Security:

First-Responder Exposure: Over the course of the initial event, the new guidelines say it's safe for firemen, police and EMTs to receive a total exposure of five rem. That's the equivalent of 5,000 dental X-rays, or 20 times the radiation people normally are exposed to in a year from natural

Groups Criticize Homeland Security Plans To Relax Radiation Cleanup Standards For A "Dirty Bomb" or Terrorist Nuclear Explosive

Committee To Bridge The Gap
Nuclear Information & Resource Service
Contacts: Daniel Hirsch, CBG (831) 332-3099
Diane D'Arrigo, NIRS (202) 328-0002 x16

Doses Equivalent to Tens of Thousands of Chest X-rays Could be Allowed,
Officially Estimated to Cause Cancer in Up to a Quarter of Those Exposed

WASHINGTON, DC - Dec 2, 2004 More than 50 public policy organizations today called on the Department of Homeland Security (DHS) to halt plans to dramatically weaken requirements for cleaning up radioactive contamination from a terrorist radiological or nuclear explosive. The groups disclosed that DHS is about to release new guidance that could permit ongoing contamination at levels equivalent to a person receiving tens of thousands of chest X-rays over thirty years. Official government risk figures estimate that as many as a quarter of the people exposed to such doses would develop cancer.

In a letter to outgoing DHS Secretary Tom Ridge, the groups said, "An attack by a terrorist group using a 'dirty bomb' or improvised nuclear device would be a terrible tragedy. . . .But should such a radiological weapon go off in the U.S, our government should not compound the situation by employment of standards for cleaning up the radioactive contamination that are inadequately protective of the public."

"Far from protecting us from the potentially catastrophic health effects of a terrorist dirty bomb, by permitting such high radiation levels to remain without cleanup, Homeland Security would actually be increasing the casualty count," said Diane D'Arrigo, Radioactive Waste Project Director at Nuclear Information and Resource Service. "Approval of this guidance would also set a dangerous precedent to weaken the already inadequate cleanup standards for nuclear-contaminated sites across this country."

"Benchmark" cleanup standards contemplated in the DHS guidance are up to 2500 times less protective than the risk levels considered by EPA as barely acceptable for cleanup of Superfund toxic and radioactive sites.

[Comment: Or are EPA standards too strict?]

"We recognize that response actions in the immediate aftermath of a terrorist incident may require extraordinary measures and doses," said Daniel Hirsch, President of the Committee to Bridge the Gap and initiator of the group letter, "However, it is unacceptable to set final cleanup goals so lax that long-term cancer risks are hundreds of times higher than currently accepted for remediation of the nation's most contaminated sites."

In a parallel letter to Environmental Protection Agency, the groups urged Administrator Michael Leavitt to resist any effort to establish cleanup standards that permit public risks significantly outside EPA's longstanding legally allowable risk range.

Signers include Committee to Bridge the Gap, Nuclear Information and Resource Service, Union of Concerned Scientists, Sierra Club, Physicians for Social Responsibility, Public Citizen, and Greenpeace.

SEPP Comment: By now our readers should be aware of the unscientific use of the LNT (linear no-threshold) hypothesis. But just to remind:
Exposure data w/o health effects (from Dr. Donald W. Miller ):
Natural exposures: Ramsar, Iran, 79 rem/yr; Guarapari, Brazil, 17.5 rem/yr
Also: 8,000 people near thermonuclear explosion in USSR, up to 12 rem
4,000 living in Taiwan apt. blds. contaminated w/Cobalt-60, 3.5 rem

5. Greenpeace Co-Founder Says Organization Has Lost Its Way

Patrick Moore, co-founder of Greenpeace, says that he left the mainstream green movement in 1986 because it abandoned science and logic in favor of an anti-corporate, anti-globalization agenda.

Moore, a Ph.D. in ecology, says instead of using science to solve problems such as whaling, nuclear testing and toxic waste, Greenpeace became more concerned about maintaining problems to further a leftist political agenda. For example, Greenpeace:

O Effectively demanded that nuclear waste never be buried; but this meant more individuals would be exposed to risk as waste is shuffled from one location to another.

O Opposed aquaculture and insisted society catch only a sustainable level of fish from the wild; but this would drive up the price of fish to the point where only the wealthy could afford it.

O Insisted that all farming be organic; but this would make millions around the world go unfed without the availability of cost-saving agricultural technologies.

Moore suggests Greenpeace's final remnants of a science-based agenda were destroyed after the fall of the Berlin Wall in 1989 due to the influx of peace activists and Marxist ideologues into the green movement.


Source: Roger Bate, "Moore Wisdom Needed," Economic Affairs, Vol. 24, Institute of Economic Affairs, June 2004;

6. Hydrogen Production Method Could Bolster Fuel Supplies
NY Times

WASHINGTON, Nov. 27 - Researchers at a government nuclear laboratory and a ceramics company in Salt Lake City say they have found a way to produce pure hydrogen with far less energy than other methods, raising the possibility of using nuclear power to indirectly wean the transportation system from its dependence on oil. The development would move the country closer to the Energy Department's goal of a "hydrogen economy," in which hydrogen would be created through a variety of means, and would be consumed by devices called fuel cells, to make electricity to run cars and for other purposes.

Experts cite three big roadblocks to a hydrogen economy: manufacturing hydrogen cleanly and at low cost, finding a way to ship it and store it on the vehicles that use it, and reducing the astronomical price of fuel cells. "This is a breakthrough in the first part," said J. Stephen Herring, a consulting engineer at the Idaho National Engineering and Environmental Laboratory, which plans to announce the development on Monday with Cerametec Inc. of Salt Lake City. The developers also said the hydrogen could be used by oil companies to stretch oil supplies even without solving the fuel cell and transportation problems.

Mr. Herring said the experimental work showed the "highest-known production rate of hydrogen by high-temperature electrolysis." But the plan requires the building of a new kind of nuclear reactor, at a time when the United States is not even building conventional reactors. And the cost estimates are uncertain. The heart of the plan is an improvement on the most convenient way to make hydrogen, which is to run electric current through water, splitting the H2O molecule into hydrogen and oxygen. This process, called electrolysis, now has a drawback: if the electricity comes from coal, which is the biggest source of power in this country, then the energy value of the ingredients - the amount of energy given off when the fuel is burned - is three and a half to four times larger than the energy value of the product. Also, carbon dioxide and nitrogen oxide emissions increase when the additional coal is burned. Hydrogen can also be made by mixing steam with natural gas and breaking apart both molecules, but the price of natural gas is rising rapidly.

The new method involves running electricity through water that has a very high temperature. As the water molecule breaks up, a ceramic sieve separates the oxygen from the hydrogen. The resulting hydrogen has about half the energy value of the energy put into the process, the developers say. Such losses may be acceptable, or even desirable, because hydrogen for a nuclear reactor can be substituted for oil, which is imported and expensive, and because the basic fuel, uranium, is plentiful. The idea is to build a reactor that would heat the cooling medium in the nuclear core, in this case helium gas, to about 1,000 degrees Celsius, or more than 1,800 degrees Fahrenheit. The existing generation of reactors, used exclusively for electric generation, use water for cooling and heat it to only about 300 degrees Celsius. The hot gas would be used two ways. It would spin a turbine to make electricity, which could be run through the water being separated. And it would heat that water, to 800 degrees Celsius. But if electricity demand on the power grid ran extremely high, the hydrogen production could easily be shut down for a few hours, and all of the energy could be converted to electricity, designers say.

The goal is to create a reactor that could produce about 300 megawatts of electricity for the grid, enough to run about 300,000 window air-conditioners, or produce about 2.5 kilos of hydrogen per second. When burned, a kilo of hydrogen has about the same energy value as a gallon of unleaded regular gasoline. But fuel cells, which work without burning, get about twice as much work out of each unit of fuel. So if used in automotive fuel cells, the reactor might replace more than 400,000 gallons of gasoline per day.

The part of the plan that the laboratory and the ceramics company have tested is high-temperature electrolysis. There is only limited experience building high-temperature gas-cooled reactors, though, and no one in this country has ordered any kind of big reactor, even those of more conventional design, in 30 years, except for those whose construction was canceled before completion.

Another problem is that the United States has no infrastructure for shipping large volumes of hydrogen. Currently, most hydrogen is produced at the point where it is used, mostly in oil refineries. Hydrogen is used to draw the sulfur out of crude oil, and to break up hydrocarbon molecules that are too big for use in liquid fuel, and change the carbon-hydrogen ratio to one more favorable for vehicle fuel.

Mr. Herring suggested another use, however: recovering usable fuel from the Athabasca Tar Sands in Alberta, Canada. The reserves there may hold the largest oil deposits in the world, but extracting them and converting them into a gasoline substitute requires copious amounts of steam and hydrogen, both products of the reactor

7. Mercury Reduction Rules Are Fishy

The Environmental Protection Agency's proposed new rules for mercury reductions would cost about $1.4 billion per year, and would have a negligible impact on public health, says environmental consultant Joel Schwartz.

Two proposals are being considered which would target coal-fired utility boilers for mercury reductions. However, both are costly and the monetary benefits of such reductions are unknown, which even the EPA admits, says Schwartz.

O The first proposal, based mainly on a cap and trade system, would cost about $1.36 billion per year to the industry, with an estimated cost to society of about $1.6 billion.

O The second proposal, a two-phase reduction, would cost $2.9 billion in
2010; $3.7 billion in 2015; and $4.9 billion in 2020; with equal social and control costs.

However, the benefits are questionable, particularly in light of previous studies examining the effects of mercury exposure to mothers and children on the Faroes Islands and the Seychelles, says Schwartz:

O Based on the Faroes study, a total elimination of mercury emissions in the United States would improve children's health minimally: Children in the 10th percentile on neurological and cognitive test scores would move to between 10.3 and 10.6 percentiles, at best.

O A study of children in Seychelles indicated no harm from mercury exposure, even though their mercury exposure was greater than the most highly-exposed Americans.

O Furthermore, the new rules assume a one-to-one correspondence between mercury emissions and mercury levels in freshwater fish; but more likely fish mercury levels would decline by less than half of the amount of mercury depositions reductions.

Source: Joel Schwartz, "A Regulatory Analysis of EPA's Proposed Rule to Reduce Mercury Emissions from Utility Boilers," AEI-Brookings Joint Center for Regulatory Studies, September 2004.
For text

8. Freak Weather Events Not Related To Global Warming
Madhav Khandekar <>

Is "freak weather", in particular heat waves in Europe 2003, becoming more common due to global warming -- with humans held accountable.

How do we know this? Because climate scientists, or rather computer scientists, have modeled the 2003 European heat wave and have concluded that the "dice is loaded now in favor of such freak heat waves occurring more often".

It is unfortunate that these computer scientists have ignored the climatology of heat waves occurring in different parts of the world for the last 100 years, since instrument records of weather observations began. None of those heat waves were blamed on human activity then. Why blame humans now?

Climatologists and meteorologists know very well that in pre-Monsoon months in India and elsewhere in Monsoonal climates, heat waves of a few days or longer occur often (once in three-to-four years); it is a simple case of an upper air ridge formation, descending air movement causing warming and with no moisture influx. In the last six years, two such heat waves occurred in central and southeastern Peninsular India, 1998 May/June and 2002, May. In 1998, temperatures of 50C were recorded in New Delhi and elsewhere in northwest India, every afternoon, three weeks in a row! In May 2002, temperatures of around 48C were recorded in southeastern states of Orissa and Andhra Pradesh. During both these heat waves, several hundred people died of heat exhaustion and dehydration. A simple ceiling fan device in poorly built houses and availability of lots of good drinking water could have saved most of these fatalities.

That is precisely what happened in 2003 in Europe when a persistent upper-level ridge of high pressure formed over the continent, partly related to eastern Atlantic teleconnection pattern (see Bulletin of American Meteorological Society, June 2004 for additional details). The European heat wave of 2003 was exceptional, but not necessarily unprecedented and certainly had nothing to do with human activity.

How do these computer scientists explain the deadliest heat wave that occurred in central and southern Canada in July 1936 (5-12 July) when there were very few people living in Canada? This Canadian heat wave killed more than 1100 people in a span of just 10 days, most of the deaths occurring from heat exhaustion and dehydration due to unavailability of air-conditioned houses. The large number of fatalities in Europe's 2003 heat wave could have been avoided if most of those senior citizens were moved to open areas and/or air conditioned houses, in time. The number of deaths does not make a heat wave any worse than what the temperature structure suggests. In 1995 July, a severe heat wave in central Illinois, USA, killed about 800 people, most of them being senior citizens living in houses with no air-conditioners and too afraid to open windows for fear of vandalism. A report published recently blamed lack of suitable precaution by city officials for the large number of deaths.

Would those computer scientists like to explain why there was a month-long cold spell in winter 2003 in Vietnam and Bangladesh? That freak weather caused hundreds of deaths due to exposure to temperatures of 10C and having to live in houses with no heating facilities? Was that freak weather due to global warming? There are several examples of freak "cold weather" incidences occurring in last six years. The city of Halifax in eastern Canada received close to 100 cm snowfall in 24-hour period in February 2004. The city lost almost 5 million man-hours as a result of huge piles of snow which could not be removed from the streets. In New York City and vicinity, 50 cm of snow fell on December 5th 2003, an all time record breaking snow fall at the earliest date ever. Are we to blame these freak weather incidences to global warming? What nonsense!

Madhav Khandekar
Consulting Meteorologist, Unionville, Canada.
(Retd) Research Scientist, Environment Canada

M L Khandekar. Are Computer Model Projections reliable enough for climate policy? Energy & Environment(UK), 15, 2004, p. 521-525

9. Much Ado About Fu: The Satellite Saga Continues
By Roy Spencer, Univ of Alabama, Huntsville (12/3/04)

The results of two research studies announced this week address the infamous discrepancy between satellite and surface thermometer trends over the last 25 years.

The original satellite dataset produced by the University of Alabama in Huntsville (UAH) now has a warming trend of 0.08 degC/decade since 1979, while the surface thermometer trend is two to three times this value. Climate models, in contrast, claim that any surface warming as a result of greenhouse warming should be amplified with height, not reduced. This has led to varying levels of concern in the climate community that the theory contained in the climate models might be in error.

As background, a study published earlier this year by Fu et al. (1) attempted to estimate the amount of tropospheric warming by a simple linear combination of the stratospheric and tropospheric channels of the Microwave Sounding Units (MSUs) flying on NOAA polar-orbiting weather satellites. (The troposphere exists from the surface up to a height of around 8-12 miles, the stratosphere overlays it.) Since the tropospheric channel has about 15% influence from the stratosphere -- which has cooled strongly since 1979 -- the tropospheric temperature can only be estimated through removal of the stratospheric component. Fu et al. used radiosonde (weather balloon) data to arrive at an optimum combination of the two channels that, when applied to the satellite-observed temperature trends, resulted in a tropospheric warming trend that was larger than that estimated by UAH with a different technique.

In the first article announced this week, Fu & Johanson (2) estimate the stratospheric contribution to the satellite instrument's tropospheric channel through a slightly different method than in their original article. They used previously published radiosonde estimates of temperature trends through the lower and middle stratosphere to estimate the error in their method, as well as the amount of stratospheric cooling contained in the tropospheric channel. While we would prefer to leave detailed comments for a journal article, a couple of general points can be made. For the period they examined (1979-2001), our (UAH) lower-tropospheric temperature trend is +0.06 deg. C/decade, while their estimate of the (whole) tropospheric trend is +0.09 deg C/decade. You might notice that the difference between these two trends is small, considering the probable error bounds on these estimates and the fact that the two techniques measure somewhat different layers. Also, their method depends on belief in the radiosonde-measured trends in the lower stratosphere, even though we know there are larger errors at those altitudes than in the troposphere -- and most published radiosonde trends for the troposphere show little or no global warming (!).

As is often the case, the press release that described their new study made claims that were, in my view, exaggerated. Nevertheless, given the importance of the global warming issue, this line of research is probably worthwhile as it provides an alternative way of interpreting the satellite data.

The other study (3), published by Simon Tett and Peter Thorne at the UK's Hadley Centre, takes issue with the original Fu et al. method. Tett and Thorne claim that when the technique is applied to variety of radiosonde, reanalysis, and global model simulation datasets in the tropics, it leads to results that are more variable than the UAH technique produces. It also mentions the dependence of the method on the characteristics of the radiosonde data that are assumed.

What all this means in terms of observed and predicted global temperature trends remains to be seen. As part of the requirements of the Bush administration's Climate Change Science Plan, a variety of scientists are now sifting through the satellite, surface thermometer, and radiosonde data, and will report in the coming year on their findings.


1. Fu, Q., C.M. Johanson, S.G. Warren, and D.J. Seidel, 2004: Contribution of stratospheric cooling to satellite inferred tropospheric temperature trends. Nature, Vol. 429, p. 55-58.

2. Fu, Q., and C.M. Johanson, 2004. Stratospheric influences on MSU-derived tropospheric temperature trends: A direct error analysis. Journal of Climate, to be published December 15, 2004

3. Tett, S., and P. Thorne, 2004: Tropospheric temperature series from satellites. December 2, 2004, at Nature online (subscription required).


SEPP Comment: If one assumes that all of the 0.08C /decade trend is anthropogenic, then the max temperature rise by 2100 is likely to be only 0.8C


10. We need your financial support: Our once-a-year appeal to our subscribers and readers.

Pls be as generous as you can. Mail yr check to

1600 S Eads St, Suite 712-S, Arlington, VA 22202, USA
or use Paypal. (We prefer check)

If you would like to receive our books or pamphlets, pls so indicate.
(for donations over $100)

The Science & Environmental Policy Project (SEPP) is an international association of mainly physical scientists and engineers concerned with the responsible use of scientific information in the development of environmental policies .We publish scientific reports, hold briefings, give talks and seminars, and issue a weekly e-mail bulletin to some 2000 addressees.

Our web address is

Our Priority Issues are:

Climate change
Control of greenhouse gases
Energy policy
Future energy sources
Nuclear radiation effects
Nuclear terrorism threats

1. We do not solicit support from either industry or government but receive donations and grants from private individuals and foundations.

2. Our Tax Status is as a 501(3)(c) organization. Donations are fully tax-deductible.

3. Our officers and board members do not receive salaries or fees. We have no salaried employees but use volunteers and student help.

4. We don't rent or sell readership or donor lists. We don't accept pop-up or other advertising.




Go to the Week That Was Index