| The Week That Was May 13, 2006   
 
 
 
 
 
 A closer look at hurricanes: Ocean temperatures are not the whole 
        story (Item #9). 
 Contemplating a run for the White House in 2008, former Vice President 
        Al Gore is positioning himself as an alternative to both Sen. Hillary 
        Clinton (D-NY) and the sleeping pill Ambien, aides to Mr. Gore confirmed 
        today.  www.Borowitzreport.com 
 President Bush has made the welcome point that the U.S. needs "to 
        move beyond a petroleum-based economy," and has lent his support 
        to the need to develop energy from biomass, which refers to all bulk plant 
        material. This is popular with the public and also enjoys significant 
        support in Congress. Unfortunately, congressional subsidies for biomass 
        are driven by farm-state politics rather than by a technology-development 
        effort that might offer a practical liquid fuel alternative to oil. Meanwhile, 
        major oil and chemical companies are evaluating biomass and investors 
        are chasing biomass investment opportunities. But how much of this is 
        practicable?  Biomass can be divided into two classes: food-crop and cellulosic. Natural 
        enzymes can easily break down food-crop biomass such as corn to simple 
        sugars, and ferment these sugars to ethanol. Cellulosic biomass -- which 
        includes agricultural residues from food crops, wood and crops such as 
        switch grass -- cannot easily be "digested" by natural enzymes. 
         Today, we use corn to produce ethanol in an automobile fuel known as 
        "gasohol" -- 10% ethanol and 90% gasoline. Generous federal 
        and state subsidies, largely in the form of exemption from gasoline taxes 
        for gasohol, explain the growth of its use; in 2005, over four billion 
        gallons of ethanol were used in gasohol out of a total gasoline pool of 
        120 billion gallons. Politicians from corn-states and other proponents 
        of renewable energy support this federal subsidy, but most energy experts 
        believe using corn to make ethanol is not effective in the long run because 
        the net amount of oil saved by gasohol use is minimal.  In the U.S., cultivation of corn is highly energy-intensive and a significant 
        amount of oil and natural gas is used in growing, fertilizing and harvesting 
        it. Moreover, there is a substantial energy requirement -- much of it 
        supplied by diesel or natural gas -- for the fermentation and distillation 
        process that converts corn to ethanol. These petroleum inputs must be 
        subtracted when calculating the net amount of oil that is displaced by 
        the use of ethanol in gasohol. While there is some quarreling among experts, 
        it is clear that it takes two-thirds of a gallon of oil to make a gallon 
        equivalent of ethanol from corn. Thus one gallon of ethanol used in gasohol 
        displaces perhaps one-third of a gallon of oil or less.  A federal tax credit of 10 cents per gallon on gasohol, therefore, costs 
        the taxpayer a hefty $120 per barrel of oil displaced cost. Surely it 
        is worthwhile to look for cheaper ways to eliminate oil.  The biotech approach, by contrast, seeks to produce new enzymes that 
        will break down the difficult-to-digest cellulosic feedstock into simple 
        sugars that can be fermented into ethanol or other liquid biofuels products. 
        This approach merits genuine enthusiasm, especially as one can imagine 
        engineering an organism to produce enzymes that (a) break down the cellulosic 
        material, as well as (b) more efficiently ferment the sugars into ethanol. 
        Realizing this exciting prospect will not be easy. Many hurdles must be 
        overcome: Biotech experts need to assemble the gene "cassette" 
        and the organisms, and talented engineers need to demonstrate a cost-effective 
        process. Most importantly, an integrated bioengineering effort is required 
        to develop a process that: reduces the harsh pretreatment required to 
        dissolve the solid cellulosic feedstock; increases the concentration of 
        ethanol that is tolerated by the enzymes; and achieves an efficient process 
        to separate the ethanol from the product liquor.  Success will require a sustained research effort; it is too early to 
        estimate the production costs of this method, because process conditions 
        are unknown. However, the expected fossil energy inputs for cellulosic 
        biomass will be much less than that of gasohol, because the energy cost 
        for cultivation is less, and because the portion of the cellulosic material 
        not converted to ethanol can be burned to provide process heat -- thus 
        substantially lowering the implied cost of federal tax subsidies per barrel 
        of oil displaced.  I will be astonished, but delighted, if the cost of ethanol or other 
        biomass-derived chemicals proves to be less than $40 per barrel of its 
        oil equivalent, and if large-scale production can be accomplished in six 
        years.  As for the land required to support significant biofuel production from 
        a dedicated energy crop, switch grass offers a basis for estimation. It 
        grows rapidly, with an expected harvest one or two years after planting. 
        Ignoring crop rotation, an acre under cultivation will produce five to 
        10 tons of switch grass annually, which in turn provides 50 to 100 gallons 
        of ethanol per ton of biomass. Thus the land requirement needed to displace 
        one million barrels of oil per day (about 10% of U.S. oil imports projected 
        by 2025), is 25 million acres (or 39,000 square miles). This is roughly 
        3% of the crop, range and pasture land that the Department of Agriculture 
        classifies as available in the U.S. I conclude that we can produce ethanol 
        from cellulosic biomass sufficient to displace one to two million barrels 
        of oil per day in the next couple of decades, but not much more. This 
        is a significant contribution, but not a long-term solution to our oil 
        problem.  Rising real prices of oil and natural gas reflect in part the progressive 
        decline in low-cost reserves, and signal the wisdom of preparing now for 
        a long transition from our petroleum-based economy. Almost certainly, 
        future economies will exploit all possible technology options for replacing 
        petroleum-based liquid fuels, especially technologies that do not produce 
        net carbon dioxide, the major greenhouse gas. Biomass should, properly, 
        be considered along with nuclear power and coal conversion with carbon 
        capture and sequestration as important options for future energy supply. 
         Mr. Deutch, director of energy research and undersecretary of Energy 
        in the Carter administration, and director of the CIA and deputy secretary 
        of Defense in the first Clinton administration, is a professor of chemistry 
        at MIT.  
 Arizona Congressman John Shadegg is the first politician of note to propose 
        a good idea in response to increased energy costs: the suspension of outrageously 
        high tariffs on imported ethanol, says the Wall Street Journal.  The intent is to offset some of the gas price hikes that Congress has 
        caused via the ethanol mandate it passed last year. That requirement -- 
        that drivers use 7.5 billion gallons of ethanol annually by 2012 -- is 
        currently helping to increase the cost of gas, since ethanol is in short 
        supply.  Shadegg's bill would suspend the taxes on imported ethanol until 2007. 
        Not only would this result in a new flow of ethanol in a tight market, 
        it would give the gas industry time to prepare its infrastructure to handle 
        new domestic ethanol requirements, explains the Journal.  One irony of the current gas panic is that big oil companies are being 
        pilloried for their profits, but domestic ethanol producers get a pass. 
        Yet the ethanol makers receive more government subsidies and are responsible 
        for far more of the current gasoline price spike. Congress doesn't have 
        to bash ethanol makers; all it has to do is allow more foreign supply, 
        which will do more to reduce gasoline prices more quickly than any other 
        single idea, says the Journal.  Source: Editorial, "A Good Gas Idea," Wall Street Journal, 
        May 8, 2006 
 Few are the subjects on which you can exhibit in public an abject, sub-protozoan 
        stupidity without fear of damage to your reputation. Gasoline is surely 
        a miracle commodity. Yet it should be bracing for politicians, the American 
        people and the press that the only sensible opinion they're hearing on 
        $3 gasoline is coming from a reviled, overpaid energy executive, namely 
        Exxon's Rex Tillerson.  Yes, he tells audience after audience, the world will depend on hydrocarbons 
        as a primary energy source for decades to come. "It is true that 
        the age of 'easy oil' is over. What many fail to realize is that it has 
        been over for decades. Our industry constantly operates at the edge of 
        technical possibility, constantly developing and applying new technologies 
        to make those possibilities a reality," he told a group in Washington 
        last week.  Doubters might consult a new book by energy economist Mark Jaccard, entitled 
        "Sustainable Fossil Fuels," winner of Canada's Donner Prize. 
        He argues that hydrocarbons, in the form of oil, gas and coal, exist in 
        such abundance, the challenge of technology is how to burn them more cleanly, 
        not how to survive without them.  The closest Mr. Tillerson comes to a prediction is that, with all these 
        fossil hydrocarbons in stock, technology will allow the world to consume 
        a growing, rather than shrinking, volume of fuel at a price users are 
        willing to pay. This puts him at odds with the "peak oil" theorists, 
        but rests on the defensible proposition that the future will be much like 
        the past. What the future price will be, Mr. Tillerson doesn't vouchsafe. 
        Exxon is sticking to its corporate discipline of investing in oil projects 
        only if they'll pay an adequate return at an oil price much lower than 
        today's. And since the company handsomely leads its peers on return on 
        capital, its analysis of industry economics must be pretty good.  To raise the most discordant question of all: Why is $3 gasoline a "crisis" 
        anyway? The fluctuations of gasoline are in line with normal experience, 
        which is that commodity prices are volatile. The price chart for gasoline 
        over any number of years doesn't look much different from the price chart 
        for corn, aluminum, orange juice, etc.  An even better question right now is: How can the forces that require 
        Mr. Tillerson not to speak idiocy on the subject of gas prices be harnessed 
        to our political culture, where the opposite incentive is sadly evident? 
         Look to a non-phony crisis, that of the welfare state. With their usual 
        dourness, the Social Security and Medicare trustees came out with their 
        annual report last week, and no demographic and fiscal miracles have transpired 
        since last year. These programs would have to consume three-quarters of 
        all projected federal taxes by 2040, up from 40% today, to keep their 
        promises to beneficiaries.  Mr. Tillerson has the stock market looking over his shoulder at every 
        moment, forcing him to adopt the intensive realism that usually prevails 
        when people have their own money on the line. When Americans finally must 
        look daily to the stock market rather than the government as guardian 
        of their retirement, their appetite for fantasies and demagoguery on bread-and-butter 
        issues like gas prices will decline too. The single biggest advance for 
        self-government since the invention of literacy will be liberating voters 
        from the infantilizing illusion that somebody else can provide for their 
        old age.  
 Automotive equipment maker Delphi will opt for this trial-tested carbon 
        dioxide air-conditioning system, if more experimental hydrofluorocarbon 
        refrigerants don't pan out. (Photo courtesy of Delphi.)  In the 1990s, air conditioning suppliers switched from the chlorofluorocarbon 
        Freon to an equally troublesome hydrofluorocarbon called R-134a; while 
        easy on the ozone, R-134a is a greenhouse gas that's 1,300 times more 
        potent than CO2.  The impact has been most acute in automotive applications, where refrigerants 
        often leak out. Indeed, by 2010, such leakage will contribute more than 
        4 percent of the total climate change impact from motor vehicles. Add 
        in the extra fuel consumption to run the AC, and AC's share rises to 7 
        percent.  Little surprise, then, that the European Union decided this January 31 
        to begin phasing out the use of R-134a in new-model cars beginning in 
        2011, and that regulators in California are preparing to follow suit. 
        Until this spring, the most likely replacement looked to be novel high-pressure 
        systems employing, ironically, CO2 as the refrigerant. Behr GmbH -- Europe's 
        leading AC supplier for cars -- announced last month that they would begin 
        selling CO2-based systems ahead of the EU's 2011 deadline.  But Behr's competitors, such as Troy, MI-based Delphi and Germany's Robert 
        Bosch GmbH, have been backing away from CO2 since February, when DuPont 
        and Honeywell unveiled new hydrofluorocarbon refrigerants that may be 
        clean enough to squeak by the regulators. According to the chemical companies, 
        the new kinds of hydrofluorocarbons are no more than 150 more potent as 
        greenhouse gases than CO2 -- the limit set by the EU for auto refrigerants 
        after 2011. What's more, these refrigerants can be dropped into existing 
        AC equipment. "The prospect of having a new drop-in refrigerant that 
        would satisfy the 2011 legislation is incredible -- it's enormous," 
        says Stefan Glober, director of engineering for Delphi's thermal and interior 
        division.  Many questions remain for both options, however. The new hydrofluorocarbon-based 
        refrigerants offered by DuPont and Honeywell must complete a host of long-term 
        tests, including for the stability of the compounds under heavy use and 
        for toxicity. That could take at least three years. And it's unknown how 
        much the new refrigerants will cost to manufacture. This means that AC 
        manufacturers must also continue to develop their new CO2 systems. "These 
        alternatives have appeared relatively late. That's the dilemma we're in 
        right now," says Glober.  The CO2 systems have their own hurdles. One is detecting leaks: cheap, 
        effective CO2 sensors don't exist yet. Another is cost. And it's here 
        that Behr and its competitors part ways. Glober says the industry consensus 
        is that the first CO2 systems will sell for €150-200 more than conventional 
        AC systems, doubling their costs. Behr, in contrast, says it will be able 
        to keep down the added cost to less than €100 in the first-generation 
        system and half that by 2015 -- sums that the firm predicts will be justified 
        by higher performance. 
 Carbon prices crashed in late April after it emerged that five EU states 
        had emitted less carbon in 2005 than they had been allocated under the 
        EU emissions trading scheme. Still in its infancy, the only certainty 
        for Europe's fledgling carbon market is that short-term price volatility 
        will continue, as Datamonitor's Paul Stewart explains...  EU carbon credits collapsed from record highs of over E30 to around E11 
        per metric tonne in just one week after the Czech Republic, Estonia, France, 
        the Netherlands and Sweden all reported lower than anticipated emissions 
        in 2005.  The EU's emissions trading scheme (ETS) is the key mechanism with which 
        Brussels intends to get Europe on track to meet its Kyoto target. The 
        extreme price volatility witnessed in April 2006 has, however, prompted 
        a skeptical review of the robustness of the original phase-I emission 
        quotas, raising concerns that countries over-allocated in their national 
        allocation plans for the period 2005 to 2007.  In reality, the market has reacted violently to a limited amount of data 
        with only 28% of the total volume included in the scheme having verified 
        against actual 2005 emission levels. The trigger for massive carbon price 
        losses was the fact that countries such as the Czech Republic, France 
        and the Netherlands had emitted far less than traders had anticipated. 
        Conversely, news that Spain had emitted 10 million tonnes more than it 
        had been allocated also failed to stem heavy losses purely because the 
        market had expected a larger deficit from one of Europe's fastest-growing 
        energy consumers.  The European Commission (EC) is rightly concerned that large price swings 
        can undermine the effectiveness of the scheme in financing carbon abatement 
        investment - reportedly emailing all member states to request they withhold 
        their verified emissions data until a complete 25-country assessment is 
        released on May 15, 2006. On this date the market will also see the 2005 
        emissions of Germany, Italy and the UK, who combined account for around 
        44% of the EU's ETS quota.  In the short-term, Europe's carbon market remains an immature and volatile 
        trading environment with traders inconsistently shifting their focus from 
        fuel market fundamentals to regulatory drivers. Longer-term, however, 
        the EU-ETS must be in deficit if Europe is to meet its collective UN-mandated 
        Kyoto target. Ultimately, a potential excess of allowances in phase I 
        will only heighten pressure on the EC to be more stringent with its phase 
        II allocations. 
 True, it needs close reading; and true, it comes from an obscure and 
        mostly powerless institution. But it's possible to detect subtle shifts 
        in the EU's position on the Kyoto Protocol.  In an 'Opinion' of 28 April 2006, on the effects of international agreements 
        to reduce greenhouse gas emissions on the industrial change processes 
        in Europe, the European Economic and Social Committee timidly opens the 
        door for an overhaul of Europe's climate policy, especially its CO2 emission 
        trading system.  The opening sentence is still funny: "Climate change is a unique 
        problem that humanity has never before encountered in modern times." 
        I always figured that climate change is of all times. It is the norm, 
        not the exception. And mankind has coped with it pretty successfully so 
        far.  "Further policies to reduce greenhouse gas emissions must take into 
        account all the economic parameters. If not, those states which have ratified 
        the Kyoto protocol run the risk of having some of their manufacturing 
        move to developed economies which are still hesitating to sign the protocol 
        or to developing countries which are not yet subject to any quota obligations 
        under it. This could result in economic losses and weakened competitiveness, 
        without producing the desired global reduction in emissions."  So true. One can only wonder why nobody thought of it before.  And then another pinch of realism:  Surprise, surprise! Is this the beginning of the recognition that there 
        is a gap between the greenhouse gas reduction rhetoric of EU member countries 
        and actual results?  It also seems that the EU has finally woken up to the outcome of the 
        G-8 Gleneagles Summit and the Montreal Climate Conference. There it became 
        clear that the major economic powers in the world were not willing to 
        follow the EU's climate policy of cap-and-trade. Nevertheless, the 'Opinion' 
        still makes an obligatory reference to "future negotiations": 
         "These negotiations must lead in the future to an acceptable way 
        of continuing the process of reducing greenhouse gas emissions after 2012 
        - one that involves all the economically developed countries and the prime 
        producers of emissions in the developing countries as a whole and especially 
        those where development is rapid."  But subsequently reality sets in:  Again, so true! Again, why did nobody think of it before?  And finally another surprise. How often have we heard "the science 
        is settled" and "all scientists agree"? Apparently the 
        EESC is not so sure any more, because it concludes:  More likely the steep fall in price of carbon credits, from a peak of €31 to 11 and maybe soon to zero, means no more money in the Kyoto Scam. Much more gain in the huge subsidies for Wind farms and construction of a Grid for Europe. The big Energy Companies[BEC] cannot jump on this bandwagon quick enough. And Climate Alarmists think BECs are financing climate sceptics. Why do they get everything so wrong? Meanwhile, Eurobusiness is finding it hard to compete in a global market. Since cheap energy facilitates growth, punitive fuel taxes in Europe may be the last straw. But global competitors need not worry. This will take decades to percolate 
        down the layers of Eco-Freak bureaucracy dominant in EU, National, Regional 
        and Local government -- never mind the caring, moralizing Media. 8. Electronic smog? The evidence - which is being taken seriously by national and international bodies and authorities - suggests that almost everyone is being exposed to a new form of pollution with countless sources in daily use in every home. Two official Department of Health reports on the smog are to be presented to ministers next month, and the Health Protection Agency (HPA) has recently held the first meeting of an expert group charged with developing advice to the public on the threat. The UN's World Health Organisation (WHO) calls the electronic smog "one of the most common and fastest growing environmental influences" and stresses that it "takes seriously" concerns about the health effects. It adds that "everyone in the world" is exposed to it and that "levels will continue to increase as technology advances". Wiring creates electrical fields, one component of the smog, even when nothing is turned on. And all electrical equipment - from TVs to toasters - give off another one, magnetic fields. The fields rapidly decrease with distance but appliances such as hair dryers and electric shavers, used close to the head, can give high exposures. Electric blankets and clock radios near to beds produce even higher doses because people are exposed to them for many hours while sleeping. Radio frequency fields - yet another component - are emitted by microwave ovens, TV and radio transmitters, mobile phone masts and phones themselves, also used close to the head. The WHO says that the smog could interfere with the tiny natural electrical currents that help to drive the human body. Nerves relay signals by transmitting electric impulses, for example, while the use of electrocardiograms testify to the electrical activity of the heart. Campaigners have long been worried about exposure to fields from lines carried by electric pylons but, until recently, their concerns were dismissed, even ridiculed, by the authorities. But last year a study by the official National Radiological Protection Board concluded that children living close to the lines are more likely to get leukaemia, and ministers are considering whether to stop any more homes being built near them. The discovery is causing a large-scale reappraisal of the hazards of the smog. The International Agency for Research on Cancer - part of the WHO and the leading international organisation on the disease - classes the smog as a "possible human carcinogen". And Professor David Carpenter, dean of the School of Public Health at the State University of New York, told The Independent on Sunday last week that it was likely to cause up to 30 per cent of all childhood cancers. A report by the California Health Department concludes that it is also likely to cause adult leukaemia, brain cancers and possibly breast cancer and could be responsible for a 10th of all miscarriages. Professor Denis Henshaw, professor of human radiation effects at Bristol University, says that "a huge and substantive body of evidence indicates a range of adverse health effects". He estimates that the smog causes some 9,000 cases of depression. Perhaps strangest of all, there is increasing evidence that the smog causes some people to become allergic to electricity, leading to nausea, pain, dizziness, depression and difficulties in sleeping and concentrating when they use electrical appliances or go near mobile phone masts. Some are so badly affected that they have to change their lifestyles. While not yet certain how it is caused, both the WHO and the HPA accept 
        that the condition exists, and the UN body estimates that up to three 
        in every 100 people are affected by it. 
 CHARLOTTESVILLE, Va., May 9 (AScribe Newswire) -- New research calls 
        into question the linkage between major Atlantic hurricanes and global 
        warming. That is one of the conclusions from a University of Virginia 
        study to appear in the May 10, 2006 issue of the journal Geophysical Research 
        Letters.  In recent years, a large number of severe Atlantic hurricanes have fueled 
        a debate as to whether global warming is responsible. Because high sea-surface 
        temperatures fuel tropical cyclones, this linkage seems logical. In fact, 
        within the past year, several hurricane researchers have correlated basin-wide 
        warming trends with increasing hurricane severity and have implicated 
        a greenhouse-warming cause.  But unlike these prior studies, the U.Va. climatologists specifically 
        examined water temperatures along the path of each storm, providing a 
        more precise picture of the tropical environment involved in each hurricane's 
        development. They found that increasing water temperatures can account 
        for only about half of the increase in strong hurricanes over the past 
        25 years; therefore the remaining storminess increase must be related 
        to other factors.  "It is too simplistic to only implicate sea surface temperatures 
        in the dramatic increase in the number of major hurricanes," said 
        lead author Patrick Michaels, U.Va. professor of environmental sciences 
        and director of the Virginia Climatology Office.  For a storm to reach the status of a major hurricane, a very specific 
        set of atmospheric conditions must be met within the region of the storm's 
        development, and only one of these factors is sufficiently high sea-surface 
        temperatures. The authors found that the ultimate strength of a hurricane 
        is not directly linked to the underlying water temperatures. Instead, 
        they found that a temperature threshold, 89 degrees Fahrenheit, must be 
        crossed before a weak tropical cyclone has the potential to become a monster 
        hurricane. Once the threshold is crossed, water temperature is no longer 
        an important factor. "At that point, other factors take over, such 
        as the vertical wind profile, and atmospheric temperature and moisture 
        gradients," Michaels said.  While there has been extensive recent discussion about whether or not 
        human-induced global warming is currently playing a role in the increased 
        frequency and intensity of Atlantic hurricanes, Michaels downplays this 
        impact, at least for the current climate.  "Some aspects of the tropical environment have evolved much differently 
        than they were expected to under the assumption that only increasing greenhouse 
        gases were involved. This leads me to believe that natural oscillations 
        have also been responsible for what we have seen," Michaels said. 
         But what if sea-surface temperatures continue to rise into the future, 
        if the world continues to warm from an enhancing greenhouse effect? "In 
        the future we may expect to see more major hurricanes," Michaels 
        said, "but we don't expect the ones that do form to be any stronger 
        than the ones that we have seen in the past."  Whereas there is a significant relationship between overall sea-surface temperature (SST) and tropical cyclone intensity, the relationship is much less clear in the upper range of SST normally associated with these storms. There, we find a step-like, rather than a continuous, influence of SST on cyclone strength, suggesting that there exists a SST threshold that must be exceeded before tropical cyclones develop into major hurricanes. Further, we show that the SST influence varies markedly over time, thereby indicating that other aspects of the tropical environment are also critically important for tropical cyclone intensification. These findings highlight the complex nature of hurricane development and weaken the notion of a simple cause-and-effect relationship between rising SST and stronger Atlantic hurricanes. Reference: Michaels, P. J., P. C. Knappenberger, and R. E. Davis, 
        2006. Sea-surface temperatures and tropical cyclones in the Atlantic basin. 
        Geophysical Research Letters, 33, doi:10.1029/2006GL025757. 
 
 
 |