Tag: greenhouse

The Current Wisdom: The Short-Term Climate Trend Is Not Your Friend

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

***********

It seems like everyone, from exalted climate scientists to late-night amateur tweeters, can get a bit over-excited about short-term fluctuations, reading into them deep cosmic and political meaning, when they are likely the statistical hiccups of our mathematically surly atmosphere.

There’s been some major errors in forecasts of recent trends. Perhaps the most famous  were made by NASA’s James Hansen in 1988, who overestimated warming between then and now by a whopping 40% or so.

But it is easy to  get snookered by short-term fluctuations.  As shown in Figure 1, it is quite obvious that there has been virtually no net change in temperature since 1997, allowing for the fact that measurement errors in global average surface temperature are easily a tenth of a degree or more. (The magnitude of those errors will be considered in a future Current Wisdom).

Figure 1. Annual global average surface temperature anomaly (°C), 1997-2010 (data source: Hadley Center).

Some who are concerned about environmental regulation without good science have seized upon this 13-year stretch as “proof” that there is no such thing as global warming driven by carbon dioxide.  More on that at the end of this Wisdom.

Similarly, periods of seemingly rapid warming can prompt scientists to see changes where there aren’t any.

Consider a landmark paper published in 2000 in Geophysical Research Letters by Tom Karl, a prominent researcher who is the head of our National Climatic Data Center (NCDC) and who just finished a stint as President of the American Meteorological Society.  He couldn’t resist the climatic blip that was occurred prior  to the current stagnation of warming, namely the very warm episode of the late 1990s. 

Cooler heads at the time noted that it was an artifact of the great El Nino of 1997-98, a periodic warming of the tropical Pacific that has been coming and going for millions of years. 

Nonetheless, the paper was published and accompanied by a flashy press release titled “Global warming may be accelerating.”  

What Karl did was to examine the 16 consecutive months of record-high temperatures (beginning in May, 1997) and to calculate the chance that this could happen, given the fairly pokey warming rate—approximately 0.17°C (0.31°F) per  decade, that was occurring.  He concluded there was less than a five percent probability, unless the warming rate had suddenly increased.

From the press release:

Karl and colleagues conclude that there is only a small chance that the string of record high temperatures in 1997-98 was simply an unusual event, rather than a change point, the start of a new and faster ongoing trend.

He also gave a number:  “…the probability of observing the record temperatures is more likely with high average rates of warming, around 3°C [5.4°F]/century,” which works out to 0.3°C per decade.

Our Figure 2 shows what was probabilistically forecast beginning in May, 1997, and what actually happened.  Between then and now, according to this paper, global temperatures should have warmed around 0.4°C (0.7°F).  The observed warming rate for the last 13.5 years—which includes the dramatically warming temperatures beginning in 1997—was a paltry 0.06°C (0.11°F) per decade. 

Figure 2. Prior to mid-1997, the observed warming trend (dashed line) was 0.17°/decade.  Karl said there was a greater than 95% probability that 1997-8 would mark a “change point”, where warming would accelerate to around 0.30°/decade.  Since then, the rate has been 0.06°/decade, or 20% of what was forecast.

Karl did provide some statistical wiggle room.  While noting the less than 5% chance that the warming rate hadn’t increased, he wrote that “unusual events can occur” and that there still was a chance (given as less than 5%) that 97-98 was just a statistical hiccup, which it ultimately proved to be.

The press release couldn’t resist the “it’s worse than we thought” mindset that pervades climate science:

Since completing the research, the data for 1999 has been compiled.  The researchers found that 1999 was the fifth warmest year on record, although as a La Nina year it would normally be cooler” [than what?ed].

“La Nina” is cool phase of El Nino, which drops temperatures about as much as El Nino raises them. What the press release and the GRL paper completely neglected to mention is that the great warm year of 1998 was a result of the “natural”  El Nino superimposed upon the overall slight warming trend.

In other words, there was every reason to believe at that time that the anomalous temperatures were indeed a statistical blip resulting from a very high-amplitude version of a natural oscillation in the earth’s climate that occurred every few years.

Now, back to the last 13 years. The puny recent changes may also just be our atmosphere’s make-up call for the sudden warming of the late 1990s, or another hiccup.

It is characteristic for climate models whose carbon dioxide increase resembles that which is being observed to produce constant rates of warming.  There’s a good reason for this.  Temperature responds logarithmically—i.e.less and less—to changes in this gas as its concentration increases.  But the concentration tends to increase exponentially—i.e. more and more.  The combination of an increasingly damped response to an ever increasing rate of input tends to resemble a straight line, or a constant rate of warming.

Indeed, Karl noted in his paper (and I have noted in virtually every public lecture I give), that “projections of temperature change in the next [i.e. the 21st] century, using [the United Nations’] business as usual scenarios…have relatively constant rates of global temperature increase”.  It’s just that their constant rates tend to be higher than the one that is being observed.  The average rate of warming predicted for this century by the UN is about 2.5°C, while the observed value has been, as predicted, constant—but with a lower value of 1.7°.  As Figure 3 shows, this rate has been remarkably constant for over three decades.

 

Figure 3. Annual global average surface temperature anomaly (°C), 1976-2010 (data source: Hadley Center).  It’s hard to imagine a more constant trend, despite the 1998 peak and the subsequent torpid warming.

The bottom line is that short-term trends are not your friends when talking about long-term climate change.

References

Hansen, J.E., et al., 1988. Global climate changes as forecast by Goddard Institute for Space Studies three-dimensional model. Journal of Geophysical Research, 93, 9341-9364.

Karl, T. R., R. W. Knight, and B. Baker, 2000. The record breaking global temperatures of 1997 and 1998” Evidence for an increase in the rate of global warming? Geophysical Research Letters, 27, 719-722.

Michaels, P. J., and P. C. Knappenberger, 2009. Scientific Shortcomings in the EPA’s Endangerment Finding from Greenhouse Gases, Cato Journal, 29, 497-521, http://www.cato.org/pubs/journal/cj29n3/cj29n3-8.pdf.

Topics:

The Shocking Truth: The Scientific American Poll on Climate Change

November’s Scientific American features a profile of Georgia Tech atmospheric scientist Judith Curry,  who has committed the mortal sin of  reaching out to other scientists who hypothesize that global warming isn’t the disaster it’s been cracked up to be.  I have personal experience with this, as she invited me to give a research seminar in Tech’s prestigious School of Earth and Atmospheric Sciences in 2008.  My lecture summarizing the reasons for doubting the apocalyptic synthesis of climate change was well-received by an overflow crowd.

Written by Michael Lemonick, who hails from the shrill blog Climate Central, the article isn’t devoid of the usual swipes, calling her a “heretic„ which is hardly at all true.  She’s simply another hardworking scientist who lets the data take her wherever it must, even if that leads her to question some of our more alarmist colleagues. 

But, as a make-up call for calling attention to Curry, Scientific American has run a poll of its readers on climate change.  Remember that SciAm has been shilling for the climate apocalypse for years, publishing a particularly vicious series of attacks on Denmark’s Bjorn Lomborg’s Skeptical Environmentalist.  The magazine also featured NASA’s James Hansen and his outlandish claims on sea-level rise. Hansen has stated, under oath in a deposition, that a twenty foot rise is quite possible within the next 89 years; oddly, he has failed to note that in 1988 he predicted that the West Side Highway in Manhattan would go permanently under water in twenty years.

SciAm probably expected a lot of people would agree with the key statement in their poll that the United Nations’ Intergovernmental Panel on Climate Change (IPCC) is “an effective group of government representatives and other experts.”

Hardly. As of this morning, only 16% of the 6655 respondents agreed.  84%—that is not a typo—described the IPCC as “a corrupt organization, prone to groupthink, with a political agenda.” 

The poll also asks “What should we do about climate change?” 69% say “nothing, we are powerless to stop it.” When asked about policy options, an astonishingly low 7% support cap-and-trade, which passed the U.S. House of Representatives in June, 2009, and cost approximately two dozen congressmen their seats.

The real killer is question “What is causing climate change?” For this one, multiple answers are allowed.  26% said greenhouse gases from human activity, 32% solar variation, and 78% “natural processes.” (In reality all three are causes of climate change.)

And finally, “How much would you be willing to pay to forestall the risk of catastrophic climate change?”  80% of the respondents said “nothing.”

Remember that this comes from what is hardly a random sample.  Scientific American is a reliably statist publication and therefore appeals to a readership that is skewed to the left of the political center.  This poll demonstrates that virtually everyone now acknowledges that the UN has corrupted climate science, that climate change is impossible to stop, and that futile attempts like cap-and-trade do nothing but waste money and burn political capital, things that Cato’s scholars have been saying for years.

Radioactive Corporate Welfare

A good default proposition regarding the government’s role in the economy would state that the government should not loan money to an enterprise if the enterprise in question cannot find one single market actor anywhere in the universe to loan said enterprise a single red cent.  It might suggest – I don’t know – that the investment is rather … dubious.

Alas, like all good propositions regarding the government’s role in the economy, this one is being left by the roadside by the Obama administration.  Unfortunately, the only complaint being made by a not insubstantial segment of the political Right – frequently, the political crowd that is busy decrying “Bailout Nation” – is that the loan guarantees are not fat enough.

I write, of course, about the $8.3 billion federal loan guarantee announced by President Obama this week for Southern Company to build two new nuclear power plants.  The money will be used to guarantee the loans being made by the federal government (via the Federal Financing Bank) to partially cover the cost of Southern’s projected $14 billion nuclear construction project at their Vogtle plant near Waynesboro, Georgia.  The loan guarantees were authorized by Congress in the 2005 Energy Policy Act and, we are told, are the first installment on a total package of $54 billion that the President would like to hand out to facilitate the construction of 7-10 new nuclear power plants (Congress, however, has only authorized $18.5 billion to this point).

The claim being made by some – that the loan guarantees are necessary to jump-start investor interest in new nuclear power plant construction – is not quite correct.  Even these lavish loan guarantees aren’t enough to do that.  In a letter to the U.S. Department of Energy dated July 2, 2007, six of Wall Street’s s then-largest investment banks – Citigroup, Credit Suisse, Goldman Sachs, Lehman Brothers, Merrill Lynch, and Morgan Stanley – informed the administration that, contrary to the government’s expectations, anything short of a 100 percent unconditional guarantee would be insufficient to induce private lending.

Why is it risky to build nuclear power plants?  Because new nuclear projects tie up more capital for longer periods of time than its main competitor, natural-gas fired generation.  Nuclear power makes economic sense only if natural gas prices are very high.  Then, over time, the high initial costs of nuclear power would be offset by nuclear power’s lower fuel costs.  Moreover, as noted by Moody’s in an analysis published in July of last year, there is uncertainty associated with construction costs, regulatory oversight, technological developments that might reduce the cost of rival facilities, and the ability of utilities to recover costs and make a profit over the lifetime of the plant – a risk tied up in the economic prospects of the region being served by the plant.  And those risks have been increasing, not decreasing, as time has gone on.

In short, even during the go-go days prior to the September 2008 crash – a time when Wall Street was allegedly throwing around money left and right to all sorts of dubious borrowers – the banks that stand accused of recklessly endangering their shareholders on other fronts were telling utility companies that they would not loan them anything for new nuclear power plant construction unless the feds unconditionally guaranteed every last penny of those loans.  That’s how risky market actors think it is to build nuclear power plants.

And it’s not as if the federal government disagrees completely.  The Congressional Budget Office pegs the chance of default (program-wide) at 50 percent or better and the Government Accountability Office likewise thinks that default risks are quite high.  Energy Secretary Stephen Chu says that he thinks the chance of default is much lower.  We can only speculate about who’s right given that no one has tried to build a nuclear power plant in the United States for over 30 years.

Regardless of what the risk actually is, the loan guarantees do not reduce that risk.  They simply transfer the risk from the bank to taxpayer.  In this particular case, however, the loan guarantee transfers risk from one arm of the state to the other, so it doesn’t really count.  But if such loan guarantees  ever were to induce actual private lending for plant construction, that’s how it would work.

Plenty of arguments have been offered to justify these loan guarantees.  Most of them are flimsy on their face.

For instance, we’re often told that we “need” new power plants.  But with electricity demand declining over the past couple of years, it is unclear when that need might arise.

Regardless, when the market “needs” more electricity, that need will be manifested in price signals that will carry with them profit opportunities.  Profit-hungry investors will be willing and able to meet that need without the help of government.  Of course, if market conditions don’t radically change, those needs will be met with gas-fired power plants, but hey, if that bothers you, take it up with someone else.

Others argue that we need the jobs that will be produced by new nuclear power plants.  Well, building big new reactors will certainly employ a lot of (largely unionized) construction workers.  But that’s one reason why building a nuclear power plant is not very economic compared to building gas-fired generators.  If creating jobs is the idea whether the project makes any economic sense or not, then let’s just ban food imports and farm equipment and put everyone to work with hand plows and scythes.

Two somewhat more serious arguments have been offered to justify these loan guarantees.  Neither of them stands up to much scrutiny either.

The first argument – the argument most often heard from the nuclear power industry and some segments of the political Left – is that we need nuclear power to reduce greenhouse gas emissions.  Of course, the best (that is, most efficient) way to reduce greenhouse gas emissions is to internalize the cost of greenhouse gas emissions in the retail price of electricity and then allow market actors to adjust their production and consumption decisions accordingly.  That price internalization exercise, however (whether directly through a carbon tax or indirectly through a cap-and-trade program), does not appear to be in the cards in the foreseeable future.  Hence, the loan guarantees are advanced as a “second-best” solution, one that will get us the technology and economic efficiency that would be delivered by a properly crafted carbon tax or cap-and-trade program without the retail price increases associated with either.

One of several problems with this argument is that it would take one hell of a carbon tax – or one hell of an onerous cap-and-trade program – to get anyone interested in building nuclear power plants.  If natural gas prices remained roughly where they are at present (that is, if they were to remain at historical norms) then it would take a $90 per ton carbon tax or a cap-and-trade program that delivered carbon emission credits at $90 per ton on the open market to induce investment in nuclear power plants.  Few economists who study climate policy believe that a carbon tax of that size is defensible given what we know about climate change.

And that’s if construction costs are as low as advertised.  Were they to double (as they did from 2003 to 2009) – either because of endogenous increases in the cost of capital, labor, or construction-related resources or because of cost overruns – then it would take at least a $150 per ton carbon tax (or a cap-and-trade program that delivered $150 carbon credits to the market) to induce investment.

You might ask yourself what the historical relationship is between final (inflation-adjusted) nuclear power plant construction costs in the United States and construction costs as projected at the onset of the project.  Happily, the CBO has done your homework for you.  They found that final construction costs averaged 207 percent of projected costs.  Hence, a doubling of construction costs is probably more likely than not once a project is underway … if past is prologue.

The upshot is that there are many more efficient ways to respond to greenhouse gas emission constraints than to go on a nuclear power bender.  This observation is heresy on the Right, but almost every credible analysis of the matter backs up that observation.

The second argument one hears to justify federal loan guarantees is that they are necessary to counter-balance the excessive regulatory costs associated with new plant construction.  Now, put aside the fact that the Nuclear Energy Institute – the trade association of the nuclear power industry – has often expressed near-total satisfaction with the current federal regulatory regime.  If the regulatory regime is truly “bad” and, accordingly, is imposing steep and unnecessary costs on the industry, then the correct remedy is to improve said regulatory regime, not to subsidize the industry.

The counter-complaint that positive regulatory reforms are impossible is difficult to swallow.  After all, if there is sufficient political will to bestow tens of billions of dollars worth of tax money on this industry, then surely there is enough political will to reform the bad and unnecessarily costly regulations allegedly bedeviling the object of those very same legislative affections.

I will confess to being skeptical about the argument that high construction costs are largely the fault of regulators.  Building a light water breeder reactor is a technologically challenging and costly undertaking whether regulators are on the scene or not.  Moreover, it is not obvious to me that the regulations that are in place are a priori objectionable from a libertarian perspective.

One rarely, if ever, hears of particulars in this bill of complaint offered about nuclear regulation.  But if a persuasive bill of complaints is ever presented, then the appropriate response is regulatory reform and then to leave the decision to build or not to build to markets.

In the course of announcing these loan guarantees, President Obama said this week that “The fact is, changing the ways we produce and use energy requires us to think anew. It requires us to act anew.”  Well, there’s nothing new about throwing subsidies at nuclear power.  Economist Douglas Koplow calculates that federal nuclear subsidies have totaled $178 billion from 1947-1999.  The promise of a nuclear economy with rates too cheap to meter has been made for over half a century.  What would be new would be a policy of “just saying no” to industries with their hands out in Washington.

[Cross-posted at MasterResource]

Are Industrialized Countries Responsible for Reducing the Well Being of Developing Countries?

A basic contention of developing countries (DCs) and various UN bureaucracies and multilateral groups during the course of International negotiations on climate change is that industrialized countries (ICs) have a historical responsibility for global warming.  This contention underlies much of the justification for insisting not only that industrialized countries reduce their greenhouse gas emissions even as developing countries are given a bye on emission reductions, but that they also subsidize clean energy development and adaptation in developing countries. [It is also part of the rationale that industrialized countries should pay reparations for presumed damages from climate change.]

Based on the above contention, the Kyoto Protocol imposes no direct costs on developing countries and holds out the prospect of large amounts of transfer payments from industrialized to developing countries via the Clean Development Mechanism or an Adaptation Fund. Not surprisingly, virtually every developing country has ratified the Protocol and is adamant that these features be retained in any son-of-Kyoto.

For their part, UN and other multilateral agencies favor this approach because lacking any taxing authority or other ready mechanism for raising revenues, they see revenues in helping manage, facilitate or distribute the enormous amounts of money that, in theory, should be available from ICs to fund mitigation and adaptation in the DCs.

Continue reading here.

Cherry Picking Climate Catastrophes: Response to Conor Clarke, Part II

Conor Clarke at The Atlantic blog, raised several issues with my study, “What to Do About Climate Change,” which Cato published last year.

One of Conor Clarke’s comments was that my analysis did not extend beyond the 21st century. He found this problematic because, as Conor put it, climate change would extend beyond 2100, and even if GDP is higher in 2100 with unfettered global warming than without, it’s not obvious that this GDP would continue to be higher “in the year 2200 or 2300 or 3758”. I addressed this portion of his argument in Part I of my response. Here I will address the second part of this argument, that “the possibility of ‘catastrophic’ climate change events — those with low probability but extremely high cost — becomes real after 2100.”

The examples of potentially catastrophic events that could be caused by anthropogenic greenhouse gas induced global warming (AGW) that have been offered to date (e.g., melting of the Greenland or West Antarctic Ice Sheets, or the shutdown of the thermohaline circulation) contain a few drops of plausibility submerged in oceans of speculation. There are no scientifically justified estimates of the probability of their occurrence by any given date. Nor are there scientifically justified estimates of the magnitude of damages such events might cause, not just in biophysical terms but also in socioeconomic terms. Therefore, to call these events “low probability” — as Mr. Clarke does — is a misnomer. They are more appropriately termed as plausible but highly speculative events.

Consider, for example, the potential collapse of the Greenland Ice Sheet (GIS). According to the IPCC’s WG I Summary for Policy Makers (p. 17), “If a negative surface mass balance were sustained for millennia, that would lead to virtually complete elimination of the Greenland Ice Sheet and a resulting contribution to sea level rise of about 7 m” (emphasis added). Presumably the same applies to the West Antarctic Ice Sheet.

But what is the probability that a negative surface mass balance can, in fact, be sustained for millennia, particularly after considering the amount of fossil fuels that can be economically extracted and the likelihood that other energy sources will not displace fossil fuels in the interim? [Remember we are told that peak oil is nigh, that renewables are almost competitive with fossil fuels, and that wind, solar and biofuels will soon pay for themselves.]

Second, for an event to be classified as a catastrophe, it should occur relatively quickly precluding efforts by man or nature to adapt or otherwise deal with it. But if it occurs over millennia, as the IPCC says, or even centuries, that gives humanity ample time to adjust, albeit at a socioeconomic cost. But it need not be prohibitively dangerous to life, limb or property if: (1) the total amount of sea level rise (SLR) and, perhaps more importantly, the rate of SLR can be predicted with some confidence, as seems likely in the next few decades considering the resources being expended on such research; (2) the rate of SLR is slow relative to how fast populations can strengthen coastal defenses and/or relocate; and (3) there are no insurmountable barriers to migration.

This would be true even had the so-called “tipping point” already been passed and ultimate disintegration of the ice sheet was inevitable, so long as it takes millennia for the disintegration to be realized. In other words, the issue isn’t just whether the tipping point is reached, rather it is how long does it actually take to tip over. Take, for example, if a hand grenade is tossed into a crowded room. Whether this results in tragedy — and the magnitude of that tragedy — depends upon how much time it takes for the grenade to go off, the reaction time of the occupants, and their ability to respond.

Lowe, et al. (2006, p. 32-33), based on a “pessimistic, but plausible, scenario in which atmospheric carbon dioxide concentrations were stabilised at four times pre-industrial levels,” estimated that a collapse of the Greenland Ice Sheet would over the next 1,000 years raise sea level by 2.3 meters (with a peak rate of 0.5 cm/yr). If one were to arbitrarily double that to account for potential melting of the West Antarctic Ice Sheet, that means a SLR of ~5 meters in 1,000 years with a peak rate (assuming the peaks coincide) of 1 meter per century.

Such a rise would not be unprecedented. Sea level has risen 120 meters in the past 18,000 years — an average of 0.67 meters/century — and as much as 4 meters/century during meltwater pulse 1A episode 14,600 years ago (Weaver et al. 2003; subscription required). Neither humanity nor, from the perspective of millennial time scales (per the above quote from the IPCC), the rest of nature seem the worse for it. Coral reefs for example, evolved and their compositions changed over millennia as new reefs grew while older ones were submerged in deeper water (e.g., Cabioch et al. 2008). So while there have been ecological changes, it is unknown whether the changes were for better or worse. For a melting of the GIS (or WAIS) to qualify as a catastrophe, one has to show, rather than assume, that the ecological consequences would, in fact, be for the worse.

Human beings can certainly cope with sea level rise of such magnitudes if they have centuries or millennia to do so. In fact, if necessary they could probably get out of the way in a matter of decades, if not years.

Can a relocation of such a magnitude be accomplished?

Consider that the global population increased from 2.5 billion in 1950 to 6.8 billion this year. Among other things, this meant creating the infrastructure for an extra 4.3 billion people in the intervening 59 years (as well as improving the infrastructure for the 2.5 billion counted in the baseline, many of whom barely had any infrastructure whatsoever in 1950). These improvements occurred at a time when everyone was significantly poorer. (Global per capita income today is more than 3.5 times greater today than it was in 1950). Therefore, while relocation will be costly, in theory, tomorrow’s much wealthier world ought to be able to relocate billions of people to higher ground over the next few centuries, if need be. In fact, once a decision is made to relocate, the cost differential of relocating, say, 10 meters higher rather than a meter higher is probably marginal. It should also be noted that over millennia the world’s infrastructure will have to be renewed or replaced dozens of times – and the world will be better for it. [For example, the ancient city of Troy, once on the coast but now a few kilometers inland, was built and rebuilt at least 9 times in 3 millennia.]

Also, so long as we are concerned about potential geological catastrophes whose probability of occurrence and impacts have yet to be scientifically estimated, we should also consider equally low or higher probability events that might negate their impacts. Specifically, it is quite possible — in fact probable — that somewhere between now and 2100 or 2200, technologies will become available that will deal with climate change much more economically than currently available technologies for reducing GHG emissions. Such technologies may include ocean fertilization, carbon sequestration, geo-engineering options (e.g., deploying mirrors in space) or more efficient solar or photovoltaic technologies. Similarly, there is a finite, non-zero probability that new and improved adaptation technologies will become available that will substantially reduce the net adverse impacts of climate change.

The historical record shows that this has occurred over the past century for virtually every climate-sensitive sector that has been studied. For example, from 1900-1970, U.S. death rates due to various climate-sensitive water-related diseases — dysentery, typhoid, paratyphoid, other gastrointestinal disease, and malaria —declined by 99.6 to 100.0 percent. Similarly, poor agricultural productivity exacerbated by drought contributed to famines in India and China off and on through the 19th and 20th centuries killing millions of people, but such famines haven’t recurred since the 1970s despite any climate change and the fact that populations are several-fold higher today. And by the early 2000s, deaths and death rates due to extreme weather events had dropped worldwide by over 95% of their earlier 20th century peaks (Goklany 2006).

With respect to another global warming bogeyman — the shutdown of the thermohaline circulation (AKA the meridional overturning circulation), the basis for the deep freeze depicted in the movie, The Day After Tomorrow — the IPCC WG I SPM notes (p. 16), “Based on current model simulations, it is very likely that the meridional overturning circulation (MOC) of the Atlantic Ocean will slow down during the 21st century. The multi-model average reduction by 2100 is 25% (range from zero to about 50%) for SRES emission scenario A1B. Temperatures in the Atlantic region are projected to increase despite such changes due to the much larger warming associated with projected increases in greenhouse gases. It is very unlikely that the MOC will undergo a large abrupt transition during the 21st century. Longer-term changes in the MOC cannot be assessed with confidence.”

Not much has changed since then. A shut down of the MOC doesn’t look any more likely now than it did then. See here, here, and here (pp. 316-317).

If one wants to develop rational policies to address speculative catastrophic events that could conceivably occur over the next few centuries or millennia, as a start one should consider the universe of potential catastrophes and then develop criteria as to which should be addressed and which not. Rational analysis must necessarily be based on systematic analysis, and not on cherry picking one’s favorite catastrophes.

Just as one may speculate on global warming induced catastrophes, one may just as plausibly also speculate on catastrophes that may result absent global warming. Consider, for example, the possibility that absent global warming, the Little Ice Age might return. The consequences of another ice age, Little or not, could range from the severely negative to the positive (if that would buffer the negative consequences of warming). That such a recurrence is not unlikely is evident from the fact that the earth entered and, only a century and a half ago, retreated from a Little Ice Age, and that history may indeed repeat itself over centuries or millennia.

Yet another catastrophe that greenhouse gas controls may cause is that CO2 not only contributes to warming, it is also the key building block of life as we know it. All vegetation is created by the photosynthesis of CO2 in the atmosphere. In fact, according to the IPCC WG I report (2007, p. 106), net primary productivity of the global biosphere has increased in recent decades, partly due to greater warming, higher CO2 concentrations and nitrogen deposition. Thus , there is a finite probability that reducing CO2 emissions would, therefore, reduce the net primary productivity of the terrestrial biosphere with potentially severe negative consequences for the amount and diversity of wildlife that it could support, as well as agricultural and forest productivity with adverse knock on effects on hunger and health.

There is also a finite probability that costs of GHG reductions could reduce economic growth worldwide. Even if only industrialized countries sign up for emission reductions, the negative consequences could show up in developing countries because they derive a substantial share of their income from aid, trade, tourism, and remittances from the rest of the world. See, for example, Tol (2005), which examines this possibility, although the extent to which that study fully considered these factors (i.e., aid, trade, tourism, and remittances) is unclear.

Finally, one of the problems with the argument that society should address low probability high impact events (assuming a probability could be estimated rather than assumed or guessed) is that it necessarily means there is a high probability that resources expended on addressing such catastrophic events will have been squandered. This wouldn’t be a problem but for the fact that there are opportunity costs associated with this.

According to the 2007 IPCC Science Assessment’s Summary for Policy Makers (p. 10), “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.” In plain language, this means that the IPCC believes there is at least a 90% likelihood that anthropogenic greenhouse gas emissions (AGHG) are responsible for 50-100% of the global warming since 1950. In other words, there is an up to 10% chance that anthropogenic GHGs are not responsible for most of that warming.

This means there is an up to 10% chance that resources expended in limiting climate change would have been squandered. Since any effort to significantly reduce climate change will cost trillions of dollars (see Nordhaus 2008, p. 82), that would be an unqualified disaster, particularly since those very resources could be devoted to reducing urgent problems humanity faces here and now (e.g., hunger, malaria, safer water and sanitation) — problems we know exist for sure unlike the bogeymen that we can’t be certain about.

Spending money on speculative, even if plausible, catastrophes instead of problems we know exist for sure is like a starving man giving up a fat juicy bird in hand while hoping that we’ll catch several other birds sometime in the next few centuries even though we know those birds don’t exist today and may never exist in the future.

French Folly

Following the dubious example set recently by U.S. legislators, French politicians have informally proposed slapping punitive tariffs on goods from countries who refuse to curb greenhouse gas emissions. The German State Secretary for the Environment has, quite rightly, called foul:

There are two problems – the WTO (World Trade Organization), and the signal would be that this is a new form of eco-imperialism,” Machnig said.

 ”We are closing our markets for their products, and I don’t think this is a very helpful signal for the international negotiations.”

I have a paper forthcoming on the carbon tariff issue, but in the meantime here’s a recent op-ed (written jointly with Pat Michaels) on climate change policy mis-steps.

Cap ‘n Trade: The Ultimate Pork-Fest

Some naive people might have been convinced that the U.S. House voted to wreck the American economy by endorsing cap and trade because it was the only way to save the world.  But even many environmentalists had given up on the bill approved last Friday.  It is truly a monstrosity:  it would cost consumers plenty, while doing little to reduce global temperatures.

But the legislation had something far more important for legislators and special interests alike.  It was a pork-fest that wouldn’t quit.

Reports the New York Times:

As the most ambitious energy and climate-change legislation ever introduced in Congress made its way to a floor vote last Friday, it grew fat with compromises, carve-outs, concessions and out-and-out gifts intended to win the votes of wavering lawmakers and the support of powerful industries.

The deal making continued right up until the final minutes, with the bill’s co-author Representative Henry A. Waxman, Democrat of California, doling out billions of dollars in promises on the House floor to secure the final votes needed for passage.

The bill was freighted with hundreds of pages of special-interest favors, even as environmentalists lamented that its greenhouse-gas reduction targets had been whittled down.

Some of the prizes were relatively small, like the $50 million hurricane research center for a freshman lawmaker from Florida.

Others were huge and threatened to undermine the environmental goals of the bill, like a series of compromises reached with rural and farm-state members that would funnel billions of dollars in payments to agriculture and forestry interests.

Automakers, steel companies, natural gas drillers, refiners, universities and real estate agents all got in on the fast-moving action.

The biggest concessions went to utilities, which wanted assurances that they could continue to operate and build coal – burning power plants without shouldering new costs. The utilities received not only tens of billions of dollars worth of free pollution permits, but also billions for work on technology to capture carbon-dioxide emissions from coal combustion to help meet future pollution targets.

That deal, negotiated by Representative Rick Boucher, a conservative Democrat from Virginia’s coal country, won the support of the Edison Electric Institute, the utility industry lobby, and lawmakers from regions dependent on coal for electricity.

Liberal Democrats got a piece, too. Representative Bobby Rush, Democrat of Illinois, withheld his support for the bill until a last-minute accord was struck to provide nearly $1 billion for energy-related jobs and job training for low-income workers and new subsidies for making public housing more energy-efficient.

Representative Joe Barton, a Texas Republican staunchly opposed to the bill, marveled at the deal-cutting on Friday.

“It is unprecedented,” Mr. Barton said, “but at least it’s transparent.”

This shouldn’t surprise anyone who follows Washington.  Still, the degree of special interest dealing was extraordinary.  Anyone want to imagine what a health care “reform” bill is likely to look like when legislators finish with it?