Tag: greenhouse gas emissions

The Current Wisdom: Overplaying the Human Contribution to Recent Weather Extremes

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

**********

 The recent publication of two articles in Nature magazine proclaiming a link to rainfall extremes (and flooding) to global warming, added to the heat in Russia and the floods in Pakistan in the summer of 2010, and the back-to-back cold and snowy winters in the eastern U.S. and western Europe, have gotten a lot of public attention.  This includes a recent hearing in the House of Representatives, despite its Republican majority.  Tying weather extremes to global warming, or using them as “proof” that warming doesn’t exist (see: snowstorms), is a popular rhetorical flourish by politicos of all stripes.  

The hearing struck many as quite odd, inasmuch as it is much clearer than apocalyptic global warming that the House is going to pass meaningless legislation commanding the EPA to cease and desist from regulating greenhouse gas emissions.  “Meaningless” means that it surely will not become law.  Even on the long-shot probability that it passes the Senate, the President will surely veto, and there are nowhere near enough votes to override such an action.

Perhaps “wolf!” has been cried yet again.  A string of soon-to-be-published papers in the scientific literature finds that despite all hue and cry about global warming and recent extreme weather events, natural climate variability is to blame.

Where to start?  How about last summer’s Russian heat wave?

The Russian heat wave (and to some degree the floods in Pakistan) have been linked to the same large-scale, stationary weather system, called an atmospheric “blocking” pattern. When the atmosphere is “blocked” it means that it stays in the same configuration for period of several weeks (or more) and keeps delivering the same weather to the same area for what can seem like an eternity to people in the way.  Capitalizing on the misery in Russia and Pakistan, atmospheric blocking was added to the list of things that were supposed to be “consistent with” anthropogenically stimulated global warming which already, of course included heat waves and floods. And thus the Great Russian Heat Wave of 2010 became part of global warming lore.

But then a funny thing happened – scientists with a working knowledge of atmospheric dynamics started to review the situation and found scant evidence for global warming.

The first chink in the armor came back in the fall of 2010, when scientists from the Physical Sciences Division (PSD) of the Earth System Research Laboratory (ESRL) of the National Oceanic and Atmospheric Administration (NOAA) presented the results of their preliminary investigation on the web , and concluded that “[d]espite this strong evidence for a warming planet, greenhouse gas forcing fails to explain the 2010 heat wave over western Russia. The natural process of atmospheric blocking, and the climate impacts induced by such blocking, are the principal cause for this heat wave.”

The PSD folks have now followed this up with a new peer-reviewed article in the journal Geophysical Research Letters that rejects the global warming explanation. The paper is titled “Was There a Basis for Anticipating the 2010 Russian Heat Wave?” Turns out that there wasn’t.

To prove this, the research team, led by PSD’s Randall Dole, first reviewed the observed temperature history of the region affected by the heat wave (western Russia, Belarus, the Ukraine, and the Baltic nations). To start, they looked at the recent antecedent conditions: “Despite record warm globally-averaged surface temperatures over the first six months of 2010, Moscow experienced an unusually cold winter and a relatively mild but variable spring, providing no hint of the record heat yet to come.” Nothing there.

Then they looked at the long-term temperature record: “The July surface temperatures for the region impacted by the 2010 Russian heat wave shows no significant warming trend over the prior 130-year period from 1880 to 2009…. A linear trend calculation yields a total temperature change over the 130 years of -0.1°C (with a range of 0 to -0.4°C over the four data sets [they examined]).” There’s not a hint of a build-up to a big heat wave.

And as to the behavior of temperature extremes: “There is also no clear indication of a trend toward increasing warm extremes. The prior 10 warmest Julys are distributed across the entire period and exhibit only modest clustering earlier in this decade, in the 1980s and in the 1930s…. This behavior differs substantially from globally averaged annual temperatures, for which eleven of the last twelve years ending in 2006 rank among the twelve warmest years in the instrumental record since 1850….”

With regard any indication that “global” warming was pushing temperatures higher in Russia and thus helped to fuel the extreme heat last summer, Dole et al. say this: “With no significant long-term trend in western Russia July surface temperatures detected over the period 1880-2009, mean regional temperature changes are thus very unlikely to have contributed substantially to the magnitude of the 2010 Russian heat wave.”

Next the PSD folks looked to see if the existing larger-scale antecedent conditions, fed into climate models would produce the atmospheric circulation patterns (i.e. blocking) that gave rise to the heat wave.  The tested “predictors” included patterns of sea surface temperature and arctic ice coverage, which most people feel have been subject to some human influence.  No relationship: “These findings suggest that the blocking and heat wave were not primarily a forced response to specific boundary conditions during 2010.”

In fact, the climate models exhibited no predilection for projecting increases in the frequency of atmospheric blocking patterns over the region as greenhouse gas concentrations increased. Just the opposite: “Results using very high-resolution climate models suggest that the number of Euro-Atlantic blocking events will decrease by the latter half of the 21st century.”

At this point, Dole and colleagues had about exhausted all lines of inquiry and summed things up:

 Our analysis points to a primarily natural cause for the Russian heat wave. This event appears to be mainly due to internal atmospheric dynamical processes that produced and maintained an intense and long-lived blocking event. Results from prior studies suggest that it is likely that the intensity of the heat wave was further increased by regional land surface feedbacks. The absence of long-term trends in regional mean temperatures and variability together with the model results indicate that it is very unlikely that warming attributable to increasing greenhouse gas concentrations contributed substantially to the magnitude of this heat wave.

Can’t be much clearer than that.

But that was last summer. What about the past two winters? Both were very cold in the eastern U.S. with record snows events and/or totals scattered about the country.

Cold, snow, and global warming? On Christmas Day 2010, the New York Times ran an op-ed by Judah Cohen, a long-range forecaster for the private forecasting firm Atmospheric and Environmental Research, outlining his theory as to how late summer Arctic ice declines lead to more fall snow cover across Siberia which in turn induces atmospheric circulation patterns to favor snowstorms along the East Coast of the U.S. Just last week, the Union of Concerned Scientists held a news conference where they handed out a press release  headlined “Climate Change Makes Major Snowstorms Likely.” In that release, Mark Serreze, director of the National Snow and Ice Data Center, laid out his theory as to how the loss of Arctic sea ice is helping to provide more moisture to fuel winter snowstorms across the U.S. as well as altering atmospheric circulation patterns into a preferred state for big snowstorms. Weather Underground’s Jeff Masters chimed in with “Heavy snowstorms are not inconsistent with a warming planet.”

As is the wont for this Wisdom, let’s go back to the scientific literature.

Another soon-to-be released paper to appear in Geophysical Research Letters describes the results of using the seasonal weather prediction model from the European Center for Medium-Range Weather Forecasts (ECMWF) to help untangle the causes of the unusual atmospheric circulation patterns that gave rise to the harsh winter of 2009-2010 on both sides of the Atlantic. A team of ECMWF scientists led by Thomas Jung went back and did experiments changing initial conditions that were fed into the ECMWF model and then assessed how well the model simulated the known weather patterns of the winter of 2009-2010. The different set of initial conditions was selected so as to test all the pet theories behind the origins of the harsh winter.  Jung et al. describe their investigations this way: “Here, the origin and predictability of the unusual winter of 2009/10 are explored through numerical experimentation with the ECMWF Monthly forecasting system. More specifically, the role of anomalies in sea surface temperature (SST) and sea ice, the tropical atmospheric circulation, the stratospheric polar vortex, solar insolation and near surface temperature (proxy for snow cover) are examined.”

Here is what they found after running their series of experiments.

Arctic sea ice and sea surface temperature anomalies.  These are often associated with global warming caused by people. Finding:  “These results suggest that neither SST nor sea ice anomalies explain the negative phase of the NAO during the 2009/10 winter.”

(NAO are the commonly used initials for the North Atlantic Oscillation – and atmospheric circulation pattern that can act to influence winter weather in the eastern U.S. and western Europe. A negative phase of the NAO is associated with cold and stormy weather and during the winter of 2009-10, the NAO value was the lowest ever observed.)

A global warming-induced weakening stratospheric (upper-atmosphere) jetstream. “Like for the other experiments, these stratospheric relaxation experiments fail to reproduce the magnitude of the observed NAO anomaly.”

Siberian snow cover.  “The resulting [upper air patterns] show little resemblance with the observations…. The implied weak role of snow cover anomalies is consistent with other research….”

Solar variability.  “The experiments carried out in this study suggest that the impact of anomalously low incoming [ultraviolet] radiation on the tropospheric circulation in the North Atlantic region are very small… suggesting that the unusually low solar activity contributed little, if any, to the observed NAO anomaly during the 2009/10 winter.”

Ok then, well what did cause the unusual weather patterns during the 2009-10 winter?

The results of this study, therefore, increase the likelihood that both the development and persistence of negative NAO phase resulted from internal atmospheric dynamical processes.

Translation: Random variability.

To drive this finding home, here’s another soon-to-be-released paper (D’Arrigo et al., 2001) that uses tree ring-based reconstructions of atmospheric circulation patterns and finds a similar set of conditions (including a negative NAO value second only to the 2009-10 winter) was responsible for the historically harsh winter of 1783-84 in the eastern U.S. and western Europe, which  was widely noted by historians. It followed the stupendous eruption of the Icelandic volcano Laki the previous summer. The frigid and snowy winter conditions have been blamed on the volcano. In fact, Benjamin Franklin even commented as much.

But in their new study, Roseanne D’Arrigo and colleagues conclude that the harshness of that winter primarily was the result of anomalous atmospheric circulation patterns that closely resembled those observed during the winter of 2009-10, and that the previous summer’s volcanic eruption played a far less prominent role:

Our results suggest that Franklin and others may have been mistaken in attributing winter conditions in 1783-4 mainly to Laki or another eruption, rather than unforced variability.

Similarly, conditions during the 2009-10 winter likely resulted from natural [atmospheric] variability, not tied to greenhouse gas forcing… Evidence thus suggests that these winters were linked to the rare but natural occurrence of negative NAO and El Niño events.

The point is that natural variability can and does produce extreme events on every time scale, from days (e.g., individual storms), weeks (e.g., the Russian heat wave), months (e.g., the winter of 2009-10), decades (e.g., the lack of global warming since 1998), centuries (e.g., the Little Ice Age), millennia (e.g., the cycle of major Ice Ages), and eons (e.g., snowball earth).

Folks would do well to keep this in mind next time global warming is being posited for the weather disaster du jour. Almost assuredly, it is all hype and little might.

Too bad these results weren’t given a “hearing” in the House!

References:

D’Arrigo, R., et al., 2011. The anomalous winter of 1783-1784: Was the Laki eruption or an analog of the 2009–2010 winter to blame? Geophysical Research Letters, in press.

Dole, R., et al., 2011. Was there a basis for anticipating the 2010 Russian heat wave? Geophysical Research Letters, in press.

Jung et al., 2011. Origin and predictability of the extreme negative NAO winter of 2009/10. Geophysical Research Letters, in press.

Min, S-K., et al., 2011. Human contribution to more-intense precipitation extremes. Nature, 470, 378-381.

Pall, P., et al., 2011. Anthropogenic greenhouse gas contribution to flood risk in England and Wales in autumn 2000. Nature, 470, 382-386.

Copenhagen Agreement Is Just More Hot Air

Late Friday afternoon, the White house announced a “meaningful agreement” at the Copenhagen climate summit.  Details are currently unavailable, but a White House official said that developed and developing countries have agreed to list their national actions and commitments to reduce greenhouse gas emissions with a “target” of a two degree (Celsius) limit to any further global warming.

In other words, there are no specific emissions reductions targets and timetables.  A country may choose no national reductions, or maybe a national program and that would be their “list.” And just what carbon dioxide level will stop warming over two degrees?

No one knows, at least until computer models stop forecasting warming that isn’t happening and/or drastically overstating the warming that is verifiable.

It sounds like the Copenhagen agreement is just more hot air. But not to worry, it will be hailed as a “breakthrough” by all the participants.

In reality, nothing much was accomplished and any significant agreement for emissions reductions has been punted to the next UN climate confab, beginning on November 8, 2010 in Mexico City, six days after our congressional election.

Next Move: Suing the Sun for Unseasonably Cool Weather

The New Orleans-based Fifth Circuit, the federal court of appeals where I once clerked, has allowed a class action lawsuit by Hurricane Katrina victims to proceed against a motley crew of energy, oil, and chemical companies.  Their claim: that the defendants’ greenhouse gas emissions raised air and water temperatures on the Gulf Coast, contributing to Katrina’s strength and causing property damage.  Mass tort litigation specialist Russell Jackson calls the plaintiffs’ claims “the litigator’s equivalent to the game ‘Six Degrees of Kevin Bacon.’”

In Comer v. Murphy Oil USA, the plaintiffs assert a variety of theories under Mississippi common law, but the main issue at this stage was whether the plaintiffs had standing, or whether they could demonstrate that their injuries were “fairly traceable” to the defendant’s actions.  The court dismissed several claims but held that plaintiffs indeed could allege public and private nuisance, trespass and negligence.  The court also held that these latter claims do not present a so-called “political question” that the court doesn’t have the authority to resolve.  You can read about the Court’s ruling in more detail at the WSJ Law Blog and Jackson’s Consumer Class Actions and Mass Torts Blog.

This is actually the second federal appeals court to rule this way; last month, the Second Circuit (based in New York) held that states, municipalities and certain private organizations had standing to bring federal common law nuisance claims to impose caps on certain companies’ greenhouse gas emissions.  Here’s the opinion in that case, Connecticut v. American Electric Power Company, and you can read a pretty good summary and analysis here.

Both of these cases, which herald a flood of global warming-related litigation, so to speak, owe their continuing vitality to the Supreme Court’s misbegotten 2007 decision in Massachusetts v. EPA.  The 2006-2007 Cato Supreme Court Review covered that case in an insightful article by Andrew Morriss of the University of Illinois.  (To get your copy of the latest (2008-2009) Review, go here.)

I should note from my own experience at the Fifth Circuit that the panel here consisted of the two worst judges on the court – Clinton appointees Carl Stewart and James Dennis – and one of Reagan’s weakest federal appellate appointments, Eugene Davis.  Even Davis, however, wrote separately to note that while he agreed on the standing issue, he would have affirmed the district court’s dismissal of the suit on a different ground (that pesky proximate cause issue).

I predict that the full (16-judge) Fifth Circuit will review this case en banc –and if not that the Supreme Court will eventually take it up (if the district court on remand doesn’t again dispose of the case on causation grounds).

Are Industrialized Countries Responsible for Reducing the Well Being of Developing Countries?

A basic contention of developing countries (DCs) and various UN bureaucracies and multilateral groups during the course of International negotiations on climate change is that industrialized countries (ICs) have a historical responsibility for global warming.  This contention underlies much of the justification for insisting not only that industrialized countries reduce their greenhouse gas emissions even as developing countries are given a bye on emission reductions, but that they also subsidize clean energy development and adaptation in developing countries. [It is also part of the rationale that industrialized countries should pay reparations for presumed damages from climate change.]

Based on the above contention, the Kyoto Protocol imposes no direct costs on developing countries and holds out the prospect of large amounts of transfer payments from industrialized to developing countries via the Clean Development Mechanism or an Adaptation Fund. Not surprisingly, virtually every developing country has ratified the Protocol and is adamant that these features be retained in any son-of-Kyoto.

For their part, UN and other multilateral agencies favor this approach because lacking any taxing authority or other ready mechanism for raising revenues, they see revenues in helping manage, facilitate or distribute the enormous amounts of money that, in theory, should be available from ICs to fund mitigation and adaptation in the DCs.

Continue reading here.

Cherry Picking Climate Catastrophes: Response to Conor Clarke, Part II

Conor Clarke at The Atlantic blog, raised several issues with my study, “What to Do About Climate Change,” which Cato published last year.

One of Conor Clarke’s comments was that my analysis did not extend beyond the 21st century. He found this problematic because, as Conor put it, climate change would extend beyond 2100, and even if GDP is higher in 2100 with unfettered global warming than without, it’s not obvious that this GDP would continue to be higher “in the year 2200 or 2300 or 3758”. I addressed this portion of his argument in Part I of my response. Here I will address the second part of this argument, that “the possibility of ‘catastrophic’ climate change events — those with low probability but extremely high cost — becomes real after 2100.”

The examples of potentially catastrophic events that could be caused by anthropogenic greenhouse gas induced global warming (AGW) that have been offered to date (e.g., melting of the Greenland or West Antarctic Ice Sheets, or the shutdown of the thermohaline circulation) contain a few drops of plausibility submerged in oceans of speculation. There are no scientifically justified estimates of the probability of their occurrence by any given date. Nor are there scientifically justified estimates of the magnitude of damages such events might cause, not just in biophysical terms but also in socioeconomic terms. Therefore, to call these events “low probability” — as Mr. Clarke does — is a misnomer. They are more appropriately termed as plausible but highly speculative events.

Consider, for example, the potential collapse of the Greenland Ice Sheet (GIS). According to the IPCC’s WG I Summary for Policy Makers (p. 17), “If a negative surface mass balance were sustained for millennia, that would lead to virtually complete elimination of the Greenland Ice Sheet and a resulting contribution to sea level rise of about 7 m” (emphasis added). Presumably the same applies to the West Antarctic Ice Sheet.

But what is the probability that a negative surface mass balance can, in fact, be sustained for millennia, particularly after considering the amount of fossil fuels that can be economically extracted and the likelihood that other energy sources will not displace fossil fuels in the interim? [Remember we are told that peak oil is nigh, that renewables are almost competitive with fossil fuels, and that wind, solar and biofuels will soon pay for themselves.]

Second, for an event to be classified as a catastrophe, it should occur relatively quickly precluding efforts by man or nature to adapt or otherwise deal with it. But if it occurs over millennia, as the IPCC says, or even centuries, that gives humanity ample time to adjust, albeit at a socioeconomic cost. But it need not be prohibitively dangerous to life, limb or property if: (1) the total amount of sea level rise (SLR) and, perhaps more importantly, the rate of SLR can be predicted with some confidence, as seems likely in the next few decades considering the resources being expended on such research; (2) the rate of SLR is slow relative to how fast populations can strengthen coastal defenses and/or relocate; and (3) there are no insurmountable barriers to migration.

This would be true even had the so-called “tipping point” already been passed and ultimate disintegration of the ice sheet was inevitable, so long as it takes millennia for the disintegration to be realized. In other words, the issue isn’t just whether the tipping point is reached, rather it is how long does it actually take to tip over. Take, for example, if a hand grenade is tossed into a crowded room. Whether this results in tragedy — and the magnitude of that tragedy — depends upon how much time it takes for the grenade to go off, the reaction time of the occupants, and their ability to respond.

Lowe, et al. (2006, p. 32-33), based on a “pessimistic, but plausible, scenario in which atmospheric carbon dioxide concentrations were stabilised at four times pre-industrial levels,” estimated that a collapse of the Greenland Ice Sheet would over the next 1,000 years raise sea level by 2.3 meters (with a peak rate of 0.5 cm/yr). If one were to arbitrarily double that to account for potential melting of the West Antarctic Ice Sheet, that means a SLR of ~5 meters in 1,000 years with a peak rate (assuming the peaks coincide) of 1 meter per century.

Such a rise would not be unprecedented. Sea level has risen 120 meters in the past 18,000 years — an average of 0.67 meters/century — and as much as 4 meters/century during meltwater pulse 1A episode 14,600 years ago (Weaver et al. 2003; subscription required). Neither humanity nor, from the perspective of millennial time scales (per the above quote from the IPCC), the rest of nature seem the worse for it. Coral reefs for example, evolved and their compositions changed over millennia as new reefs grew while older ones were submerged in deeper water (e.g., Cabioch et al. 2008). So while there have been ecological changes, it is unknown whether the changes were for better or worse. For a melting of the GIS (or WAIS) to qualify as a catastrophe, one has to show, rather than assume, that the ecological consequences would, in fact, be for the worse.

Human beings can certainly cope with sea level rise of such magnitudes if they have centuries or millennia to do so. In fact, if necessary they could probably get out of the way in a matter of decades, if not years.

Can a relocation of such a magnitude be accomplished?

Consider that the global population increased from 2.5 billion in 1950 to 6.8 billion this year. Among other things, this meant creating the infrastructure for an extra 4.3 billion people in the intervening 59 years (as well as improving the infrastructure for the 2.5 billion counted in the baseline, many of whom barely had any infrastructure whatsoever in 1950). These improvements occurred at a time when everyone was significantly poorer. (Global per capita income today is more than 3.5 times greater today than it was in 1950). Therefore, while relocation will be costly, in theory, tomorrow’s much wealthier world ought to be able to relocate billions of people to higher ground over the next few centuries, if need be. In fact, once a decision is made to relocate, the cost differential of relocating, say, 10 meters higher rather than a meter higher is probably marginal. It should also be noted that over millennia the world’s infrastructure will have to be renewed or replaced dozens of times – and the world will be better for it. [For example, the ancient city of Troy, once on the coast but now a few kilometers inland, was built and rebuilt at least 9 times in 3 millennia.]

Also, so long as we are concerned about potential geological catastrophes whose probability of occurrence and impacts have yet to be scientifically estimated, we should also consider equally low or higher probability events that might negate their impacts. Specifically, it is quite possible — in fact probable — that somewhere between now and 2100 or 2200, technologies will become available that will deal with climate change much more economically than currently available technologies for reducing GHG emissions. Such technologies may include ocean fertilization, carbon sequestration, geo-engineering options (e.g., deploying mirrors in space) or more efficient solar or photovoltaic technologies. Similarly, there is a finite, non-zero probability that new and improved adaptation technologies will become available that will substantially reduce the net adverse impacts of climate change.

The historical record shows that this has occurred over the past century for virtually every climate-sensitive sector that has been studied. For example, from 1900-1970, U.S. death rates due to various climate-sensitive water-related diseases — dysentery, typhoid, paratyphoid, other gastrointestinal disease, and malaria —declined by 99.6 to 100.0 percent. Similarly, poor agricultural productivity exacerbated by drought contributed to famines in India and China off and on through the 19th and 20th centuries killing millions of people, but such famines haven’t recurred since the 1970s despite any climate change and the fact that populations are several-fold higher today. And by the early 2000s, deaths and death rates due to extreme weather events had dropped worldwide by over 95% of their earlier 20th century peaks (Goklany 2006).

With respect to another global warming bogeyman — the shutdown of the thermohaline circulation (AKA the meridional overturning circulation), the basis for the deep freeze depicted in the movie, The Day After Tomorrow — the IPCC WG I SPM notes (p. 16), “Based on current model simulations, it is very likely that the meridional overturning circulation (MOC) of the Atlantic Ocean will slow down during the 21st century. The multi-model average reduction by 2100 is 25% (range from zero to about 50%) for SRES emission scenario A1B. Temperatures in the Atlantic region are projected to increase despite such changes due to the much larger warming associated with projected increases in greenhouse gases. It is very unlikely that the MOC will undergo a large abrupt transition during the 21st century. Longer-term changes in the MOC cannot be assessed with confidence.”

Not much has changed since then. A shut down of the MOC doesn’t look any more likely now than it did then. See here, here, and here (pp. 316-317).

If one wants to develop rational policies to address speculative catastrophic events that could conceivably occur over the next few centuries or millennia, as a start one should consider the universe of potential catastrophes and then develop criteria as to which should be addressed and which not. Rational analysis must necessarily be based on systematic analysis, and not on cherry picking one’s favorite catastrophes.

Just as one may speculate on global warming induced catastrophes, one may just as plausibly also speculate on catastrophes that may result absent global warming. Consider, for example, the possibility that absent global warming, the Little Ice Age might return. The consequences of another ice age, Little or not, could range from the severely negative to the positive (if that would buffer the negative consequences of warming). That such a recurrence is not unlikely is evident from the fact that the earth entered and, only a century and a half ago, retreated from a Little Ice Age, and that history may indeed repeat itself over centuries or millennia.

Yet another catastrophe that greenhouse gas controls may cause is that CO2 not only contributes to warming, it is also the key building block of life as we know it. All vegetation is created by the photosynthesis of CO2 in the atmosphere. In fact, according to the IPCC WG I report (2007, p. 106), net primary productivity of the global biosphere has increased in recent decades, partly due to greater warming, higher CO2 concentrations and nitrogen deposition. Thus , there is a finite probability that reducing CO2 emissions would, therefore, reduce the net primary productivity of the terrestrial biosphere with potentially severe negative consequences for the amount and diversity of wildlife that it could support, as well as agricultural and forest productivity with adverse knock on effects on hunger and health.

There is also a finite probability that costs of GHG reductions could reduce economic growth worldwide. Even if only industrialized countries sign up for emission reductions, the negative consequences could show up in developing countries because they derive a substantial share of their income from aid, trade, tourism, and remittances from the rest of the world. See, for example, Tol (2005), which examines this possibility, although the extent to which that study fully considered these factors (i.e., aid, trade, tourism, and remittances) is unclear.

Finally, one of the problems with the argument that society should address low probability high impact events (assuming a probability could be estimated rather than assumed or guessed) is that it necessarily means there is a high probability that resources expended on addressing such catastrophic events will have been squandered. This wouldn’t be a problem but for the fact that there are opportunity costs associated with this.

According to the 2007 IPCC Science Assessment’s Summary for Policy Makers (p. 10), “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.” In plain language, this means that the IPCC believes there is at least a 90% likelihood that anthropogenic greenhouse gas emissions (AGHG) are responsible for 50-100% of the global warming since 1950. In other words, there is an up to 10% chance that anthropogenic GHGs are not responsible for most of that warming.

This means there is an up to 10% chance that resources expended in limiting climate change would have been squandered. Since any effort to significantly reduce climate change will cost trillions of dollars (see Nordhaus 2008, p. 82), that would be an unqualified disaster, particularly since those very resources could be devoted to reducing urgent problems humanity faces here and now (e.g., hunger, malaria, safer water and sanitation) — problems we know exist for sure unlike the bogeymen that we can’t be certain about.

Spending money on speculative, even if plausible, catastrophes instead of problems we know exist for sure is like a starving man giving up a fat juicy bird in hand while hoping that we’ll catch several other birds sometime in the next few centuries even though we know those birds don’t exist today and may never exist in the future.

French Folly

Following the dubious example set recently by U.S. legislators, French politicians have informally proposed slapping punitive tariffs on goods from countries who refuse to curb greenhouse gas emissions. The German State Secretary for the Environment has, quite rightly, called foul:

There are two problems – the WTO (World Trade Organization), and the signal would be that this is a new form of eco-imperialism,” Machnig said.

 ”We are closing our markets for their products, and I don’t think this is a very helpful signal for the international negotiations.”

I have a paper forthcoming on the carbon tariff issue, but in the meantime here’s a recent op-ed (written jointly with Pat Michaels) on climate change policy mis-steps.

Global Taxes and More Foreign Aid

The U.K.-based Guardian reports that the United Nations and other international bureaucracies dealing with so-called climate change are scheming to impose global taxes. That’s not too surprising, but it is discouraging to read that the Obama Administration appears to be acquiescing to these attacks on U.S. fiscal sovereignty. The Administration also has indicated it wants to squander an additional $400 billion on foreign aid, adding injury to injury:

…rich countries will be asked to accept a compulsory levy on international flight tickets and shipping fuel to raise billions of dollars to help the world’s poorest countries adapt to combat climate change. The suggestions come at the start of the second week in the latest round of UN climate talks in Bonn, where 192 countries are starting to negotiate a global agreement to limit and then reduce greenhouse gas emissions. The issue of funding for adaptation is critical to success but the hardest to agree. …It has been proposed by the world’s 50 least developed countries. It could be matched by a compulsory surcharge on all international shipping fuel, said Connie Hedegaard, the Danish environment and energy minister who will host the final UN climate summit in December. …In Bonn last week, a separate Mexican proposal to raise billions of dollars was gaining ground. The idea, known as the “green fund” plan, would oblige all countries to pay amounts according to a formula reflecting the size of their economy, their greenhouse gas emissions and the country’s population. That could ensure that rich countries, which have the longest history of using of fossil fuels, pay the most to the fund. Recently, the proposal won praise from 17 major-economy countries meeting in Paris as a possible mechanism to help finance a UN pact. The US special envoy for climate change, Todd Stern, called it “highly constructive”. …Last week, a US negotiator, Jonathan Pershing, said that the US had budgeted $400m to help poor countries adapt to climate change as an interim measure. But that amount was dismissed as inadequate by Bernarditas Muller of the Philippines, who is the co-ordinator of the G77 and China group of countries.