Tag: climate change

The Current Wisdom: Overplaying the Human Contribution to Recent Weather Extremes

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

**********

 The recent publication of two articles in Nature magazine proclaiming a link to rainfall extremes (and flooding) to global warming, added to the heat in Russia and the floods in Pakistan in the summer of 2010, and the back-to-back cold and snowy winters in the eastern U.S. and western Europe, have gotten a lot of public attention.  This includes a recent hearing in the House of Representatives, despite its Republican majority.  Tying weather extremes to global warming, or using them as “proof” that warming doesn’t exist (see: snowstorms), is a popular rhetorical flourish by politicos of all stripes.  

The hearing struck many as quite odd, inasmuch as it is much clearer than apocalyptic global warming that the House is going to pass meaningless legislation commanding the EPA to cease and desist from regulating greenhouse gas emissions.  “Meaningless” means that it surely will not become law.  Even on the long-shot probability that it passes the Senate, the President will surely veto, and there are nowhere near enough votes to override such an action.

Perhaps “wolf!” has been cried yet again.  A string of soon-to-be-published papers in the scientific literature finds that despite all hue and cry about global warming and recent extreme weather events, natural climate variability is to blame.

Where to start?  How about last summer’s Russian heat wave?

The Russian heat wave (and to some degree the floods in Pakistan) have been linked to the same large-scale, stationary weather system, called an atmospheric “blocking” pattern. When the atmosphere is “blocked” it means that it stays in the same configuration for period of several weeks (or more) and keeps delivering the same weather to the same area for what can seem like an eternity to people in the way.  Capitalizing on the misery in Russia and Pakistan, atmospheric blocking was added to the list of things that were supposed to be “consistent with” anthropogenically stimulated global warming which already, of course included heat waves and floods. And thus the Great Russian Heat Wave of 2010 became part of global warming lore.

But then a funny thing happened – scientists with a working knowledge of atmospheric dynamics started to review the situation and found scant evidence for global warming.

The first chink in the armor came back in the fall of 2010, when scientists from the Physical Sciences Division (PSD) of the Earth System Research Laboratory (ESRL) of the National Oceanic and Atmospheric Administration (NOAA) presented the results of their preliminary investigation on the web , and concluded that “[d]espite this strong evidence for a warming planet, greenhouse gas forcing fails to explain the 2010 heat wave over western Russia. The natural process of atmospheric blocking, and the climate impacts induced by such blocking, are the principal cause for this heat wave.”

The PSD folks have now followed this up with a new peer-reviewed article in the journal Geophysical Research Letters that rejects the global warming explanation. The paper is titled “Was There a Basis for Anticipating the 2010 Russian Heat Wave?” Turns out that there wasn’t.

To prove this, the research team, led by PSD’s Randall Dole, first reviewed the observed temperature history of the region affected by the heat wave (western Russia, Belarus, the Ukraine, and the Baltic nations). To start, they looked at the recent antecedent conditions: “Despite record warm globally-averaged surface temperatures over the first six months of 2010, Moscow experienced an unusually cold winter and a relatively mild but variable spring, providing no hint of the record heat yet to come.” Nothing there.

Then they looked at the long-term temperature record: “The July surface temperatures for the region impacted by the 2010 Russian heat wave shows no significant warming trend over the prior 130-year period from 1880 to 2009…. A linear trend calculation yields a total temperature change over the 130 years of -0.1°C (with a range of 0 to -0.4°C over the four data sets [they examined]).” There’s not a hint of a build-up to a big heat wave.

And as to the behavior of temperature extremes: “There is also no clear indication of a trend toward increasing warm extremes. The prior 10 warmest Julys are distributed across the entire period and exhibit only modest clustering earlier in this decade, in the 1980s and in the 1930s…. This behavior differs substantially from globally averaged annual temperatures, for which eleven of the last twelve years ending in 2006 rank among the twelve warmest years in the instrumental record since 1850….”

With regard any indication that “global” warming was pushing temperatures higher in Russia and thus helped to fuel the extreme heat last summer, Dole et al. say this: “With no significant long-term trend in western Russia July surface temperatures detected over the period 1880-2009, mean regional temperature changes are thus very unlikely to have contributed substantially to the magnitude of the 2010 Russian heat wave.”

Next the PSD folks looked to see if the existing larger-scale antecedent conditions, fed into climate models would produce the atmospheric circulation patterns (i.e. blocking) that gave rise to the heat wave.  The tested “predictors” included patterns of sea surface temperature and arctic ice coverage, which most people feel have been subject to some human influence.  No relationship: “These findings suggest that the blocking and heat wave were not primarily a forced response to specific boundary conditions during 2010.”

In fact, the climate models exhibited no predilection for projecting increases in the frequency of atmospheric blocking patterns over the region as greenhouse gas concentrations increased. Just the opposite: “Results using very high-resolution climate models suggest that the number of Euro-Atlantic blocking events will decrease by the latter half of the 21st century.”

At this point, Dole and colleagues had about exhausted all lines of inquiry and summed things up:

 Our analysis points to a primarily natural cause for the Russian heat wave. This event appears to be mainly due to internal atmospheric dynamical processes that produced and maintained an intense and long-lived blocking event. Results from prior studies suggest that it is likely that the intensity of the heat wave was further increased by regional land surface feedbacks. The absence of long-term trends in regional mean temperatures and variability together with the model results indicate that it is very unlikely that warming attributable to increasing greenhouse gas concentrations contributed substantially to the magnitude of this heat wave.

Can’t be much clearer than that.

But that was last summer. What about the past two winters? Both were very cold in the eastern U.S. with record snows events and/or totals scattered about the country.

Cold, snow, and global warming? On Christmas Day 2010, the New York Times ran an op-ed by Judah Cohen, a long-range forecaster for the private forecasting firm Atmospheric and Environmental Research, outlining his theory as to how late summer Arctic ice declines lead to more fall snow cover across Siberia which in turn induces atmospheric circulation patterns to favor snowstorms along the East Coast of the U.S. Just last week, the Union of Concerned Scientists held a news conference where they handed out a press release  headlined “Climate Change Makes Major Snowstorms Likely.” In that release, Mark Serreze, director of the National Snow and Ice Data Center, laid out his theory as to how the loss of Arctic sea ice is helping to provide more moisture to fuel winter snowstorms across the U.S. as well as altering atmospheric circulation patterns into a preferred state for big snowstorms. Weather Underground’s Jeff Masters chimed in with “Heavy snowstorms are not inconsistent with a warming planet.”

As is the wont for this Wisdom, let’s go back to the scientific literature.

Another soon-to-be released paper to appear in Geophysical Research Letters describes the results of using the seasonal weather prediction model from the European Center for Medium-Range Weather Forecasts (ECMWF) to help untangle the causes of the unusual atmospheric circulation patterns that gave rise to the harsh winter of 2009-2010 on both sides of the Atlantic. A team of ECMWF scientists led by Thomas Jung went back and did experiments changing initial conditions that were fed into the ECMWF model and then assessed how well the model simulated the known weather patterns of the winter of 2009-2010. The different set of initial conditions was selected so as to test all the pet theories behind the origins of the harsh winter.  Jung et al. describe their investigations this way: “Here, the origin and predictability of the unusual winter of 2009/10 are explored through numerical experimentation with the ECMWF Monthly forecasting system. More specifically, the role of anomalies in sea surface temperature (SST) and sea ice, the tropical atmospheric circulation, the stratospheric polar vortex, solar insolation and near surface temperature (proxy for snow cover) are examined.”

Here is what they found after running their series of experiments.

Arctic sea ice and sea surface temperature anomalies.  These are often associated with global warming caused by people. Finding:  “These results suggest that neither SST nor sea ice anomalies explain the negative phase of the NAO during the 2009/10 winter.”

(NAO are the commonly used initials for the North Atlantic Oscillation – and atmospheric circulation pattern that can act to influence winter weather in the eastern U.S. and western Europe. A negative phase of the NAO is associated with cold and stormy weather and during the winter of 2009-10, the NAO value was the lowest ever observed.)

A global warming-induced weakening stratospheric (upper-atmosphere) jetstream. “Like for the other experiments, these stratospheric relaxation experiments fail to reproduce the magnitude of the observed NAO anomaly.”

Siberian snow cover.  “The resulting [upper air patterns] show little resemblance with the observations…. The implied weak role of snow cover anomalies is consistent with other research….”

Solar variability.  “The experiments carried out in this study suggest that the impact of anomalously low incoming [ultraviolet] radiation on the tropospheric circulation in the North Atlantic region are very small… suggesting that the unusually low solar activity contributed little, if any, to the observed NAO anomaly during the 2009/10 winter.”

Ok then, well what did cause the unusual weather patterns during the 2009-10 winter?

The results of this study, therefore, increase the likelihood that both the development and persistence of negative NAO phase resulted from internal atmospheric dynamical processes.

Translation: Random variability.

To drive this finding home, here’s another soon-to-be-released paper (D’Arrigo et al., 2001) that uses tree ring-based reconstructions of atmospheric circulation patterns and finds a similar set of conditions (including a negative NAO value second only to the 2009-10 winter) was responsible for the historically harsh winter of 1783-84 in the eastern U.S. and western Europe, which  was widely noted by historians. It followed the stupendous eruption of the Icelandic volcano Laki the previous summer. The frigid and snowy winter conditions have been blamed on the volcano. In fact, Benjamin Franklin even commented as much.

But in their new study, Roseanne D’Arrigo and colleagues conclude that the harshness of that winter primarily was the result of anomalous atmospheric circulation patterns that closely resembled those observed during the winter of 2009-10, and that the previous summer’s volcanic eruption played a far less prominent role:

Our results suggest that Franklin and others may have been mistaken in attributing winter conditions in 1783-4 mainly to Laki or another eruption, rather than unforced variability.

Similarly, conditions during the 2009-10 winter likely resulted from natural [atmospheric] variability, not tied to greenhouse gas forcing… Evidence thus suggests that these winters were linked to the rare but natural occurrence of negative NAO and El Niño events.

The point is that natural variability can and does produce extreme events on every time scale, from days (e.g., individual storms), weeks (e.g., the Russian heat wave), months (e.g., the winter of 2009-10), decades (e.g., the lack of global warming since 1998), centuries (e.g., the Little Ice Age), millennia (e.g., the cycle of major Ice Ages), and eons (e.g., snowball earth).

Folks would do well to keep this in mind next time global warming is being posited for the weather disaster du jour. Almost assuredly, it is all hype and little might.

Too bad these results weren’t given a “hearing” in the House!

References:

D’Arrigo, R., et al., 2011. The anomalous winter of 1783-1784: Was the Laki eruption or an analog of the 2009–2010 winter to blame? Geophysical Research Letters, in press.

Dole, R., et al., 2011. Was there a basis for anticipating the 2010 Russian heat wave? Geophysical Research Letters, in press.

Jung et al., 2011. Origin and predictability of the extreme negative NAO winter of 2009/10. Geophysical Research Letters, in press.

Min, S-K., et al., 2011. Human contribution to more-intense precipitation extremes. Nature, 470, 378-381.

Pall, P., et al., 2011. Anthropogenic greenhouse gas contribution to flood risk in England and Wales in autumn 2000. Nature, 470, 382-386.

Supreme Court Takes Up Butterfly Effect

As Congress debates cap-and-trade, new fuel standards, and subsidies for “green” companies, some still feel that political solutions to global warming are not moving fast enough. In the present case, American Electric Power Co. v. Connecticut, eight states and New York City sued several public utilities (including the federal Tennessee Valley Authority), alleging that their carbon dioxide emissions contribute to global warming.

This is the third major lawsuit to push global warming into the courts (another being Comer v. Murphy Oil USA, in which Cato also filed a brief). All of these suits try to use the common law doctrine of nuisance—which, for example, lets you sue your neighbor if his contaminated water flows onto your land and kills your lawn—to attack carbon emitters. None of them had gotten very far until the Second Circuit vacated a lower-court ruling and allowed the claims here to proceed.

But the judiciary was not meant to be the sole method for resolving grievances with the government—even if everything looks like a nail to lawyers who only have a hammer. After all, there are two other co-equal branches, the legislative and executive, which are constitutionally committed to unique roles in our system of separation of powers. The doctrine of “standing” exists in part to ensure that the judiciary is not used to solve issues that properly belong to those other branches. Toward this end, the Constitution allows courts to hear only actual “cases or controversies” that can feasibly be resolved by a court.

Cato thus filed a brief supporting the defendant utilities’ successful request for Supreme Court review, and has now filed another brief supporting their position before the Court. Cato’s latest brief first argues that no judicial solution is possible here because the chain of causation between the defendants’ carbon emissions and the alleged harm caused by global warming is so attenuated that it resembles the famed “butterfly effect.” Just as butterflies should not be sued for causing tsunamis, a handful of utility companies in the Northeastern United States should not be sued for the complex (and disputed) harms of global warming.

Second, we contend that, even if the plaintiffs can demonstrate causation, it is unconstitutional for courts to make nuanced policy decisions that should be left to the legislature—and this is true regardless of the science of global warming. Just as it’s improper for a legislature to pass a statute punishing a particular person (bill of attainder), it’s beyond courts’ constitutional authority—under the “political question doctrine”—to determine wide-ranging policies in which numerous considerations must be weighed in anything but an adversarial litigation process.

If a court were to adjudicate the claims here and issue an order dictating emissions standards, two things will happen: 1) the elected branches will be encouraged to abdicate to the courts their responsibilities for addressing complex and controversial policy issues, and 2) an already difficult situation would become nearly intractable as regulatory agencies and legislative actors butt heads with court orders issued across the country in quickly multiplying global warming cases. These inevitable outcomes are precisely why the standing and political question doctrines exist.

Dissatisfaction with the decisions and pace of government does not give someone the right to sue over anything. Or, as Chief Justice Marshall once said, “If the judicial power extended to every question under the laws of the United States … [t]he division of power [among the branches of government] could exist no longer, and the other departments would be swallowed up by the judiciary.”

The Supreme Court will hear arguments in American Electric Power Co. v. Connecticut on April 19.

Special thanks to Trevor Burrus, who contributed to this post.

The Current Wisdom: The Short-Term Climate Trend Is Not Your Friend

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

***********

It seems like everyone, from exalted climate scientists to late-night amateur tweeters, can get a bit over-excited about short-term fluctuations, reading into them deep cosmic and political meaning, when they are likely the statistical hiccups of our mathematically surly atmosphere.

There’s been some major errors in forecasts of recent trends. Perhaps the most famous  were made by NASA’s James Hansen in 1988, who overestimated warming between then and now by a whopping 40% or so.

But it is easy to  get snookered by short-term fluctuations.  As shown in Figure 1, it is quite obvious that there has been virtually no net change in temperature since 1997, allowing for the fact that measurement errors in global average surface temperature are easily a tenth of a degree or more. (The magnitude of those errors will be considered in a future Current Wisdom).

Figure 1. Annual global average surface temperature anomaly (°C), 1997-2010 (data source: Hadley Center).

Some who are concerned about environmental regulation without good science have seized upon this 13-year stretch as “proof” that there is no such thing as global warming driven by carbon dioxide.  More on that at the end of this Wisdom.

Similarly, periods of seemingly rapid warming can prompt scientists to see changes where there aren’t any.

Consider a landmark paper published in 2000 in Geophysical Research Letters by Tom Karl, a prominent researcher who is the head of our National Climatic Data Center (NCDC) and who just finished a stint as President of the American Meteorological Society.  He couldn’t resist the climatic blip that was occurred prior  to the current stagnation of warming, namely the very warm episode of the late 1990s. 

Cooler heads at the time noted that it was an artifact of the great El Nino of 1997-98, a periodic warming of the tropical Pacific that has been coming and going for millions of years. 

Nonetheless, the paper was published and accompanied by a flashy press release titled “Global warming may be accelerating.”  

What Karl did was to examine the 16 consecutive months of record-high temperatures (beginning in May, 1997) and to calculate the chance that this could happen, given the fairly pokey warming rate—approximately 0.17°C (0.31°F) per  decade, that was occurring.  He concluded there was less than a five percent probability, unless the warming rate had suddenly increased.

From the press release:

Karl and colleagues conclude that there is only a small chance that the string of record high temperatures in 1997-98 was simply an unusual event, rather than a change point, the start of a new and faster ongoing trend.

He also gave a number:  “…the probability of observing the record temperatures is more likely with high average rates of warming, around 3°C [5.4°F]/century,” which works out to 0.3°C per decade.

Our Figure 2 shows what was probabilistically forecast beginning in May, 1997, and what actually happened.  Between then and now, according to this paper, global temperatures should have warmed around 0.4°C (0.7°F).  The observed warming rate for the last 13.5 years—which includes the dramatically warming temperatures beginning in 1997—was a paltry 0.06°C (0.11°F) per decade. 

Figure 2. Prior to mid-1997, the observed warming trend (dashed line) was 0.17°/decade.  Karl said there was a greater than 95% probability that 1997-8 would mark a “change point”, where warming would accelerate to around 0.30°/decade.  Since then, the rate has been 0.06°/decade, or 20% of what was forecast.

Karl did provide some statistical wiggle room.  While noting the less than 5% chance that the warming rate hadn’t increased, he wrote that “unusual events can occur” and that there still was a chance (given as less than 5%) that 97-98 was just a statistical hiccup, which it ultimately proved to be.

The press release couldn’t resist the “it’s worse than we thought” mindset that pervades climate science:

Since completing the research, the data for 1999 has been compiled.  The researchers found that 1999 was the fifth warmest year on record, although as a La Nina year it would normally be cooler” [than what?ed].

“La Nina” is cool phase of El Nino, which drops temperatures about as much as El Nino raises them. What the press release and the GRL paper completely neglected to mention is that the great warm year of 1998 was a result of the “natural”  El Nino superimposed upon the overall slight warming trend.

In other words, there was every reason to believe at that time that the anomalous temperatures were indeed a statistical blip resulting from a very high-amplitude version of a natural oscillation in the earth’s climate that occurred every few years.

Now, back to the last 13 years. The puny recent changes may also just be our atmosphere’s make-up call for the sudden warming of the late 1990s, or another hiccup.

It is characteristic for climate models whose carbon dioxide increase resembles that which is being observed to produce constant rates of warming.  There’s a good reason for this.  Temperature responds logarithmically—i.e.less and less—to changes in this gas as its concentration increases.  But the concentration tends to increase exponentially—i.e. more and more.  The combination of an increasingly damped response to an ever increasing rate of input tends to resemble a straight line, or a constant rate of warming.

Indeed, Karl noted in his paper (and I have noted in virtually every public lecture I give), that “projections of temperature change in the next [i.e. the 21st] century, using [the United Nations’] business as usual scenarios…have relatively constant rates of global temperature increase”.  It’s just that their constant rates tend to be higher than the one that is being observed.  The average rate of warming predicted for this century by the UN is about 2.5°C, while the observed value has been, as predicted, constant—but with a lower value of 1.7°.  As Figure 3 shows, this rate has been remarkably constant for over three decades.

 

Figure 3. Annual global average surface temperature anomaly (°C), 1976-2010 (data source: Hadley Center).  It’s hard to imagine a more constant trend, despite the 1998 peak and the subsequent torpid warming.

The bottom line is that short-term trends are not your friends when talking about long-term climate change.

References

Hansen, J.E., et al., 1988. Global climate changes as forecast by Goddard Institute for Space Studies three-dimensional model. Journal of Geophysical Research, 93, 9341-9364.

Karl, T. R., R. W. Knight, and B. Baker, 2000. The record breaking global temperatures of 1997 and 1998” Evidence for an increase in the rate of global warming? Geophysical Research Letters, 27, 719-722.

Michaels, P. J., and P. C. Knappenberger, 2009. Scientific Shortcomings in the EPA’s Endangerment Finding from Greenhouse Gases, Cato Journal, 29, 497-521, http://www.cato.org/pubs/journal/cj29n3/cj29n3-8.pdf.

Topics:

The Current Wisdom: Better Model, Less Warming

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.


Better Model, Less Warming

Bet you haven’t seen this one on TV:  A newer, more sophisticated climate model has lost more than 25% of its predicted warming!  You can bet that if it had predicted that much more warming it would have made the local paper.

The change resulted from a more realistic simulation of the way clouds work, resulting in a major reduction in the model’s “climate sensitivity,” which is the amount of warming predicted for a doubling of  the concentration of atmospheric carbon dioxide over what it was prior to the industrial revolution.

Prior to the modern era, atmospheric carbon dioxide concentrations, as measured in air trapped in ice in the high latitudes (which can be dated year-by-year) was pretty constant, around 280 parts per million (ppm).  No wonder CO2 is called a “trace gas”—there really is not much of it around.

The current concentration is pushing about 390 ppm, an increase of about 40% in 250 years.  This is a pretty good indicator of the amount of “forcing” or warming pressure that we are exerting on the atmosphere.  Yes, there are other global warming gases going up, like the chlorofluorocarbons (refrigerants now banned by treaty), but the modern climate religion is that these are pretty much being cancelled by reflective  “aerosol” compounds that go in the air along with the combustion of fossil fuels, mainly coal.

Most projections have carbon dioxide doubling to a nominal 600 ppm somewhere in the second half of this century, absent no major technological changes (which history tells us is a very shaky assumption).  But the “sensitivity” is not reached as soon as we hit the doubling, thanks to the fact that it takes a lot of time to warm the ocean (like it takes a lot of time to warm up a big pot of water with a small burner).

So the “sensitivity” is much closer to the temperature rise that a model projects about 100 years from now – assuming (again, shakily) that we ultimately switch to power sources that don’t release dreaded CO2 into the atmosphere somewhere around the time its concentration doubles.

The bottom line is that lower sensitivity means less future warming as a result of anthropogenic greenhouse gas emissions. So our advice… keep on working on the models, eventually, they may actually arrive at something close puny rate of warming that is being observed

At any rate, improvements to the Japanese-developed Model for Interdisciplinary Research on Climate (MIROC) are the topic of a new paper by Masahiro Watanabe and colleagues in the current issue of the Journal of Climate. This modeling group has been working on a new version of their model (MIROC5) to be used in the upcoming 5th Assessment Report of the United Nations’ Intergovernmental Panel on Climate Change, due in late 2013. Two incarnations of the previous version (MIROC3.2) were included in the IPCC’s 4th Assessment Report (2007) and contribute to the IPCC “consensus” of global warming projections.

The high resolution version (MIROC3.2(hires)) was quite a doozy – responsible for far and away the greatest projected global temperature rise (see Figure 1). And the medium resolution model (MIROC3.2(medres)) is among the Top 5 warmest models. Together, the two MIROC models undoubtedly act to increase the overall model ensemble mean warming projection and expand the top end of the “likely” range of temperature rise.

FIGURE 1

Global temperature projections under the “midrange” scenario for greenhouse-gas emissions produced by the IPCC’s collection of climate models.  The MIROC high resolution model (MIROC3.2(hires)) is clearly the hottest one, and the medium range one isn’t very far behind.

The reason that the MIROC3.2 versions produce so much warming is that their  sensitivity is very high, with the high-resolution  at 4.3°C (7.7°F) and the medium-resolution  at  4.0°C (7.2°F).  These sensitivities are very near the high end of the distribution of climate sensitivities from the IPCC’s collection of models (see Figure 2).

FIGURE 2

Equilibrium climate sensitivities of the models used in the IPCC AR4 (with the exception of the MIROC5). The MIROC3.2 sensitivities are highlighted in red and lie near the upper und of the collection of model sensitivities.  The new, improved, MIROC5, which was not included in the IPCC AR4, is highlighted in magenta, and lies near the low end of the model climate sensitivities (data from IPCC Fourth Assessment Report, Table 8.2 and Watanabe et al., 2010).

Note that the highest sensitivity is not necessarily in the hottest model, as observed warming is dependent upon how the model deals with the slowness of the oceans to warm.

The situation is vastly different in the new MIROC5 model.  Watanabe et al. report that the climate sensitivity is now  2.6°C (4.7°F) – more than 25% less than in the previous version on the model.[1] If the MIROC5 had been included in the IPCC’s AR4 collection of models, its climate sensitivity of 2.6°C would have been found near the low end of the distribution (see Figure 2), rather than pushing the high extreme as MIROC3.2 did.

And to what do we owe this large decline in the modeled climate sensitivity?  According to Watanabe et al., a vastly improved handling of cloud processes involving “a prognostic treatment for the cloud water and ice mixing ratio, as well as the cloud fraction, considering both warm and cold rain processes.”  In fact, the improved cloud scheme—which produces clouds which compare more favorably with satellite observations—projects that under a warming climate low altitude clouds become a negative feedback rather than acting as positive feedback as the old version of the model projected.[2] Instead of enhancing the CO2-induced warming, low clouds are now projected to retard it.

Here is how Watanabe et al. describe their results:

A new version of the global climate model MIROC was developed for better simulation of the mean climate, variability, and climate change due to anthropogenic radiative forcing….

MIROC5 reveals an equilibrium climate sensitivity of 2.6K, which is 1K lower than that in MIROC3.2(medres)…. This is probably because in the two versions, the response of low clouds to an increasing concentration of CO2 is opposite; that is, low clouds decrease (increase) at low latitudes in MIROC3.2(medres) (MIROC5).[3]

Is the new MIROC model perfect? Certainly not.  But is it better than the old one? It seems quite likely.  And the net result of the model improvements is that the climate sensitivity and therefore the warming projections (and resultant impacts) have been significantly lowered. And much of this lowering comes as the handling of cloud processes—still among the most uncertain of climate processes—is improved upon. No doubt such improvements will continue into the future as both our scientific understanding and our computational abilities increase.

Will this lead to an even greater reduction in climate sensitivity and projected temperature rise?  There are many folks out there (including this author) that believe this is a very distinct possibility, given that observed warming in recent decades is clearly beneath the average predicted by climate models. Stay tuned!

References:

Intergovernmental Panel on Climate Change, 2007.  Fourth Assessment Report, Working Group 1 report, available at http://www.ipcc.ch.

Watanabe, M., et al., 2010. Improved climate simulation by MIROC5: Mean states, variability, and climate sensitivity. Journal of Climate, 23, 6312-6335.


[1] Watanabe et al. report that the sensitivity of MIROC3.2 (medres) is 3.6°C (6.5°), which is less that what was reported in the 2007 IPCC report.  So 25% is likely a conservative estimate of the reduction in warming.

[2] Whether enhanced cloudiness enhances or cancels carbon-dioxide warming is one of the core issues in the climate debate, and is clearly not “settled” science.

[3] Degrees Kelvin (K) are the same as degrees Celsius (C) when looking at relative, rather than absolute temperatures.

The Current Wisdom

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

History to Repeat:  Greenland’s Ice to Survive, United Nations to Continue Holiday Party

This year’s installment of the United Nations’ annual climate summit (technically known as the 16th meeting of the Conference of the Parties to the Framework Convention on Climate Change) has come and gone in Cancun. Nothing substantial came of it policy-wise; just the usual attempts by the developing world to shake down our already shaky economy in the name of climate change.   News-wise probably the biggest story was that during the conference, Cancun broke an all time daily low temperature record.  Last year’s confab in Copenhagen was pelted by snowstorms and subsumed in miserable cold.  President Obama attended, failed to forge any meaningful agreement, and fled back to beat a rare Washington blizzard. He lost.

But surely as every holiday season now includes one of these enormous jamborees, dire climate stories appeared daily.  Polar bear cubs are endangered!  Glaciers are melting!!

Or so beat the largely overhyped drums, based upon this or that press release from Greenpeace or the World Wildlife Fund.

And, of course, no one bothered to mention a blockbuster paper appearing in Nature the day before the end of the Cancun confab, which reassures us that Greenland’s ice cap and glaciers are a lot more stable than alarmists would have us believe.  That would include Al Gore, fond of his lurid maps showing the melting all of Greenland’s ice submerging Florida.

Ain’t gonna happen.

The disaster scenario goes like this:  Summer temperatures in Greenland are warming, leading to increased melting and the formation of ephemeral lakes on the ice surface.  This water eventually finds a crevasse and then a way down thousands of feet to the bottom of a glacier, where it lubricates the underlying surface, accelerating the seaward march of the ice.  Increase the temperature even more and massive amounts deposit into the ocean by the year 2100, catastrophically raising sea levels.

According to Christian Schoof of the University of British Columbia (UBC), “The conventional view has been that meltwater permeates the ice from the surface and pools under the base of the ice sheet….This water then serves as a lubricant between the glacier and the earth underneath it….”

And, according to Schoof, that’s just not the way things work. A UBC press release about his Nature article noted that he found that “a steady meltwater supply from gradual warming may in fact slow down the glacier flow, while sudden water input could cause glaciers to speed up and spread.”

Indeed, Schoof finds that sudden water inputs, such as would occur with heavy rain, are responsible for glacial accelerations, but these last only one or a few days.

The bottom line?  A warming climate has very little to do with accelerating ice flow, but weather events do.

How important is this?  According to University of Leeds Professor Andrew Shepherd, who studies glaciers via satellite, “This study provides an elegant solution to one of the two key ice sheet instability problems” noted by the United Nations in their last (2007) climate compendium.  “It turns out that, contrary to popular belief, Greenland ice sheet flow might not be accelerated by increased melting after all,” he added.

I’m not so sure that those who hold the “popular belief” can explain why Greenland’s ice didn’t melt away thousands of years ago.  For millennia, after the end of the last ice age (approximately 11,000 years ago) strong evidence indicates that the Eurasian arctic averaged nearly 13°F warmer in July than it is now.

That’s because there are trees buried and preserved in the acidic Siberian tundra, and they can be carbon dated.  Where there is no forest today—because it’s too cold in summer—there were trees, all the way to the Arctic Ocean and even on some of the remote Arctic islands that are bare today. And, back then, thanks to the remnants of continental ice, the Arctic Ocean was smaller and the North American and Eurasian landmasses extended further north.

That work was by Glen MacDonald, from UCLA’s Geography Department. In his landmark 2000 paper in Quaternary Research, he noted that the only way that the Arctic could become so warm is for there to be a massive incursion of warm water from the Atlantic Ocean.  The only “gate” through which that can flow is the Greenland Strait, between Greenland and Scandinavia.

So, Greenland had to have been warmer for several millennia, too.

Now let’s do a little math to see if the “popular belief” about Greenland ever had any basis in reality.

In 2009 University of Copenhagen’s B. M. Vinther and 13 coauthors published the definitive history of Greenland climate back to the ice age, studying ice cores taken over the entire landmass. An  exceedingly conservative interpretation of  their results is that Greenland was 1.5°C (2.7°F) warmer for the period from 5,000-9000 years ago, which is also the warm period in Eurasia that MacDonald detected.  The integrated warming is given by multiplying the time (4,000 years) by the warming (1.5°), and works out (in Celsius) to 6,000 “degree-years.” 

Now let’s assume that our dreaded emissions of carbon dioxide spike the temperature there some 4°C.  Since we cannot burn fossil fuel forever, let’s put this in over 200 years.  That’s a pretty liberal estimate given that the temperature there still hasn’t exceeded values seen before in the 20th century.  Anyway, we get 800 (4 x 200) degree-years.

If the ice didn’t come tumbling off Greenland after 6,000 degree-years, how is it going to do so after only 800?  The integrated warming of Greenland in the post-ice-age warming (referred to as the “climatic optimum” in textbooks published prior to global warming hysteria) is over seven times what humans can accomplish in 200 years.  Why do we even worry about this?

So we can all sleep a bit better.  Florida will survive.  And, we can also rest assured that the UN will continue its outrageous holiday parties, accomplishing nothing, but living large.  Next year’s is in Durban, South Africa, yet another remote warm spot hours of Jet-A away.

References:

MacDonald, G. M., et al., 2000.  Holocene treeline history and climatic change across Northern Eurasia.  Quaternary Research 53, 302-311.

Schoof, C., 2010. Ice-sheet acceleration driven by melt supply variability. Nature 468, 803-805.

Vinther, B.M., et al., 2009.  Holocene thinning of the Greenland ice sheet. Nature 461, 385-388.

The Shocking Truth: The Scientific American Poll on Climate Change

November’s Scientific American features a profile of Georgia Tech atmospheric scientist Judith Curry,  who has committed the mortal sin of  reaching out to other scientists who hypothesize that global warming isn’t the disaster it’s been cracked up to be.  I have personal experience with this, as she invited me to give a research seminar in Tech’s prestigious School of Earth and Atmospheric Sciences in 2008.  My lecture summarizing the reasons for doubting the apocalyptic synthesis of climate change was well-received by an overflow crowd.

Written by Michael Lemonick, who hails from the shrill blog Climate Central, the article isn’t devoid of the usual swipes, calling her a “heretic„ which is hardly at all true.  She’s simply another hardworking scientist who lets the data take her wherever it must, even if that leads her to question some of our more alarmist colleagues. 

But, as a make-up call for calling attention to Curry, Scientific American has run a poll of its readers on climate change.  Remember that SciAm has been shilling for the climate apocalypse for years, publishing a particularly vicious series of attacks on Denmark’s Bjorn Lomborg’s Skeptical Environmentalist.  The magazine also featured NASA’s James Hansen and his outlandish claims on sea-level rise. Hansen has stated, under oath in a deposition, that a twenty foot rise is quite possible within the next 89 years; oddly, he has failed to note that in 1988 he predicted that the West Side Highway in Manhattan would go permanently under water in twenty years.

SciAm probably expected a lot of people would agree with the key statement in their poll that the United Nations’ Intergovernmental Panel on Climate Change (IPCC) is “an effective group of government representatives and other experts.”

Hardly. As of this morning, only 16% of the 6655 respondents agreed.  84%—that is not a typo—described the IPCC as “a corrupt organization, prone to groupthink, with a political agenda.” 

The poll also asks “What should we do about climate change?” 69% say “nothing, we are powerless to stop it.” When asked about policy options, an astonishingly low 7% support cap-and-trade, which passed the U.S. House of Representatives in June, 2009, and cost approximately two dozen congressmen their seats.

The real killer is question “What is causing climate change?” For this one, multiple answers are allowed.  26% said greenhouse gases from human activity, 32% solar variation, and 78% “natural processes.” (In reality all three are causes of climate change.)

And finally, “How much would you be willing to pay to forestall the risk of catastrophic climate change?”  80% of the respondents said “nothing.”

Remember that this comes from what is hardly a random sample.  Scientific American is a reliably statist publication and therefore appeals to a readership that is skewed to the left of the political center.  This poll demonstrates that virtually everyone now acknowledges that the UN has corrupted climate science, that climate change is impossible to stop, and that futile attempts like cap-and-trade do nothing but waste money and burn political capital, things that Cato’s scholars have been saying for years.

The Current Wisdom

NOTE:  This is the first in a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

The Iceman Goeth:  Good News from Greenland and Antarctica

How many of us have heard that global sea level will be about a meter—more than three feet—higher in 2100 than it was in the year 2000?  There are even scarier stories, circulated by NASA’s James E. Hansen, that the rise may approach 6 meters, altering shorelines and inundating major cities and millions of coastal inhabitants worldwide.

Figure 1. Model from a travelling climate change exhibit (currently installed at the Field Museum of natural history in Chicago) of Lower Manhattan showing what 5 meters (16 feet) of sea level rise will look like.

In fact, a major exhibition now at the prestigious Chicago Field Museum includes a 3-D model of Lower Manhattan under 16 feet of water—this despite the general warning from the James Titus, who has been EPA’s sea-level authority for decades:

Researchers and the media need to stop suggesting that Manhattan or even Miami will be lost to a rising sea. That’s not realistic; it promotes denial and panic, not a reasoned consideration of the future.

Titus was commenting upon his 2009 publication on sea-level rise in the journal Environmental Research Letters.

The number one rule of grabbing attention for global warming is to never let the facts stand in the way of a good horror story, so advice like Titus’s is usually ignored.

The catastrophic sea level rise proposition is built upon the idea that large parts of the ice fields that lay atop Greenland and Antarctica will rapidly melt and slip into the sea as temperatures there rise.  Proponents of this idea claim that the United Nations’ Intergovernmental Panel on Climate Change (IPCC), in its most recent (2007) Assessment Report,  was far too conservative in its projections of future sea level rise—the mean value of which is a rise by the year 2100 of about 15 inches.

In fact, contrary to virtually all news coverage, the IPCC actually anticipates that Antarctica will gain ice mass (and lower sea level) as the climate warms, since the temperature there is too low to produce much melting even if it warms up several degrees, while the warmer air holds more moisture and therefore precipitates more snow. The IPCC projects Greenland to contribute a couple of inches of sea level rise as ice melts around its periphery.

Alarmist critics claim that the IPCC’s projections are based only on direct melt estimates rather than “dynamic” responses of the glaciers and ice fields to rising temperatures.

These include Al Gore’s favorite explanation—that melt water from the surface percolates down to the bottom of the glacier and lubricates its base, increasing flow and ultimately ice discharge. Alarmists like Gore and Hansen claim that Greenland and Antarctica’s glaciers will then “surge” into the sea, dumping an ever-increasing volume of ice and raising water levels worldwide.

The IPCC did not include this mechanism because it is very hypothetical and not well understood.  Rather, new science argues that the IPCC’s minuscule projections of sea level rise from these two great ice masses are being confirmed.

About a year ago, several different research teams reported that while glaciers may surge from time to time and increase ice discharge rates, these surges are not long-lived and that basal lubrication is not a major factor in these surges. One research group, led by Faezeh Nick and colleagues reported that “our modeling does not support enhanced basal lubrication as the governing process for the observed changes.” Nick and colleagues go on to find that short-term rapid increases in discharge rates are not stable and that “extreme mass loss cannot be dynamically maintained in the long term” and ultimately concluding that “[o]ur results imply that the recent rates of mass loss in Greenland’s outlet glaciers are transient and should not be extrapolated into the future.”

But this is actually old news. The new news is that the commonly-reported (and commonly hyped) satellite estimates of mass loss from both Greenland and Antarctica were a result of improper calibration, overestimating ice loss by  some 50%.

As with any new technology, it takes a while to get all the kinks worked out. In the case of the Gravity Recovery and Climate Experiment (GRACE) satellite-borne instrumentation, one of the major problems is interpreting just what exactly the satellites are measuring. When trying to ascertain mass changes (for instance, from ice loss) from changes in the earth’s gravity field, you first have to know how the actual land under the ice is vertically moving (in many places it is still slowly adjusting from the removal of the glacial ice load from the last ice age).

The latest research by a team led by Xiaoping Wu from Caltech’s Jet Propulsion Laboratory concludes that the adjustment models that were being used by previous researchers working with the GRACE data didn’t do that great of a job. Wu and colleagues enhanced the existing models by incorporating land movements from a network of GPS sensors, and employing more sophisticated statistics. What they found has been turning heads.

Using the GRACE measurements and the improved model, the new estimates of the rates of ice loss from Greenland and Antarctica  are only about half as much as the old ones.

Instead of Greenland losing ~230 gigatons of ice each year since 2002, the new estimate is 104 Gt/yr. And for Antarctica, the old estimate of ~150 Gt/yr has been modified to be about 87 Gt/yr.

 How does this translate into sea level rise?

 It takes about 37.4 gigatons of ice loss to raise the global sea level 0.1 millimeter—four hundredths of an inch. In other words, ice loss from Greenland is currently contributing just over one-fourth of a millimeter of sea level rise per year, or one one-hundreth of an inch.  Antarctica’s contribution is just under one-fourth of a millimeter per year.  So together, these two regions—which contain 99% of all the land ice on earth—are losing ice at a rate which leads to an annual sea level rise of one half of one millimeter per year. This is equivalent to a bit less than 2 hundredths of an inch per year.  If this continues for the next 90 years, the total sea level rise contributed by Greenland and Antarctica by the year 2100 will amount to less than 2 inches.

 Couple this with maybe 6-8 inches from the fact that the ocean rises with increasing temperature,  temperatures and 2-3 inches from melting of other land-based ice, and you get a sum total of about one foot of additional rise by century’s end.

 This is about 1/3rd of the 1 meter estimates and 1/20th of the 6 meter estimates.

Things had better get cooking in a hurry if the real world is going to approach these popular estimates. And there are no signs that such a move is underway.

So far, the 21st century has been pretty much of a downer for global warming alarmists. Not only has the earth been warming at a rate considerably less than the average rate projected by climate models, but now the sea level rise is suffering a similar fate.

Little wonder that political schemes purporting to save us from these projected (non)calamities are also similarly failing to take hold.

References:

Nick, F. M., et al., 2009. Large-scale changes in Greenland outlet glacier dynamics triggered at the terminus. Nature Geoscience, DOI:10.1038, published on-line January 11, 2009.

Titus, J.G., et al., 2009. State and Local Governments Plan for Development of Most Land Vulnerable to Rising Sea Level along the U.S. Atlantic Coast, Environmental Research Letters 4 044008. (doi: 10.1088/1748-9326/4/4/044008).

Wu, X., et al., 2010. Simultaneous estimation of global present-day water treansport and glacial isostatic adjustment. Nature Geoscience, published on-line August 15, 2010, doi: 10.1038/NGE0938.