Tag: climate change

Tuesday Links

Thursday Links

Congress: The Least Dangerous Branch

That’s the topic of my Washington Examiner column this week. In it, I discuss last week’s budget battle and the failure of “policy riders” designed to rein in the Obama EPA’s attempts to regulate greenhouse gases without a congressional vote specifically authorizing it. The Obama team believes it has the authority to implement comprehensive climate change regulation, Congress be damned. Worse still, under current constitutional law–which has little to do with the actual Constitution–they’re probably right. Thanks to overbroad congressional delegation, “the Imperial Presidency Comes in Green, Too.” At home and abroad, the legislative branch sits on the sidelines as the executive state makes the law and wages war, despite the fact that “all legislative powers” the Constitution grants are vested in Congress, among them the power “to declare War.”

Yet, as I point out in the column, Congress retains every power the Constitution gave it–powers broad enough that talk of “co-equal branches” is a misnomer. Excerpt:

The constitutional scholar Charles Black once commented, “My classes think I am trying to be funny when I say that, by simple majorities,” Congress could shrink the White House staff to one secretary, and that, with a two-thirds vote, “Congress could put the White House up at auction.” (I sometimes find myself wishing they would.)

But Professor Black wasn’t trying to be funny: it’s in Congress’s power to do that. And if Congress can sell the White House, surely it can defund an illegal war and rein in a runaway bureaucracy.

If they don’t, it’s because they like the current system. And why wouldn’t they? It lets them take credit for passing high-minded, vaguely worded statutes, and take it again by railing against the bureaucracy when it imposes costs in the course of deciding what those statutes mean.

Last year, in the journal White House Studies [.pdf], I explored some of the reasons we’ve drifted so far from the original design:

Federalist 51 envisions a constitutional balance of power reinforced by the connection
between “the interests of the man and the constitutional rights of the place.” Yet, as NYU‘s Daryl Levinson notes, ―beyond the vague suggestion of a psychological identification between official and institution, Madison failed to offer any mechanism by which this connection would take hold…. for most members, the psychological identification with party appears greatly to outweigh loyalty to the institution. Levinson notes that when one party holds both branches, presidential vetoes greatly decrease, and delegation skyrockets. Under unified government, “the shared policy goals of, or common sources of political reward for, officials in the legislative and executive branches create cross-cutting, cooperative political dynamics rather than conflictual ones.”

Individual presidents have every reason to protect and expand their power; but individual senators and representatives lack similar incentive to defend Congress’s constitutional prerogatives. “Congress” is an abstraction. Congressmen are not, and their most basic interest is getting reelected. Ceding power can be a means toward that end: it allows members to have their cake and eat it too. They can let the president launch a war, reserving the right to criticize him if things go badly. And they can take credit for passing high-minded, vaguely worded statutes, and take it again by railing against the executive-branch bureaucracy when it imposes costs in the course of deciding what those statutes mean.

In David Schoenbrod’s metaphor, modern American governance is a “shell game,” with We the People as the rubes.  That game will go on unless and until the voters start holding Congress accountable for dodging responsibility.

The Current Wisdom: Overplaying the Human Contribution to Recent Weather Extremes

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

**********

 The recent publication of two articles in Nature magazine proclaiming a link to rainfall extremes (and flooding) to global warming, added to the heat in Russia and the floods in Pakistan in the summer of 2010, and the back-to-back cold and snowy winters in the eastern U.S. and western Europe, have gotten a lot of public attention.  This includes a recent hearing in the House of Representatives, despite its Republican majority.  Tying weather extremes to global warming, or using them as “proof” that warming doesn’t exist (see: snowstorms), is a popular rhetorical flourish by politicos of all stripes.  

The hearing struck many as quite odd, inasmuch as it is much clearer than apocalyptic global warming that the House is going to pass meaningless legislation commanding the EPA to cease and desist from regulating greenhouse gas emissions.  “Meaningless” means that it surely will not become law.  Even on the long-shot probability that it passes the Senate, the President will surely veto, and there are nowhere near enough votes to override such an action.

Perhaps “wolf!” has been cried yet again.  A string of soon-to-be-published papers in the scientific literature finds that despite all hue and cry about global warming and recent extreme weather events, natural climate variability is to blame.

Where to start?  How about last summer’s Russian heat wave?

The Russian heat wave (and to some degree the floods in Pakistan) have been linked to the same large-scale, stationary weather system, called an atmospheric “blocking” pattern. When the atmosphere is “blocked” it means that it stays in the same configuration for period of several weeks (or more) and keeps delivering the same weather to the same area for what can seem like an eternity to people in the way.  Capitalizing on the misery in Russia and Pakistan, atmospheric blocking was added to the list of things that were supposed to be “consistent with” anthropogenically stimulated global warming which already, of course included heat waves and floods. And thus the Great Russian Heat Wave of 2010 became part of global warming lore.

But then a funny thing happened – scientists with a working knowledge of atmospheric dynamics started to review the situation and found scant evidence for global warming.

The first chink in the armor came back in the fall of 2010, when scientists from the Physical Sciences Division (PSD) of the Earth System Research Laboratory (ESRL) of the National Oceanic and Atmospheric Administration (NOAA) presented the results of their preliminary investigation on the web , and concluded that “[d]espite this strong evidence for a warming planet, greenhouse gas forcing fails to explain the 2010 heat wave over western Russia. The natural process of atmospheric blocking, and the climate impacts induced by such blocking, are the principal cause for this heat wave.”

The PSD folks have now followed this up with a new peer-reviewed article in the journal Geophysical Research Letters that rejects the global warming explanation. The paper is titled “Was There a Basis for Anticipating the 2010 Russian Heat Wave?” Turns out that there wasn’t.

To prove this, the research team, led by PSD’s Randall Dole, first reviewed the observed temperature history of the region affected by the heat wave (western Russia, Belarus, the Ukraine, and the Baltic nations). To start, they looked at the recent antecedent conditions: “Despite record warm globally-averaged surface temperatures over the first six months of 2010, Moscow experienced an unusually cold winter and a relatively mild but variable spring, providing no hint of the record heat yet to come.” Nothing there.

Then they looked at the long-term temperature record: “The July surface temperatures for the region impacted by the 2010 Russian heat wave shows no significant warming trend over the prior 130-year period from 1880 to 2009…. A linear trend calculation yields a total temperature change over the 130 years of -0.1°C (with a range of 0 to -0.4°C over the four data sets [they examined]).” There’s not a hint of a build-up to a big heat wave.

And as to the behavior of temperature extremes: “There is also no clear indication of a trend toward increasing warm extremes. The prior 10 warmest Julys are distributed across the entire period and exhibit only modest clustering earlier in this decade, in the 1980s and in the 1930s…. This behavior differs substantially from globally averaged annual temperatures, for which eleven of the last twelve years ending in 2006 rank among the twelve warmest years in the instrumental record since 1850….”

With regard any indication that “global” warming was pushing temperatures higher in Russia and thus helped to fuel the extreme heat last summer, Dole et al. say this: “With no significant long-term trend in western Russia July surface temperatures detected over the period 1880-2009, mean regional temperature changes are thus very unlikely to have contributed substantially to the magnitude of the 2010 Russian heat wave.”

Next the PSD folks looked to see if the existing larger-scale antecedent conditions, fed into climate models would produce the atmospheric circulation patterns (i.e. blocking) that gave rise to the heat wave.  The tested “predictors” included patterns of sea surface temperature and arctic ice coverage, which most people feel have been subject to some human influence.  No relationship: “These findings suggest that the blocking and heat wave were not primarily a forced response to specific boundary conditions during 2010.”

In fact, the climate models exhibited no predilection for projecting increases in the frequency of atmospheric blocking patterns over the region as greenhouse gas concentrations increased. Just the opposite: “Results using very high-resolution climate models suggest that the number of Euro-Atlantic blocking events will decrease by the latter half of the 21st century.”

At this point, Dole and colleagues had about exhausted all lines of inquiry and summed things up:

 Our analysis points to a primarily natural cause for the Russian heat wave. This event appears to be mainly due to internal atmospheric dynamical processes that produced and maintained an intense and long-lived blocking event. Results from prior studies suggest that it is likely that the intensity of the heat wave was further increased by regional land surface feedbacks. The absence of long-term trends in regional mean temperatures and variability together with the model results indicate that it is very unlikely that warming attributable to increasing greenhouse gas concentrations contributed substantially to the magnitude of this heat wave.

Can’t be much clearer than that.

But that was last summer. What about the past two winters? Both were very cold in the eastern U.S. with record snows events and/or totals scattered about the country.

Cold, snow, and global warming? On Christmas Day 2010, the New York Times ran an op-ed by Judah Cohen, a long-range forecaster for the private forecasting firm Atmospheric and Environmental Research, outlining his theory as to how late summer Arctic ice declines lead to more fall snow cover across Siberia which in turn induces atmospheric circulation patterns to favor snowstorms along the East Coast of the U.S. Just last week, the Union of Concerned Scientists held a news conference where they handed out a press release  headlined “Climate Change Makes Major Snowstorms Likely.” In that release, Mark Serreze, director of the National Snow and Ice Data Center, laid out his theory as to how the loss of Arctic sea ice is helping to provide more moisture to fuel winter snowstorms across the U.S. as well as altering atmospheric circulation patterns into a preferred state for big snowstorms. Weather Underground’s Jeff Masters chimed in with “Heavy snowstorms are not inconsistent with a warming planet.”

As is the wont for this Wisdom, let’s go back to the scientific literature.

Another soon-to-be released paper to appear in Geophysical Research Letters describes the results of using the seasonal weather prediction model from the European Center for Medium-Range Weather Forecasts (ECMWF) to help untangle the causes of the unusual atmospheric circulation patterns that gave rise to the harsh winter of 2009-2010 on both sides of the Atlantic. A team of ECMWF scientists led by Thomas Jung went back and did experiments changing initial conditions that were fed into the ECMWF model and then assessed how well the model simulated the known weather patterns of the winter of 2009-2010. The different set of initial conditions was selected so as to test all the pet theories behind the origins of the harsh winter.  Jung et al. describe their investigations this way: “Here, the origin and predictability of the unusual winter of 2009/10 are explored through numerical experimentation with the ECMWF Monthly forecasting system. More specifically, the role of anomalies in sea surface temperature (SST) and sea ice, the tropical atmospheric circulation, the stratospheric polar vortex, solar insolation and near surface temperature (proxy for snow cover) are examined.”

Here is what they found after running their series of experiments.

Arctic sea ice and sea surface temperature anomalies.  These are often associated with global warming caused by people. Finding:  “These results suggest that neither SST nor sea ice anomalies explain the negative phase of the NAO during the 2009/10 winter.”

(NAO are the commonly used initials for the North Atlantic Oscillation – and atmospheric circulation pattern that can act to influence winter weather in the eastern U.S. and western Europe. A negative phase of the NAO is associated with cold and stormy weather and during the winter of 2009-10, the NAO value was the lowest ever observed.)

A global warming-induced weakening stratospheric (upper-atmosphere) jetstream. “Like for the other experiments, these stratospheric relaxation experiments fail to reproduce the magnitude of the observed NAO anomaly.”

Siberian snow cover.  “The resulting [upper air patterns] show little resemblance with the observations…. The implied weak role of snow cover anomalies is consistent with other research….”

Solar variability.  “The experiments carried out in this study suggest that the impact of anomalously low incoming [ultraviolet] radiation on the tropospheric circulation in the North Atlantic region are very small… suggesting that the unusually low solar activity contributed little, if any, to the observed NAO anomaly during the 2009/10 winter.”

Ok then, well what did cause the unusual weather patterns during the 2009-10 winter?

The results of this study, therefore, increase the likelihood that both the development and persistence of negative NAO phase resulted from internal atmospheric dynamical processes.

Translation: Random variability.

To drive this finding home, here’s another soon-to-be-released paper (D’Arrigo et al., 2001) that uses tree ring-based reconstructions of atmospheric circulation patterns and finds a similar set of conditions (including a negative NAO value second only to the 2009-10 winter) was responsible for the historically harsh winter of 1783-84 in the eastern U.S. and western Europe, which  was widely noted by historians. It followed the stupendous eruption of the Icelandic volcano Laki the previous summer. The frigid and snowy winter conditions have been blamed on the volcano. In fact, Benjamin Franklin even commented as much.

But in their new study, Roseanne D’Arrigo and colleagues conclude that the harshness of that winter primarily was the result of anomalous atmospheric circulation patterns that closely resembled those observed during the winter of 2009-10, and that the previous summer’s volcanic eruption played a far less prominent role:

Our results suggest that Franklin and others may have been mistaken in attributing winter conditions in 1783-4 mainly to Laki or another eruption, rather than unforced variability.

Similarly, conditions during the 2009-10 winter likely resulted from natural [atmospheric] variability, not tied to greenhouse gas forcing… Evidence thus suggests that these winters were linked to the rare but natural occurrence of negative NAO and El Niño events.

The point is that natural variability can and does produce extreme events on every time scale, from days (e.g., individual storms), weeks (e.g., the Russian heat wave), months (e.g., the winter of 2009-10), decades (e.g., the lack of global warming since 1998), centuries (e.g., the Little Ice Age), millennia (e.g., the cycle of major Ice Ages), and eons (e.g., snowball earth).

Folks would do well to keep this in mind next time global warming is being posited for the weather disaster du jour. Almost assuredly, it is all hype and little might.

Too bad these results weren’t given a “hearing” in the House!

References:

D’Arrigo, R., et al., 2011. The anomalous winter of 1783-1784: Was the Laki eruption or an analog of the 2009–2010 winter to blame? Geophysical Research Letters, in press.

Dole, R., et al., 2011. Was there a basis for anticipating the 2010 Russian heat wave? Geophysical Research Letters, in press.

Jung et al., 2011. Origin and predictability of the extreme negative NAO winter of 2009/10. Geophysical Research Letters, in press.

Min, S-K., et al., 2011. Human contribution to more-intense precipitation extremes. Nature, 470, 378-381.

Pall, P., et al., 2011. Anthropogenic greenhouse gas contribution to flood risk in England and Wales in autumn 2000. Nature, 470, 382-386.

Supreme Court Takes Up Butterfly Effect

As Congress debates cap-and-trade, new fuel standards, and subsidies for “green” companies, some still feel that political solutions to global warming are not moving fast enough. In the present case, American Electric Power Co. v. Connecticut, eight states and New York City sued several public utilities (including the federal Tennessee Valley Authority), alleging that their carbon dioxide emissions contribute to global warming.

This is the third major lawsuit to push global warming into the courts (another being Comer v. Murphy Oil USA, in which Cato also filed a brief). All of these suits try to use the common law doctrine of nuisance—which, for example, lets you sue your neighbor if his contaminated water flows onto your land and kills your lawn—to attack carbon emitters. None of them had gotten very far until the Second Circuit vacated a lower-court ruling and allowed the claims here to proceed.

But the judiciary was not meant to be the sole method for resolving grievances with the government—even if everything looks like a nail to lawyers who only have a hammer. After all, there are two other co-equal branches, the legislative and executive, which are constitutionally committed to unique roles in our system of separation of powers. The doctrine of “standing” exists in part to ensure that the judiciary is not used to solve issues that properly belong to those other branches. Toward this end, the Constitution allows courts to hear only actual “cases or controversies” that can feasibly be resolved by a court.

Cato thus filed a brief supporting the defendant utilities’ successful request for Supreme Court review, and has now filed another brief supporting their position before the Court. Cato’s latest brief first argues that no judicial solution is possible here because the chain of causation between the defendants’ carbon emissions and the alleged harm caused by global warming is so attenuated that it resembles the famed “butterfly effect.” Just as butterflies should not be sued for causing tsunamis, a handful of utility companies in the Northeastern United States should not be sued for the complex (and disputed) harms of global warming.

Second, we contend that, even if the plaintiffs can demonstrate causation, it is unconstitutional for courts to make nuanced policy decisions that should be left to the legislature—and this is true regardless of the science of global warming. Just as it’s improper for a legislature to pass a statute punishing a particular person (bill of attainder), it’s beyond courts’ constitutional authority—under the “political question doctrine”—to determine wide-ranging policies in which numerous considerations must be weighed in anything but an adversarial litigation process.

If a court were to adjudicate the claims here and issue an order dictating emissions standards, two things will happen: 1) the elected branches will be encouraged to abdicate to the courts their responsibilities for addressing complex and controversial policy issues, and 2) an already difficult situation would become nearly intractable as regulatory agencies and legislative actors butt heads with court orders issued across the country in quickly multiplying global warming cases. These inevitable outcomes are precisely why the standing and political question doctrines exist.

Dissatisfaction with the decisions and pace of government does not give someone the right to sue over anything. Or, as Chief Justice Marshall once said, “If the judicial power extended to every question under the laws of the United States … [t]he division of power [among the branches of government] could exist no longer, and the other departments would be swallowed up by the judiciary.”

The Supreme Court will hear arguments in American Electric Power Co. v. Connecticut on April 19.

Special thanks to Trevor Burrus, who contributed to this post.

The Current Wisdom: The Short-Term Climate Trend Is Not Your Friend

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

***********

It seems like everyone, from exalted climate scientists to late-night amateur tweeters, can get a bit over-excited about short-term fluctuations, reading into them deep cosmic and political meaning, when they are likely the statistical hiccups of our mathematically surly atmosphere.

There’s been some major errors in forecasts of recent trends. Perhaps the most famous  were made by NASA’s James Hansen in 1988, who overestimated warming between then and now by a whopping 40% or so.

But it is easy to  get snookered by short-term fluctuations.  As shown in Figure 1, it is quite obvious that there has been virtually no net change in temperature since 1997, allowing for the fact that measurement errors in global average surface temperature are easily a tenth of a degree or more. (The magnitude of those errors will be considered in a future Current Wisdom).

Figure 1. Annual global average surface temperature anomaly (°C), 1997-2010 (data source: Hadley Center).

Some who are concerned about environmental regulation without good science have seized upon this 13-year stretch as “proof” that there is no such thing as global warming driven by carbon dioxide.  More on that at the end of this Wisdom.

Similarly, periods of seemingly rapid warming can prompt scientists to see changes where there aren’t any.

Consider a landmark paper published in 2000 in Geophysical Research Letters by Tom Karl, a prominent researcher who is the head of our National Climatic Data Center (NCDC) and who just finished a stint as President of the American Meteorological Society.  He couldn’t resist the climatic blip that was occurred prior  to the current stagnation of warming, namely the very warm episode of the late 1990s. 

Cooler heads at the time noted that it was an artifact of the great El Nino of 1997-98, a periodic warming of the tropical Pacific that has been coming and going for millions of years. 

Nonetheless, the paper was published and accompanied by a flashy press release titled “Global warming may be accelerating.”  

What Karl did was to examine the 16 consecutive months of record-high temperatures (beginning in May, 1997) and to calculate the chance that this could happen, given the fairly pokey warming rate—approximately 0.17°C (0.31°F) per  decade, that was occurring.  He concluded there was less than a five percent probability, unless the warming rate had suddenly increased.

From the press release:

Karl and colleagues conclude that there is only a small chance that the string of record high temperatures in 1997-98 was simply an unusual event, rather than a change point, the start of a new and faster ongoing trend.

He also gave a number:  “…the probability of observing the record temperatures is more likely with high average rates of warming, around 3°C [5.4°F]/century,” which works out to 0.3°C per decade.

Our Figure 2 shows what was probabilistically forecast beginning in May, 1997, and what actually happened.  Between then and now, according to this paper, global temperatures should have warmed around 0.4°C (0.7°F).  The observed warming rate for the last 13.5 years—which includes the dramatically warming temperatures beginning in 1997—was a paltry 0.06°C (0.11°F) per decade. 

Figure 2. Prior to mid-1997, the observed warming trend (dashed line) was 0.17°/decade.  Karl said there was a greater than 95% probability that 1997-8 would mark a “change point”, where warming would accelerate to around 0.30°/decade.  Since then, the rate has been 0.06°/decade, or 20% of what was forecast.

Karl did provide some statistical wiggle room.  While noting the less than 5% chance that the warming rate hadn’t increased, he wrote that “unusual events can occur” and that there still was a chance (given as less than 5%) that 97-98 was just a statistical hiccup, which it ultimately proved to be.

The press release couldn’t resist the “it’s worse than we thought” mindset that pervades climate science:

Since completing the research, the data for 1999 has been compiled.  The researchers found that 1999 was the fifth warmest year on record, although as a La Nina year it would normally be cooler” [than what?ed].

“La Nina” is cool phase of El Nino, which drops temperatures about as much as El Nino raises them. What the press release and the GRL paper completely neglected to mention is that the great warm year of 1998 was a result of the “natural”  El Nino superimposed upon the overall slight warming trend.

In other words, there was every reason to believe at that time that the anomalous temperatures were indeed a statistical blip resulting from a very high-amplitude version of a natural oscillation in the earth’s climate that occurred every few years.

Now, back to the last 13 years. The puny recent changes may also just be our atmosphere’s make-up call for the sudden warming of the late 1990s, or another hiccup.

It is characteristic for climate models whose carbon dioxide increase resembles that which is being observed to produce constant rates of warming.  There’s a good reason for this.  Temperature responds logarithmically—i.e.less and less—to changes in this gas as its concentration increases.  But the concentration tends to increase exponentially—i.e. more and more.  The combination of an increasingly damped response to an ever increasing rate of input tends to resemble a straight line, or a constant rate of warming.

Indeed, Karl noted in his paper (and I have noted in virtually every public lecture I give), that “projections of temperature change in the next [i.e. the 21st] century, using [the United Nations’] business as usual scenarios…have relatively constant rates of global temperature increase”.  It’s just that their constant rates tend to be higher than the one that is being observed.  The average rate of warming predicted for this century by the UN is about 2.5°C, while the observed value has been, as predicted, constant—but with a lower value of 1.7°.  As Figure 3 shows, this rate has been remarkably constant for over three decades.

 

Figure 3. Annual global average surface temperature anomaly (°C), 1976-2010 (data source: Hadley Center).  It’s hard to imagine a more constant trend, despite the 1998 peak and the subsequent torpid warming.

The bottom line is that short-term trends are not your friends when talking about long-term climate change.

References

Hansen, J.E., et al., 1988. Global climate changes as forecast by Goddard Institute for Space Studies three-dimensional model. Journal of Geophysical Research, 93, 9341-9364.

Karl, T. R., R. W. Knight, and B. Baker, 2000. The record breaking global temperatures of 1997 and 1998” Evidence for an increase in the rate of global warming? Geophysical Research Letters, 27, 719-722.

Michaels, P. J., and P. C. Knappenberger, 2009. Scientific Shortcomings in the EPA’s Endangerment Finding from Greenhouse Gases, Cato Journal, 29, 497-521, http://www.cato.org/pubs/journal/cj29n3/cj29n3-8.pdf.

Topics:

The Current Wisdom: Better Model, Less Warming

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.


Better Model, Less Warming

Bet you haven’t seen this one on TV:  A newer, more sophisticated climate model has lost more than 25% of its predicted warming!  You can bet that if it had predicted that much more warming it would have made the local paper.

The change resulted from a more realistic simulation of the way clouds work, resulting in a major reduction in the model’s “climate sensitivity,” which is the amount of warming predicted for a doubling of  the concentration of atmospheric carbon dioxide over what it was prior to the industrial revolution.

Prior to the modern era, atmospheric carbon dioxide concentrations, as measured in air trapped in ice in the high latitudes (which can be dated year-by-year) was pretty constant, around 280 parts per million (ppm).  No wonder CO2 is called a “trace gas”—there really is not much of it around.

The current concentration is pushing about 390 ppm, an increase of about 40% in 250 years.  This is a pretty good indicator of the amount of “forcing” or warming pressure that we are exerting on the atmosphere.  Yes, there are other global warming gases going up, like the chlorofluorocarbons (refrigerants now banned by treaty), but the modern climate religion is that these are pretty much being cancelled by reflective  “aerosol” compounds that go in the air along with the combustion of fossil fuels, mainly coal.

Most projections have carbon dioxide doubling to a nominal 600 ppm somewhere in the second half of this century, absent no major technological changes (which history tells us is a very shaky assumption).  But the “sensitivity” is not reached as soon as we hit the doubling, thanks to the fact that it takes a lot of time to warm the ocean (like it takes a lot of time to warm up a big pot of water with a small burner).

So the “sensitivity” is much closer to the temperature rise that a model projects about 100 years from now – assuming (again, shakily) that we ultimately switch to power sources that don’t release dreaded CO2 into the atmosphere somewhere around the time its concentration doubles.

The bottom line is that lower sensitivity means less future warming as a result of anthropogenic greenhouse gas emissions. So our advice… keep on working on the models, eventually, they may actually arrive at something close puny rate of warming that is being observed

At any rate, improvements to the Japanese-developed Model for Interdisciplinary Research on Climate (MIROC) are the topic of a new paper by Masahiro Watanabe and colleagues in the current issue of the Journal of Climate. This modeling group has been working on a new version of their model (MIROC5) to be used in the upcoming 5th Assessment Report of the United Nations’ Intergovernmental Panel on Climate Change, due in late 2013. Two incarnations of the previous version (MIROC3.2) were included in the IPCC’s 4th Assessment Report (2007) and contribute to the IPCC “consensus” of global warming projections.

The high resolution version (MIROC3.2(hires)) was quite a doozy – responsible for far and away the greatest projected global temperature rise (see Figure 1). And the medium resolution model (MIROC3.2(medres)) is among the Top 5 warmest models. Together, the two MIROC models undoubtedly act to increase the overall model ensemble mean warming projection and expand the top end of the “likely” range of temperature rise.

FIGURE 1

Global temperature projections under the “midrange” scenario for greenhouse-gas emissions produced by the IPCC’s collection of climate models.  The MIROC high resolution model (MIROC3.2(hires)) is clearly the hottest one, and the medium range one isn’t very far behind.

The reason that the MIROC3.2 versions produce so much warming is that their  sensitivity is very high, with the high-resolution  at 4.3°C (7.7°F) and the medium-resolution  at  4.0°C (7.2°F).  These sensitivities are very near the high end of the distribution of climate sensitivities from the IPCC’s collection of models (see Figure 2).

FIGURE 2

Equilibrium climate sensitivities of the models used in the IPCC AR4 (with the exception of the MIROC5). The MIROC3.2 sensitivities are highlighted in red and lie near the upper und of the collection of model sensitivities.  The new, improved, MIROC5, which was not included in the IPCC AR4, is highlighted in magenta, and lies near the low end of the model climate sensitivities (data from IPCC Fourth Assessment Report, Table 8.2 and Watanabe et al., 2010).

Note that the highest sensitivity is not necessarily in the hottest model, as observed warming is dependent upon how the model deals with the slowness of the oceans to warm.

The situation is vastly different in the new MIROC5 model.  Watanabe et al. report that the climate sensitivity is now  2.6°C (4.7°F) – more than 25% less than in the previous version on the model.[1] If the MIROC5 had been included in the IPCC’s AR4 collection of models, its climate sensitivity of 2.6°C would have been found near the low end of the distribution (see Figure 2), rather than pushing the high extreme as MIROC3.2 did.

And to what do we owe this large decline in the modeled climate sensitivity?  According to Watanabe et al., a vastly improved handling of cloud processes involving “a prognostic treatment for the cloud water and ice mixing ratio, as well as the cloud fraction, considering both warm and cold rain processes.”  In fact, the improved cloud scheme—which produces clouds which compare more favorably with satellite observations—projects that under a warming climate low altitude clouds become a negative feedback rather than acting as positive feedback as the old version of the model projected.[2] Instead of enhancing the CO2-induced warming, low clouds are now projected to retard it.

Here is how Watanabe et al. describe their results:

A new version of the global climate model MIROC was developed for better simulation of the mean climate, variability, and climate change due to anthropogenic radiative forcing….

MIROC5 reveals an equilibrium climate sensitivity of 2.6K, which is 1K lower than that in MIROC3.2(medres)…. This is probably because in the two versions, the response of low clouds to an increasing concentration of CO2 is opposite; that is, low clouds decrease (increase) at low latitudes in MIROC3.2(medres) (MIROC5).[3]

Is the new MIROC model perfect? Certainly not.  But is it better than the old one? It seems quite likely.  And the net result of the model improvements is that the climate sensitivity and therefore the warming projections (and resultant impacts) have been significantly lowered. And much of this lowering comes as the handling of cloud processes—still among the most uncertain of climate processes—is improved upon. No doubt such improvements will continue into the future as both our scientific understanding and our computational abilities increase.

Will this lead to an even greater reduction in climate sensitivity and projected temperature rise?  There are many folks out there (including this author) that believe this is a very distinct possibility, given that observed warming in recent decades is clearly beneath the average predicted by climate models. Stay tuned!

References:

Intergovernmental Panel on Climate Change, 2007.  Fourth Assessment Report, Working Group 1 report, available at http://www.ipcc.ch.

Watanabe, M., et al., 2010. Improved climate simulation by MIROC5: Mean states, variability, and climate sensitivity. Journal of Climate, 23, 6312-6335.


[1] Watanabe et al. report that the sensitivity of MIROC3.2 (medres) is 3.6°C (6.5°), which is less that what was reported in the 2007 IPCC report.  So 25% is likely a conservative estimate of the reduction in warming.

[2] Whether enhanced cloudiness enhances or cancels carbon-dioxide warming is one of the core issues in the climate debate, and is clearly not “settled” science.

[3] Degrees Kelvin (K) are the same as degrees Celsius (C) when looking at relative, rather than absolute temperatures.