Tag: climate change

With or Without a “Pause” Climate Models Still Project Too Much Warming

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

A new paper just hit the scientific literature that argues that the apparent pause in the rise in global average surface temperatures during the past 16 years was really just a slowdown. 

As you may imagine, this paper, by Kevin Cowtan and Robert Way is being hotly discussed in the global warming blogs, with reaction ranging from a warm embrace by the global-warming-is-going-to-be-bad-for-us crowd to revulsion from the human-activities-have-no-effect-on-the-climate claque.

The lukewarmers (a school we take some credit for establishing) seem to be taking the results in stride.  After all, the “pause” as curious as it is/was, is not central to the primary argument that, yes, human activities are pressuring the planet to warm, but that the rate of warming is going to be much slower than is being projected by the collection of global climate models (upon which mainstream projections of future climate change—and the resulting climate alarm (i.e., calls for emission regulations, etc.)—are based).

Under the adjustments to the observed global temperature history put together by Cowtan and Way, the models fare a bit better than they do with the unadjusted temperature record. That is, the observed temperature trend over the past 34 years (the period of record analyzed by Cowtan and Way) is a tiny bit closer to the average trend from the collection of climate models used in the new report from the U.N.’s Intergovernmental Panel on Climate Change (IPCC) than is the old temperature record.

Specifically, while the trend in observed global temperatures from 1979-2012 as calculated by Cowtan and Way is 0.17°C/decade, it is 0.16°C/decade in the temperature record compiled by the U.K. Hadley Center (the record that Cowtan and Way adjusted).  Because of the sampling errors associated with trend estimation, these values are not significantly different from one another.  Whether the 0.17°C/decade is significantly different from the climate model average simulated trend during that period of 0.23°C/decade is discussed extensively below.

But, suffice it to say that an insignificant difference of 0.01°C/decade in the global trend measured over more than 30 years is pretty small beer and doesn’t give model apologists very much to get happy over.

Instead, the attention is being deflected to “The Pause”—the leveling off of global surface temperatures during the past 16 years (give or take). Here, the new results from Cowtan and Way show that during the period 1997-2012, instead of a statistically insignificant rise at a rate of 0.05°C/decade as is contained in the “old” temperature record, the rise becomes a statistically significant 0.12°C/decade. “The Pause” is transformed into “The Slowdown” and alarmists rejoice because global warming hasn’t stopped after all. (If the logic sounds backwards, it does to us as well, if you were worried about catastrophic global warming, wouldn’t you rejoice at findings that indicate that future climate change was going to be only modest, more so than results to the contrary?)

The science behind the new Cowtan and Way research is still being digested by the community of climate scientists and other interested parties alike. The main idea is that the existing compilations of the global average temperature are very data-sparse in the high latitudes. And since the Arctic (more so than the Antarctic) is warming faster than the global average, the lack of data there may mean that the global average temperature trend may be underestimated. Cowtan and Way developed a methodology which relied on other limited sources of temperature information from the Arctic (such as floating buoys and satellite observations) to try to make an estimate of how the surface temperature was behaving in regions lacking more traditional temperature observations (the authors released an informative video explaining their research which may better help you understand what they did). They found that the warming in the data-sparse regions was progressing faster than the global average (especially during the past couple of years) and that when they included the data that they derived for these regions in the computation of the global average temperature, they found the global trend was higher than previously reported—just how much higher depended on the period over which the trend was calculated. As we showed, the trend more than doubled over the period from 1997-2012, but barely increased at all over the longer period 1979-2012.

Figure 1 shows the impact on the global average temperature trend for all trend lengths between 10 and 35 years (incorporating  our educated guess as to what the 2013 temperature anomaly will be), and compares that to the distribution of climate model simulations of the same period. Statistically speaking, instead of there being a clear inconsistency (i.e., the observed trend value falls outside of the range which encompasses 95% of all modeled trends) between the observations and the climate mode simulations for lengths ranging generally from 11 to 28 years and a marginal inconsistency (i.e., the observed trend value falls outside of the range which encompasses 90% of all modeled trends)  for most of the other lengths, now the observations track closely the marginal inconsistency line, although trends of length 17, 19, 20, 21 remain clearly inconsistent with the collection of modeled trends. Still, throughout the entirely of the 35-yr period (ending in 2013), the observed trend lies far below the model average simulated trend (additional information on the impact of the new Cowtan and Way adjustments on modeled/observed temperature comparison can be found here).

 

Figure 1. Temperature trends ranging in length from 10 to 35 years (ending in a preliminary 2013) calculated using the data from the U.K. Hadley Center (blue dots), the adjustments to the U.K. Hadley Center data made by Cowtan and Way (red dots) extrapolated through 2013, and the average of climate model simulations (black dots). The range that encompasses 90% (light grey lines) and 95% (dotted black lines) of climate model trends is also included.

The Cowtan and Way analysis is an attempt at using additional types of temperature information, or extracting “information” from records that have already told their stories, to fill in the missing data in the Arctic.  There are concerns about the appropriateness of both the data sources and the methodologies applied to them.  

A major one is in the applicability of satellite data at such high latitudes.   The nature of the satellite’s orbit forces it to look “sideways” in order to sample polar regions.  In fact, the orbit is such that the highest latitude areas cannot be seen at all.  This is compounded by the fact that cold regions can develop substantial “inversions” of near-ground temperature, in which temperature actually rises with height such that there is not a straightforward relationship between the surface temperature and the temperature of the lower atmosphere where the satellites measure the temperature. If the nature of this complex relationship is not constant in time, an error is introduced into the Cowtan and Way analysis.

Another unresolved problem comes up when extrapolating land-based weather station data far into the Arctic Ocean.  While land temperatures can bounce around a lot, the fact that much of the ocean is partially ice-covered for many months.  Under “well-mixed” conditions, this forces the near-surface temperature to be constrained to values near the freezing point of salt water, whether or not the associated land station is much warmer or colder.

You can run this experiment yourself by filling a glass with a mix of ice and water and then making sure it is well mixed.  The water surface temperature must hover around 33°F until all the ice melts.  Given that the near-surface temperature is close to the water temperature, the limitations of land data become obvious.

Considering all of the above, we advise caution with regard to Cowtan and Way’s findings.  While adding high arctic data should increase the observed trend, the nature of the data means that the amount of additional rise is subject to further revision.  As they themselves note, there’s quite a bit more work to be done this area.

In the meantime, their results have tentatively breathed a small hint of life back into the climate models, basically buying them a bit more time—time for either the observed temperatures to start rising rapidly as current models expect, or, time for the modelers to try to fix/improve cloud processes, oceanic processes, and other process of variability (both natural and anthropogenic) that lie behind what would be the clearly overheated projections. 

We’ve also taken a look at how “sensitive” the results are to the length of the ongoing pause/slowdown.  Our educated guess is that the “bit” of time that the Cowtan and Way findings bought the models is only a few years long, and it is a fact, not a guess, that each additional year at the current rate of lukewarming increases the disconnection between the models and reality.

 

Reference:

Cowtan, K., and R. G. Way, 2013. Coverage bias in the HadCRUT4 temperature series and its impact on recent temperature trends. Quarterly Journal of the Royal Meteorological Society, doi: 10.1002/qj.2297.

 

Climate Models’ Tendency to Simulate Too Much Warming and the IPCC’s Attempt to Cover That Up

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels and Paul C. “Chip” Knappenberger, from Cato’s Center for the Study of Science, review interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

 

The biggest criticism to emerge so far regarding the new Fifth Assessment Report from the U.N.’s Intergovernmental Panel on Climate Change (IPCC) is that it generally fails to acknowledge how poorly climate model simulations of the earth’s temperature evolution compare with actual observations. If the models cannot accurately simulate known climate variability and change, using them for policy purposes is a fool’s errand.

There are two lines of evidence that converge to show that the climate models are largely failing to accurately simulate observed climate behavior.

The first is that a collection of about ten research papers (including 16 separate analyses) published in the scientific literature beginning in 2011 that collectively indicate that the earth’s equilibrium climate sensitivity—that is, how much the earth’s average surface temperature rises as a result of a doubling of the atmospheric carbon dioxide concentration—is about 2°C, give or take about 0.5°C (Figure 1). You can find details here

Figure 1. Climate sensitivity estimates from new research published since 2010 (colored), compared with range of estimates from the climate models incorporated into the IPCC Fifth Assessment Report (AR5; black). The arrows indicate the 5 to 95 percent confidence bounds for each estimate along with the best estimate (median of each probability density function; or the mean of multiple estimates; colored vertical line). Ring et al. (2012) present four estimates of the climate sensitivity and the red box encompasses those estimates. The light grey vertical bar is the mean of the 16 estimates from the new findings. The mean climate sensitivity (3.2°C) of the climate models used in the IPCC AR5 is 60 percent greater than the mean of recent estimates (2.0°C).

 

Band-aids Can’t Fix the New IPCC Report

The U.N.’s Intergovernmental Panel on Climate Change (IPCC) today released the Summary for Policymakers (SPM) of the physical science volume of its Fifth Assessment Report. The SPM is the most widely-read section of the IPCC reports and purports to summarize and highlight the contents of the thousand-odd pages of the full report. The SPM is agreed to word by word by the international attendees of the IPCC’s final editorial meeting which concluded as the SPM was released.

The Humpty Dumpty-esque report once claiming to represent the “consensus of scientists” has fallen from its exalted wall and cracked to pieces under the burdensome weight of its own cumbersome and self-serving processes, which is why all the governments’ scientists and all the governments’ men cannot put the IPCC report together again.

The pace of climate science far surpasses the glacial movements of large, cumbersome international efforts at consensus building, such as the IPCC, which is why the new report has experienced such a disastrous crack-up.

For example, just this past May, a blockbuster finding was published that the climate sensitivity—how much the earth’s average surface temperature increases as a response to increasing greenhouse gas concentrations—is some 40% less than the average value characteristic of the collection of climate models that the IPCC used to produce the projections of future climate change—projections which are at the heart of the IPCC reports. But by the time this paper was published (and several others with similar conclusions), it was far too late to go back and try to fix the climate models and then rerun the projections.

The  fact is that the IPCC’s climate models need fixing. Prima facie evidence is that they cannot even track the evolution of broadest measure of climate, the earth’s average temperature, for the last 10-20 years.  Despite this being widely obvious to everyone, it didn’t find its way into the scientific literature (although not without trying) until earlier this month.

As a result, the latest science on two key issues: how much the earth will warm as a result of human greenhouse gas emissions, and how well climate models perform in projecting the warming, are largely not incorporated in the new IPCC report.

Which renders the new IPCC report, and its “four years’ work by hundreds of experts” not only obsolete on its release, but completely useless as a basis to form opinions (or policy) related to human energy choices and their influence on the climate.

The IPCC report should be torn up and tossed out, and with it, the entire IPCC process which produced such a misleading (and potentially dangerous) document.

We review the problems with the new IPCC report and the political consequences of relying on it in a couple of recent op-eds, one in the National Review (“The IPCC Political Suicide Pill”) and the other at Fox News (“UN’s new climate change report an embarrassment, self-serving and beyond misleading”), as well as a myriad of blog posts.

US Carbon Dioxide Emissions Fall as Global Emissions Rise

A new report from the International Energy Agency is sparking headlines across the media. “Global carbon dioxide emissions soared to record high in 2012” proclaimed USA Today; The Weather Channel led “Carbon dioxide emissions rose to record high in 2012”; and the Seattle Post-Intelligencer added “The world pumped a record amount of carbon dioxide in the atmosphere in 2012.”

The figure below (taken from the IEA summary) provides the rest of the story.

It shows a breakdown of the change in carbon dioxide emissions from 2011 to 2012 from various regions of the globe.

 

Notice that the U.S. is far and away the leader in reducing carbon dioxide (CO2) emissions, while China primarily is responsible for pushing global CO2 emissions higher. In fact, CO2 emissions growth in China more than offsets all the CO2 savings that we have achieved in the U.S.

This will happen for the foreseeable future. Domestic actions to reduce carbon dioxide emissions will not produce a decline in the overall atmospheric carbon dioxide concentration.  The best we can hope to achieve is to slow the rate of growth of the atmospheric concentration—an effect that we can only achieve until our emissions are reduced to zero. The resulting climate impact is small and transient.

And before anyone goes and getting too uppity about the effectiveness of “green” measures in the U.S., the primary reason for the U.S. emissions decline is the result of new technologies from the fossil fuel industry that are leading to cheap coal being displaced by even cheaper natural gas for the generation of electricity. As luck would have it, the chemistry works out that that burning natural gas produces the same amount of energy for only about half of the CO2 emissions that burning coal does.

A new report from the U.S. Energy Information Administration estimates that as a result of these new technologies (e.g., hydraulic fracturing and horizontal drilling), globally, the technologically recoverable reserves of natural gas are nearly 50% greater than prior to their development.

Currently, the U.S. is the leader in the deployment of these technologies, and the effects are obvious (as seen in the figure above).  If and when more countries start to employ such technologies to recover natural gas, perhaps the growth in global carbon dioxide emissions will begin to slow (as compared to current projections).

Considering that possibility, along with the new, lower estimates for how sensitive the global average temperature is to carbon dioxide emissions, and the case for alarming climate change (and a carbon tax to try to mitigate it) is fading fast.

Low Climate Sensitivity Making its Way into the Mainstream Press

When it comes to the press, the New York Times pretty much defines “mainstream.”

And Justin Gillis is the Times’ mainstream reporter on the global warming beat.

So it is somewhat telling, that his article on Tuesday, “A Change in Temperature,” was largely dedicated (although begrudgingly) to facing up to the possibility that mainstream estimates (i.e., those produced by the U.N.’s Intergovernmental Panel on Climate Change) of climate sensitivity are too large.

Readers of this blog are probably well aware of the reasons why.

Despite our illusions of grandeur, this blog isn’t the mainstream press –although we do seek to influence it. Maybe we are being successful.

Throughout Gillis’ article are sprinkled references to “climate contrarians,” and even the recognition of the effort by such contrarians to push the new science on low climate sensitivity to the forefront of the discussion to change the existing politics of climate change.

Gillis writes:

Still, the recent body of evidence — and the political use that climate contrarians are making of it to claim that everything is fine — sheds some light on where we are in our scientific and public understanding of the risks of climate change.

We at the Cato’s Center for the Study of Science are at the leading edge of efforts to present a more accurate representation of the scientific of climate change through our testimony to Congress, public comments and review of government documents and proposals, media appearances, op-eds, and serial posts on this blog, among other projects. We emphasize that current regulations and proposed legislation are based on outdated, and likely wrong, projections of future climate impacts from human carbon dioxide emissions from the use of fossil fuels to produce energy.

Gillis recognizes the positives of a low climate sensitivity value:

“…tantalizing possibility that climate change might be slow and limited enough that human society could adapt to it without major trauma.”

“It will certainly be good news if these recent papers stand up to critical scrutiny, something that will take at least a year or two to figure out.”

“So if the recent science stands up to critical examination, it could indeed turn into a ray of hope…”

But, the “mainstream” is slow to change. And so despite the good news about climate sensitivity, Gillis closes his article by pointing out that, in his opinion, the political response to climate change has been “weak” (contrary to our view), and that therefore:

Even if climate sensitivity turns out to be on the low end of the range, total emissions may wind up being so excessive as to drive the earth toward dangerous temperature increases.

Clearly we still have work to do, but there are signs of progress!

CO2: 400ppm and Growing

The atmospheric concentration of carbon dioxide (CO2) has recently reached a “milestone” of 400 parts per million (ppm). In some circles, this announcement has been met with consternation and gnashing of teeth. The proper reaction is celebration.

The growth in the atmospheric CO2 concentration over the past several centuries is primarily the result of mankind’s thirst for energy—largely in the form of fossil fuels.  According to the World Bank, fossil fuel energy supplies about 80% of the world’s energy production—a value which has been pretty much constant for the past 40 years. During that time, the global population increased by 75%, and global energy use doubled. Global per capita energy use increased, while global energy use per $1000 GDP declined.  We are using more energy, but we are using it more efficiently. In the developed world, life expectancy has doubled since the dawn of the fossil fuel era.

Of course, burning fossil fuels to produce energy results in the emission of carbon dioxide into the atmosphere, tipping the natural balance of annual CO2 flux and leading to  a gradual build-up.

There are two primary externalities that result from our emissions of carbon dioxide into the atmosphere—1) an enhancement of the greenhouse effect, which results in an alteration of the energy flow in the earth’s climate and a general tendency to warm the global average surface temperature, and 2) an enhancement of the rate of photosynthesis in plants and a general tendency to result in more efficient growth and an overall healthier condition of vegetation (including crops).  There’s incontrovertible evidence that the planet is both warmer and greener than it was 100 years ago.

As we continually document (see here for our latest post), more and more science is suggesting that the rate (and thus magnitude at any point in time) of CO2-induced climate change is not as great as commonly portrayed. The lower the rate of change, the lower the resulting impact. If the rate is low enough, carbon dioxide emissions confer a net benefit. We’d like to remind readers that “it’s not the heat, it’s the sensitivity,” when it comes to carbon dioxide, and the sensitivity appears to have been overestimated.

As new science erodes the foundation of climate worry, new technologies are expanding recoverable fossil fuel resources. Horizontal drilling and hydraulic fracturing have opened up vast expanses of fossil fuel resources—mainly natural gas—that were untouchable just a few years ago. The discovery that the world is awash in hundreds of years of recoverable fuels is a game-changer, given  the strong correlation between energy use per capita and life expectancy.

400ppm of carbon dioxide in the atmosphere should remind us of our continuing success at expanding the global supply of energy to meet a growing demand. That  success which ultimately leads to an improvement of the global standard of living and a reduction in vulnerability to the vagaries of weather and climate.

400pm is cause for celebration. “A world lit only by fire” is not. 

Getting Our Due

In the Diary feature of this week’s The Spectator, rational optimist Matt Ridley has a collection of rather random observations from his daily life that have him thinking about (or maybe wishing for since Old Man Winter has been slow to loose his grip in the U.K. and Western Europe, much like he has across the Eastern U.S.) anthropogenic global warming.

What has his attention is that global warming just doesn’t seem to be going according to plan. And for those who have bought into that plan, their plan-driven actions are starting to make them look foolish.

But it’s not as if we haven’t “told you so”—a fact that Ridley draws attention to in the closing segment of his article.

David Rose of the Mail on Sunday was vilified for saying that there’s been no global warming for about 16 years, but even the head of the Intergovernmental Panel on Climate Change [IPCC] now admits he’s right. Rose is also excoriated for drawing attention to papers which find that climate sensitivity to carbon dioxide is much lower than thought — as was I when I made the same point in the Wall Street Journal. Yet even the Economist has now conceded this. Tip your hat to Patrick Michaels, then of the University of Virginia, who together with three colleagues published a carefully argued estimate of climate sensitivity in 2002. For having the temerity to say they thought ‘21st-century warming will be modest’, Michaels was ostracised. A campaign began behind the scenes to fire the editor of the journal that published the paper, Chris de Freitas. Yet Michaels’s central estimate of climate sensitivity agrees well with recent studies. Scientists can behave remarkably like priests at times.

What we determined in our 2002 study was that the amount of global warming projected by the end of this century was most likely being overestimated.  When we adjusted the climate model projections to take into account and better match the actual observations, our best estimate of the amount of warming we expected from 1990 to 2100 was about 1.8°C (3.2°F), which was in the lower end of the IPCC projected range, and which Ridley correctly noted, we termed as “modest.”

Further, we anticipated the slowdown in the warming rate. Quoting from our 2002 paper titled “Revised 21st century temperature projections” (Michaels et al., 2002):

The ‘worst case’ warming now appears to be merely linear, subject to the modifications described in this paper. Furthermore, both Table 1 and Fig. 3 indicate that any exponential rise in atmospheric CO2 concentrations is weak at best. Consequently, the current linear warming may in fact be the adjustment to the exponential growth in CO2 that took place prior to 1975. Levitus et al. (2000) documented a warming of 0.06°C in the top 3 km of a large-area ocean sample over the course of 40 yr. A lag correlation between that deep-water record and the sea-surface temperature record from Quayle et al. (1999) is very suggestive that oceanic thermal lag maximizes around 35 yr (Michaels et al. 2001). Thus, the truly exponential phase of concentration growth in the atmosphere, which ended about 25 yr ago, should induce a linear warming for the next decade or two before it could actually begin to damp.

Now, more than 10 years later, more and more evidence is piling in that we were right, including several recent papers that apply a technique not all that dissimilar in theory than our own (e.g. Gillett et al., 2012; Stott et al., 2013).

So even though we still are largely ostracized, at least we rest assured that we were pretty much on target—and some people are starting to take notice.

References:

Gillett N. P., V. K. Arora, G. M. Flato, J. F.  Scinocca, and K. von Salzen, 2012. Improved constraints on 21st-century warming derived using 160 years of temperature observations. Geophysical Research Letters, 39, L01704.

Michaels, P. J., P. C. Knappenberger, O. W. Frauenfeld, and R. E. Davis, 2002. Revised 21st century temperature projections. Climate Research, 23, 1-9.

Stott, P., P. Good, G. Jones, N. Gillett, and E. Hawkins, 2013. The upper end of climate model temperature projections is inconsistent with past warming. Environmental Research Letters, 8, 014024, doi:10.1088/1748-9326/8/1/014024.