Tag: global temperature

Current Wisdom: Record Global Temperature—Conflicting Reports, Contrasting Implications

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels and Paul C. “Chip” Knappenberger, from Cato’s Center for the Study of Science, review interesting items on global warming in the scientific literature or of a more technical nature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

Despite what you may think if you reside in the eastern United States, the world as a whole in 2014 has been fairly warm. For the past few months, several temperature-tracking agencies have been hinting that this year may turn out to be the “warmest ever recorded”—for whatever that is worth (keep reading for our evaluation). The hints have been turned up a notch with the latest United Nations climate confab taking place in Lima, Peru through December 12.  The mainstream media is happy to popularize these claims (as are government-money-seeking science lobbying groups).

But a closer look shows two things: first, whether or not 2014 will prove to be the record warmest year depends on whom you ask; and second, no matter where the final number for the year ranks in the observations, it will rank among the greatest “busts” of climate model predictions (which collectively expected it to be a lot warmer). The implication of the first is just nothing more than a jostling for press coverage. The implication of the latter is that future climate change appears to be less of a menace than assumed by the president and his pen and phone. 

Let’s examine at the various temperature records.

First, a little background. Several different groups compile the global average temperature in near-real time. Each uses slightly different data-handling techniques (such as how to account for missing data) and so each gets a slightly different (but nevertheless very similar) values. Several groups compute the surface temperature, while others calculate the global average temperature in the lower atmosphere (a bit freer from confounding factors like urbanization). All, thus far, only have data for 2014 compiled through October, so the final ranking for 2014, at this point in time, is only a speculation (although a pretty well-founded one).

The three major groups calculating the average surface temperature of the earth (land and ocean combined) all are currently indicating that 2014 will likely nudge out 2010 (by a couple hundredths of a degree Celsius) to become the warmest year in each dataset (which begin in mid-to-late 1800s). This is almost certainly true in the datasets maintained by the U.S. National Oceanographic and Atmospheric Administration (NOAA) and the UK Met Office Hadley Centre. In the record compiled by NASA’s Goddard Institute for Space Studies (GISS), the 2014 year-to-date value is in a virtual dead heat with the annual value for 2010, so the final ranking will depend heavily on the how the data come in for November and December. (The other major data compilation, the one developed by the Berkeley Earth group is not updated in real time).

The Current Wisdom: The Short-Term Climate Trend Is Not Your Friend

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

***********

It seems like everyone, from exalted climate scientists to late-night amateur tweeters, can get a bit over-excited about short-term fluctuations, reading into them deep cosmic and political meaning, when they are likely the statistical hiccups of our mathematically surly atmosphere.

There’s been some major errors in forecasts of recent trends. Perhaps the most famous  were made by NASA’s James Hansen in 1988, who overestimated warming between then and now by a whopping 40% or so.

But it is easy to  get snookered by short-term fluctuations.  As shown in Figure 1, it is quite obvious that there has been virtually no net change in temperature since 1997, allowing for the fact that measurement errors in global average surface temperature are easily a tenth of a degree or more. (The magnitude of those errors will be considered in a future Current Wisdom).

Figure 1. Annual global average surface temperature anomaly (°C), 1997-2010 (data source: Hadley Center).

Some who are concerned about environmental regulation without good science have seized upon this 13-year stretch as “proof” that there is no such thing as global warming driven by carbon dioxide.  More on that at the end of this Wisdom.

Similarly, periods of seemingly rapid warming can prompt scientists to see changes where there aren’t any.

Consider a landmark paper published in 2000 in Geophysical Research Letters by Tom Karl, a prominent researcher who is the head of our National Climatic Data Center (NCDC) and who just finished a stint as President of the American Meteorological Society.  He couldn’t resist the climatic blip that was occurred prior  to the current stagnation of warming, namely the very warm episode of the late 1990s. 

Cooler heads at the time noted that it was an artifact of the great El Nino of 1997-98, a periodic warming of the tropical Pacific that has been coming and going for millions of years. 

Nonetheless, the paper was published and accompanied by a flashy press release titled “Global warming may be accelerating.”  

What Karl did was to examine the 16 consecutive months of record-high temperatures (beginning in May, 1997) and to calculate the chance that this could happen, given the fairly pokey warming rate—approximately 0.17°C (0.31°F) per  decade, that was occurring.  He concluded there was less than a five percent probability, unless the warming rate had suddenly increased.

From the press release:

Karl and colleagues conclude that there is only a small chance that the string of record high temperatures in 1997-98 was simply an unusual event, rather than a change point, the start of a new and faster ongoing trend.

He also gave a number:  “…the probability of observing the record temperatures is more likely with high average rates of warming, around 3°C [5.4°F]/century,” which works out to 0.3°C per decade.

Our Figure 2 shows what was probabilistically forecast beginning in May, 1997, and what actually happened.  Between then and now, according to this paper, global temperatures should have warmed around 0.4°C (0.7°F).  The observed warming rate for the last 13.5 years—which includes the dramatically warming temperatures beginning in 1997—was a paltry 0.06°C (0.11°F) per decade. 

Figure 2. Prior to mid-1997, the observed warming trend (dashed line) was 0.17°/decade.  Karl said there was a greater than 95% probability that 1997-8 would mark a “change point”, where warming would accelerate to around 0.30°/decade.  Since then, the rate has been 0.06°/decade, or 20% of what was forecast.

Karl did provide some statistical wiggle room.  While noting the less than 5% chance that the warming rate hadn’t increased, he wrote that “unusual events can occur” and that there still was a chance (given as less than 5%) that 97-98 was just a statistical hiccup, which it ultimately proved to be.

The press release couldn’t resist the “it’s worse than we thought” mindset that pervades climate science:

Since completing the research, the data for 1999 has been compiled.  The researchers found that 1999 was the fifth warmest year on record, although as a La Nina year it would normally be cooler” [than what?ed].

“La Nina” is cool phase of El Nino, which drops temperatures about as much as El Nino raises them. What the press release and the GRL paper completely neglected to mention is that the great warm year of 1998 was a result of the “natural”  El Nino superimposed upon the overall slight warming trend.

In other words, there was every reason to believe at that time that the anomalous temperatures were indeed a statistical blip resulting from a very high-amplitude version of a natural oscillation in the earth’s climate that occurred every few years.

Now, back to the last 13 years. The puny recent changes may also just be our atmosphere’s make-up call for the sudden warming of the late 1990s, or another hiccup.

It is characteristic for climate models whose carbon dioxide increase resembles that which is being observed to produce constant rates of warming.  There’s a good reason for this.  Temperature responds logarithmically—i.e.less and less—to changes in this gas as its concentration increases.  But the concentration tends to increase exponentially—i.e. more and more.  The combination of an increasingly damped response to an ever increasing rate of input tends to resemble a straight line, or a constant rate of warming.

Indeed, Karl noted in his paper (and I have noted in virtually every public lecture I give), that “projections of temperature change in the next [i.e. the 21st] century, using [the United Nations’] business as usual scenarios…have relatively constant rates of global temperature increase”.  It’s just that their constant rates tend to be higher than the one that is being observed.  The average rate of warming predicted for this century by the UN is about 2.5°C, while the observed value has been, as predicted, constant—but with a lower value of 1.7°.  As Figure 3 shows, this rate has been remarkably constant for over three decades.

 

Figure 3. Annual global average surface temperature anomaly (°C), 1976-2010 (data source: Hadley Center).  It’s hard to imagine a more constant trend, despite the 1998 peak and the subsequent torpid warming.

The bottom line is that short-term trends are not your friends when talking about long-term climate change.

References

Hansen, J.E., et al., 1988. Global climate changes as forecast by Goddard Institute for Space Studies three-dimensional model. Journal of Geophysical Research, 93, 9341-9364.

Karl, T. R., R. W. Knight, and B. Baker, 2000. The record breaking global temperatures of 1997 and 1998” Evidence for an increase in the rate of global warming? Geophysical Research Letters, 27, 719-722.

Michaels, P. J., and P. C. Knappenberger, 2009. Scientific Shortcomings in the EPA’s Endangerment Finding from Greenhouse Gases, Cato Journal, 29, 497-521, http://www.cato.org/pubs/journal/cj29n3/cj29n3-8.pdf.

Topics:

UN Climate Official Steps In It, Then Aside

There are numerous possible reasons for UN climate chief Yvo de Boer’s decision to resign—from his inability to cobble together a new climate treaty last December in Copenhagen (where he wept on the podium), to recent revelations of his agency’s mishandling of climate change data.

What the climate science community and the public should focus on now are the ramifications of de Boer’s resignation.  For one thing, it signals that hope is dead for a UN-brokered global treaty that would have any meaningful effect on global temperatures.  It also means that the UN intends to keep its Intergovernmental Panel on Climate Change pretty much intact under the leadership of the scientifically compromised Rajenda Pauchari, who should have resigned along with de Boer.

This development guarantees that the Obama administration will have an unmitigated mess on its hands when signatories to the Framework Convention sit down in Mexico City this November in yet another meeting intended to produce a climate treaty.  The Mexico City meeting convenes six days after U.S. midterm elections, in which American voters are fully expected to rebuke Obama for policies including economy-crippling proposals to combat climate change.

In short, Mexico City is about as likely to produce substantive policy decisions as the TV show ‘The View.’  Backers of radical climate change measures are now paying the price for over two decades of telling the public—in this case literally—that the sky is falling.

New Study: Hadley Center and CRU Apparently Cherry-picked Russia’s Climate Data

Yesterday, the Moscow-based Institute of Economic Analysis (IEA), of which I am President, issued a study (in Russian), “How Warming Is Being Made: The Case of Russia.” The report, prepared by IEA director Natalya Pivovarova, suggests that the Hadley Center for Climate Change based at the headquarters of the British Meteorological Office in Exeter (Devon, England) and the Climate Research Unit of the University of East Anglia (CRU) in Norwich (England) apparently cherry-picked Russian climate data.

The IEA report shows that Russian meteorological-station data in the last 130 years did not substantiate the rate of warming on Russian territory suggested by the Hadley Climate Research Unit Temperature (HadCRUT) database, which has now been partially released.

IEA analysts point out that Russian meteorological stations cover most of the country’s territory, while the HadCRUT used data from only 25% of such stations in their calculations. Over 40% of Russian territory was not included in their global temperature calculations even though there was no lack of meteorological stations and observations. The data of stations located in areas not listed in the HadCRUT survey often shows slight cooling or no substantial warming in the second part of the 20th century and the early 21st century.

The HadCRUT database includes specific stations providing shorter observations and incomplete data highlighting the warming process, rather than stations providing longer and uninterrupted observations not demonstrating significant warming. On the whole, HadCRUT specialists use the incomplete findings of meteorological stations far more often than those providing complete observations. IEA analysts found that the climatologists used the data of stations located in large populated centers that are influenced by the “urban heat effect” more frequently than the unbiased data from the stations located in less populated places.

The IEA authors calculated that the scale of actual warming for the Russian territory in 1877-1998 was probably exaggerated by 0.64°C. Since Russia accounts for 12.5% of the world’s land mass, such an exaggeration for Russia alone should have an impact on the IPCC claim that the global temperature in the last century has risen by 0.76°C.

If similar procedures have been used for processing climate data from other national data sources, the impact on the rate of change in global temperature would be considerable.

The IEA report concludes that it is necessary to recalculate all global temperature data in order to assess the real rate of temperature change during the last century. Global temperature data will have to be modified because the calculations used by Copenhagen Conference on Climate Change analysts are based on HadCRUT research.