Tag: Global Science Report

2013: Will U.S. Temperature Be Below Average?

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”


Last year, the annual average temperature in the contiguous United States was the highest on record (since 1895) according the data compiled by the National Climatic Data Center (NCDC).   This year, the temperature took a nosedive from the lofty heights of 2012.

As we pointed out in our coverage of the 2012 milestone, the influence of human-caused climate change on the U.S. temperature history (including last year’s record warmth), while undoubtedly present, is difficult to ascertain.

The role that anthropogenic “global warming” from the emissions of greenhouse gases from the combustion of fossil fuels plays is debatable—both in timing and magnitude. Almost certainly its influence is present and detectable in the U.S. annual average temperature record, but beyond that simple statement, not a whole lot more can be added with scientific certainty.

We now stand nearly a year later with more evidence of proof and point.

Through November of this year, the U.S. average temperature is only 0.53°F above the 20th century mean temperature (the default baseline used by NCDC). Last year the annual temperature was 3.24°F above it.

Figure 1. Average January-November temperature in the contiguous United States from 1895-2013 as compiled by the National Climatic Data Center (source: NCDC, Climate at a Glance).

With the cold start to December across the country, the annual temperature for 2013 has an increasingly good shot at coming  in below the 20th century average.  For this to happen, the U.S. temperature for December would have to average about 27.6°F. For the first 12 days of the month, the average has been 28.4°F,  and the forecast is for continued cold, so getting to the needed temperature is not out of the question.

If 2013 does come in below the 20th century average, it would be the first year since 1996 to have done so, and would end a 16-year long run of above average annual temperature for the U.S.  You can follow the chase here.

But even if the rest of the month is not quite cold enough to push the entire year into negative territory, the 2013 annual temperate will still be markedly colder than last year’s record high, and will be the largest year-over-year decrease in the annual temperature on record, underscoring the “outlier” nature of the 2012 temperatures.

Will 2013 mark the end of the decade and a half period of abnormal warmth experience across the U.S. that was touched off by the 1998 El Niño event, and a return to conditions of the 1980s and early-to-mid 1990s? Or will 2013 turn out to just be a cold blip in the 21st century U.S. climate?

In either case, 2013 shows that the natural variability of annual temperatures in the U.S. is high (as is decadal and multi-decadal variability, see Figure 1)—an important caveat to keep in mind when you face the inundation of every-weather-event-is-caused-by-human-global-warming hysteria.

Stay tuned!

The Center for the Study of Science would like to thank Ryan Maue of WeatherBELL Analytics for his summary of December temperatures and the expected  temperatures for the rest of the year.

High-profile Paper Linking GMO Corn to Cancer in Rats Retracted

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

 

About a year ago, a major paper appeared in a high-profile scientific journal, Food and Chemical Toxicology, claiming a link between genetically modified corn and cancer in rats. The findings were published by a research team led by Gilles-Éric Séralini of the University of Caen in France. It was widely trumpeted by people opposed to genetically modified organisms (GMOs).

Simply put, making a GMO dramatically accelerates the normally slow process of traditional plant breeding, which takes many generations to stabilize some desired new trait in the plant genome, making the philosophical objections to it seem somewhat naïve.

While Séralini’s finding was heralded by anti-GMO activists as an “I told you so,” the paper was promptly, harshly, and widely criticized by geneticists and the general scientific community, many of whom lobbied the journal directly to address the shortcomings in the paper.

The most stinging criticism is going to sound painfully like what we see so often in environmental science, where researchers purposefully design an experiment likely to produce a desired results. Two months ago we documented a similar process that pretty much guaranteed that the chemical currently the darling of green enrages, bisphenyl-A, would “cause” cancer.

In Seralini’s case, the research team used a strain of rats with a known strong proclivity to develop cancer if left to age long enough, which is what they allowed, obeying the maxim that “if you let something get old enough, it will get cancer.”

With or Without a “Pause” Climate Models Still Project Too Much Warming

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

A new paper just hit the scientific literature that argues that the apparent pause in the rise in global average surface temperatures during the past 16 years was really just a slowdown. 

As you may imagine, this paper, by Kevin Cowtan and Robert Way is being hotly discussed in the global warming blogs, with reaction ranging from a warm embrace by the global-warming-is-going-to-be-bad-for-us crowd to revulsion from the human-activities-have-no-effect-on-the-climate claque.

The lukewarmers (a school we take some credit for establishing) seem to be taking the results in stride.  After all, the “pause” as curious as it is/was, is not central to the primary argument that, yes, human activities are pressuring the planet to warm, but that the rate of warming is going to be much slower than is being projected by the collection of global climate models (upon which mainstream projections of future climate change—and the resulting climate alarm (i.e., calls for emission regulations, etc.)—are based).

Under the adjustments to the observed global temperature history put together by Cowtan and Way, the models fare a bit better than they do with the unadjusted temperature record. That is, the observed temperature trend over the past 34 years (the period of record analyzed by Cowtan and Way) is a tiny bit closer to the average trend from the collection of climate models used in the new report from the U.N.’s Intergovernmental Panel on Climate Change (IPCC) than is the old temperature record.

Specifically, while the trend in observed global temperatures from 1979-2012 as calculated by Cowtan and Way is 0.17°C/decade, it is 0.16°C/decade in the temperature record compiled by the U.K. Hadley Center (the record that Cowtan and Way adjusted).  Because of the sampling errors associated with trend estimation, these values are not significantly different from one another.  Whether the 0.17°C/decade is significantly different from the climate model average simulated trend during that period of 0.23°C/decade is discussed extensively below.

But, suffice it to say that an insignificant difference of 0.01°C/decade in the global trend measured over more than 30 years is pretty small beer and doesn’t give model apologists very much to get happy over.

Instead, the attention is being deflected to “The Pause”—the leveling off of global surface temperatures during the past 16 years (give or take). Here, the new results from Cowtan and Way show that during the period 1997-2012, instead of a statistically insignificant rise at a rate of 0.05°C/decade as is contained in the “old” temperature record, the rise becomes a statistically significant 0.12°C/decade. “The Pause” is transformed into “The Slowdown” and alarmists rejoice because global warming hasn’t stopped after all. (If the logic sounds backwards, it does to us as well, if you were worried about catastrophic global warming, wouldn’t you rejoice at findings that indicate that future climate change was going to be only modest, more so than results to the contrary?)

The science behind the new Cowtan and Way research is still being digested by the community of climate scientists and other interested parties alike. The main idea is that the existing compilations of the global average temperature are very data-sparse in the high latitudes. And since the Arctic (more so than the Antarctic) is warming faster than the global average, the lack of data there may mean that the global average temperature trend may be underestimated. Cowtan and Way developed a methodology which relied on other limited sources of temperature information from the Arctic (such as floating buoys and satellite observations) to try to make an estimate of how the surface temperature was behaving in regions lacking more traditional temperature observations (the authors released an informative video explaining their research which may better help you understand what they did). They found that the warming in the data-sparse regions was progressing faster than the global average (especially during the past couple of years) and that when they included the data that they derived for these regions in the computation of the global average temperature, they found the global trend was higher than previously reported—just how much higher depended on the period over which the trend was calculated. As we showed, the trend more than doubled over the period from 1997-2012, but barely increased at all over the longer period 1979-2012.

Figure 1 shows the impact on the global average temperature trend for all trend lengths between 10 and 35 years (incorporating  our educated guess as to what the 2013 temperature anomaly will be), and compares that to the distribution of climate model simulations of the same period. Statistically speaking, instead of there being a clear inconsistency (i.e., the observed trend value falls outside of the range which encompasses 95% of all modeled trends) between the observations and the climate mode simulations for lengths ranging generally from 11 to 28 years and a marginal inconsistency (i.e., the observed trend value falls outside of the range which encompasses 90% of all modeled trends)  for most of the other lengths, now the observations track closely the marginal inconsistency line, although trends of length 17, 19, 20, 21 remain clearly inconsistent with the collection of modeled trends. Still, throughout the entirely of the 35-yr period (ending in 2013), the observed trend lies far below the model average simulated trend (additional information on the impact of the new Cowtan and Way adjustments on modeled/observed temperature comparison can be found here).

 

Figure 1. Temperature trends ranging in length from 10 to 35 years (ending in a preliminary 2013) calculated using the data from the U.K. Hadley Center (blue dots), the adjustments to the U.K. Hadley Center data made by Cowtan and Way (red dots) extrapolated through 2013, and the average of climate model simulations (black dots). The range that encompasses 90% (light grey lines) and 95% (dotted black lines) of climate model trends is also included.

The Cowtan and Way analysis is an attempt at using additional types of temperature information, or extracting “information” from records that have already told their stories, to fill in the missing data in the Arctic.  There are concerns about the appropriateness of both the data sources and the methodologies applied to them.  

A major one is in the applicability of satellite data at such high latitudes.   The nature of the satellite’s orbit forces it to look “sideways” in order to sample polar regions.  In fact, the orbit is such that the highest latitude areas cannot be seen at all.  This is compounded by the fact that cold regions can develop substantial “inversions” of near-ground temperature, in which temperature actually rises with height such that there is not a straightforward relationship between the surface temperature and the temperature of the lower atmosphere where the satellites measure the temperature. If the nature of this complex relationship is not constant in time, an error is introduced into the Cowtan and Way analysis.

Another unresolved problem comes up when extrapolating land-based weather station data far into the Arctic Ocean.  While land temperatures can bounce around a lot, the fact that much of the ocean is partially ice-covered for many months.  Under “well-mixed” conditions, this forces the near-surface temperature to be constrained to values near the freezing point of salt water, whether or not the associated land station is much warmer or colder.

You can run this experiment yourself by filling a glass with a mix of ice and water and then making sure it is well mixed.  The water surface temperature must hover around 33°F until all the ice melts.  Given that the near-surface temperature is close to the water temperature, the limitations of land data become obvious.

Considering all of the above, we advise caution with regard to Cowtan and Way’s findings.  While adding high arctic data should increase the observed trend, the nature of the data means that the amount of additional rise is subject to further revision.  As they themselves note, there’s quite a bit more work to be done this area.

In the meantime, their results have tentatively breathed a small hint of life back into the climate models, basically buying them a bit more time—time for either the observed temperatures to start rising rapidly as current models expect, or, time for the modelers to try to fix/improve cloud processes, oceanic processes, and other process of variability (both natural and anthropogenic) that lie behind what would be the clearly overheated projections. 

We’ve also taken a look at how “sensitive” the results are to the length of the ongoing pause/slowdown.  Our educated guess is that the “bit” of time that the Cowtan and Way findings bought the models is only a few years long, and it is a fact, not a guess, that each additional year at the current rate of lukewarming increases the disconnection between the models and reality.

 

Reference:

Cowtan, K., and R. G. Way, 2013. Coverage bias in the HadCRUT4 temperature series and its impact on recent temperature trends. Quarterly Journal of the Royal Meteorological Society, doi: 10.1002/qj.2297.

 

Thanks to Natural Gas and Climate Change, U.S. Carbon Dioxide Emissions Continue Downward Trend

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Carbon dioxide emissions in the United States from the production and consumption of energy have been on the decline since about 2005, after generally being on the rise ever since our country was first founded.

The decline in emissions between 2012 and 2011 was 3.8 percent, which, according to the Energy Information Administration (EIA) was the largest decline in a non-recession year since 1990 and the first time that carbon dioxide (CO2) emissions fell while the per capita economic output increased by more than 2 percent.  In other words, we are producingmore while emitting less carbon dioxide.

 

Just in Time for Halloween Come Some Scary Global Warming Predictions

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Global warming beater Justin Gillis of the New York Times had an article yesterday describing a new paper in the current issue of Nature magazine, the point of which seems to be scaring people with alarming global warming statistics.

Gillis’ article “By 2047, Coldest Years May Be Warmer Than Hottest in Past,” describes the results of a class-project-cum-Nature-article headed by Camilo Mora from the University of Hawaii at Manoa (please, no puns). The class assignment was to identify the year for each spot on the globe in which all future years were, according to climate model projections, warmer as a result of greenhouse gas emissions than the warmest year simulated by the models during the historical period 1860 to 2005. Mora and students termed this pivotal year the “climate departure.”

This work is significant, according to Gillis, because:

Thousands of scientific papers have been published about the model results, but the students identified one area of analysis that was missing. The results are usually reported as average temperature changes across the planet. But that gives little sense of how the temperature changes in specific places might compare with historical norms. “We wanted to give people a really relatable way to understand climate,” said Abby G. Frazier, a doctoral candidate in geography.

Perhaps Dr. Mora should have injected a little climate-science history in this class.

Looking at the time that a human climate signal will rise above the background noise is not particularly a novel concept. It’s commonplace. We would guess that a signal-to-noise ratio was probably present in the first papers describing the performance and output of the very first climate models.

After all, without such information it is impossible to put absolute changes in perspective.  Some measure of the statistical significance of climate change has been present in every climate assessment report from the U.N. Intergovernmental Panel on Climate Change dating back to 1990.

In our presentation to the Science Policy Conference of the American Geophysical Union this summer, we even included a table listing the number of years into the future it would be before projected changes in precipitation across the U.S. rose above the level of nature variability. We guess we just didn’t give that year a catchy enough name like “climate departure,” because our results didn’t capture the attention of the press (nor were they very frightening).

But Gillis does manage to carve some new, scary Jack-o-Lanterns from the Mora study.

Here is his lead paragraph:

If greenhouse emissions continue their steady escalation, temperatures across most of the earth will rise to levels with no recorded precedent by the middle of this century, researchers said Wednesday.

Uh, correct us if we are wrong, but we already thought that global temperatures were reported to be at unprecedented levels in recorded history. According to the IPCC’s Fifth Assessment Report:

Each of the last three decades has been successively warmer at the Earth’s surface than any preceding decade since 1850.

So, is this recycled news, or is the new paper saying that we have to wait until 2047 for that to happen? Well, whatever, it sounds B-A-D.

Or how about this one:

“Go back in your life to think about the hottest, most traumatic event you have experienced,” Dr. Mora said in an interview. “What we’re saying is that very soon, that event is going to become the norm.”

Hot Tub Time Machine came immediately to mind, but Gillis provided another scenario:

With the technique the Mora group used, it is possible to specify climate departure dates for individual cities. Under high emissions, climate departure for New York City will come in 2047, the paper found, plus or minus the five-year margin of error.

How scared should you be about passing the date of “climate departure”?

Not at all.

What the New IPCC Global Warming Projections Should Have Looked Like

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

The United Nations’ Intergovernmental Panel on Climate Change (IPCC) released its Fifth Assessment Report (AR5) last week to fanfare and stinging criticism.

Most of the criticism was aimed at the IPCC’s defense of climate models—models that the latest observations of the earth’s climate evolution show to be inaccurate, or at least are strongly indicative that is the case.

There are two prominent and undeniable examples of the models’ insufficiencies: 1) climate models overwhelmingly expected much more warming to have taken place over the past several decades than actually occurred; and 2) the sensitivity of the earth’s average temperature to increases in atmospheric greenhouse gas concentrations (such as carbon dioxide) averages some 60 percent greater in the IPCC’s climate models than it does in reality (according to a large and growing collection of evidence published in the scientific literature).

Had the IPCC addressed these model shortcomings head on, the flavor of their entire report would have been different. Instead of including projections for extreme climate changes as a result of continued human emissions of greenhouse gases resulting from our production of energy, the high-end projections would have featured relatively modest changes and the low-end projections would have been completely unremarkable.

Since changes in the earth’s temperature scale approximately linearly with a property known as the earth’s equilibrium climate sensitivity (how much the earth’s average surface temperature rises as a result of a doubling of the atmosphere’s carbon dioxide concentration), it is pretty straightforward to adjust the IPCC’s projections of future temperature change to bring them closer to what the latest science says the climate sensitivity is. That science suggests the equilibrium climate sensitivity probably lies between 1.5°C and 2.5°C (with an average value of 2.0°C), while the climate models used by the IPCC have climate sensitivities which range from 2.1°C to 4.7°C with an average value of 3.2°C.

To make the IPCC projections of the evolution of the earth’s average temperature better reflect the latest scientific estimates of the climate sensitivity, it is necessary to adjust them downward by about 30% at the low end, about 50% at the high end, and about 40% in the middle.

The figure below the jump shows what happens when we apply such a correction (note: we maintain some internal weather noise). The top panel shows the projections as portrayed by the IPCC in their just-released Fifth Assessment Report, and the lower panel shows what they pretty much would have looked like had the climate models better reflected the latest science. In other words, the lower panel is what the IPCC temperature projections should have looked like.

 

New IPCC Report Will Be Internally Inconsistent and Misleading

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

The United Nation’s Intergovernmental Panel on Climate Change (IPCC) seems more intent on maintaining the crumbling “consensus” on anthropogenic global warming than on following climate science to its logical conclusion—a conclusion that increasingly suggests that human greenhouse gas emissions are less important in driving climate change than commonly held.

This fact is obvious from the embarrassing lack of internal inconsistency contained in the leaked versions of  the IPCC’s Fifth Assessment Report. The Summary for Policymakers, a succinct and brief document supposedly encapsulating what is in the entire 3,000-page report is supposed to be approved by closing time on Friday, at a meeting currently taking place in Stockholm.

In no place will this internal inconsistency be more obvious than in how the IPCC deals with the discrepancy between the observed effectiveness of greenhouse gases in warming the earth and this effectiveness calculated  by the climate models that the IPCC uses to project future climate change.

The warming effectiveness is known as the “climate sensitivity” and is the key parameter in how much the earth’s surface temperature rise as a result of the increasing atmospheric concentration of carbon dioxide and other greenhouse gases. Most all climate impacts are related to the climate sensitivity—the lower the climate sensitivity, the fewer the impacts.

One problem. Climate scientists don’t know what the value of the climate sensitivity really is.

Not because the calculation is complicated—just take how much the global average temperature has changed over some longish time period (a couple of decades or longer) and divide by much energy was used to force that change.