Topic: Energy and Environment

2013: Will U.S. Temperature Be Below Average?

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”


Last year, the annual average temperature in the contiguous United States was the highest on record (since 1895) according the data compiled by the National Climatic Data Center (NCDC).   This year, the temperature took a nosedive from the lofty heights of 2012.

As we pointed out in our coverage of the 2012 milestone, the influence of human-caused climate change on the U.S. temperature history (including last year’s record warmth), while undoubtedly present, is difficult to ascertain.

The role that anthropogenic “global warming” from the emissions of greenhouse gases from the combustion of fossil fuels plays is debatable—both in timing and magnitude. Almost certainly its influence is present and detectable in the U.S. annual average temperature record, but beyond that simple statement, not a whole lot more can be added with scientific certainty.

We now stand nearly a year later with more evidence of proof and point.

Through November of this year, the U.S. average temperature is only 0.53°F above the 20th century mean temperature (the default baseline used by NCDC). Last year the annual temperature was 3.24°F above it.

Figure 1. Average January-November temperature in the contiguous United States from 1895-2013 as compiled by the National Climatic Data Center (source: NCDC, Climate at a Glance).

With the cold start to December across the country, the annual temperature for 2013 has an increasingly good shot at coming  in below the 20th century average.  For this to happen, the U.S. temperature for December would have to average about 27.6°F. For the first 12 days of the month, the average has been 28.4°F,  and the forecast is for continued cold, so getting to the needed temperature is not out of the question.

If 2013 does come in below the 20th century average, it would be the first year since 1996 to have done so, and would end a 16-year long run of above average annual temperature for the U.S.  You can follow the chase here.

But even if the rest of the month is not quite cold enough to push the entire year into negative territory, the 2013 annual temperate will still be markedly colder than last year’s record high, and will be the largest year-over-year decrease in the annual temperature on record, underscoring the “outlier” nature of the 2012 temperatures.

Will 2013 mark the end of the decade and a half period of abnormal warmth experience across the U.S. that was touched off by the 1998 El Niño event, and a return to conditions of the 1980s and early-to-mid 1990s? Or will 2013 turn out to just be a cold blip in the 21st century U.S. climate?

In either case, 2013 shows that the natural variability of annual temperatures in the U.S. is high (as is decadal and multi-decadal variability, see Figure 1)—an important caveat to keep in mind when you face the inundation of every-weather-event-is-caused-by-human-global-warming hysteria.

Stay tuned!

The Center for the Study of Science would like to thank Ryan Maue of WeatherBELL Analytics for his summary of December temperatures and the expected  temperatures for the rest of the year.

High-profile Paper Linking GMO Corn to Cancer in Rats Retracted

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

 

About a year ago, a major paper appeared in a high-profile scientific journal, Food and Chemical Toxicology, claiming a link between genetically modified corn and cancer in rats. The findings were published by a research team led by Gilles-Éric Séralini of the University of Caen in France. It was widely trumpeted by people opposed to genetically modified organisms (GMOs).

Simply put, making a GMO dramatically accelerates the normally slow process of traditional plant breeding, which takes many generations to stabilize some desired new trait in the plant genome, making the philosophical objections to it seem somewhat naïve.

While Séralini’s finding was heralded by anti-GMO activists as an “I told you so,” the paper was promptly, harshly, and widely criticized by geneticists and the general scientific community, many of whom lobbied the journal directly to address the shortcomings in the paper.

The most stinging criticism is going to sound painfully like what we see so often in environmental science, where researchers purposefully design an experiment likely to produce a desired results. Two months ago we documented a similar process that pretty much guaranteed that the chemical currently the darling of green enrages, bisphenyl-A, would “cause” cancer.

In Seralini’s case, the research team used a strain of rats with a known strong proclivity to develop cancer if left to age long enough, which is what they allowed, obeying the maxim that “if you let something get old enough, it will get cancer.”

Government Planning in Indiana with Federal Funds

According to popular myth, Democrats favor government planning of the economy and Republicans favor free markets. Today’s example of why this is baloney comes from the Republican governor of Indiana, Mike Pence. Before I get to the story, readers should know up front that I was a state budget official (2006-2008) in the prior administration of Gov. Mitch Daniels (R). 

Yesterday, the Indiana Department of Energy Development announced that it will be “crafting a new energy plan for the state of Indiana.” Well, praise the Lord – the state’s energy planners are going to work with “stakeholders” to make sure Hoosiers won’t be forced to turn to whale oil lamps. No, seriously, Indiana is in trouble. According to the announcement, that’s because the state’s current plan apparently just hasn’t panned out: 

Indiana’s current energy plan, the Homegrown Energy Plan, was written in 2006. Since that time, Indiana’s cost of electricity for industrial customers has increased, causing Indiana to slip from 5th lowest in the country to 27th lowest. 

Oops. 

Okay, a new vision is clearly needed. Enter former radio host Gov. Mike Pence: 

“Here in Indiana, we make things, and we grow things,” said Governor Mike Pence. “These activities require enormous amounts of energy. In order to maintain our historic advantage for low cost of energy, we need a new, updated energy plan.” 

Whoa – that’s deep. Think about what Pence is saying: Hoosiers make things…Hoosiers grow things. Only a cold-hearted cynic doesn’t feel a tingle after contemplating such profound insights. 

As the saying goes, great leaders surround themselves with great people. Heading up the state’s development of a new energy plan is my former colleague, Tristan Vance. According to a press release announcing Vance’s reappointment, he has extensive experience working in state government. There’s no mention of Vance having real world experience in the energy sector that he’s now in charge of planning, but he did monitor the agency as a state budget official prior to heading it. 

Eh, close enough. 

Snark aside, there’s a deeper policy concern here that affects taxpayers in all states. Much of the Indiana Department of Energy Development’s funding comes from the federal government (about 70 percent if my reading of state budget numbers is correct). That means, dear federal taxpayers, you’ll be subsidizing the bulk of whatever “plan” the Pence administration comes up with.   

Now as I noted in an Indianapolis Star op-ed back in June, Indiana’s dependence on federal funds isn’t unique. Indeed, the other 49 states are similarly dependent on handouts from Uncle Sam. But state taxpayers should understand that federal funds are not a “free” lunch: 

The appeal of federal funds to governors is obvious: They get to spend additional money without having to raise taxes on their voters to pay for it. A problem with this arrangement is that it creates a fiscal illusion — state taxpayers perceive the cost of government to be cheaper than it really is. In effect, the federal money and a large part of the annual budget appears to be “free.”

But Hoosiers should be mindful that every dollar Washington sends to Indianapolis is a dollar taken from taxpayers in Indiana and the other states. (The return is actually less than a dollar since the federal bureaucracy takes its cut). The situation is no different when the federal dollars go instead to, say, Sacramento. In addition, economists have found that federal subsidies to the states lead to higher state taxes and spending in the long-run because the federal “seed money” creates a demand for more government.

One could argue that so long as Hoosiers have to send money to Washington, Indiana might as well get a share of the loot. That’s an understandable sentiment, but the blatantly self-serving manner in which the Pence administration goes about distributing the bounty should give Hoosiers pause.

Indeed, the self-serving manner in which the nation’s governors go about playing with federal funds should give all taxpayers pause. 

Major Sports Organizations Discuss Climate Change with Bicameral Task Force

Seriously?!?

Tomorrow [today] Rep. Henry A. Waxman and Sen. Sheldon Whitehouse, co-chairs of the Bicameral Task Force on Climate Change, will host representatives from five of America’s major sports leagues, as well as the U.S. Olympic Committee (USOC), to discuss the effects of climate change on sporting activities and the work these organizations are doing to reduce their greenhouse gas (GHG) emissions.  The group will meet for a closed-door discussion, followed by a press availability.

Now, admittedly, even as a climatologist, I do spend a fair amount of time discussing sports.

But I do so around the water cooler or at the local bar, not with Congressional task forces.

Your tax dollars are probably better served that way.

With or Without a “Pause” Climate Models Still Project Too Much Warming

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

A new paper just hit the scientific literature that argues that the apparent pause in the rise in global average surface temperatures during the past 16 years was really just a slowdown. 

As you may imagine, this paper, by Kevin Cowtan and Robert Way is being hotly discussed in the global warming blogs, with reaction ranging from a warm embrace by the global-warming-is-going-to-be-bad-for-us crowd to revulsion from the human-activities-have-no-effect-on-the-climate claque.

The lukewarmers (a school we take some credit for establishing) seem to be taking the results in stride.  After all, the “pause” as curious as it is/was, is not central to the primary argument that, yes, human activities are pressuring the planet to warm, but that the rate of warming is going to be much slower than is being projected by the collection of global climate models (upon which mainstream projections of future climate change—and the resulting climate alarm (i.e., calls for emission regulations, etc.)—are based).

Under the adjustments to the observed global temperature history put together by Cowtan and Way, the models fare a bit better than they do with the unadjusted temperature record. That is, the observed temperature trend over the past 34 years (the period of record analyzed by Cowtan and Way) is a tiny bit closer to the average trend from the collection of climate models used in the new report from the U.N.’s Intergovernmental Panel on Climate Change (IPCC) than is the old temperature record.

Specifically, while the trend in observed global temperatures from 1979-2012 as calculated by Cowtan and Way is 0.17°C/decade, it is 0.16°C/decade in the temperature record compiled by the U.K. Hadley Center (the record that Cowtan and Way adjusted).  Because of the sampling errors associated with trend estimation, these values are not significantly different from one another.  Whether the 0.17°C/decade is significantly different from the climate model average simulated trend during that period of 0.23°C/decade is discussed extensively below.

But, suffice it to say that an insignificant difference of 0.01°C/decade in the global trend measured over more than 30 years is pretty small beer and doesn’t give model apologists very much to get happy over.

Instead, the attention is being deflected to “The Pause”—the leveling off of global surface temperatures during the past 16 years (give or take). Here, the new results from Cowtan and Way show that during the period 1997-2012, instead of a statistically insignificant rise at a rate of 0.05°C/decade as is contained in the “old” temperature record, the rise becomes a statistically significant 0.12°C/decade. “The Pause” is transformed into “The Slowdown” and alarmists rejoice because global warming hasn’t stopped after all. (If the logic sounds backwards, it does to us as well, if you were worried about catastrophic global warming, wouldn’t you rejoice at findings that indicate that future climate change was going to be only modest, more so than results to the contrary?)

The science behind the new Cowtan and Way research is still being digested by the community of climate scientists and other interested parties alike. The main idea is that the existing compilations of the global average temperature are very data-sparse in the high latitudes. And since the Arctic (more so than the Antarctic) is warming faster than the global average, the lack of data there may mean that the global average temperature trend may be underestimated. Cowtan and Way developed a methodology which relied on other limited sources of temperature information from the Arctic (such as floating buoys and satellite observations) to try to make an estimate of how the surface temperature was behaving in regions lacking more traditional temperature observations (the authors released an informative video explaining their research which may better help you understand what they did). They found that the warming in the data-sparse regions was progressing faster than the global average (especially during the past couple of years) and that when they included the data that they derived for these regions in the computation of the global average temperature, they found the global trend was higher than previously reported—just how much higher depended on the period over which the trend was calculated. As we showed, the trend more than doubled over the period from 1997-2012, but barely increased at all over the longer period 1979-2012.

Figure 1 shows the impact on the global average temperature trend for all trend lengths between 10 and 35 years (incorporating  our educated guess as to what the 2013 temperature anomaly will be), and compares that to the distribution of climate model simulations of the same period. Statistically speaking, instead of there being a clear inconsistency (i.e., the observed trend value falls outside of the range which encompasses 95% of all modeled trends) between the observations and the climate mode simulations for lengths ranging generally from 11 to 28 years and a marginal inconsistency (i.e., the observed trend value falls outside of the range which encompasses 90% of all modeled trends)  for most of the other lengths, now the observations track closely the marginal inconsistency line, although trends of length 17, 19, 20, 21 remain clearly inconsistent with the collection of modeled trends. Still, throughout the entirely of the 35-yr period (ending in 2013), the observed trend lies far below the model average simulated trend (additional information on the impact of the new Cowtan and Way adjustments on modeled/observed temperature comparison can be found here).

 

Figure 1. Temperature trends ranging in length from 10 to 35 years (ending in a preliminary 2013) calculated using the data from the U.K. Hadley Center (blue dots), the adjustments to the U.K. Hadley Center data made by Cowtan and Way (red dots) extrapolated through 2013, and the average of climate model simulations (black dots). The range that encompasses 90% (light grey lines) and 95% (dotted black lines) of climate model trends is also included.

The Cowtan and Way analysis is an attempt at using additional types of temperature information, or extracting “information” from records that have already told their stories, to fill in the missing data in the Arctic.  There are concerns about the appropriateness of both the data sources and the methodologies applied to them.  

A major one is in the applicability of satellite data at such high latitudes.   The nature of the satellite’s orbit forces it to look “sideways” in order to sample polar regions.  In fact, the orbit is such that the highest latitude areas cannot be seen at all.  This is compounded by the fact that cold regions can develop substantial “inversions” of near-ground temperature, in which temperature actually rises with height such that there is not a straightforward relationship between the surface temperature and the temperature of the lower atmosphere where the satellites measure the temperature. If the nature of this complex relationship is not constant in time, an error is introduced into the Cowtan and Way analysis.

Another unresolved problem comes up when extrapolating land-based weather station data far into the Arctic Ocean.  While land temperatures can bounce around a lot, the fact that much of the ocean is partially ice-covered for many months.  Under “well-mixed” conditions, this forces the near-surface temperature to be constrained to values near the freezing point of salt water, whether or not the associated land station is much warmer or colder.

You can run this experiment yourself by filling a glass with a mix of ice and water and then making sure it is well mixed.  The water surface temperature must hover around 33°F until all the ice melts.  Given that the near-surface temperature is close to the water temperature, the limitations of land data become obvious.

Considering all of the above, we advise caution with regard to Cowtan and Way’s findings.  While adding high arctic data should increase the observed trend, the nature of the data means that the amount of additional rise is subject to further revision.  As they themselves note, there’s quite a bit more work to be done this area.

In the meantime, their results have tentatively breathed a small hint of life back into the climate models, basically buying them a bit more time—time for either the observed temperatures to start rising rapidly as current models expect, or, time for the modelers to try to fix/improve cloud processes, oceanic processes, and other process of variability (both natural and anthropogenic) that lie behind what would be the clearly overheated projections. 

We’ve also taken a look at how “sensitive” the results are to the length of the ongoing pause/slowdown.  Our educated guess is that the “bit” of time that the Cowtan and Way findings bought the models is only a few years long, and it is a fact, not a guess, that each additional year at the current rate of lukewarming increases the disconnection between the models and reality.

 

Reference:

Cowtan, K., and R. G. Way, 2013. Coverage bias in the HadCRUT4 temperature series and its impact on recent temperature trends. Quarterly Journal of the Royal Meteorological Society, doi: 10.1002/qj.2297.

 

Was Typhoon Haiyan the Most Intense Storm in Modern History?

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Global warming buffs have been fond of claiming that the roaring winds of Typhoon Haiyan were the highest ever measured in a landfalling tropical cyclone, and that therefore (?) this is a result of climate change. In reality, it’s unclear whether or not it holds the modern record for the strongest surface wind at landfall. 

This won’t be known until there is a thorough examination of its debris field.

The storm of record is 1969 Hurricane Camille, which I rode out in an oceanfront laboratory about 25 miles east of the eye. There’s a variety of evidence arguing that Camille is going to be able to retain her crown.

The lowest pressure in Haiyan was 895 millibars, or 26.42 inches of mercury. To give an idea, the needle on your grandmonther’s dial barometer would have to turn two complete counterclockwise circles to get there. While there have been four storms in the Atlantic in the modern era that have been as strong or a bit stronger, the western Pacific sees one of these approximately every two years or so.

Camille’s lowest pressure was a bit higher, at 905 mb (26.72 inches). At first blush it would therefore seem Haiyan would win the blowhard award hands down, but Hayian had a very large eye around which its winds swirled, while Camille’s was one of the smallest ever measured.  At times in its brief life, Camille’s was so small that the hurricane hunter aircraft could not safely complete a 360 degree turn without brushing through the devastating innermost cloud band, something you just don’t want to be near in a turning aircraft. In fact, the last aircraft to get into Camille, which measured 190mph sustained winds, lost an engine in the severe turbulence and fortunately was able to limp home.

Haiyan’s estimated 195mph winds were derived from satellite data, rather than being directly sensed by an aircraft.  But winds over the open ocean are always greater than those at landfall because of friction, and the five mph difference between the two storms is physically meaningless. 

Victory for Cato: Feds Now Seeking Input on the Social Cost of Carbon

It’s about time!

For months, we have been hammering away at the point that the Feds’ current determination of the social cost of carbon is grossly out of touch with the relevant scientific literature and economic guidance.

Perhaps in response to the fact that they can’t argue against what we have been saying, the Administration has finally capitulated and is opening up their determination of the social cost of carbon (SCC) for public comment.

Their SCC calculation—in keeping with the playbook of the president’s Climate Action Plan—is a backdoor way of implementing a carbon tax. And it is slowly, pervasively, and worse of all, silently, creeping into all of our lives.  We’ve been trying to stop all of this by, at the very least, pulling back the cloak of secrecy and trying to make this once-esoteric  subject a topic of dinnertime conversation.

Meanwhile,  the government’s regulatory push using the SCC continues.

The Institute for Energy Research has recently identified nearly 30 federal regulations which have incorporated the SCC into their cost benefit analysis (and several more have been recently announced).

The SCC is used to make regulations seem less costly.  We say “seem,” because the “benefit” from reducing carbon dioxide (CO2)  emissions, as valued by the SCC, is likely never to be realized by the American consumer—yet the other costs (such as increased manufacturing costs) most assuredly will be.

The SCC is a theoretical cost of each additional CO2 emission. But the theory is so loosey-goosey that with a little creativity, you can arrive at pretty much any value for  the SCC—a point noted by M.I.T.’s Robert Pindyck in an article for the Summer 2013 edition of Cato’s Regulation.

As the Obama Administration wants to regulate away as many carbon dioxide emissions as possible, it is in its own self-interest to try to arrive at the highest SCC value possible.  This way, the more that CO2 emissions are reduced, the more money is “saved.”

Or so the idea goes.

But their path towards a high SCC is one away from both the best science and the most common-sense economics.

We imagine that readers of this blog probably are well-aware of the details behind this reality, as we have laid them out on many occasions,  so we won’t go into them again here.

Instead, we want to point out several opportunities to draw further attention to the short-comings in the Administration’s SCC determination.

The period for accepting public comments on several proposed rulemakings is open, and provides a good opportunity to remind the issuing agency what they did wrong. For example, here is a recently-announced regulation proposal from the Department of Energy (DoE) which seeks to impose higher energy efficiency rules for residential furnace fans. It employs the SCC to make this rule seem a lot sweeter than it actually is.

We have already submitted comments on several of these proposed regulations, including DoE regulations to increase the efficiency standards for Microwave Ovens, Walk-In Freezers, and Commercial Refrigeration Equipment.

So, it’s important that  the White House’s  Office of Management and Budget (OBM)  just announced that the social cost of carbon determination currently in force will be open to public comment starting sometime in the presumably near future (keep an eye on the Federal Register for the official announcement).

While it is too early to tell, this willingness to hear public comments on the SCC probably originated from the comments received on the Petition to Reconsider the proposed Microwave Oven ruling—the first rulemaking to incorporate the Administration’s latest-worst iteration of the SCC (which was about a 50% increase over its original figure). There hasn’t been an official announcement as to the result of Petition, but the scientific argument against it is a Cato product.

More than likely, though, this will all be for show.  The feds could selectively use some comments and  somehow find a way to raise the SCC even further.  Like we said, that’s easy to do—crank down the discount rate, or  crank up the damage function (make-up new damages not included in the current models)—even while paying lip service to the lowered equilibrium climate sensitivity and the CO2 fertilization effect.

We’d be more than happy to be wrong about this. But until then, our efforts to set things straight will continue.