Topic: Energy and Environment

With or Without a “Pause” Climate Models Still Project Too Much Warming

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

A new paper just hit the scientific literature that argues that the apparent pause in the rise in global average surface temperatures during the past 16 years was really just a slowdown. 

As you may imagine, this paper, by Kevin Cowtan and Robert Way is being hotly discussed in the global warming blogs, with reaction ranging from a warm embrace by the global-warming-is-going-to-be-bad-for-us crowd to revulsion from the human-activities-have-no-effect-on-the-climate claque.

The lukewarmers (a school we take some credit for establishing) seem to be taking the results in stride.  After all, the “pause” as curious as it is/was, is not central to the primary argument that, yes, human activities are pressuring the planet to warm, but that the rate of warming is going to be much slower than is being projected by the collection of global climate models (upon which mainstream projections of future climate change—and the resulting climate alarm (i.e., calls for emission regulations, etc.)—are based).

Under the adjustments to the observed global temperature history put together by Cowtan and Way, the models fare a bit better than they do with the unadjusted temperature record. That is, the observed temperature trend over the past 34 years (the period of record analyzed by Cowtan and Way) is a tiny bit closer to the average trend from the collection of climate models used in the new report from the U.N.’s Intergovernmental Panel on Climate Change (IPCC) than is the old temperature record.

Specifically, while the trend in observed global temperatures from 1979-2012 as calculated by Cowtan and Way is 0.17°C/decade, it is 0.16°C/decade in the temperature record compiled by the U.K. Hadley Center (the record that Cowtan and Way adjusted).  Because of the sampling errors associated with trend estimation, these values are not significantly different from one another.  Whether the 0.17°C/decade is significantly different from the climate model average simulated trend during that period of 0.23°C/decade is discussed extensively below.

But, suffice it to say that an insignificant difference of 0.01°C/decade in the global trend measured over more than 30 years is pretty small beer and doesn’t give model apologists very much to get happy over.

Instead, the attention is being deflected to “The Pause”—the leveling off of global surface temperatures during the past 16 years (give or take). Here, the new results from Cowtan and Way show that during the period 1997-2012, instead of a statistically insignificant rise at a rate of 0.05°C/decade as is contained in the “old” temperature record, the rise becomes a statistically significant 0.12°C/decade. “The Pause” is transformed into “The Slowdown” and alarmists rejoice because global warming hasn’t stopped after all. (If the logic sounds backwards, it does to us as well, if you were worried about catastrophic global warming, wouldn’t you rejoice at findings that indicate that future climate change was going to be only modest, more so than results to the contrary?)

The science behind the new Cowtan and Way research is still being digested by the community of climate scientists and other interested parties alike. The main idea is that the existing compilations of the global average temperature are very data-sparse in the high latitudes. And since the Arctic (more so than the Antarctic) is warming faster than the global average, the lack of data there may mean that the global average temperature trend may be underestimated. Cowtan and Way developed a methodology which relied on other limited sources of temperature information from the Arctic (such as floating buoys and satellite observations) to try to make an estimate of how the surface temperature was behaving in regions lacking more traditional temperature observations (the authors released an informative video explaining their research which may better help you understand what they did). They found that the warming in the data-sparse regions was progressing faster than the global average (especially during the past couple of years) and that when they included the data that they derived for these regions in the computation of the global average temperature, they found the global trend was higher than previously reported—just how much higher depended on the period over which the trend was calculated. As we showed, the trend more than doubled over the period from 1997-2012, but barely increased at all over the longer period 1979-2012.

Figure 1 shows the impact on the global average temperature trend for all trend lengths between 10 and 35 years (incorporating  our educated guess as to what the 2013 temperature anomaly will be), and compares that to the distribution of climate model simulations of the same period. Statistically speaking, instead of there being a clear inconsistency (i.e., the observed trend value falls outside of the range which encompasses 95% of all modeled trends) between the observations and the climate mode simulations for lengths ranging generally from 11 to 28 years and a marginal inconsistency (i.e., the observed trend value falls outside of the range which encompasses 90% of all modeled trends)  for most of the other lengths, now the observations track closely the marginal inconsistency line, although trends of length 17, 19, 20, 21 remain clearly inconsistent with the collection of modeled trends. Still, throughout the entirely of the 35-yr period (ending in 2013), the observed trend lies far below the model average simulated trend (additional information on the impact of the new Cowtan and Way adjustments on modeled/observed temperature comparison can be found here).

 

Figure 1. Temperature trends ranging in length from 10 to 35 years (ending in a preliminary 2013) calculated using the data from the U.K. Hadley Center (blue dots), the adjustments to the U.K. Hadley Center data made by Cowtan and Way (red dots) extrapolated through 2013, and the average of climate model simulations (black dots). The range that encompasses 90% (light grey lines) and 95% (dotted black lines) of climate model trends is also included.

The Cowtan and Way analysis is an attempt at using additional types of temperature information, or extracting “information” from records that have already told their stories, to fill in the missing data in the Arctic.  There are concerns about the appropriateness of both the data sources and the methodologies applied to them.  

A major one is in the applicability of satellite data at such high latitudes.   The nature of the satellite’s orbit forces it to look “sideways” in order to sample polar regions.  In fact, the orbit is such that the highest latitude areas cannot be seen at all.  This is compounded by the fact that cold regions can develop substantial “inversions” of near-ground temperature, in which temperature actually rises with height such that there is not a straightforward relationship between the surface temperature and the temperature of the lower atmosphere where the satellites measure the temperature. If the nature of this complex relationship is not constant in time, an error is introduced into the Cowtan and Way analysis.

Another unresolved problem comes up when extrapolating land-based weather station data far into the Arctic Ocean.  While land temperatures can bounce around a lot, the fact that much of the ocean is partially ice-covered for many months.  Under “well-mixed” conditions, this forces the near-surface temperature to be constrained to values near the freezing point of salt water, whether or not the associated land station is much warmer or colder.

You can run this experiment yourself by filling a glass with a mix of ice and water and then making sure it is well mixed.  The water surface temperature must hover around 33°F until all the ice melts.  Given that the near-surface temperature is close to the water temperature, the limitations of land data become obvious.

Considering all of the above, we advise caution with regard to Cowtan and Way’s findings.  While adding high arctic data should increase the observed trend, the nature of the data means that the amount of additional rise is subject to further revision.  As they themselves note, there’s quite a bit more work to be done this area.

In the meantime, their results have tentatively breathed a small hint of life back into the climate models, basically buying them a bit more time—time for either the observed temperatures to start rising rapidly as current models expect, or, time for the modelers to try to fix/improve cloud processes, oceanic processes, and other process of variability (both natural and anthropogenic) that lie behind what would be the clearly overheated projections. 

We’ve also taken a look at how “sensitive” the results are to the length of the ongoing pause/slowdown.  Our educated guess is that the “bit” of time that the Cowtan and Way findings bought the models is only a few years long, and it is a fact, not a guess, that each additional year at the current rate of lukewarming increases the disconnection between the models and reality.

 

Reference:

Cowtan, K., and R. G. Way, 2013. Coverage bias in the HadCRUT4 temperature series and its impact on recent temperature trends. Quarterly Journal of the Royal Meteorological Society, doi: 10.1002/qj.2297.

 

Was Typhoon Haiyan the Most Intense Storm in Modern History?

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Global warming buffs have been fond of claiming that the roaring winds of Typhoon Haiyan were the highest ever measured in a landfalling tropical cyclone, and that therefore (?) this is a result of climate change. In reality, it’s unclear whether or not it holds the modern record for the strongest surface wind at landfall. 

This won’t be known until there is a thorough examination of its debris field.

The storm of record is 1969 Hurricane Camille, which I rode out in an oceanfront laboratory about 25 miles east of the eye. There’s a variety of evidence arguing that Camille is going to be able to retain her crown.

The lowest pressure in Haiyan was 895 millibars, or 26.42 inches of mercury. To give an idea, the needle on your grandmonther’s dial barometer would have to turn two complete counterclockwise circles to get there. While there have been four storms in the Atlantic in the modern era that have been as strong or a bit stronger, the western Pacific sees one of these approximately every two years or so.

Camille’s lowest pressure was a bit higher, at 905 mb (26.72 inches). At first blush it would therefore seem Haiyan would win the blowhard award hands down, but Hayian had a very large eye around which its winds swirled, while Camille’s was one of the smallest ever measured.  At times in its brief life, Camille’s was so small that the hurricane hunter aircraft could not safely complete a 360 degree turn without brushing through the devastating innermost cloud band, something you just don’t want to be near in a turning aircraft. In fact, the last aircraft to get into Camille, which measured 190mph sustained winds, lost an engine in the severe turbulence and fortunately was able to limp home.

Haiyan’s estimated 195mph winds were derived from satellite data, rather than being directly sensed by an aircraft.  But winds over the open ocean are always greater than those at landfall because of friction, and the five mph difference between the two storms is physically meaningless. 

Victory for Cato: Feds Now Seeking Input on the Social Cost of Carbon

It’s about time!

For months, we have been hammering away at the point that the Feds’ current determination of the social cost of carbon is grossly out of touch with the relevant scientific literature and economic guidance.

Perhaps in response to the fact that they can’t argue against what we have been saying, the Administration has finally capitulated and is opening up their determination of the social cost of carbon (SCC) for public comment.

Their SCC calculation—in keeping with the playbook of the president’s Climate Action Plan—is a backdoor way of implementing a carbon tax. And it is slowly, pervasively, and worse of all, silently, creeping into all of our lives.  We’ve been trying to stop all of this by, at the very least, pulling back the cloak of secrecy and trying to make this once-esoteric  subject a topic of dinnertime conversation.

Meanwhile,  the government’s regulatory push using the SCC continues.

The Institute for Energy Research has recently identified nearly 30 federal regulations which have incorporated the SCC into their cost benefit analysis (and several more have been recently announced).

The SCC is used to make regulations seem less costly.  We say “seem,” because the “benefit” from reducing carbon dioxide (CO2)  emissions, as valued by the SCC, is likely never to be realized by the American consumer—yet the other costs (such as increased manufacturing costs) most assuredly will be.

The SCC is a theoretical cost of each additional CO2 emission. But the theory is so loosey-goosey that with a little creativity, you can arrive at pretty much any value for  the SCC—a point noted by M.I.T.’s Robert Pindyck in an article for the Summer 2013 edition of Cato’s Regulation.

As the Obama Administration wants to regulate away as many carbon dioxide emissions as possible, it is in its own self-interest to try to arrive at the highest SCC value possible.  This way, the more that CO2 emissions are reduced, the more money is “saved.”

Or so the idea goes.

But their path towards a high SCC is one away from both the best science and the most common-sense economics.

We imagine that readers of this blog probably are well-aware of the details behind this reality, as we have laid them out on many occasions,  so we won’t go into them again here.

Instead, we want to point out several opportunities to draw further attention to the short-comings in the Administration’s SCC determination.

The period for accepting public comments on several proposed rulemakings is open, and provides a good opportunity to remind the issuing agency what they did wrong. For example, here is a recently-announced regulation proposal from the Department of Energy (DoE) which seeks to impose higher energy efficiency rules for residential furnace fans. It employs the SCC to make this rule seem a lot sweeter than it actually is.

We have already submitted comments on several of these proposed regulations, including DoE regulations to increase the efficiency standards for Microwave Ovens, Walk-In Freezers, and Commercial Refrigeration Equipment.

So, it’s important that  the White House’s  Office of Management and Budget (OBM)  just announced that the social cost of carbon determination currently in force will be open to public comment starting sometime in the presumably near future (keep an eye on the Federal Register for the official announcement).

While it is too early to tell, this willingness to hear public comments on the SCC probably originated from the comments received on the Petition to Reconsider the proposed Microwave Oven ruling—the first rulemaking to incorporate the Administration’s latest-worst iteration of the SCC (which was about a 50% increase over its original figure). There hasn’t been an official announcement as to the result of Petition, but the scientific argument against it is a Cato product.

More than likely, though, this will all be for show.  The feds could selectively use some comments and  somehow find a way to raise the SCC even further.  Like we said, that’s easy to do—crank down the discount rate, or  crank up the damage function (make-up new damages not included in the current models)—even while paying lip service to the lowered equilibrium climate sensitivity and the CO2 fertilization effect.

We’d be more than happy to be wrong about this. But until then, our efforts to set things straight will continue.

Energy Subsidies vs. Energy Progress

If we did a poll of free market economists about federal programs that are the most wasteful and ridiculous, energy subsidies would be near the top of the list. It’s not just that energy subsidies make no sense in economic theory, but also that there are so many news stories highlighting the folly that it’s hard to see why policymakers persist in wasting our money.

From the Washington Post on Friday:

The Department of Energy failed to disclose concerns about a green-technology company that won $135 million in federal funding but ended up filing for bankruptcy in September, according to a watchdog report released this week. DOE Inspector General Gregory Friedman noted that the firm, San Francisco-based Ecotality, is still due to receive $26 million from the agency for testing electric vehicles.

The Energy Department awarded the firm $100 million in 2009 Recovery Act funding for that initiative, in addition to a combined $35 million from a separate program to help pay for testing vehicles

Ecotality is among a number of failed firms that received stimulus funding through an Obama administration initiative to support green-technology companies during the recession. Solyndra, a Silicon Valley-based solar-panel maker, stands as perhaps the most high-profile example. The business collapsed after receiving more than a half-billion dollars in Recovery Act money. Other examples include Beacon Power , a Massachusetts-based company that received at least $39 million from the federal government, along with Michigan-based battery manufacturers LG Chem and A123, which landed grants worth $150 million and $249 million, respectively.

On Sunday, the Washington Post profiled the economic chaos, central planning, and wasteful lobbying generated by federal mandates for cellulosic ethanol:

Congress assumed that it could be phased in gradually, but not this gradually. This year refiners were supposed to mix about one billion gallons of it into motor fuel. So far, there has been hardly a drop. More than a dozen companies have tried and failed to find a profitable formula combining sophisticated enzymes and the mundane but costly and labor-intensive job of collecting biomass.

To reach the ethanol goals set by Congress, the government came up with a byzantine implementation plan. Each gallon of renewable fuel has its own 38-character number, called a “renewable identification number,” to track its use and monitor trading. There are different types of these RINs for different biofuels, including corn-based ethanol, cellulosic ethanol and biodiesel.

In February of each year, refiners who fail to provide enough renewable fuel to the blenders who mix ethanol and gasoline must buy extra RIN certificates. When companies have extra credits for renewable fuels, the RINs can be banked and sold in later years. If there are not enough renewable fuels overall, the price of RINs rises — and provides an incentive to produce more.

And in a related story on ethanol, the Post found:

Five years ago, about a dozen companies were racing to start up distilleries that would produce enough cellulosic ethanol to meet the congressionally mandated target of 16 billion gallons a year by 2022 … The Agriculture Department provided a $250 million loan guarantee for the Coskata plant. Today, most of the dozen contenders have gone out of business or shelved their plans.

Federal subsidies and mandates for ethanol and other energy activities are sadly causing the diversion of billions of dollars of capital to uneconomic uses. That’s the bad news.

But there is good news on the energy front, which comes from far outside of Washington. The Wall Street Journal last weekend profiled “the little guys,” the market entrepreneurs, who were behind the shale energy revolution:

The experts keep getting it wrong. And the oddballs keep getting it right. Over the past five years of business history, two events have shocked and transformed the nation. In 2007 and 2008, the housing market crumbled and the financial system collapsed, causing trillions of dollars of losses. Around the same time, a few little-known wildcatters began pumping meaningful amounts of oil and gas from U.S. shale formations. A country that once was running out of energy now is on track to become the world’s leading producer.

The resurgence in U.S. energy came from a group of brash wildcatters who discovered techniques to hydraulically fracture—or frack—and horizontally drill shale and other rock. Many of these men operated on the fringes of the oil industry, some without college degrees or much background in drilling, geology or engineering.

Thank goodness for the oddballs. And thank goodness for the market system that channels the brashness into creating growth for all of us, not just the favored few getting handouts from Washington.

Transit Spending Slows Urban Growth

Contrary to the claims of many transit advocates, regions that spend more money on transit seem to grow slower than regions that spend less. The fastest-growing urban areas of the country tend to offer transit service mainly to people who lack access to automobiles. Urban areas that seek to provide high-cost transit services, such as trains, in order to attract people out of their cars, tend to grow far slower.

Transit advocates often argue that a particular city or region must spend more on urban transit in order to support the growth of that region. To test that claim, I downloaded the latest historic data files from the National Transit Database, specifically the capital funding and service data and operating expenses by mode time series. These files list which urbanized area each transit agency primarily serves, so it was easy to compare these data with Census Bureau population data from 1990, 2000, and 2010.

The transit data include capital and operating expenses for all years from 1991 through 2011. I decided to compare the average of 1991 through 2000 per capita expenses with population growth in the 1990s, and the average of 2001 through 2010 per capita expenses with population growth in the 2010s. In case there is a delayed response, I also compared the average of 1990 through 2000 per capita expenses with population growth in the 2000s. Although it shouldn’t matter too much, I used GNP deflators to convert all costs to 2012 dollars.

MagLev: The Idea Whose Time Never Came

Superconducting magnetic levitation is the “next generation of transportation,” says a new rail advocacy group that calls itself The Northeast Maglev (TNEM). The group’s proposed New York-Washington maglev line has received attention from the Washington Post and Baltimore Sun. TNEM’s claims might have seemed valid 80 years ago, when maglev trains were first conceived, but today maglev is just one more superexpensive technology that can’t compete with what we already have.

Superconducting maglev train being tested in Japan. Wikimedia commons photo by Yosemite.

Maglev has all the defects of conventional high-speed rail with the added bonuses of higher costs and greater energy requirements. Unlike automobiles on roads, rails don’t go where you want to go when you want to go there. Compared with planes, even the fastest trains are slow, and modest improvements in airport security would do far more to speed travelers, at a far lower cost, than building expensive new rail infrastructure.

Current Wisdom: Observations Now Inconsistent with Climate Model Predictions for 25 (going on 35) Years

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels and Paul C. “Chip” Knappenberger, from Cato’s Center for the Study of Science, review interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.  

 

Question: How long will the fantasy that climate models are reliable indicators of the earth’s climate evolution persist in face of overwhelming evidence to the contrary?  

Answer: Probably for as long as there is a crusade against fossil fuels.  

Without the exaggerated alarm conjured from overly pessimistic climate model projections of climate change from carbon dioxide emissions, fossil fuels—coal, oil, gas—would regain their image as the celebrated agents of  prosperity that they are, rather than being labeled as pernicious agents of our destruction.  

Just how credible are these climate models?  

In two words, “they’re not.”  

Everyone has read that over the past 10-15 years, most climate models’ forecasts of the rate of global warming have been wrong. Most predicted a hefty warming of the earth’s average surface temperature to have taken place, while there was no significant change in the real world.  

But very few  people know that the same situation has persisted for 25, going on 35 years, or that over the past 50-60 years (since the middle of the 20th century), the same models expected about 33 percent more warming to have taken place than was observed.