Topic: Energy and Environment

Climate Models Veer Off Course

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

A new paper shows that climate models are getting worse at replicating a collection of known climate changes as incentivized efforts to improve them have them universally veering off course.

Anyone who is familiar with John Allison’s book The Financial Crisis and the Free Market Cure knows that incentives can drive otherwise “independent” decisions in a common direction, with sometimes disastrous results. Allison documents how a collection of government incentives (intentionally and unintentionally) drove decisions in the wider financial markets towards overinvesting in residential real estate. The resulting massive misallocation of funds and ultimate bubble burst sent us into the Great Recession, from which we have yet to recover.

Obviously, that was not the intended outcome of the federal policies, but as Allison writes “Intentions that are called ‘good’ often do not produce favorable outcomes.” Allison argues that a free market, one that is free from centralized incentives, and one in which truly independent decisions are being made, is less susceptible to a universal failure and that when failures do occur, they are not as severe and they are more quickly recovered from. Had the financial markets been operating without federal regulations and incentives, not only would the Great Recession not have occurred (or would have been minor), but that our country would be in a much healthier financial state with an overall higher standard of living for everyone.

Not only can (and do) targeted incentives lead financial markets astray, they also operate the same way in the field of science.

In either case, the ultimate effect is to steer the outcome away from its most efficient pathway and instead send it veering towards dangerous territory that is marked by a decline in our overall well-being.

This is nowhere more evident than in the field of climate science, as a new paper by the University of Wisconsin-Milwaukee’s Kyle Swanson clearly illuminates.

In his work “Emerging selection bias in large climate change simulations,” Swanson finds that the new generation of climate models has become worse at matching recent climate change than the generation of models which they supplant.

Current Wisdom: Hansen’s Extreme Sea Level Rise Projections Drowning in Hubris

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels, director of the Center for the Study of Science, reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

Retired NASA scientist and peripatetic global warming crusader James Hansen has a—let’s put it delicately—unique view of sea-level rise resulting from mankind’s use of fossil fuels. Specifically, he believes global average sea level will rise some 15 to 20 feet by 2095. The central estimate from the most recent report from U.N.’s Intergovernmental Panel on Climate Change (IPCC) is about 15 inches.

Hansen’s an outlier, and proud of it, thinking himself more courageous than other scientists who, he says, are “reticent” to tell the public how bad things really are.

Wethinks that Hansen doth protest too much. His scientific arguments for a large and rapid sea level rise this century simply don’t hold water.

He laid out a summary of his logic on sea level rise in a book chapter (co-authored with Makiko Sato) published last year titled “Paleoclimate Implications for Human-Made Climate Change.”

Below, we have reproduced the relevant text on sea-level rise from that chapter along with our comments highlighting recent findings from the scientific literature which refute each and every one of Hansen’s claims.

Infrastructure Is Not the Problem

The sudden collapse of a 58-year-old bridge across the Skagit River in Washington state has led to renewed calls to spend more money on American infrastructure. But if that spending comes out of tax dollars rather than user fees and is dedicated to replacing bridges, it will be seriously misplaced.

The usual media hysteria followed the collapse. “Thousands of bridges around the U.S. may be one freak accident or mistake away from collapse,” screamed CBS News. “If just one of [New York’s Tappan Zee Bridge’s] structural elements gives way, the whole bridge could fall and send” hundreds of cars “tumbling into the Hudson River,” warned Business Week.

About 18,000 highway bridges (less than 3 percent of the total) built in the 1950s and early 1960s have what is now considered to be a design flaw that makes them “fracture critical.” This means that at least one major element does not have redundent support, so if that element gives way, the entire bridge could collapse. The Skagit River Bridge failed when an oversized truck that should not have been on the bridge hit a cross beam that lacked redundent support. “This does not mean the bridge is inherently unsafe, only that there is a lack of redundancy in its design,” says the American Association of State Highway and Transportation Officials (AASHTO).

To listen to the hype, you would think that bridges are failing on almost a daily basis. But put this into perspective: In 2012, more than 34,000 people died in traffic accidents. Virtually none of them died due to a fracture-critical bridge failure. We can do lots of things to make highways safer and reduce that 34,000. A crash program to replace thousands of bridges isn’t one of them and is likely to divert funds away from programs that are far more important.

Many of the stories about America’s infrastructure focus on the number of “structurally deficient” bridges, which (says AASHTO) doesn’t mean the bridges are unsafe but only that they require “significant maintenance and repair to remain in service.” What the stories rarely mention is that in the last two decades the number of structurally deficient bridges has declined by 44 percent, from more than 118,000 in 1992 to fewer than 67,000 in 2012, even as the total number of highway bridges increased from 572,000 to 607,000. The number of fracture-critical bridges has declined from 22,000 in the last four years alone. In other words, the problem is going away without the help of a giant new federal program.

Highway user fees, including federal and state gas taxes and tolls, fund nearly all construction and maintenance of state highways and bridges. The Skagit River Bridge notwithstanding, these roads and bridges tend to be in better shape than those that are locally owned, which need about $30 billion a year from property, sales, or other local taxes. User fees work better than taxes because the fees give highway managers signals about where to spend the money.

Speaker of the House John Boehner wants to dedicate oil and gas royalties to highway infrastructure. But that’s the wrong source of money and it will almost certainly be spent in the wrong places as as much if not most spending will be on glitzy projects that glorify the elected officials who appropriate the money rather than where it is really needed. For example, one sector hungry for more “infrastructure spending” is the rail transit industry, which since 1982 has automatically received a large share of all new transportation dollars. Yet rail transit does virtually nothing to relieve congestion or make our highways safer. Moreover, transit suffers from its own infrastructure crisis, mainly because it is funded mostly out of tax dollars that get spent on glamorous new rail lines rather than user fees that would be spent on maintenance.

Recent highway safety data reveal a striking 20 percent decline in fatalities between 2007 and 2010. This decline was associated with a mere 2.2 percent decline in driving, suggesting that–in the absence of the recession–a 2.2 percent increase in highway capacity and other congestion relief could have produced a similar decline in fatalities. Of the 41,259 fatalities in 2007, 13 were due to a bridge failure; there have been virtually none since then.

In short, the key to sound infrastructure is funding that infrastructure out of user fees rather than tax dollars. Since that’s true, one way to improve highway safety would be to develop a new system of user fees that local governments can tap into so that local as well as state highway engineers receive sufficient funds and the appropriate signals about where to spend money.

Climate History: Cato Boffins Discovered “Anti-information”

While doing some historical studies in preparation for an article in Cato’s Regulation magazine, we found that we  once discovered the information equivalent of antimatter, namely, “anti-information”.

This breakthrough came  when we were reviewing the first “National Assessment” of climate change impacts in the United States in the 21st century, published by the U.S. Global Change Research Program (USGCRP) in 2000.  The Assessments are mandated by the Global Change Research Act of 1990.  According to that law, they are, among other things, for “the Environmental Protection Agency for use in the formulation of a coordinated national policy on global climate change…”

One cannot project future climate without some type of model for what it will be.  In this case, the USGCRP examined a suite of nine climate models and selected two for the Assessment. One was the Canadian Climate Model, which forecast the most extreme warming for the 21st century of all models, and the other was from the Hadley Center at the U.K Met Office, which predicted the greatest changes in precipitation.

We thought this odd and were told by the USGCRP that they wanted to examine the plausible limits of climate change. Fair enough, we said, but we also noted that there was no test of whether the models could simulate even of the most rudimentary climate behavior in past (20th) century.

So, we tested them on ten-year running means of annual temperature over the lower 48 states.

One standard method used to determine the utility of a model is to compare the “residuals”, or the differences between what is predicted and what is observed, to the original data.  Specifically, if the variability of the residuals is less than that of the raw data, then the model has explained a portion of the behavior of the raw data and the model can continue to be tested and entertained.

A model can’t do worse than explaining nothing, right?

Not these models!  The differences between their predictions and the observed temperatures were significantly greater (by a factor of two) than what one would get just applying random numbers.

Ponder this:  Suppose there is a multiple choice test, asking for the correct temperature forecast for 100 temperature observations, and there were four choices. Using random numbers, you would average one-in-four correct, or 25%. But the models in the National Assessment somehow could only get 12.5%!

“No information”—a random number simulation—yields 25% correct in this example, which means that anything less is anti-information. It seems impossible, but it happened.

We informed the USGCRP of this problem when we discovered it, and they wrote back that we were right, and then they went on to publish their Assessment, undisturbed that they were basing it models that had just done the impossible.

Welcome to the Whimsy-conomy, Energy Trade Edition

The AP reports some bad news for anyone seeking a little security and predictability in the US and global energy markets:

Energy Secretary Ernest Moniz said Tuesday he will delay final decisions on about 20 applications to export liquefied natural gas until he reviews studies by the Energy Department and others on what impact the exports would have on domestic natural gas supplies and prices.

Moniz, who was sworn in Tuesday as the nation’s new energy chief, said he promised during his confirmation hearing that he would “review what’s out there” before acting on proposals to export natural gas. Among the things Moniz said he wants to review is whether the data in the studies are outdated.

A study commissioned by the Energy Department concluded last year that exporting natural gas would benefit the U.S. economy even if it led to higher domestic prices for the fuel.

The AP adds that Secretary Moniz justified this delay as his “commitment” to Senate Energy Committee Chairman Ron Wyden (D-Ore.) who opposes natural gas exports and has criticized the DOE study.  Moniz’s statement comes just days after his department (quietly, on a Friday) approved one pending export application—moving the grand total of approvals to two out of 20 total applications, most of which have been sitting on DOE’s desk for several years now.

And who says the U.S. government isn’t swift and efficient?

The Realities of Government Infrastructure

Politicians and liberal economists get misty-eyed when thinking about grand infrastructure projects. But recent stories in the Washington Post about D.C.-area projects illustrate the realities of government capital investments.

Arlington County recently spent $1 million for a single bus stop, and the structure doesn’t even shelter passengers from wind or rain. The stop is one of 24 along a planned streetcar route, which is a mode of transportation that makes no sense in this area. (I understand that the streetcar dream of local politicians is currently on hold in the face of strong citizen opposition). Why is Arlington wasting so much money on these bus stops? Probably because 80 percent of the costs are being paid by state and federal taxpayers, not local taxpayers.

The Washington Airports Authority has been in the news for mismanagement, overspending, and corruption. The Washington Post has a story today about corruption in contracting by the agency and the complete failure of senior executives to do anything about it. Why the corruption and mismanagement? Because the Authority is a government agency, and worse, it has a monopoly over D.C.-area airports. Airports should be privatized in order to introduce competition, improve service, end corruption, and reduce costs.

The Washington Post has also done an excellent job covering the Silver Spring Transit Center fiasco. The estimated cost of this grandiose bus/train station has more than quadrupled over time to $120 million. It’s a classic government cost overrun story involving mismanagement, design screw-ups, and contractor failures. A key cause of the problems seems to have been that so many different government agencies were involved that no one had the responsibility or incentive to make needed hard decisions to ensure quality and control costs. Today, fingers of blame are pointing in every direction, and the costs will rise even further as major engineering defects are fixed.  

The grandness of political visions for infrastructure run far ahead of the government’s ability to actually implement projects in an efficient manner. There are types of infrastructure that governments must fund. But more infrastructure should be opened up for private funding, ownership, and control. Private businesses make mistakes, but when they are spending their own money they have strong incentives to control costs, eliminate corruption, and complete quality projects on time—incentives that simply don’t exist in the government sector.

Mobility Is Freedom, Not an Invasion of Privacy

Mobility is freedom, or at least an important part of it. Yet earlier this month challenges to expansions of that freedom came from, surprisingly, the Mises Institute of Canada, Reason magazine, and American Enterprise Institute. The issues are new automobile technologies, specifically self-driving cars and improved road pricing, and the challenges came from people who clearly don’t understand the technologies involved.

Self-driving cars, says Roger Toutant writing for the Mises Institute of Canada, will lead to “a national, state-operated, computer network that will be used to achieve an Orwellian level of vehicular control and information sharing. …The implications are ominous. In the future, private spheres will be invaded and all movements will be tracked.”

“Boot up a Google car,” agrees Greg Beato of Reason magazine, “and it’s not so easy to cut the connection with the online mothership.” If you get into a Google driverless car, “you immediately start sending great quantities of revealing information to a company that’s already hoarding every emoticon you’ve ever IMed.”

It is appropriate to question new technologies, but the answer is that’s not the way these cars work. None of the self-driving cars being developed by Volkswagen, Google, or other companies rely at all on central computers. Instead, all the computing power is built into each car.