Topic: Energy and Environment

Infrastructure Is Not the Problem

The sudden collapse of a 58-year-old bridge across the Skagit River in Washington state has led to renewed calls to spend more money on American infrastructure. But if that spending comes out of tax dollars rather than user fees and is dedicated to replacing bridges, it will be seriously misplaced.

The usual media hysteria followed the collapse. “Thousands of bridges around the U.S. may be one freak accident or mistake away from collapse,” screamed CBS News. “If just one of [New York’s Tappan Zee Bridge’s] structural elements gives way, the whole bridge could fall and send” hundreds of cars “tumbling into the Hudson River,” warned Business Week.

About 18,000 highway bridges (less than 3 percent of the total) built in the 1950s and early 1960s have what is now considered to be a design flaw that makes them “fracture critical.” This means that at least one major element does not have redundent support, so if that element gives way, the entire bridge could collapse. The Skagit River Bridge failed when an oversized truck that should not have been on the bridge hit a cross beam that lacked redundent support. “This does not mean the bridge is inherently unsafe, only that there is a lack of redundancy in its design,” says the American Association of State Highway and Transportation Officials (AASHTO).

To listen to the hype, you would think that bridges are failing on almost a daily basis. But put this into perspective: In 2012, more than 34,000 people died in traffic accidents. Virtually none of them died due to a fracture-critical bridge failure. We can do lots of things to make highways safer and reduce that 34,000. A crash program to replace thousands of bridges isn’t one of them and is likely to divert funds away from programs that are far more important.

Many of the stories about America’s infrastructure focus on the number of “structurally deficient” bridges, which (says AASHTO) doesn’t mean the bridges are unsafe but only that they require “significant maintenance and repair to remain in service.” What the stories rarely mention is that in the last two decades the number of structurally deficient bridges has declined by 44 percent, from more than 118,000 in 1992 to fewer than 67,000 in 2012, even as the total number of highway bridges increased from 572,000 to 607,000. The number of fracture-critical bridges has declined from 22,000 in the last four years alone. In other words, the problem is going away without the help of a giant new federal program.

Highway user fees, including federal and state gas taxes and tolls, fund nearly all construction and maintenance of state highways and bridges. The Skagit River Bridge notwithstanding, these roads and bridges tend to be in better shape than those that are locally owned, which need about $30 billion a year from property, sales, or other local taxes. User fees work better than taxes because the fees give highway managers signals about where to spend the money.

Speaker of the House John Boehner wants to dedicate oil and gas royalties to highway infrastructure. But that’s the wrong source of money and it will almost certainly be spent in the wrong places as as much if not most spending will be on glitzy projects that glorify the elected officials who appropriate the money rather than where it is really needed. For example, one sector hungry for more “infrastructure spending” is the rail transit industry, which since 1982 has automatically received a large share of all new transportation dollars. Yet rail transit does virtually nothing to relieve congestion or make our highways safer. Moreover, transit suffers from its own infrastructure crisis, mainly because it is funded mostly out of tax dollars that get spent on glamorous new rail lines rather than user fees that would be spent on maintenance.

Recent highway safety data reveal a striking 20 percent decline in fatalities between 2007 and 2010. This decline was associated with a mere 2.2 percent decline in driving, suggesting that–in the absence of the recession–a 2.2 percent increase in highway capacity and other congestion relief could have produced a similar decline in fatalities. Of the 41,259 fatalities in 2007, 13 were due to a bridge failure; there have been virtually none since then.

In short, the key to sound infrastructure is funding that infrastructure out of user fees rather than tax dollars. Since that’s true, one way to improve highway safety would be to develop a new system of user fees that local governments can tap into so that local as well as state highway engineers receive sufficient funds and the appropriate signals about where to spend money.

Climate History: Cato Boffins Discovered “Anti-information”

While doing some historical studies in preparation for an article in Cato’s Regulation magazine, we found that we  once discovered the information equivalent of antimatter, namely, “anti-information”.

This breakthrough came  when we were reviewing the first “National Assessment” of climate change impacts in the United States in the 21st century, published by the U.S. Global Change Research Program (USGCRP) in 2000.  The Assessments are mandated by the Global Change Research Act of 1990.  According to that law, they are, among other things, for “the Environmental Protection Agency for use in the formulation of a coordinated national policy on global climate change…”

One cannot project future climate without some type of model for what it will be.  In this case, the USGCRP examined a suite of nine climate models and selected two for the Assessment. One was the Canadian Climate Model, which forecast the most extreme warming for the 21st century of all models, and the other was from the Hadley Center at the U.K Met Office, which predicted the greatest changes in precipitation.

We thought this odd and were told by the USGCRP that they wanted to examine the plausible limits of climate change. Fair enough, we said, but we also noted that there was no test of whether the models could simulate even of the most rudimentary climate behavior in past (20th) century.

So, we tested them on ten-year running means of annual temperature over the lower 48 states.

One standard method used to determine the utility of a model is to compare the “residuals”, or the differences between what is predicted and what is observed, to the original data.  Specifically, if the variability of the residuals is less than that of the raw data, then the model has explained a portion of the behavior of the raw data and the model can continue to be tested and entertained.

A model can’t do worse than explaining nothing, right?

Not these models!  The differences between their predictions and the observed temperatures were significantly greater (by a factor of two) than what one would get just applying random numbers.

Ponder this:  Suppose there is a multiple choice test, asking for the correct temperature forecast for 100 temperature observations, and there were four choices. Using random numbers, you would average one-in-four correct, or 25%. But the models in the National Assessment somehow could only get 12.5%!

“No information”—a random number simulation—yields 25% correct in this example, which means that anything less is anti-information. It seems impossible, but it happened.

We informed the USGCRP of this problem when we discovered it, and they wrote back that we were right, and then they went on to publish their Assessment, undisturbed that they were basing it models that had just done the impossible.

Welcome to the Whimsy-conomy, Energy Trade Edition

The AP reports some bad news for anyone seeking a little security and predictability in the US and global energy markets:

Energy Secretary Ernest Moniz said Tuesday he will delay final decisions on about 20 applications to export liquefied natural gas until he reviews studies by the Energy Department and others on what impact the exports would have on domestic natural gas supplies and prices.

Moniz, who was sworn in Tuesday as the nation’s new energy chief, said he promised during his confirmation hearing that he would “review what’s out there” before acting on proposals to export natural gas. Among the things Moniz said he wants to review is whether the data in the studies are outdated.

A study commissioned by the Energy Department concluded last year that exporting natural gas would benefit the U.S. economy even if it led to higher domestic prices for the fuel.

The AP adds that Secretary Moniz justified this delay as his “commitment” to Senate Energy Committee Chairman Ron Wyden (D-Ore.) who opposes natural gas exports and has criticized the DOE study.  Moniz’s statement comes just days after his department (quietly, on a Friday) approved one pending export application—moving the grand total of approvals to two out of 20 total applications, most of which have been sitting on DOE’s desk for several years now.

And who says the U.S. government isn’t swift and efficient?

The Realities of Government Infrastructure

Politicians and liberal economists get misty-eyed when thinking about grand infrastructure projects. But recent stories in the Washington Post about D.C.-area projects illustrate the realities of government capital investments.

Arlington County recently spent $1 million for a single bus stop, and the structure doesn’t even shelter passengers from wind or rain. The stop is one of 24 along a planned streetcar route, which is a mode of transportation that makes no sense in this area. (I understand that the streetcar dream of local politicians is currently on hold in the face of strong citizen opposition). Why is Arlington wasting so much money on these bus stops? Probably because 80 percent of the costs are being paid by state and federal taxpayers, not local taxpayers.

The Washington Airports Authority has been in the news for mismanagement, overspending, and corruption. The Washington Post has a story today about corruption in contracting by the agency and the complete failure of senior executives to do anything about it. Why the corruption and mismanagement? Because the Authority is a government agency, and worse, it has a monopoly over D.C.-area airports. Airports should be privatized in order to introduce competition, improve service, end corruption, and reduce costs.

The Washington Post has also done an excellent job covering the Silver Spring Transit Center fiasco. The estimated cost of this grandiose bus/train station has more than quadrupled over time to $120 million. It’s a classic government cost overrun story involving mismanagement, design screw-ups, and contractor failures. A key cause of the problems seems to have been that so many different government agencies were involved that no one had the responsibility or incentive to make needed hard decisions to ensure quality and control costs. Today, fingers of blame are pointing in every direction, and the costs will rise even further as major engineering defects are fixed.  

The grandness of political visions for infrastructure run far ahead of the government’s ability to actually implement projects in an efficient manner. There are types of infrastructure that governments must fund. But more infrastructure should be opened up for private funding, ownership, and control. Private businesses make mistakes, but when they are spending their own money they have strong incentives to control costs, eliminate corruption, and complete quality projects on time—incentives that simply don’t exist in the government sector.

Mobility Is Freedom, Not an Invasion of Privacy

Mobility is freedom, or at least an important part of it. Yet earlier this month challenges to expansions of that freedom came from, surprisingly, the Mises Institute of Canada, Reason magazine, and American Enterprise Institute. The issues are new automobile technologies, specifically self-driving cars and improved road pricing, and the challenges came from people who clearly don’t understand the technologies involved.

Self-driving cars, says Roger Toutant writing for the Mises Institute of Canada, will lead to “a national, state-operated, computer network that will be used to achieve an Orwellian level of vehicular control and information sharing. …The implications are ominous. In the future, private spheres will be invaded and all movements will be tracked.”

“Boot up a Google car,” agrees Greg Beato of Reason magazine, “and it’s not so easy to cut the connection with the online mothership.” If you get into a Google driverless car, “you immediately start sending great quantities of revealing information to a company that’s already hoarding every emoticon you’ve ever IMed.”

It is appropriate to question new technologies, but the answer is that’s not the way these cars work. None of the self-driving cars being developed by Volkswagen, Google, or other companies rely at all on central computers. Instead, all the computing power is built into each car.

Low Climate Sensitivity Making its Way into the Mainstream Press

When it comes to the press, the New York Times pretty much defines “mainstream.”

And Justin Gillis is the Times’ mainstream reporter on the global warming beat.

So it is somewhat telling, that his article on Tuesday, “A Change in Temperature,” was largely dedicated (although begrudgingly) to facing up to the possibility that mainstream estimates (i.e., those produced by the U.N.’s Intergovernmental Panel on Climate Change) of climate sensitivity are too large.

Readers of this blog are probably well aware of the reasons why.

Despite our illusions of grandeur, this blog isn’t the mainstream press –although we do seek to influence it. Maybe we are being successful.

Throughout Gillis’ article are sprinkled references to “climate contrarians,” and even the recognition of the effort by such contrarians to push the new science on low climate sensitivity to the forefront of the discussion to change the existing politics of climate change.

Gillis writes:

Still, the recent body of evidence — and the political use that climate contrarians are making of it to claim that everything is fine — sheds some light on where we are in our scientific and public understanding of the risks of climate change.

We at the Cato’s Center for the Study of Science are at the leading edge of efforts to present a more accurate representation of the scientific of climate change through our testimony to Congress, public comments and review of government documents and proposals, media appearances, op-eds, and serial posts on this blog, among other projects. We emphasize that current regulations and proposed legislation are based on outdated, and likely wrong, projections of future climate impacts from human carbon dioxide emissions from the use of fossil fuels to produce energy.

Gillis recognizes the positives of a low climate sensitivity value:

“…tantalizing possibility that climate change might be slow and limited enough that human society could adapt to it without major trauma.”

“It will certainly be good news if these recent papers stand up to critical scrutiny, something that will take at least a year or two to figure out.”

“So if the recent science stands up to critical examination, it could indeed turn into a ray of hope…”

But, the “mainstream” is slow to change. And so despite the good news about climate sensitivity, Gillis closes his article by pointing out that, in his opinion, the political response to climate change has been “weak” (contrary to our view), and that therefore:

Even if climate sensitivity turns out to be on the low end of the range, total emissions may wind up being so excessive as to drive the earth toward dangerous temperature increases.

Clearly we still have work to do, but there are signs of progress!

CO2: 400ppm and Growing

The atmospheric concentration of carbon dioxide (CO2) has recently reached a “milestone” of 400 parts per million (ppm). In some circles, this announcement has been met with consternation and gnashing of teeth. The proper reaction is celebration.

The growth in the atmospheric CO2 concentration over the past several centuries is primarily the result of mankind’s thirst for energy—largely in the form of fossil fuels.  According to the World Bank, fossil fuel energy supplies about 80% of the world’s energy production—a value which has been pretty much constant for the past 40 years. During that time, the global population increased by 75%, and global energy use doubled. Global per capita energy use increased, while global energy use per $1000 GDP declined.  We are using more energy, but we are using it more efficiently. In the developed world, life expectancy has doubled since the dawn of the fossil fuel era.

Of course, burning fossil fuels to produce energy results in the emission of carbon dioxide into the atmosphere, tipping the natural balance of annual CO2 flux and leading to  a gradual build-up.

There are two primary externalities that result from our emissions of carbon dioxide into the atmosphere—1) an enhancement of the greenhouse effect, which results in an alteration of the energy flow in the earth’s climate and a general tendency to warm the global average surface temperature, and 2) an enhancement of the rate of photosynthesis in plants and a general tendency to result in more efficient growth and an overall healthier condition of vegetation (including crops).  There’s incontrovertible evidence that the planet is both warmer and greener than it was 100 years ago.

As we continually document (see here for our latest post), more and more science is suggesting that the rate (and thus magnitude at any point in time) of CO2-induced climate change is not as great as commonly portrayed. The lower the rate of change, the lower the resulting impact. If the rate is low enough, carbon dioxide emissions confer a net benefit. We’d like to remind readers that “it’s not the heat, it’s the sensitivity,” when it comes to carbon dioxide, and the sensitivity appears to have been overestimated.

As new science erodes the foundation of climate worry, new technologies are expanding recoverable fossil fuel resources. Horizontal drilling and hydraulic fracturing have opened up vast expanses of fossil fuel resources—mainly natural gas—that were untouchable just a few years ago. The discovery that the world is awash in hundreds of years of recoverable fuels is a game-changer, given  the strong correlation between energy use per capita and life expectancy.

400ppm of carbon dioxide in the atmosphere should remind us of our continuing success at expanding the global supply of energy to meet a growing demand. That  success which ultimately leads to an improvement of the global standard of living and a reduction in vulnerability to the vagaries of weather and climate.

400pm is cause for celebration. “A world lit only by fire” is not.