Topic: Energy and Environment

CO2: 400ppm and Growing

The atmospheric concentration of carbon dioxide (CO2) has recently reached a “milestone” of 400 parts per million (ppm). In some circles, this announcement has been met with consternation and gnashing of teeth. The proper reaction is celebration.

The growth in the atmospheric CO2 concentration over the past several centuries is primarily the result of mankind’s thirst for energy—largely in the form of fossil fuels.  According to the World Bank, fossil fuel energy supplies about 80% of the world’s energy production—a value which has been pretty much constant for the past 40 years. During that time, the global population increased by 75%, and global energy use doubled. Global per capita energy use increased, while global energy use per $1000 GDP declined.  We are using more energy, but we are using it more efficiently. In the developed world, life expectancy has doubled since the dawn of the fossil fuel era.

Of course, burning fossil fuels to produce energy results in the emission of carbon dioxide into the atmosphere, tipping the natural balance of annual CO2 flux and leading to  a gradual build-up.

There are two primary externalities that result from our emissions of carbon dioxide into the atmosphere—1) an enhancement of the greenhouse effect, which results in an alteration of the energy flow in the earth’s climate and a general tendency to warm the global average surface temperature, and 2) an enhancement of the rate of photosynthesis in plants and a general tendency to result in more efficient growth and an overall healthier condition of vegetation (including crops).  There’s incontrovertible evidence that the planet is both warmer and greener than it was 100 years ago.

As we continually document (see here for our latest post), more and more science is suggesting that the rate (and thus magnitude at any point in time) of CO2-induced climate change is not as great as commonly portrayed. The lower the rate of change, the lower the resulting impact. If the rate is low enough, carbon dioxide emissions confer a net benefit. We’d like to remind readers that “it’s not the heat, it’s the sensitivity,” when it comes to carbon dioxide, and the sensitivity appears to have been overestimated.

As new science erodes the foundation of climate worry, new technologies are expanding recoverable fossil fuel resources. Horizontal drilling and hydraulic fracturing have opened up vast expanses of fossil fuel resources—mainly natural gas—that were untouchable just a few years ago. The discovery that the world is awash in hundreds of years of recoverable fuels is a game-changer, given  the strong correlation between energy use per capita and life expectancy.

400ppm of carbon dioxide in the atmosphere should remind us of our continuing success at expanding the global supply of energy to meet a growing demand. That  success which ultimately leads to an improvement of the global standard of living and a reduction in vulnerability to the vagaries of weather and climate.

400pm is cause for celebration. “A world lit only by fire” is not. 

Stopping the EPA from Regulating Puddles

Some of the biggest Environmental Protection Agency abuses of property rights (see last term’s Sackett case and this term’s Koontz case) stem from expansive interpretations of the Clean Water Act. The EPA imposes huge costs on people who want to do anything on their property, claiming the agency has the authority to regulate “wetlands.” The agency is only supposed to have authority to regulate discharges to “navigable” waters, but the jurisprudence here is so confused that it’s become an area ripe for federal overreach. This week a group of Republican senators (Rand Paul, Mike Lee, Marco Rubio, David Vitter, and Mitch McConnell) introduced a bill that’s an excellent step to addressing the federal government’s endemic property rights violations. The Defense of Environment and Property Act of 2013 does a number of very good things:

  1. Narrows the definition of “navigable waters” to waters that are “navigable-in-fact” or “permanent, standing, or continuously flowing bodies of water … that are connected to waters that are navigable-in-fact,” with explicit exclusions for such things as rainfall drainage channels and wetlands without a continuous connection to “waters of the United States”;

  2. Directs that the EPA and Army Corps of Engineers shall not impinge on the primary power of states over land and water use;

  3. Gives landowners judicial review in federal court within 30 days of any claim of federal authority over their land or water resources;

  4. Makes clear that ground water is state, not federal water;

  5. Eliminates the so-called “signficant nexus test” that the EPA often uses to assert jurisdiction over otherwise non-federal lands;

  6. Strikes down various regulations and agency guidances;

  7. Requires a landowner’s consent before federal agents can enter his property to collect information regarding navigable waters;

  8. Defines as a regulatory taking any loss of value of land due to navigable-water-related regulation and compensates the landowner twice the value of that loss.
Unlike most legislation before Congress, this bill would help a lot of people very quickly in a very direct way. I hope Congress acts on it.

Farmers Starting to Resent Strings Attached to Subsidies

Earlier this week, farming and some conservation groups announced that they had come to a deal to link eligibility for crop insurance premium subsidies to compliance with conservation measures. In return, in one of the great sell-outs in modern times, the conservation groups agreed not to push for payment limits or means testing on farm subsidies.

But it seems that the new link between conservation and government support for crop insurance has angered the House Agriculture Committee Chairman, Frank Lucas. From the DTN Ag Policy Blog yesterday:

Lucas, a Republican from Oklahoma, told DTN off the House floor Wednesday that he has a philosophical problem with various lobby groups “tying strings to how farmers farm” and dictating terms to producers when the farm bill is supposed to be about raising food and fiber.

“My perspective has always been, very sincerely, if a farm bill is about raising food — and I know 80% of it now is about making sure people have enough to eat, helping them buy their food — but if it is about raising food, farmers should have the tools to raise the food and fiber,” Lucas said. “And if you engage in whole series of things, such as you can’t get crop insurance unless you plant in a certain way, on a certain day, in a certain direction, or you can’t access a variety of other programs, then we aren’t having a farm bill that helps farmers raise food and fiber, but we have a social tool here that’s used to direct how farmers use their lives and conduct their business.” [emphasis added]

You’ll excuse me if I am having trouble summoning much sympathy for your special interest friends, Mr Lucas. It’s just that I feel that having to accept inconvenient conditions should be expected when you suck at the government teat. The Farm Bill was designed as a social tool, and you and your colleagues over the years have added more “social tools” like food stamps, environmental programs and energy subsidies in order to secure sufficient votes for your pork. Complaining now that all these other people are ruining your party is, to say the least, a bit rich.

If farmers don’t want to be directed on “how [they] use their lives and conduct their business,” then I suggest they start sending their cheques back. Ending farm programs will truly Free the Farm.

The Science vs. the Pseudoscience of Extreme Weather

Over at Capital Weather Gang, the always-perceptive Jason Samenow details a recent Twitterspat between Dot Earth’s (aka The New York Times’) Andrew Revkin and Penn State’s Michael Mann over attributing extreme weather events to anthropogenic climate change—tornadoes, in particular.

Revkin tweeted to ask whether the folks who were alluding to anthropogenic greenhouse gas emissions being behind the major (and deadly) tornado outbreak during the spring of 2011 were willing to attribute the record lack of tornado occurrences during the past 12 months to the same cause.

Revkin could have very well asked this same question about all kinds of bad weather—blizzards, hurricanes, droughts, floods, record heat, record cold, summer in Washington, winter in Chicago, etc.

Used to be, when the weather was bad, folks would logically cite Mark Twain’s “if you don’t like the weather in New England now, just wait a few minutes.”  Now, someone will show up on TV who is quick to point out that this sort of thing “is consistent with” expectations of global warming.  These same folks tend nap when the weather is hunky-dory, and to go into hibernation when the extreme weather category of their previous pronouncement has a hiatus.

Since the bang-up year of 2011, the number of tornadoes has dropped off the table, with the last 12 months showing the fewest since systematic observations began in the 1950s.

And like tornados, major hurricane strikes have also become scarce, in fact, they are so in remission that someone might soon announce they have been cured.  It has currently been more than 7 years since a Category 3 made landfall in the U.S., the longest time in more than 100 years—and all this when overall hurricane activity in the Atlantic basin has been elevated.  Maybe there is something to research that finds that while anthropogenic climate change may increase the frequency of major hurricanes in the Atlantic, it changes the circulation patterns such that they are more likely to remain offshore (see page 30-32 of our comments on the draft National Assessment Report)

But we digress…

Apparently the folks who rally around the anthropogenic climate change/extreme weather linkage don’t like being awoken when all is calm.

Obama’s New Transportation Chief Wants Streetcars for Everyone

America’s transportation system will continue to grind to a halt under President Obama’s pick for transportation secretary, Anthony Foxx. Currently mayor of Charlotte, N.C., Foxx strongly supports streetcars and other obsolete forms of transit.

It is a measure of the glacial pace of America’s political system that Obama had nearly 16 months’ notice that current Secretary Ray LaHood planned to step down at the end of Obama’s first term, yet the president required another three months before finding a replacement. If the administration has anything to say about it, American travelers will move at the same glacial pace: the streetcars that Obama, LaHood, and Foxx want to fund are slower than most people can walk.

Transit advocates often point to Charlotte as an example of a successful lightrail line (more accurately described as a “low-capacity-rail line”). With success like this, I’d hate to see failure: the line cost more than twice the original projection; generates just $3 million in annual fares against more than $20 million in annual operations and maintenance costs; and collects of an average of just 77 cents per ride compared with nearly a dollar for other light-rail lines. Now Charlotte wants to extend the line even though a traffic analysis report predicts that the extension will dramatically increase traffic congestion in the corridor (see pp. 54-56).

Foxx believes rail transit “drives economic development,” says George Washington University Professor Christopher Leinberger approvingly. “The goal of any transportation system, especially rail transit, is not to move people,” Leinberger argues. “The goal is economic development at the stations.”

Anthony Foxx certainly believes that. “If we didn’t do streetcar,” he asked the Charlotte city council during a debate, “does anybody have an idea how we’re going to revitalize” downtown Charlotte?

Rail advocates claim that Charlotte’s low-capacity-rail line helped revitalize neighborhoods along the line. However, a study by transportation expert David Hartgen concluded that most of the billions of dollars of development that was planned along the line was never built. Of the developments that were built, most would have taken place without the line, Hartgen found, though not necessarily in exactly the same locations.

I’ve said this before and I’ll say it again: transportation spending generates true economic growth only if it results in lower-cost, faster, and/or more convenient movement of people and goods. Streetcars and low-capacity rail are more expensive, slower, and for all but a tiny number of people less convenient than the alternatives, whether buses or cars. Even if you reduce transit rider costs by subsidizing them to the hilt, someone has to pay the subsidies and that slows economic growth.

Foxx is blissfully unaware of this and we can expect him to continue LaHood’s policy of giving away as much money as possible for transit projects that are as expensive as possible and move few people while creating more congestion for everyone else.

What to Do about OPEC?

Cato hosted a policy forum last week (which you can watch in its entirety if you missed it the first time around) to discuss a new paper released by Security America’s Energy Future (SAFE).  The paper – written by long-time friends Andy Morriss and Roger Meiners – argues that there is a consensus among academics who have studied OPEC.  The consensus?  The cartel is responsible for less crude oil on the market than would otherwise be the case (which means higher prices than would otherwise be the case) and for the bulk of the price volatility we find in crude oil and, thus, gasoline markets.  “The international market for oil is not a free market” they conclude.  “The global oil market deviates in important ways from the competitive model and that these market anomalies have significant economic impacts and so are relevant for policy makers.”

While Morriss and Meiners would thus seem to invite politicians to act, they offered no agenda of their own.  That’s where SAFE comes in.  FedEx’s Fred Smith, who co-chairs SAFE’s Energy Security Leadership Council, argued at the forum that the federal government needs to respond to OPEC’s machinations by (1) achieving energy independence for North America (a goal I’ve been quite skeptical about in the past), (2) establishing tough energy efficiency standards for a whole host of goods, but most particularly, for U.S. automobiles via CAFÉ standards (an agenda that most economists would reject in favor of accurate price signals), and (3) subsidizing R&D in order to find alternatives to oil in transportation markets.  SAFE discusses this agenda more robustly in their “National Energy Strategy for Energy Security, 2013”.

SMU’s James Smith – one of the most prominent energy economists who works in this field – was on-hand to offer what I think was a compelling rebuttal to the central arguments forwarded by the Morriss and Meiners study.

Still Another Low Climate Sensitivity Estimate

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

As promised, we report here on yet another published estimate of the earth’s equilibrium climate sensitivity that is towards the low end of the United Nations’ Intergovernmental Panel on Climate Change (IPCC) range of possibilities.

Recall that the equilibrium climate sensitivity is the amount that the earth’s surface temperature will rise from a doubling of the pre-industrial atmospheric concentration of carbon dioxide. As such, it is probably the most important factor in determining whether or not we need to “do something” to mitigate future climate change. Lower sensitivity means low urgency, and, if low enough, carbon dioxide emissions confer a net benefit.

And despite common claims that the “science is settled” when it comes to global warming, we are still learning more and more about the earth complex climate system—and the more we learn, the less responsive it seems that the earth’s average temperature is to human carbon dioxide emissions.

The latest study to document a low climate sensitivity is authored by independent scientist Nic Lewis and is scheduled for publication in the Journal of Climate. Lewis’ study is a rather mathematically complicated reanalysis of another earlier mathematically complicated analysis that matches the observed global temperature change to the temperature change produced from a simple climate model with a configurable set of parameters whose actual values are largely unknown but can be assigned in the model simulations. By varying the values of these parameters in the models and seeing how well the resulting temperature output matches the observations, you can get some idea as to what the real-world value of these parameters are. And the main parameter of interest is the equilibrium climate sensitivity. Lewis’ study also includes additional model years and additional years of observations, including several years from the current global warming “hiatus” (i.e., the lack of a statistically significant rise in global temperature that extends for about 16 years, starting in early 1997).

We actually did something along a similar vein—in English—and published it back in 2002. We found the same thing that Lewis did: substantially reduced warming. We were handsomely rewarded for our efforts by the climategate mafia, who tried to get 1) the paper withdrawn, 2) the editor fired—not just from the journal, but from Auckland University, and 3) my (Michaels) 1979 PhD “reopened” by University of Wisconsin.