Topic: Energy and Environment

More Evidence for a Low Climate Sensitivity

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

We have two new entries to the long (and growing) list of papers appearing the in recent scientific literature that argue that the earth’s climate sensitivity—the ultimate rise in the earth’s average surface temperature from a doubling of the atmospheric carbon dioxide content—is close to 2°C, or near the low end of the range of possible values presented by the U.N.’s Intergovernmental Panel on Climate Change (IPCC).  With a low-end warming comes low-end impacts and an overall lack of urgency for federal rules and regulations (such as those outlined in the President’s Climate Action Plan) to limit carbon dioxide emissions and limit our energy choices.

The first is the result of a research effort conducted by Craig Loehle and published in the journal Ecological Modelling. The paper is a pretty straightforward determination of the climate sensitivity.  Loehle first uses a model of natural modulations to remove the influence of natural variability (such as solar activity and ocean circulation cycles) from the observed temperature history since 1850. The linear trend in the post-1950 residuals from Loehle’s natural variability model was then assumed to be largely the result, in net, of human carbon dioxide emissions.  By dividing the total temperature change (as indicated by the best-fit linear trend) by the observed rise in atmospheric carbon dioxide content, and then applying that relationship to a doubling of the carbon dioxide content, Loehle arrives at an estimate of the earth’s transient climate sensitivity—transient, in the sense that at the time of CO2 doubling, the earth has yet to reach a state of equilibrium and some warming is still to come. 

Loehle estimated the equilibrium climate sensitivity from his transient calculation based on the average transient:equilibrium ratio projected by the collection of climate models used in the IPCC’s most recent Assessment Report. In doing so, he arrived at an equilibrium climate sensitivity estimate of 1.99°C with a 95% confidence range of it being between 1.75°C and 2.23°C.

Compare Loehle’s estimate to the IPCC’s latest assessment of the earth’s equilibrium climate sensitivity which assigns a 66 percent or greater likelihood that it lies somewhere in the range from 1.5°C to 4.5°C. Loehle’s determination is more precise and decidedly towards the low end of the range.

The second entry to our list of low climate sensitivity estimates comes from  Roy Spencer and William Braswell and published in the Asia-Pacific Journal of Atmospheric Sciences. Spencer and Braswell used a very simple climate model to simulate the global temperature variations averaged over the top 2000 meters of the global ocean during the period 1955-2011. They first ran the simulation using only volcanic and anthropogenic influences on the climate. They ran the simulation again adding a simple take on the natural variability contributed by the El Niño/La Niña process. And they ran the simulation a final time adding in a more complex situation involving a feedback from El Niño/La Niña onto natural cloud characteristics. They then compared their model results with the set of real-world observations.

What the found, was the that the complex situation involving El Niño/La Niña feedbacks onto cloud properties produced the best match to the observations.  And this situation also produced the lowest estimate for the earth’s climate sensitivity to carbon dioxide emissions—a value of 1.3°C.

Spencer and Braswell freely admit that using their simple model is just the first step in a complicated diagnosis, but also point out that the results from simple models provide insight that should help guide the development of more complex models, and ultimately could help unravel some of the mystery as to why full climate models produce  high estimates of the earth’s equilibrium climate sensitivity, while estimates based in real-world observations are much lower.

Our Figure below helps to illustrate the discrepancy between climate model estimates and real-world estimates of the earth’s equilibrium climate sensitivity. It shows Loehle’s determination as well as that of Spencer and Braswell along with 16 other estimates reported in the scientific literature, beginning in 2011. Also included in our Figure is both the IPCC’s latest assessment of the literature as well as the characteristics of the equilibrium climate sensitivity from the collection of climate models that the IPCC uses to base its impacts assessment.

Figure 1. Climate sensitivity estimates from new research beginning in 2011 (colored), compared with the assessed range given in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) and the collection of climate models used in the IPCC AR5. The “likely” (greater than a 66% likelihood of occurrence)range in the IPCC Assessment is indicated by the gray bar. The arrows indicate the 5 to 95 percent confidence bounds for each estimate along with the best estimate (median of each probability density function; or the mean of multiple estimates; colored vertical line). Ring et al. (2012) present four estimates of the climate sensitivity and the red box encompasses those estimates. The right-hand side of the IPCC AR5 range is actually the 90% upper bound (the IPCC does not actually state the value for the upper 95 percent confidence bound of their estimate). Spencer and Braswell (2013) produce a single ECS value best-matched to ocean heat content observations and internal radiative forcing.

Quite obviously, the IPCC is rapidly losing is credibility.

As a result, the Obama Administration would do better to come to grips with this fact and stop deferring to the IPCC findings when trying to justify increasingly  burdensome  federal regulation of  carbon dioxide emissions, with the combined effects of manipulating markets and restricting energy choices.

References:

Loehle, C., 2014. A minimal model for estimating climate sensitivity. Ecological Modelling, 276, 80-84.

Spencer, R.W., and W. D. Braswell, 2013. The role of ENSO in global ocean temperature changes during 1955-2011 simulated with a 1D climate model. Asia-Pacific Journal of Atmospheric Sciences, doi:10.1007/s13143-014-0011-z.

Folly of Federal Flood Insurance

Subsidized flood insurance is one of the many federal programs that is counter to both sound economic policy and sound environmental policy. Congress created the National Flood Insurance Program (NFIP) in 1968 to help homeowners in flood-prone areas purchase insurance. The FEMA-run program covers floods from river surges and storms on the seacoasts.

In recent years, the NFIP has gone hugely into debt and it may be bailed-out by taxpayers at some point. The program has encouraged people to build homes in areas that are too hazardous to safely occupy. It has encouraged towns to expand development in flood-prone areas. And the program undermines constitutional federalism by prompting the federal government to reach its regulatory tentacles into local zoning issues.

The NFIP subsidizes wealthy people with multiple payouts after their homes on the seacoasts are repeatedly destroyed. The program is very bad policy—a seemingly good idea to policymakers in the 1960s that has ended up creating growing distortions.

When I started reading about the NFIP recently, I was surprised to learn that Congress made sensible reforms to it in 2012 under the Biggert-Waters Act. The best reform would be a complete repeal of the NFIP, but in the meantime the 2012 law was a good start at reducing the program’s costs and distortions.

Alas, the prospect of Congress staying on a pro-market, pro-environment reform path was apparently too good to be true. No sooner had the ink dried on the 2012 law than members of Congress began trying to reverse the reforms.

This week, Congress will be voting on a bill that backtracks on the 2012 reforms. I have not studied the details of the new bill, but Diane Katz at the Heritage Foundation has penned a nice overview.

A Tough Day in Court for the EPA’s Greenhouse Gas Regulations

The Obama Administration appeared prepared to abandon a major portion of its initial greenhouse gas regulatory scheme in oral argument before the Supreme Court today. Solicitor General Donald Verrilli, defending a series of EPA rules, sought to preserve regulations reaching large industrial sources by offering up a more aggressive gambit by the agency that could potentially reach millions of smaller businesses, apartment buildings, and schools.

The problem, as EPA itself has conceded, is that EPA’s regulatory approach renders the Clean Air Act’s Prevention of Significant Deterioration program “unrecognizable” to the Congress that enacted it. That’s because GHGs are emitted in far greater quantities than traditional pollutants and PSD requirements are based on the quantities of emissions, with facilities emitting more than either 100 or 250 tons per year of any applicable pollutant being subject to an expensive pollution-control regime. For GHGs, those tonnage triggers would transform the PSD program from one aimed at only the nation’s largest sources of emissions. For that reason, after deciding to use PSD to regulate GHGs, EPA then issued a “tailoring rule” to avoid the absurd result by discarding the numerical thresholds that are specified in the law and adopting new ones thousands of times larger.

That decision was under heavy scrutiny at oral argument. Businesses challenging the rule, represented by Peter Keisler, argued that the PSD program is structured to address local air quality concerns and therefore does not extend to emissions of carbon dioxide. PSD’s triggers, monitoring requirements, requirement for local air-quality analysis, and administration by 90 separate state and local permitting authorities all demonstrate that Congress did not intend the statute to address anything like GHG, Keisler argued. So while the statute does apply to “any air pollutant,” that term cannot be interpreted to reach pollutants that cause these other statutory requirements to fail

Another $6.5 Billion in DOE Loan Guarantees

After Solyndra collapsed, the Department of Energy (DOE) should have learned its lesson. Guaranteeing loans for energy and industrial companies is a bad idea. The failures of Beacon Power and Fisker Automotive should have driven home the message. Now, we have further proof that the DOE isn’t paying attention.

Yesterday, DOE Secretary Ernest Moniz traveled to Georgia to announce $6.5 billion in loan guarantees for two new nuclear reactors already under construction. 

The loan, like so many others, has the markings of an incredible risky use of taxpayer dollars. According to the Washington Post, the project is already 21 months behind schedule. Additionally, Southern Company, the largest shareholder of the project, had its ratings’ outlook downgraded from “stable” to “negative” by Standard and Poor’s last year, in part because of “cost overruns” at the Georgia facility.

Even more frustrating, the company already had private loans in place to finance construction. Now we, the taxpayers, will save the company $250 million a year in interest costs by bearing the full burden of default.

The company also benefits from $2 billion in other federal tax credits, according to its CEO.

Some deal.

Water in the West: It’s Complicated

In the media, one hears two different stories regarding the drought in California and Western water problems in general. Liberals say that droughts are being made worse by climate change. Conservatives say that water shortages are being perpetrated by the EPA in a misguided effort to sacrifice farmers for some tiny fish. The Washington Times editorial today is of the latter genre.

The real story is more complicated. It’s not just Mother Nature, and it’s not just farmer vs. fish.

The fundamental problem is that the federal government has been heavily subsidizing Western water for decades, particularly for crop irrigation. Artificially low water prices have encouraged overconsumption and the planting of very dry areas where farming is inefficient and environmentally unsound. Subsidized irrigation farming has created major environmental problems in the San Joaquin Valley, for example.

To make matters worse, federal farm subsidies have boosted demand for irrigation water, which has further encouraged farmers to bring marginal lands into production.

So don’t blame the Delta smelt. Instead, blame antimarket policies going back eight decades in the case of farm subsidies and a century in the case of subsidized water from the federal Bureau of Reclamation.

The long-term solution to the West’s growing water problems is free-market economics. Policymakers should end the farm subsidies, reform water property rights, transfer federal dams and aqueducts to state ownership, and move toward market pricing of water.

For more, see my essay with Peter Hill and check out the great work from the free-market environmentalists at PERC.

Fuel Efficiency Standards for New Trucks—Can’t We Decide These for Ourselves?

Rather than wait on the market to demand more fuel efficient trucks, President Obama, bypassing Congress, has directed the Environmental Protection Agency to draw up a new round of regulations raising the fuel efficiency standards on heavy-duty trucks. He promises that this will save billions of dollars in fuel costs, lower prices and reduce greenhouse gas emissions—or, as he describes it, a “win-win-win” situation.

Thank you, Mr. President for taking such good care of us.

Apparently, we are too stupid to have realized the manifold benefits of this chain of events ourselves.

Or is it that we realize these actions will have no impact of climate change and will probably result in higher prices for new trucks and everything that they transport?

You can use our “Handy-Dandy Temperature Change Calculator” to see that, using the EPA’s own computer model, if Americans cut all of our carbon dioxide emissions to zero, today, the amount of warming that would be prevented (assuming a warming forecast that is itself probably too high) by 2100 is around two-tenths of a degree—an amount that would be virtually impossible to measure against natural climate variability.  Increasing the fuel efficiency of heavy trucks would have considerably less of an effect than cutting all carbon dioxide emissions and would simply not be discernible in climate data.

And the President’s claim that increasing the fuel efficiency will lower the price of all things neglects the fact that we simply do not know what technology would accomplish this end.

Perhaps he should have said—”if you like your truck, you can keep your truck,” that is, until you have to replace it with something that will cost much more than you would have otherwise purchased and not do what it is supposed to do.

Again, thank you, Mr. President.

Venezuela’s Plunging Petroleum Production

A hallmark of socialism and interventionism is failure. Venezuela is compelling proof of this, having spent the past half century going down the tubes. Indeed, in the 1950’s, it was one of Latin America’s most well off countries. No more. Now it is a basket case – a failed state that’s descending into chaos.

How could this be? After all, Venezuela’s combined reserves of oil and gas are second only to Iran’s. Well, it might have reserves, but thanks to the wrongheaded policies of President Hugo Chavez, Venezuela is the only major energy producer that has seen its production fall over the past quarter of a century. The following chart tells that dismal tale: