Topic: Energy and Environment

Say What!?

While the social cost of carbon (SCC) is still being mulled over by the Office of Management and Budget, other federal agencies continue to push ahead with using the SCC to help justify their many regulations.

The way this works is that for every ton of carbon dioxide (CO2) that any new regulation is supposed to keep from being emitted into the atmosphere, the proposing agency gets about $32 credit to use to offset the costs that the new regulation will generate. This way, new regulations seem less costly—an attractive quality when trying to gain acceptance.

The idea is that the damage resulting from future climate changes will be decreased by $32 for every ton of carbon dioxide that is not emitted.

There is so much wrong with the way the government arrives at this number that we have argued that the SCC should be tossed out and barred from use in all federal rulemaking. It is far better not to include any value for the SCC cost/benefit analyses, than to include one which is knowingly improper, inaccurate and misleading.

Further, that the federal regulations limiting carbon dioxide emission will have any detectable impact on future climate change is highly debatable. To see for yourself, try out our global warming calculator that lets you select the magnitude of future carbon dioxide emissions reductions as well as which countries participate in your plan. The best that the U.S. can do—even if it were to halt all CO2 emissions now and forever—is to knock off about 0.1°C from the total climate model-projected global temperature rise by the year 2100.  In other words, U.S. actions are not very effective in limiting future climate change.

Apparently, the feds, too, agree that their plethora of proposed regulations will have little impact on carbon dioxide emissions and future climate change. But that doesn’t stop them from issuing them.

The passage below is from the proposed rulemaking from the Department of Energy to alter the Energy Conservation Standards for Commercial and Industrial Electric Motors  (this is only one of many proposed regulations making this claim):

The purpose of the SCC estimates presented here is to allow agencies to incorporate the monetized social benefits of reducing CO2 emissions into cost-benefit analyses of regulatory actions that have small, or “marginal,” impacts on cumulative global emissions.

In other words, DoE’s regulations won’t have any real impact on global CO2 emissions (and, in that manner, climate change), but nevertheless they’ll take a monetary credit for reduced damages that supposedly will result from the non-effective regulations.

(I wonder if can try that on my taxes)

It seems a bit, uh, cheeky, to take credit for something that you admit won’t happen.

But that’s the logic of the federal government for you!

VMT Fees Yes — V2V No

The National Highway Traffic Safety Administration (NHTSA) says it wants to require auto makers to include vehicle-to-vehicle (V2V) communications systems in all new cars. Calling V2V “the next generation of auto safety improvements,” the agency says such devices would “improve safety by allowing vehicles to “talk” to each other and ultimately avoid many crashes altogether by exchanging basic safety data, such as speed and position, ten times per second.”

The government wants every vehicle on the road to transmit its location to every other nearby vehicle–as well as any other receivers that happen to be in range.

Supposedly, “the system as contemplated contains several layers of security and privacy protection.” However, privacy advocates should be far more suspicious of V2V than of electronic vehicle-mile fee systems. The big difference between them is that V2V by definition incorporates both a receiver and a transmitter, while it is possible to design vehicle-mile fee systems that do not include wireless transmitters. No transmitter means no invasion of privacy is possible; on the other hand, despite whatever privacy protection is included in V2V, a transmitter necessarily allows someone to receive the signal.

Perhaps the biggest argument against V2V is that it will soon be obsolete as a safety device, so mandating that it be included in cars adds an unnecessary expense to auto buyers. According to the NHTSA, V2V will “provide warnings to drivers so that they can prevent imminent collisions” but “not automatically operate any vehicle systems, such as braking or steering.” Yet many cars on the market today, such as the Ford Fusion shown above, do this and more solely with built-in radar or other sensors rather than V2V transmitters. Moreover, the occupants of such cars are safer even if no other car on the road has those sensors, which isn’t true of a V2V system.

The Ford Fusion is a mid-priced car that has numerous built-in radar sensors that can detect and warn drivers of potential collisions, even braking if necessary to avoid accidents–all without V2V transmissions.

Moreover, as contemplated by the NHTSA, V2V will not be mandated in cars before 2018 at the earliest. Yet the kind of self-driving cars that Nissan and other companies expect to have on the market by 2020 will use radar, infrared, lasers, or other means to detect all other vehicles on the road without transmitting any signals themselves. They would get no benefit from a wireless V2V system.

If systems that are already being included in more and more new cars work as well, if not better, than V2V, then why have V2V at all? It is worth noting that self-driving cars are coming from the private sector, while the National Highway Traffic Safety Administration has expressed a go-slow attitude. Meanwhile, the push to mandate V2V comes from government agencies, both here and in Europe. I suspect governments are more interested in technologies that centralize transportation and communications, while private manufacturers are supporting technologies that promote decentralization.

In any case, it will be interesting to see if privacy groups protest this plan as loudly as they do proposals for vehicle-mile fees. Those who don’t may be using privacy concerns to cover their reluctance to paying the full cost of the roads they use. But, where VMT fees are an important step to using markets, rather than politics, to manage transportation systems, V2V is both a potential invasion of privacy and a waste of money.

Keystone XL Pipeline Given High Marks in State Department’s Final Environmental Impact Statement

Recall this passage from President Obama’s Georgetown speech last summer announcing his Climate Action Plan:

Now, I know there’s been, for example, a lot of controversy surrounding the proposal to build a pipeline, the Keystone pipeline, that would carry oil from Canadian tar sands down to refineries in the Gulf. And the State Department is going through the final stages of evaluating the proposal. That’s how it’s always been done. But I do want to be clear:  Allowing the Keystone pipeline to be built requires a finding that doing so would be in our nation’s interest. And our national interest will be served only if this project does not significantly exacerbate the problem of carbon pollution. The net effects of the pipeline’s impact on our climate will be absolutely critical to determining whether this project is allowed to go forward. It’s relevant.

This basically should have green-lighted the pipeline, because, as I pointed out in congressional testimony last year, regardless of how you figure the carbon dioxide emissions from the pipeline’s oil, the resulting climate impact will be so small as to assuredly put the president’s mind at ease.

The just-released Final Environmental Impact Statement from the State Department concluded about the same thing as the Draft Environmental Impacts Statement from the State Department, which is in complete agreement with my findings regarding carbon dioxide emissions from the pipeline’s oil and climate change. The net global warming impact from the pipeline oil amounts to somewhat less than 1/100th of a degree Celsius over the next 100 years.

So if the president wants to kill the Keystone XL pipeline (clearly he does, because he has had ample opportunity to approve it), he’ll have to find a reason to do so other than a climate one. Unfortunately for him, trying to kill it for other reasons would be equally ill-founded.

Senate Prepares to Roll Back Flood Insurance Reforms

A funny thing happened in 2012, Congress actually passed a bill that intentionally cut subsidies.  In this case subsidies given to homeowners under the National Flood Insurance Program (NFIP).  The Biggert-Waters Act of 2012, if fully implemented, would eliminate almost half of the annual billion in estimated subsidies under the NFIP.  Now before your opinion of Congress suddenly improves, its important to remember that subsidies reductions were done only because the NFIP had expired and some responsible members objected to extending the program without reform.  Now that the program is up and running again, beach front homeowners and their friends in the real estate industry want their subsidies back.

The Senate is currently moving towards that goal.  Not even wanting to bother with the normal process of hearings and a Committee vote, Senate Majority Leader Harry Reid has brought S.1926 directly to the floor for a vote, likely to occur this week.  S.1926 would indefinitely delay the premium increases passed in Waters-Biggert, effectively hitting the taxpayer for $100s of millions annually.  But hey there’s a close Senate race going on it Louisiana, so regular order can wait.

Now I have every sympathy for households facing rate increases under NFIP.  They’ve been getting a subsidy for years and have grown used to it.  Given the sometimes high cost of NFIP, it might not even feel like a subsidy.  But then part of that is because almost a third of the premium income is pocketed by the insurance companies (at no risk to them I might add).  The solution is to let those households either get out of NFIP altogether or to purchase private insurance, that would likely be cheaper given the inefficiencies of the NFIP.  If one feels that maintaining flood coverage is vital for these households, yet they cannot bear the higher raters, another option would be a significantly higher deductible.  Rolling back the premium reforms in Biggert-Waters is simply short-sighted and irresponsible, but then that’s nothing new for Washington.

Free America’s Energy Future: Drop Washington’s Misguided Export Ban

For years people have been told to expect a dismal energy future.  But because of rapid market innovation Americans now can look forward to an abundant energy future.  The U.S. could even become a leading exporter—if Washington gets out of the way. 

An energy revolution currently is underway, with increasing supplies and falling prices.  Even more could be done if Washington expanded access to federal lands and waters and freed producers to make best use of what they extract.

Arbitrary restrictions bedevil energy exports.  For instance, natural gas licenses are granted automatically for nations with free trade agreements—in this case Canada and Mexico—but otherwise the review process is lengthy and approval is rare.  Last year Energy Secretary Ernest Moniz announced that he was delaying decisions on a score of applications for political reasons even though the department had already concluded that such exports would benefit the U.S. economy. 

The ban on oil is even tougher, with only small amounts being shipped to Canada.  Few licenses have been issued under the law’s “national interest” exception, and none since 2000.

As I point out in my latest Forbes online column:

Forbidding petroleum exports does not make additional oil available to Americans.  Rather, the ban prevents energy companies from saving money.  For instance, it would be cheaper to sell Alaskan crude to Asia and purchase more oil from Latin America.

Closing the Books on 2013: Another Year, Another Nail in the Coffin of Disastrous Global Warming

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

A few weeks have now passed since the end of last year, giving enough time for various data-compiling (and “data-adusting”) agencies to get their numbers in order and to release the sad figures from 2013.

U.S. Annual Average Temperature

We pointed out, back in this post in mid-December, that there was an outside chance—if December were cold enough—that the average annual temperature for the U.S. in 2013 would fall below the 20th century average for the first time since 1996.  Well, despite how cold it seemed in December, it turned out to not quite be cold enough to push the January-December 2013 temperature anomaly into negative territory. Figure 1 below shows the U.S. temperature history as compiled by the National Climatic Data Center from 1895 through 2013.

Figure 1. U.S. annual average temperature as compiled by the National Climatic Data Center, 1895-2013 (data: NCDC Climate at a Glance).

Please be advised that this history has been repeatedly “revised” to either make temperatures colder in the earlier years or warmer at the end.  Not one “adjustment” has the opposite effect, a clear contravention of logic and probability.  While the US has gotten slightly warmer in recent decades, compared to the early 20th century, so have the data themselves.  It’s a fact that if you just take all the thousands of fairly evenly-spaced “official” weather stations around the country and average them up since 1895, that you won’t get much of a warming trend at all.   Consequently a major and ongoing federal effort has been to try and cram these numbers into the box imposed by the theory that gives the government the most power—i.e., strong global warming.

What immediately stands out in 2013 is how exceptional the average temperature in 2012 (the warmest year in the record) really was. In fact, the recovery in 2013 from the lofty heights in 2012 was the largest year-over-year temperature decline in the complete 119 year record—an indication that 2012 was an outlier more so than “the new normal.”

Hot Air About Cold Air

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Last summer, we predicted that come this winter, any type of severe weather event was going to be linked to pernicious industrial activity (via global warming) through a new mechanism that had become a media darling—the loss of late summer/early fall Arctic sea ice leading to more persistent patterns in the jet stream. These are known as “blocking” patterns, which generally means that the same type of weather (usually somewhat extremish) hangs around longer than usual.

This global-warming-leading-to-more-extreme-winter-weather mechanism has been presented in several recent papers, perhaps the most noteworthy of which was a 2012 publication by Jennifer Francis and Stephen Vavrus, which was the subject of one of our blog posts last summer. We noted then how their idea ran counter to much of the extant literature of the topic as well as a host of other newly published papers investigating historical jet stream patterns.

After running through a list of observations compiled from the scientific literature countering the Francis and Vavrus explanation of things, we nevertheless wondered:

It’ll be interesting to see during this upcoming winter season how often the press—which seems intent on seeking to relate all bad weather events to anthropogenic global warming—turns to the Francis and Vavrus explanation of winter weather events, and whether or not the growing body of new and conflicting science is ever brought up.

We didn’t have to wait long. After a couple of early winter southward Arctic air excursions, the familiar and benign-sounding “jet stream” had become the “polar vortex”[1] which “sucked in” the United States. Of course, the U.S. being sucked into a polar vortex was part and parcel of what was to be expected from global warming.

Since we had predicted this action/reaction, we weren’t terribly surprised.

What did surprise us (although perhaps it shouldn’t have) is that the White House joined in the polar vortex horror show and released a video in which John Holdren, the  President’s Science Advisor—arguably the highest ranking “scientist” in the U.S.—linked the frigid air to global warming:

In the video, Holdren boldly stated:

 …a growing body of evidence suggests that kind of extreme cold being experienced by much of the United States as we speak is a pattern that we can expect to see with increasing frequency as global warming continues…

It seems that Holdren neither keeps up with our writings at Cato nor the scientific literature on the topic.

While perhaps it could be argued that Holdren’s statement is not an outright lie, it is, at its very best, a half-truth and even a stretch at that. For in fact, there is a larger and faster growing body of evidence that directly disputes Holdren’s contention.

In addition to the evidence that we reported on here and here, a couple of brand new papers just hit the scientific journals this month that emphatically reject the hypothesis that global warming is leading to more blocking patterns in the jet stream (and accompanying severe weather outbreaks across the U.S.).

The first paper is a modeling paper by a team of U.K. scientists led by Giacomo Masato from the University of Reading. Masato and his colleagues looked at how the magnitude and frequency of atmospheric blocking events in the Atlantic-Europe region is projected to change in the future according to four climate models which the authors claim match the observed characteristics of blocking events in this region pretty well. What they found was completely contradictory to Holdren’s claim. While the researchers did note a model-projected small future increase in the frequency of blocking patterns over the Atlantic (the ones which impact the weather in the U.S.), they found that the both the strength of the blocking events as well as the associated surface temperature anomalies over the continental U.S. were considerably moderated. In other words, global warming was expected to make “polar vortex” associated cold outbreaks less cold.

The second paper is by a research team led by Colorado State University’s Elizabeth Barnes. In their paper “Exploring recent trends in Northern Hemisphere blocking,” Barnes and colleagues used various meteorological definitions of “blocking” along with various datasets of atmospheric conditions to assess whether or not there have been any trends in the frequency of blocking events that could be tied to changes in global warming and/or the declines in Arctic sea ice.

They found no such associations.

From their conclusions:

[T]he link between recent Arctic warming and increased Northern Hemisphere blocking is currently not supported by observations. While Arctic sea ice experienced unprecedented losses in recent years, blocking frequencies in these years do not appear exceptional, falling well within their historically observed range. The large variability of blocking occurrence, on both inter-annual and decadal time scales, underscores the difficulty in separating any potentially forced response from natural variability.

In other words natural variability dominates the observed record making it impossible to detect any human-caused global warming signal even if one were to exist (which there is no proof of).

So, the most recent science shows 1) no observed relationship between global warming and winter severe weather outbreaks and 2) future “polar vortex”-associated cold outbreaks are projected to mollify—yet the White House prepares a special video proclaiming the opposite with the intent to spread climate alarm.

Full scientific disclosure in matters pertaining to global warming is not a characteristic that we have come to expect with this Administration.

References:

Barnes, E., et al., 2014. Exploring recent trends in Northern Hemisphere blocking. Geophysical Research Letters, doi:10.1002/2013GL058745.

Francis, J. A. and S. J. Vavrus, 2012: Evidence linking Arctic amplification to extreme weather in mid-latitudes. Geophysical Research Letters, 39, doi:10.1029/2012GL051000.

Masato, G., T. Woollings, and B.J. Hoskins, 2014. Structure and impact of atmospheric blocking over the Euro-Atlantic region in present day and future simulations. Geophysical Research Letters, doi:10.1002/2013GL058570.


[1] For what it’s worth, there’s been two polar vortices (north and south) on planet earth ever since it acquired an atmosphere and maintains rotation.