Topic: Energy and Environment

Hot Air About Cold Air

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Last summer, we predicted that come this winter, any type of severe weather event was going to be linked to pernicious industrial activity (via global warming) through a new mechanism that had become a media darling—the loss of late summer/early fall Arctic sea ice leading to more persistent patterns in the jet stream. These are known as “blocking” patterns, which generally means that the same type of weather (usually somewhat extremish) hangs around longer than usual.

This global-warming-leading-to-more-extreme-winter-weather mechanism has been presented in several recent papers, perhaps the most noteworthy of which was a 2012 publication by Jennifer Francis and Stephen Vavrus, which was the subject of one of our blog posts last summer. We noted then how their idea ran counter to much of the extant literature of the topic as well as a host of other newly published papers investigating historical jet stream patterns.

After running through a list of observations compiled from the scientific literature countering the Francis and Vavrus explanation of things, we nevertheless wondered:

It’ll be interesting to see during this upcoming winter season how often the press—which seems intent on seeking to relate all bad weather events to anthropogenic global warming—turns to the Francis and Vavrus explanation of winter weather events, and whether or not the growing body of new and conflicting science is ever brought up.

We didn’t have to wait long. After a couple of early winter southward Arctic air excursions, the familiar and benign-sounding “jet stream” had become the “polar vortex”[1] which “sucked in” the United States. Of course, the U.S. being sucked into a polar vortex was part and parcel of what was to be expected from global warming.

Since we had predicted this action/reaction, we weren’t terribly surprised.

What did surprise us (although perhaps it shouldn’t have) is that the White House joined in the polar vortex horror show and released a video in which John Holdren, the  President’s Science Advisor—arguably the highest ranking “scientist” in the U.S.—linked the frigid air to global warming:

In the video, Holdren boldly stated:

 …a growing body of evidence suggests that kind of extreme cold being experienced by much of the United States as we speak is a pattern that we can expect to see with increasing frequency as global warming continues…

It seems that Holdren neither keeps up with our writings at Cato nor the scientific literature on the topic.

While perhaps it could be argued that Holdren’s statement is not an outright lie, it is, at its very best, a half-truth and even a stretch at that. For in fact, there is a larger and faster growing body of evidence that directly disputes Holdren’s contention.

In addition to the evidence that we reported on here and here, a couple of brand new papers just hit the scientific journals this month that emphatically reject the hypothesis that global warming is leading to more blocking patterns in the jet stream (and accompanying severe weather outbreaks across the U.S.).

The first paper is a modeling paper by a team of U.K. scientists led by Giacomo Masato from the University of Reading. Masato and his colleagues looked at how the magnitude and frequency of atmospheric blocking events in the Atlantic-Europe region is projected to change in the future according to four climate models which the authors claim match the observed characteristics of blocking events in this region pretty well. What they found was completely contradictory to Holdren’s claim. While the researchers did note a model-projected small future increase in the frequency of blocking patterns over the Atlantic (the ones which impact the weather in the U.S.), they found that the both the strength of the blocking events as well as the associated surface temperature anomalies over the continental U.S. were considerably moderated. In other words, global warming was expected to make “polar vortex” associated cold outbreaks less cold.

The second paper is by a research team led by Colorado State University’s Elizabeth Barnes. In their paper “Exploring recent trends in Northern Hemisphere blocking,” Barnes and colleagues used various meteorological definitions of “blocking” along with various datasets of atmospheric conditions to assess whether or not there have been any trends in the frequency of blocking events that could be tied to changes in global warming and/or the declines in Arctic sea ice.

They found no such associations.

From their conclusions:

[T]he link between recent Arctic warming and increased Northern Hemisphere blocking is currently not supported by observations. While Arctic sea ice experienced unprecedented losses in recent years, blocking frequencies in these years do not appear exceptional, falling well within their historically observed range. The large variability of blocking occurrence, on both inter-annual and decadal time scales, underscores the difficulty in separating any potentially forced response from natural variability.

In other words natural variability dominates the observed record making it impossible to detect any human-caused global warming signal even if one were to exist (which there is no proof of).

So, the most recent science shows 1) no observed relationship between global warming and winter severe weather outbreaks and 2) future “polar vortex”-associated cold outbreaks are projected to mollify—yet the White House prepares a special video proclaiming the opposite with the intent to spread climate alarm.

Full scientific disclosure in matters pertaining to global warming is not a characteristic that we have come to expect with this Administration.

References:

Barnes, E., et al., 2014. Exploring recent trends in Northern Hemisphere blocking. Geophysical Research Letters, doi:10.1002/2013GL058745.

Francis, J. A. and S. J. Vavrus, 2012: Evidence linking Arctic amplification to extreme weather in mid-latitudes. Geophysical Research Letters, 39, doi:10.1029/2012GL051000.

Masato, G., T. Woollings, and B.J. Hoskins, 2014. Structure and impact of atmospheric blocking over the Euro-Atlantic region in present day and future simulations. Geophysical Research Letters, doi:10.1002/2013GL058570.


[1] For what it’s worth, there’s been two polar vortices (north and south) on planet earth ever since it acquired an atmosphere and maintains rotation. 

CO2 Regulation News from the Federal Register

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

The Federal Register has been brimming with announcements of government activities aimed to reduce/regulate carbon dioxide emissions emanating from the United States.

You may wonder why the government finds the need to pursue such action since 1) U.S. carbon dioxide emissions have already topped out and have generally been on the decline for the past 7-8 years or so (from technological advances in natural gas extraction and a slow economy more so than from already- enacted government regulations and subsidies); 2) greenhouse gases from the rest of the world (primarily driven by China) have been sky-rocketing over the same period, which lessens any impacts that our emissions reduction have); and 3) even in their totality, U.S. carbon dioxide emissions have a negligible influence on local/regional/global climate change (even a immediate and permanent cessation of all our carbon dioxide emissions would likely result in a mitigation of global temperature rise of less than one-quarter of a degree C by the end of the century).

We wonder the same thing. Nevertheless, the government has lots of ideas for how to save ourselves from ourselves (with likely to opposite outcome).

Here is a summary of new announcements appearing in the Federal Register over the past month or so on actions aimed to curtail our carbon dioxide emissions (primarily the result of our desire for cheap and reliable energy—gasp!).

Posted November 26, 2013: The Office of Management and Budget (OMB) announced a call for review of the Technical Support Document currently justifying the Administration’s value of the social cost of carbon (SCC) used in federal cost/benefit analyses.  We have discussed this announcement previously, and while it provides a glimmer of hope for injecting some new science and common sense into the government’s social cost of carbon, we are highly skeptical of a positive outcome. We mention the announcement again here, because the public comment period ends on January 27, 2014.  Comments can be submitted here.

Posted December 6, 2013: The Department of Energy announced another in its seemingly endless string of intrusions into our personal choices through its energy efficiency requirement updates for all sorts of consumer products. These revised efficiency regulations rely on the SCC to offset the costs and enrich the apparent benefits of the new requirements. We have already submitted comments on several of these proposed regulations (from walk-in refrigerators to furnace fans), but they just keep on coming.  The latest pertains to commercial and electric motors. Final comments are due February 4, 2014 and con be submitted here.

Posted December 31, 2013: The Department of Energy (DoE) announced that it has declined a Petition for Reconsideration of its rule updating the energy conservation standards for of microwave ovens.  The Petition for Reconsideration was brought by the Landmark Legal Foundation which pointed out that the DoE used a social cost of carbon estimate in the cost/benefits analysis for the rule that had not been subject to public comment and which was some 50% higher than the value used in the C/B analysis that was available for public comment.  In other words, the DoE pulled a pretty big bait and switch.  We at the Cato’s Center for the Study of Science submitted comments on the Landmark Petition pointing out just how far afield from the actual science that the Administrations SCC estimate had become.  The denial was disappointing, but the fight over the proper value for the SCC has now moved to the OMB (as described above).

Downsize the Department of Energy

The Department of Energy spends $29 billion per year on various schemes with a disastrous track record, often with bipartisan support. From regulations that destabilize markets, decrease domestic output and harm consumers, to subsidies that pick and choose winners and losers, this department is a perfect example of a white elephant – an expensive project of little to no useful purpose.

Solyndra is the best example of such waste. The solar panel company received a $535 million loan before filing for bankruptcy in 2011. The federal government will likely recover just $27 million from that loan.

The department can be abolished by relegating security and clean-up-related tasks to the EPA or the Department of Defense and by returning research functions to the private sector. In all, abolishing the Department of Energy would save taxpayers about $7 billion a year. To that end we’ve created a short video which makes these and other points, which you can watch below.

Is Free Trade in Energy Finally on the Horizon?

Over the last few months, the media and the policy world have discovered that America’s archaic crude oil export restrictions are really bad policy. Two new and important developments give this welcome and growing movement even more momentum:

  • In a much-publicized speech yesterday, Sen. Lisa Murkowski (R-AK), ranking member of the Senate Energy Committee, advocated modernizing U.S. export restrictions on energy products, particularly natural gas and crude oil. Accompanying her speech was a new white paper on the same topic, which (i) highlights the serious economic problems caused by the current crude oil export licensing system (which is effectively a ban on exports to all countries except Canada); (ii) confirms the widely held view that oil exports won’t cause higher gas prices; and (iii) recommends that the president, the Commerce Department, or–if they continue to do nothing–Congress relax the export ban. Just as importantly, Murkowski’s views were recently echoed by Sen. Mary Landrieu, (D-LA) who stands to take over the Senate Energy Committee this year. Thus, there could be bi-partisan support for easing the U.S. crude oil ban on the Senate committee arguably most integral to any such reforms.
  • Also, today, the American Petroleum Institute’s president and CEO Jack Gerard reiterated his organization’s support for lifting the crude oil export ban:

Gerards’s formal announcement echoes a few previous statements from folks at API (which is the largest U.S. energy trade association and a big player on Capitol Hill) and is a good sign that they’re going to push harder on this issue in the future. (API’s related blog post, which calls the crude export ban “obsolete,” certainly indicates as much.)

These two developments should be welcome news for anyone concerned with free markets, economic growth, and well-functioning energy markets. As I argued in a February 2013 Cato paper (and subsequent podcast), the crude oil export restrictions–and the similar, more well-known restrictions on U.S. natural gas exports–raise a host of economic, legal, and policy concerns. These restrictions should be replaced with a simple, transparent, and automatic licensing system for all exports of U.S. energy goods (not just fossil fuels).

‘Worse Than We Thought’ Rears Ugly Head Again

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”


Our last post was a brief run-through of some items of interest from the recent scientific literature that buck the popular alarmist meme that human-caused climate change is always “worse than we thought.” But as we said in that post, finding coverage of such results in the dinosaur media is a fool’s errand. Instead, it thrives on “worse than we thought” stories, despite their becoming a detriment to science itself.

Not to disappoint, headlines from the first major climate change story of the new year claim “Climate change models underestimate likely temperature rise, report shows,” and it’s clearly Worse Than We Thought. In its January 5 (Sunday) paper, the editorial board of the Washington Post points to the new results as a call for action on climate change.

The trumpeted results appear in a paper published in the January 2nd 2014 issue of Nature magazine by a team led by University of New South Wales professor Steven Sherwood and colleagues which claims that the earth’s equilibrium climate sensitivity—how much the global average surface temperature will rise as a result of a doubling of the atmospheric carbon dioxide content—is being underestimated by most climate models. Sherwood’s team finds “a most likely climate sensitivity of about 4°C, with a lower limit of about 3°C.”

Sherwood’s most likely value of 4°C is about twice the value arrived at by a rather largish collection of other research published during the past 2-3 years and lies very close to the top of the likely range (1.5°C to 4.5°C) given in the new report from the U.N.’s Intergovernmental Panel on Climate Change (IPCC).

While there are a host of reasons as to why our understanding of the true value of the climate sensitivity is little better constrained now that it was some 20+ years ago (it was given as 1.5°C to 4.5°C in the IPCC’s first report issued, almost a quarter-century ago), it is widely recognized that our understanding of the role of clouds in a changing climate is central to the issue.

In describing the why climate models have such different climate sensitivity values, the IPCC writes, in the 2013 edition of it’s science compendium,

There is very high confidence that uncertainties in cloud processes explain much of the spread in modelled climate sensitivity.

Sherwood and colleague set out to see if they could help nail down the specific cloud processes involved in the model spread and to see if recent observations could help better understand which models were handling  processes related to cloud behavior better than others.

Is Warmer Better? Florida Soon to Surpass New York as Nation’s Third Most Populous State

Hmmm. A pounding blizzard hits the Northeast, followed by an Arctic cold blast. All the while, Florida is set to oust New York and join California and Texas as the top 3 most populous states in the U.S.

Here is the story according to the Associated Press:

So while some folks yammer on about the perils of a warming climate (and try to force regulations upon us aimed at “doing something” about it), a great many others are actively seeking out warmer places to live. Perhaps not entirely for the climate, but that factor is almost assuredly not out of mind.

Maybe the public doesn’t think that its “health” is as “endangered” by a warmer climate as the U.S. Environmental Protection Agency contends.

California Thinks Your Time Is Worthless

California’s S.B. 375 mandates that cities increase the population densities of targeted neighborhoods because everyone knows that people drive less in higher densities and transit-oriented developments relieve congestion. One problem, however, is that transportation models reveal that increased densities actually increase congestion, as measured by “level of service,” which measures traffic as a percent of a roadway’s capacity and which in turn can be used to estimate the hours of delay people suffer.

The California legislature has come up with a solution: S.B. 743, which exempts cities from having to calculate and disclose levels of service in their environmental impact reports for densification projects. Instead, the law requires planners to come up with alternative measures of the impacts of densification.

On Monday, December 30, the Governor’s Office of Planning and Research released a “preliminary evaluation of alternative methods of transportation analysis. The document notes that one problem with trying to measure levels of service is that it is “difficult and expensive to calculate.” Well, boo hoo. Life is complicated, and if you want to centrally plan society, you can either deal with difficult and expensive measurement problems, or you will botch things up even worse than if you do deal with those problems.

The paper also argues that measuring congestion leads people to want projects that might actually relieve congestion, such as increasing roadway capacities. This would be bad, says the paper, because increased capacities might simply “induce” more travel. The fact that such increased travel might actually produce some economic benefits for the state is ignored. Instead, suppressing travel (and therefore suppressing economic productivity) should be the goal.

The document suggests five alternative measures of the impacts of densfication on transportation:

  1. Vehicle miles traveled;
  2. Auto trips generated;
  3. Multi-model level of service;
  4. Auto fuel use; and
  5. Motor vehicle hours traveled.

There are many problems with these alternatives. First, they really aren’t any simpler to reliably calculate than levels of service. Second, they ignore the impact on people’s time and lives: if densification reduces per capita vehicle miles traveled by 1 percent, planners will regard it as a victory even if the other 99 percent of travel is slowed by millions of hours per year. Third, despite the “multi-modal” measure, these measures ignore the environmental impacts of transit. For example, they propose to estimate automotive fuel consumption, but ignore transit energy consumption.

Worst of all, the final “measure” proposed by state planners is to simply presume, without making any estimates, that there is no significant transportation impact from densification. After all, if you add one vehicle to a congested highway and traffic bogs down, can you blame that one vehicle, or is everyone else equally to blame? If the latter, then it seems ridiculous, at least to the planners, to blame densification for increased congestion when the existing residents contribute to the congestion as well. By the same token, if an airplane is full, and one more person wants to take that flight, then the airline should punish everyone who is already on board by simply delaying the plane until someone voluntarily gets off.

The real problem is that planners and planning enthusiasts in the legislature don’t like the results of their own plans, so they simply want to ignore them. What good is an environmental impact report process if the legislature mandates that any impacts it doesn’t like should simply not be evaluated in that process?

All of this is a predictable outcome of attempts to improve peoples’ lives through planning. Planners can’t deal with complexity, so they oversimplify. Planners can’t deal with letting people make their own decisions, so they try to constrict those decisions. Planners can’t imagine that anyone wants to live any way but the way planners think they should live, so they ignore the 80 to 90 percent who drive and want to live in single-family homes as they impose their lifestyle ideologies on as many people as possible. The result is the planning disaster known as California.