Topic: Energy and Environment

What to Do about OPEC?

Cato hosted a policy forum last week (which you can watch in its entirety if you missed it the first time around) to discuss a new paper released by Security America’s Energy Future (SAFE).  The paper – written by long-time friends Andy Morriss and Roger Meiners – argues that there is a consensus among academics who have studied OPEC.  The consensus?  The cartel is responsible for less crude oil on the market than would otherwise be the case (which means higher prices than would otherwise be the case) and for the bulk of the price volatility we find in crude oil and, thus, gasoline markets.  “The international market for oil is not a free market” they conclude.  “The global oil market deviates in important ways from the competitive model and that these market anomalies have significant economic impacts and so are relevant for policy makers.”

While Morriss and Meiners would thus seem to invite politicians to act, they offered no agenda of their own.  That’s where SAFE comes in.  FedEx’s Fred Smith, who co-chairs SAFE’s Energy Security Leadership Council, argued at the forum that the federal government needs to respond to OPEC’s machinations by (1) achieving energy independence for North America (a goal I’ve been quite skeptical about in the past), (2) establishing tough energy efficiency standards for a whole host of goods, but most particularly, for U.S. automobiles via CAFÉ standards (an agenda that most economists would reject in favor of accurate price signals), and (3) subsidizing R&D in order to find alternatives to oil in transportation markets.  SAFE discusses this agenda more robustly in their “National Energy Strategy for Energy Security, 2013”.

SMU’s James Smith – one of the most prominent energy economists who works in this field – was on-hand to offer what I think was a compelling rebuttal to the central arguments forwarded by the Morriss and Meiners study.

Still Another Low Climate Sensitivity Estimate

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

As promised, we report here on yet another published estimate of the earth’s equilibrium climate sensitivity that is towards the low end of the United Nations’ Intergovernmental Panel on Climate Change (IPCC) range of possibilities.

Recall that the equilibrium climate sensitivity is the amount that the earth’s surface temperature will rise from a doubling of the pre-industrial atmospheric concentration of carbon dioxide. As such, it is probably the most important factor in determining whether or not we need to “do something” to mitigate future climate change. Lower sensitivity means low urgency, and, if low enough, carbon dioxide emissions confer a net benefit.

And despite common claims that the “science is settled” when it comes to global warming, we are still learning more and more about the earth complex climate system—and the more we learn, the less responsive it seems that the earth’s average temperature is to human carbon dioxide emissions.

The latest study to document a low climate sensitivity is authored by independent scientist Nic Lewis and is scheduled for publication in the Journal of Climate. Lewis’ study is a rather mathematically complicated reanalysis of another earlier mathematically complicated analysis that matches the observed global temperature change to the temperature change produced from a simple climate model with a configurable set of parameters whose actual values are largely unknown but can be assigned in the model simulations. By varying the values of these parameters in the models and seeing how well the resulting temperature output matches the observations, you can get some idea as to what the real-world value of these parameters are. And the main parameter of interest is the equilibrium climate sensitivity. Lewis’ study also includes additional model years and additional years of observations, including several years from the current global warming “hiatus” (i.e., the lack of a statistically significant rise in global temperature that extends for about 16 years, starting in early 1997).

We actually did something along a similar vein—in English—and published it back in 2002. We found the same thing that Lewis did: substantially reduced warming. We were handsomely rewarded for our efforts by the climategate mafia, who tried to get 1) the paper withdrawn, 2) the editor fired—not just from the journal, but from Auckland University, and 3) my (Michaels) 1979 PhD “reopened” by University of Wisconsin.

Pennsylvania’s Solyndra

Another government-subsidized solar energy company is headed to bankruptcy. The latest casualty is Flabeg Solar U.S. Corp, a subsidiary of a German company. Flabeg’s Pittsburgh plant has been shuttered and its employees laid off. 

In 2009, the Obama administration awarded Flabeg $10 million in federal green energy tax credits. Flabeg also reportedly received a $1 million federal grant. According to the Pittsburgh Tribune-Review, the state of Pennsylvania and Allegheny County kicked in another “$9 million in job creation grants, loans and other financial aid.” 

Flabeg apparently never had a chance to use the tax credits because it was never profitable, but federal taxpayers will likely be out $1 million for the grant. State and local taxpayers are unlikely to be as fortunate. And while taxpayers lose when government places a bad bet, the broader economy also loses when politicians redirect capital toward less productive uses (in this case, completely unproductive). 

Flabeg’s demise is a reminder that it isn’t just the federal government that’s shoveling corporate welfare. Not only do state and local government subsidize commercial interests, but the handouts are often coordinated with the feds. With Uncle Sam putting money in the pot, state and local governments can find the temptation to participate in a press release announcing the creation of X number of jobs irresistible. 

Just ask former Indiana Gov. Mitch Daniels (see here, here, and here). 

On a final note, the head of a Pennsylvania environmental group offered this reaction to the Flabeg news: 

The reason government steps into these cases is because they are too risky to get private capital…But as with private investments, some companies fail.

Yes, private investments do fail. But as I note in a paper on corporate welfare, “Businesses and venture capital firms make many mistakes as well, but their losses are private and not foisted involuntarily on taxpayers.” 

Current Wisdom: U.S. Precipitation Changes and Climate Model Expectations—If It Doesn’t Fit, You Must Acquit

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels, director of the Center for the Study of Science, reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

In this Current Wisdom we report further on our ongoing effort to prepare comments on the latest, greatest (or, more aptly, most recent, most indecent) edition of the government’s assessment of climate change impacts in the United States (if you are interested in submitting your own comments, you should hurry, because the public comment period closes on this Friday, April 12).

A disturbing yet ubiquitous aspect of the current draft National Climate Assessment (and for that matter, both earlier editions of the NCA) is the use of future projections of climate change before demonstrating that they work in the recent past, as greenhouse-gas concentrations have increased.

Discussions of future impacts from changes in precipitation resulting from human emissions of greenhouse gases are everywhere in the report and they are usually bad—increased droughts, floods, and longer dry spells, for example.  The NCA folks weren’t quite so enthusiastic at generating many forecasts of salutary changes.  Perhaps Dr. Pangloss is their spiritual adviser. 

NCA’s precipitation forecasts turn out to be uglier than Candide’s fair Cunegonde became.  Do  the models accurately simulate past changes that have been observed? If the answer is “no,” then the whole impact exercise is meaningless because the models provide no reliable information about what the future may bring.

The answer isn’t just “no.” It’s NO, NON, ONAY, NEIN.

Getting Our Due

In the Diary feature of this week’s The Spectator, rational optimist Matt Ridley has a collection of rather random observations from his daily life that have him thinking about (or maybe wishing for since Old Man Winter has been slow to loose his grip in the U.K. and Western Europe, much like he has across the Eastern U.S.) anthropogenic global warming.

What has his attention is that global warming just doesn’t seem to be going according to plan. And for those who have bought into that plan, their plan-driven actions are starting to make them look foolish.

But it’s not as if we haven’t “told you so”—a fact that Ridley draws attention to in the closing segment of his article.

David Rose of the Mail on Sunday was vilified for saying that there’s been no global warming for about 16 years, but even the head of the Intergovernmental Panel on Climate Change [IPCC] now admits he’s right. Rose is also excoriated for drawing attention to papers which find that climate sensitivity to carbon dioxide is much lower than thought — as was I when I made the same point in the Wall Street Journal. Yet even the Economist has now conceded this. Tip your hat to Patrick Michaels, then of the University of Virginia, who together with three colleagues published a carefully argued estimate of climate sensitivity in 2002. For having the temerity to say they thought ‘21st-century warming will be modest’, Michaels was ostracised. A campaign began behind the scenes to fire the editor of the journal that published the paper, Chris de Freitas. Yet Michaels’s central estimate of climate sensitivity agrees well with recent studies. Scientists can behave remarkably like priests at times.

What we determined in our 2002 study was that the amount of global warming projected by the end of this century was most likely being overestimated.  When we adjusted the climate model projections to take into account and better match the actual observations, our best estimate of the amount of warming we expected from 1990 to 2100 was about 1.8°C (3.2°F), which was in the lower end of the IPCC projected range, and which Ridley correctly noted, we termed as “modest.”

Further, we anticipated the slowdown in the warming rate. Quoting from our 2002 paper titled “Revised 21st century temperature projections” (Michaels et al., 2002):

The ‘worst case’ warming now appears to be merely linear, subject to the modifications described in this paper. Furthermore, both Table 1 and Fig. 3 indicate that any exponential rise in atmospheric CO2 concentrations is weak at best. Consequently, the current linear warming may in fact be the adjustment to the exponential growth in CO2 that took place prior to 1975. Levitus et al. (2000) documented a warming of 0.06°C in the top 3 km of a large-area ocean sample over the course of 40 yr. A lag correlation between that deep-water record and the sea-surface temperature record from Quayle et al. (1999) is very suggestive that oceanic thermal lag maximizes around 35 yr (Michaels et al. 2001). Thus, the truly exponential phase of concentration growth in the atmosphere, which ended about 25 yr ago, should induce a linear warming for the next decade or two before it could actually begin to damp.

Now, more than 10 years later, more and more evidence is piling in that we were right, including several recent papers that apply a technique not all that dissimilar in theory than our own (e.g. Gillett et al., 2012; Stott et al., 2013).

So even though we still are largely ostracized, at least we rest assured that we were pretty much on target—and some people are starting to take notice.

References:

Gillett N. P., V. K. Arora, G. M. Flato, J. F.  Scinocca, and K. von Salzen, 2012. Improved constraints on 21st-century warming derived using 160 years of temperature observations. Geophysical Research Letters, 39, L01704.

Michaels, P. J., P. C. Knappenberger, O. W. Frauenfeld, and R. E. Davis, 2002. Revised 21st century temperature projections. Climate Research, 23, 1-9.

Stott, P., P. Good, G. Jones, N. Gillett, and E. Hawkins, 2013. The upper end of climate model temperature projections is inconsistent with past warming. Environmental Research Letters, 8, 014024, doi:10.1088/1748-9326/8/1/014024.

Burning Books, Burning Witches, Burning Corn

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

 

History is littered with ideology gone awry.

The most recent example? Burning corn as a substitute for fossil fuels in an effort to mitigate anthropogenic climate change (which supposedly has a negative impact on the production of crops such as corn).

This is about as logical as publicity-stunt burnings of Harry Potter books because of objections to the contents within, which only results in more people buying and reading the books to find out what got the book-burners so inflamed in the first place.

With Harry Potter it was the fantasy world of witchcraft and wizardry. With corn ethanol it is the fantasy world of agriculturally damaging climate change.

A few years ago, a paper was published in the prominent scientific journal Science by Stanford’s David Lobell and colleagues that reported that human-caused global warming over the past 30 years resulted in a slowdown in global crop production. Modeling the climate response of the world’s four largest commodity crops—corn, rice, wheat, and soybeans—Lobell’s team calculated that as a result of rising temperatures and precipitation changes, global crop production was about 3 percent less than it otherwise would have been.

But consider this: The United States produces about 36 percent of the world’s corn. And about 40 percent of U.S. corn is used to produce ethanol for use as a gasoline substitute in an attempt to lower net carbon dioxide emissions from driving and reduce climate change. Globally, corn makes up 30 percent of total worldwide production of the four crops studied by Lobell’s group.

Multiply all these percentages out, and you get that the United States is burning a bit more than 4 percent of global crop production in an attempt to mitigate a climate-driven loss of 3 percent of the global crop production.

Rare “It’s Not as Bad as We Thought” Finding Published

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.

From the authors of a new paper just-published in the journal Nature Geoscience comes this surprising finding:

Tropical forests are less likely to lose biomass – plants and plant material – in response to greenhouse gas emissions over the twenty-first century than may previously have been thought.

A rare “not as bad as we thought” admission about the impacts of manmade global warming!

Not only that, but based on recent findings that the true climate sensitivity is much lower than climate models emulate—findings not incorporated in new study—the results are probably still even more “not as bad as they thought” than they thought!

Chris Huntingford from the U.K’s Centre for Ecology & Hydrology and colleagues coupled climate model projections to a land surface/vegetation model to see how the tropical forests in the Americas, Africa, and Asia respond to changes in atmospheric conditions. Their vegetation model includes interactions between terrestrial plants and influences such as temperature, precipitation, and the carbon dioxide concentration of the atmosphere (a plant fertilizer).

Unlike other studies which used a very limited selection of climate models and less sophisticated vegetation models, the Huntingford team found that in virtually all future simulations that the biomass of tropical forests increases over the course of the 21st century. This is a significantly different result than many previous which suggested that anthropogenic climate change would lead to, as Huntingford et al. put it, “catastrophic losses of forest cover and biomass.”

Perhaps most interestingly, the major driver for the biomass increase is the projected growth in atmospheric carbon dioxide concentration (thanks to our use of fossil fuels). The model projected changes in precipitation had little impact on the biomass predictions and the projected increase in temperature acted to decrease the biomass (although not as much as additional carbon dioxide acted to increase it).

Which is why the results probably get even better if there is less warming associated with carbon dioxide emissions than current generation climate models predict (new research suggest that climate models together produce about 50% more warming than they should).

The authors are quick to mention that uncertainty abounds, as our level of understanding of forest response to changing environmental conditions is not all that high. But even given these uncertainties, the authors are confident that their results of increasing biomass are robust. Here is how Huntingford described the situation in a press release:

The big surprise in our analysis is that uncertainties in ecological models of the rainforest are significantly larger than uncertainties from differences in climate projections. Despite this we conclude that based on current knowledge of expected climate change and ecological response, there is evidence of forest resilience for the Americas (Amazonia and Central America), Africa and Asia.

Resilience. A refreshingly honest assessment of an ecosystem response to climate change. And one that is probably a much more apt descriptor of natural systems than “delicate,” “sensitive,” or “fragile.”

Now if only the folks in charge of assembling national and international climate impact assessments would realize (or probably more accurately, admit to) this.

We are hard at work trying to focus their attention as we are vigorously reviewing the latest draft “National Assessment” of climate change.  We will leak out particularly juicy snippets in these pages when the time seems right.

Reference:

Huntingford, C. et al., 2013. Simulated resilience of tropical rainforests to CO2-induced climate change, Nature Geoscience, 10.1038/NGEO1741.