Topic: Energy and Environment

Climate Sensitivity Going Down

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

“Climate sensitivity” is the amount that the average global surface temperature will rise, given a doubling of the concentration of atmospheric carbon dioxide (CO2) in the atmosphere from its pre-industrial value. This metric is the key to understanding how much global warming will occur as we continue to burn fossil fuels for energy and emit the resultant CO2 into the atmosphere.

The problem is that we don’t know what the value of the climate sensitivity really is.

In its Fourth Assessment Report, released in 2007, the United Nations’ Intergovernmental Panel on Climate Change (IPCC) had this to say about the climate sensitivity:

It is likely to be in the range 2°C to 4.5°C with a best estimate of about 3.0°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded…

In IPCC parlance, the term “likely” means a probability of greater than 66% and “very likely” means a greater than 90% change of occurrence. The IPCC’s 90% range for the climate sensitivity  includes values at the low end which, if proven true, would engender very little concern over our use of fossil fuels as a primary energy source, and values at the high end would generate calls for frantic efforts (which would likely fail)  to lower carbon dioxide emissions.

While there has been a lot of effort expended to better constrain estimates of sensitivity over the past several decades, little progress has been made in narrowing the range.  The IPCC’s First Assessment Report, released back in 1990, gave a range of 1.5°C to 4.5°C.  It’s not that climate science hasn’t progressed since then, but just that the advanced understanding has not led to substantially better constraints.

But what has occurred over the past several decades is that greenhouse emissions have continued to rise (in fact, half of the total anthropogenerated  carbon dioxide emissions have been since the mid-1980s), and global temperature observations have continued to be collected.  We now have much more data with which to use to try to determine the sensitivity.

While global carbon dioxide emissions continue to rise year-over-year (primarily driven by the rapid growth in developing countries such as China), global temperatures have not kept up—in fact, there has been little to no overall global temperature increase (depending upon the record used) over the past decade and a half.

That doesn’t bode well for the IPCC’s high-end temperature sensitivity estimates. The scientific literature is now starting to reflect that reality.

Never mind that Pat Michaels and I published a paper in 2002 showing that the sensitivity lies near the low side of the IPCC’s range.  This idea (and those in similar papers subsequently published by others) had largely been ignored by the “mainstream” scientists self-selected to produce the IPCC Assessments.  But new results supporting lower and tighter estimates of the climate sensitivity are now appearing with regularity,  a testament to just how strong the evidence has become, for such results had to overcome the guardians of the IPCC’s so called “consensus of scientists”, which the Climategate emails showed to be less than gentlemanly.

Figure 1 shows the estimates of the climate sensitivity from five research papers that have appeared in the past two years, including the recent contributions from Ring et al. (2012) and van Hateren (2012)—both of which put the central estimate of the climate sensitivity at 2°C or lower, values which are at or beneath the IPCC’s  current “likely” range.

Figure 1. Climate sensitivity estimates from new research published in the past two years (colored), compared with the range given in the IPCC Fourth Assessment Report (black). The arrows indicate the 5 to 95% confidence bounds for each estimate along with the mean (vertical line) where available. Ring et al. (2012) present four estimates of the climate sensitivity and the red box encompasses those estimates.  The right-hand side of the IPCC range is dotted to indicate that the IPCC does not actually state the value for the upper 95% confidence bound of their estimate. The thick gray line represents the IPCC’s “likely” range.

The IPCC is scheduled to release its Fifth Assessment Report in 2013.  We’ll see whether these new, lower, and more constrained estimates of climate sensitivity  that are increasing populating the literature result in a modification of the IPCC estimates, or whether the IPCC authors manage to wave  them all away (or simply ignore them, as was the case with our 2002 paper).

Regardless of how the IPCC ultimately assesses climate science in 2013, the fact of the matter is that there is growing evidence that anthropogenic climate change from the burning of fossil fuels is not going to turn out to be as much as climate alarmists have made it out to be.

References:

Annan, J.D., and J.C. Hargreaves, 2011. On the genera­tion and interpretation of probabilistic estimates of climate sensitivity. Climatic Change, 104, 324-436.

Lindzen, R.S., and Y-S. Choi, 2011. On the observational determination of climate sensitivity and its implica­tions. Asia-Pacific Journal of Atmospheric Sciences, 47, 377-390.

Michaels, P.J., P.C. Knappenberger, O.W. Frauenfeld, and R.E. Davis, 2002. Revised 21st century temperature predictions. Climate Research, 23, 1-9.

Ring, M.J., et al., 2012. Causes of the global warming observed since the 19th century. Atmospheric and Climate Sciences, 2, 401-415, doi:10.4236/acs.2012.24035.

Schmittner, A., et al., 2011. Climate sensitivity estimat­ed from temperature reconstructions of the Last Glacial Maximum, Science, 334, 1385-1388, doi: 10.1126/science.1203513.

Solomon, S., et al., (eds.), 2007. Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge University Press, Cambridge, 996pp.

van Hateren, J.H., 2012. A fractal climate response function can simulate global average temperature trends of the modern era and the past millennium. Climate Dynamics, doi:10.1007/s00382-012-1375-3.

Ah, the Sweet Smell of Lukewarm Success

Three years ago the climate world was set ablaze by the release of thousands of “Climategate” emails from the server at the University of  East Anglia.  The ruling climate establishment, which I now call “hotheads”, showed itself threatening editors of journals who dared publish my papers, and engaged in a wide variety of other shady and nefarious practices.

I didn’t realize until the Climategate circus that my view on climate change had generated a moniker.  I was branded—accurately—a “lukewarmer”, meaning that my synthesis of climate behavior is that global warming is real, and caused in part by people. It is also exaggerated, both in magnitude and effect. My new Center studies why this occurs, and finds similar dynamics operating across many fields of federally-sponsored science.

Apparently this view is getting, as is said here in Swamp-By-the-Potomac, “traction”.

My evidence comes from no less a media icon than Bill Maher, writing about me and my sidekick Chip Knappenberger here at Cato, specifically concluding, “they are winning”.

Thank you Bill, and no I won’t go on your show.

Fact check:  Maher doesn’t realize that my company is closed and World Climate Report has migrated and evolved into Global Science Report, featured weekly on this blog.

Straw Men

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Lawrence Livermore National Laboratory’s Benjamin Santer and his mentor, Tom Wigley, of the National Center for Atmospheric Research seem, well, a little obsessed over Cato’s Pat Michaels.  First, Santer threatened to “beat the cr&p out of him”, and then Wigley tried to foment a cabal to “re-assess” his doctoral dissertation, under grounds that were completely, unalterably, and demonstrably 100 percent false.

So it’s no surprise that they have just published—two years after the fact—what they consider to be a rejoinder to Michaels’ 2010 testimony to the Subcommittee on Energy and Environment of the Committee on Science and Technology of the United States House of Representatives.  There’s about as much real substance here as there was in Wigley’s very ill-informed (and seemingly actionable, if Michaels didn’t have a day job here at Cato) campaign against his doctorate.

It’s just been published in the journal Climate Dynamics.  This is technically peer-reviewed, but, judging from the climategate emails and the serial paper trail of threatening journal editors (say, by writing to the University administrations where they worked) as well as the less than high quality contents of the actual paper, it makes you wonder just how critical the reviewers actually were.

In their paper, Wigley and Santer wrote:

Michaels’ 2010 Congressional testimony…is in conflict with the results presented here. This testimony makes the claims that “…greenhouse-related warming is clearly below the mean of relevant forecasts by IPCC”, and that “… the Finding of Endangerment from greenhouse gases by the Environmental Protection Agency is based on a very dubious and critical assumption”. The “assumption” referred to here is the IPCC statement that is the primary focus of the present paper, i.e., the statement that most of the warming since 1950 is very likely due to the human-caused increase in greenhouse gas concentrations.

Regarding Michaels’ statement that “…greenhouse-related warming is clearly below the mean of relevant forecasts by IPCC” Wigley and Santer argued that:

Roughly half of these [IPCC climate model] simulations did not consider the cooling effect of indirect aerosol forcing, so the results, on average, would be biased towards trends that are warmer than observed even if the models were perfect (cf. Santer et al. 2011).

So the climate models are biased to producing more warming than is observed? Isn’t that what Michaels said? These guys just won’t take “yes” for an answer.

And in fact, the reference in the above quote to “Santer et al. 2011” is a paper published by Santer and Wigley (and 15 others) that finds:

The multi-model average [lower atmospheric temperature] trend is always larger than the average observed [lower atmospheric temperature] trend…[a]s the trend fitting period increases…average observed trends are increasingly more unusual with respect to the multi-model distribution of forced trends.

That says what you think it says! Model temperature trends are always higher than the observed temperature trends and that over longer periods (i.e., more robust analysis) the model/observed discrepancy grows. Here is the relevant figure from that paper.

Figure 1. A comparison between modeled and observed trends in the average temperature of the lower atmosphere, for periods ranging from 10 to 32 years (during the period 1979 through 2010). The yellow is the 5-95 percentile range of individual model projections, the green is the model average, the red and blue are the average of the observations, as compiled by Remote Sensing Systems and University of Alabama in Huntsville respectively (adapted from Santer et al., 2011).

Their own analysis supports Michaels’ contention, which they somehow say is wrong.  Beats me.

In fact, their picture looks an awful lot like the one that Michaels used in his testimony (Figure 2).

Figure 2. Range of climate model probabilities of surface temperature trends (gray shading) overlaid with the observed surface temperature trend from the Climate Research Unit (blue line) (data through September 2010).

It’s worth noting that Michaels’ was the first presenter of this type of chart several years ago.  In fact, Wigley reviewed a paper it was in, helped get the editor to kill it, and then, with Santer, published something mighty similar.  How strange for someone they are arguing is wrong.

It goes on.

They then take exception with Michaels’ statement to Congress that the IPCC’s central finding that “[m]ost of the observed increase in global average temperatures since the mid-twentieth century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations” is “dubious.”

Wigley and Santer spill a lot of ink over the concept that in the absence of everything else, that the potential warming from anthropogenic greenhouse gases is likely greater than the observed warming. Michaels didn’t say that it wasn’t. In fact, most people believe this as true.

Michaels was concerned about the observed warming not some hypothetical, unrealizable (and therefore unverifiable) change. After all, it is the actual warming that the environment largely responds to. So when assessing the accuracy of the IPCC statement on observed warming, it is therefore appropriate to divide it up between various elements as he did.

While there’s a lot of gory detail in this discussion (see here, for more ), one thing that I think we all should be able to agree on is that it is physically impossible for something (like the emissions of anthropogenic greenhouse gases) to be responsible for causing more than 100% of what has been observed, and that such statements like this one from Wigley and Santer’s paper,

Here, the probability that the model-estimated GHG component of warming is greater than the entire observed trend (i.e., not just greater than ‘‘most’’ of the observed warming) is about 93%.

is something other than science, because one surely cannot find something that nature will not reveal.

The bottom line here is that in their paper, Wigley and Santer seem to place more import on the attack of Pat Michaels, than they do on the actual logic behind it.


References:

Santer,  B. D., C. Mears, C. Doutriaux, P. Caldwell, P.J. Gleckler , T.M.L. Wigley, S. Solomon, N.P. Gillett, D. Ivanova D, T.R. Karl, J.R. Lanzante, G.A. Meehl, P.A. Stott, K.E. Taylor, P.W. Thorne, M.F. Wehner, F.J. Wentz, 2011. Separating signal and noise in atmospheric temperature changes: the importance of timescale. Journal of Geophysical Research116, D22105. doi:10.1029/2011JD016263

Wigley, T.M.L., and B.D. Santer, 2012. A probabilistic quantification of the anthropogenic component of twentieth century global warming. Climate Dynamics, doi: 10.1007/s00382-012-1585-8

Carbon Tax Follies

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

There seems to be a noticeable murmur around town about a carbon tax—a tax on the amount of carbon dioxide that is released upon generating a unit of energy. Since fossil fuels—coal, oil, natural gas—are both the source of over 75% of our energy production and emitters of carbon dioxide when producing that energy, a carbon tax insures that the price of everything goes up.

There is one and only one justification for a carbon tax—an attempt to influence the future course of the earth’s climate (or, as some people prefer, to mitigate anthropogenic climate change) by trying to force down the emissions of the most abundant human-generated greenhouse gas.

But of all the things that a carbon tax will do (raise prices, increase bureaucracy, elect Tea Partiers, etc), mitigating anthropogenic climate change in any meaningful manner is not one of them.

The annual carbon dioxide emissions from the U.S., currently about 5,500 million metric tons per year, only contributes roughly 0.003°C/per year of warming pressure on global temperatures (see here for a handy way of making that calculation). So the best that a carbon tax could ever hope to achieve, climatically, would be to prevent this amount of warming each year by completely eliminating all carbon dioxide emissions from the U.S.

If we went to zero emissions tomorrow,  the carbon tax would prevent about 0.26°C of global temperature rise by the year 2100. According to the latest projections from the Intergovernmental Panel on Climate Change (IPCC), the projected temperature rise by the end of the century ranges from about 1.1 to 6.4°C, with a business-as-usual rise of around 3°C (put me down for 1.6° until then, unless nature is being a blatant liar).  The “mitigated” rise is proportional to the expected temperature rise. A carbon tax enacted today that is immediately and completely successful at eliminating all U.S. CO2 emission would lower rise in temperature expected by the end of the century around 10%.  This amount is small, of little consequence, and in fact will be difficult to detect.

It is also not going to happen.  We only have the capacity to produce about 30% of our electricity from non-carbon emitting fuel sources (primarily nuclear and hydroelectric). So it will take time, and probably a lot of time (many decades) before our energy needs could possibly be met without emitting CO2 into the atmosphere.  And of course, as time ticks by before eliminating or at least appreciably reducing  our emissions, the amount of global warming saved by such action declines (and become less and less consequential), as does the justification for the carbon tax.

I am just in the early stage of this analysis, so the numbers above are a bit rough (but conservative). In the future I hope to produce a menu of emissions reductions/climate savings options—but one without prices.  That way the policymakers will see what they are going to be getting for whatever price they decide to assign. So too will the general public. And what they will all see is that whatever level of carbon tax they decide upon,  they will get a lot of climate nothing  for a lot of financial something.

The best thing would be for policymakers to just leave well enough alone, for on their own, carbon dioxide emissions in the U.S. have been declining for more than a decade (and in fact are pushing levels of the early 1990s, http://www.eia.gov/environment/emissions/carbon/). And even if such a reduction doesn’t result in any scientifically detectable climate impacts, at least it hasn’t cost us anything.

Obama on Energy

Today Politico Arena asks:

What will the president’s reelection mean for gasoline and electricity prices over the next four years?

My response:

Unless Obama takes some extraordinary measure like imposing price controls, which is possible but not likely, his reelection will probably have little effect on energy prices over the next four years. Oil prices are determined largely by international markets, over which an American president has little if any control. If anything, the domestic shale oil boom that leads the news in the Wall Street Journal this morning is likely to result in lower energy prices.

But there’s a caveat, and that’s the global warming agenda of the environmental zealots. Al Gore, Governor Cuomo, and Mayor Bloomberg are only the latest to promote as conventional wisdom the idea that global warming causes more and more severe hurricanes, despite the lack of credible evidence supporting the claim. Thus, as less expensive fossil fuels promise to help our sluggish economy out of recession, environmentalists will be urging the president to wean the nation away from those fuels and toward far more expensive renewable energy.

We shouldn’t be surprised, therefore, if cap and trade and other such measures are again before us—perhaps through lawless executive order. Reaching vast areas of life, like Obamacare, the president’s energy agenda could, as he promised four years ago, “fundamentally transform e United States of America.”

Arguing over Sandy

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

On Monday of this week, three prominent and influential scientists published an opinion piece in Politico arguing that anthropogenic global warming was responsible for making the destruction from “super storm” Sandy significantly worse than it otherwise would have been. They added that if we don’t “cut industrial carbon pollution,” we are going to get more of the same, and then some.

On the same day, I argued here at Cato@Liberty that it could be reasoned that anthropogenic global warming lessened the impact of Sandy.

The scientific truth about the situation is that it is impossible to know who is right—the uncertainties are just too large. But I think it is fair to say that no matter what direction the influence of anthropogenic global warming was on Sandy, in net it was quite small.

One difference between my piece and the Politico article co-signed by Dr. Robert Corell (“senior policy fellow for the American Meteorological Society and former Chair of the United States Global Change Research Program”), Dr. Jeff Masters (“the founder and Director of Meteorology for Weather Underground and a former NOAA Hurricane Hunter”) and Dr. Kevin Trenberth (“Distinguished Senior Scientist in the Climate Analysis Section at the National Center for Atmospheric Research”) was that I tried to stick to a scientifically defensible argument.  In their piece, they juiced up their case with some—how should I say this?—rather dubious facts.

The worst of these was claiming that “[o]n the stretch of the Atlantic Coast that spans from Norfolk to Boston, sea levels have been rising four times faster than the global average.” They implied that anthropogenic global warming was the reason why.

This is simply untrue.

While it is true that the long-term (~20th century) rate of sea level rise along that stretch has been about twice the global average over the same period, it is scientifically well-established that the regional enhanced rate of sea level rise is due to ongoing geologic processes resulting from end of the last ice age. When these non-anthropogenic processes are properly subtracted out of the tide gauge record of sea level observations, the rate of sea level rise that is left over is virtually the same as the global rate of rise, not four times faster.

The same conclusion is reached if you limit your comparison to the period of satellite observations of sea level, which began about 20 years ago.  The figure below shows a map of the satellite-measured trends in sea level from 1993 through mid-2012.  The global average rate of rise is 0.12 inches per year, which is represented by a sort of greenish yellow color.  Turning your attention to the Northeast coast of the United States (you might have to squint a bit), you see that the color there is also sort of greenish yellow—in other words, right about the global average.  Places where the sea level is rising four times faster than the global average are colored a light pink; while there are a few such places, none of them are anywhere near the stretch of coast between Norfolk and Boston.

Figure 1. Spatial distribution of the rate of sea level rise across the globe as measured by satellite altimeters (Source: University of Colorado Sea Level Group, modified to reflect English units).

It is somewhat telling when prominent climate scientists have to resort to incorporating incorrect (and readily debunked) science to try to bolster their case for climate alarm—an alarm that was raised to try to scare us into accepting regulations on greenhouse gas emissions.

A Nation in Decline?

Der Spiegel, the German magazine, argues that the recent election campaign is evidence that the United States is a nation in decline. Certainly the political system is having its problems, but Der Spiegel’s prescription of going further into debt to build high-speed trains and other European follies is a dubious way to fix those problems.

The real decline is in the Republican Party, which couldn’t manage to capture the White House or the Senate despite high unemployment and other economic problems. Given the economy, this election was the Republicans’ to lose, and lose it they did. They began shooting themselves in their collective feet early in the last decade when they made immigration a big issue, thus earning the enmity of Latinos, the nation’s fastest-growing and second-most important ethnic group.

Unfortunately, our two-party system too often limits voters to a choice between a social & fiscal liberal vs. a social & fiscal conservative (or, worse, a social & fiscal liberal vs. a social conservative & fiscal liberal). A large percentage of potential voters don’t feel comfortable in either party, and the libertarian side of me thinks, or hopes anyway, that many of those “independents” are socially liberal but fiscally conservative.

By focusing on fiscal issues, the tea parties seemed to provide an alternate route, one that set social issues aside. But, as Marian Tupy notes, too many Republican candidates made social issues a major part of their campaigns, thus alienating both Democrats and independents. Romney, who was neither a true fiscal nor social conservative, didn’t help by offering an inconsistent message, as often criticizing the president for cutting budgets, such as medicare and defense, as for spending money.

So the next two years look to be the same as the last two: Democrats in the White House and controlling the Senate while Republicans hold the House. Does that mean more gridlock, with Republicans opposing any tax increases and Democrats opposing any budget cuts?

In the face of a fiscal cliff–meaning automatic budget cuts and tax increases if Congress doesn’t find another resolution–Obama hopes for a Grand Bargain in which Republicans accept some modest tax increases in exchange for some modest budget cuts. However, I suspect Republicans are immediately going to regroup for 2014 and 2016, and won’t want to commit themselves to such a bargain. Moreover, most of the push against a Grand Bargain is coming from liberals, not conservatives. So I suspect we will be seeing two more years of gridlock.

Take my issue, transportation, which Congress has to deal with again in 2014, the year the 2012 transportation bill expires. What would a Grand Bargain look like for reauthorization of the gas tax and the spending of those tax revenues? A five-cent increase in gas taxes in exchange for cuts in some of the worst examples of pork? It doesn’t seem likely; increased gas taxes would just feed the pork barrel, and any cuts in pork would probably be restored in annual appropriations. Fiscal conservatives have nothing to gain by supporting such a Grand Bargain.

Does that mean the U.S. is in decline, as Der Spiegel says? Not necessarily. Elections today are no more contentious than they were between 1876 and 1900, when several presidential elections were decided by less than a percent of the vote and at least two of the electoral college winners lost the popular vote. Politics then were dirtier, or at least as dirty, as any time in American history.

The real threat to the future of the country is not political polarization but the huge fiscal hole Congress has dug, which means the real question is whether our economy can recover enough to ever fill up that hole. The standard free-market answer is that the uncertainty created by Obama’s overregulation and inconsistent attitudes towards business will prevent such a recovery. We can only hope that this is wrong.

If they are to participate in this recovery, Republicans must drop the emphasis on social issues (which aren’t really decided at the national level anyway) and their hostility towards immigrants. I hate to think that America’s future depends on Republicans coming to their collective senses, but the alternative of Democrats suddenly becoming fiscally conservative seems even less likely.