Tag: Global Science Report

AAAS’s Guide to Climate Alarmism

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Back in the Bush II Administration, the American Association for the Advancement of Science (AAAS) nakedly tried to nudge the political process surrounding the passage of the environmentally-horrific ethanol fuel mandate.  It hung a large banner from the side of its Washington headquarters, picturing a corn stalk morphing into a gas pump, all surrounded by a beautiful, pristine, blue ocean.  They got their way, and we got the bill, along with a net increase in greenhouse gas emissions.

So it’s not surprising that AAAS is on the Washington Insider side of global warming, releasing  a report today that is the perfect 1-2-3 step-by-step how-to guide to climate change alarm.

This is how it is laid out in the counterfactually-titled AAAS report  “What We Know”:

Step 1: State that virtually all scientists agree that humans are changing the climate,

Step 2: Highlight that climate change has the potential to bring low risk but high impact outcomes, and

Step 3: Proclaim that by acting now, we can do something to avert potential catastrophe.

To make this most effective, appeal to authority, or in this case, make the case that you are the authority. From the AAAS:

We’re the largest general scientific society in the world, and therefore we believe we have an obligation to inform the public and policymakers about what science is showing about any issue in modern life, and climate is a particularly pressing one,” said Dr. Alan Leshner, CEO of AAAS. “As the voice of the scientific community, we need to share what we know and bring policymakers to the table to discuss how to deal with the issue.

But despite promising to inform us as to “what the science is showing,” the AAAS report largely sidesteps the best and latest science that points to a much lowered risk of extreme climate change, choosing instead to inflate and then highlight what meager evidence exists for potential catastrophic outcomes—evidence that in many cases has been scientifically challenged (for example here and here).

Climate Insensitivity: What the IPCC Knew and Didn’t Tell Us, Part II

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

The bottom line from the new report from the Global Warming Policy Foundation (GWPF) is that the U.N.’s Intergovernmental Panel on Climate Change (IPCC) knew, but didn’t highlight, the fact that the best available scientific evidence suggests that the earth’s climate is much less sensitive to atmospheric carbon dioxide input than the climate models they relied upon  to forecast future global warming portray.

We covered the GWPF report and its implications in this post. But one implication is worth mentioning again, from the report’s conclusions:

The [climate models] overestimate future warming by 1.7–2 times relative to an estimate based on the best observational evidence.

While the report’s authors, Nicolas Lewis and Marcel Crok, are talking about the future, the same thing should apply to the past. In fact, a strong test of Lewis and Crok’s prediction is whether or not the same climate models predict too much warming to have already taken place than observations indicate.

There is perhaps a no better general assessment of past model behavior than the analysis that we developed for a post back in the fall.

The figure below is our primary finding. It shows how the observed rate of global warming compares with the rate of global warming projected to have occurred by the collection of climate models used by the IPCC. We performed this comparison over all time scales ranging from from 10 to 63 years. Our analysis ended in 2013 and included an analysis of the global temperature trend beginning in each year from 1950 through 2004. 

As can be clearly seen in our figure, climate models have consistently overestimated the amount of warming that has taken place. In fact, they are so bad, that over the course of the past 25 years (and even at some lengths as long as 35 years) the observed trend falls outside of the range which includes 95 percent of all model runs. In statistical parlance, this situation means that the observed trend cannot be reliably considered to be part of the collection of modeled trends. In other words, the real world is not accurately captured by the climate models—the models  predict that the world should warm up much faster than it actually does.

Climate Insensitivity: What the IPCC Knew But Didn’t Tell Us

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

In a remarkable example of scientific malfeasance, it has become apparent that the IPCC knew a lot more than it revealed in its 2013 climate compendium about how low the earth’s climate sensitivity is likely to be.

The importance of this revelation cannot be overstated. If the UN had played it straight, the “urgency” of global warming would have evaporated, but, recognizing that this might cause problems, they preferred to mislead the world’s policymakers.

Strong words? Judge for yourself.

The report Oversensitive—how the IPCC hid the good news on global warming,” was released today by the Global Warming Policy Foundation (GWPF)—a U.K. think-tank which is “concerned about the costs and other implications of many of the policies currently being advocated” regarding climate change (disclosure: our Dick Lindzen is a member of the GWPF Academic Advisory Council).

The new GWPF report concluded:

We believe that, due largely to the constraints the climate model-orientated IPCC process imposed, the Fifth Assessment Report failed to provide an adequate assessment of climate sensitivity – either ECS [equilibrium climate sensitivity] or TCR [transient climate response] – arguably the most important parameters in the climate discussion. In particular, it did not draw out the divergence that has emerged between ECS and TCR estimates based on the best observational evidence and those embodied in GCMs. Policymakers have thus been inadequately informed about the state of the science.

The study was authored by Nicholas Lewis and Marcel Crok. Crok is a freelance science writer from The Netherlands and Lewis, an independent climate scientist, was an author on two recent important papers regarding the determination of the earth’s equilibrium climate sensitivity (ECS)—that is, how much the earth’s average surface temperature will rise as a result of a doubling of the atmospheric concentration of carbon dioxide.

The earth’s climate sensitivity is the most important climate factor in determining how much global warming will result from our greenhouse gas emissions (primarily from burning of fossil fuels to produce, reliable, cheap energy). But, the problem is, is that we don’t know what the value of the climate sensitivity is—this makes projections of future climate change–how should we say this?–a bit speculative.

More Evidence for a Low Climate Sensitivity

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

We have two new entries to the long (and growing) list of papers appearing the in recent scientific literature that argue that the earth’s climate sensitivity—the ultimate rise in the earth’s average surface temperature from a doubling of the atmospheric carbon dioxide content—is close to 2°C, or near the low end of the range of possible values presented by the U.N.’s Intergovernmental Panel on Climate Change (IPCC).  With a low-end warming comes low-end impacts and an overall lack of urgency for federal rules and regulations (such as those outlined in the President’s Climate Action Plan) to limit carbon dioxide emissions and limit our energy choices.

The first is the result of a research effort conducted by Craig Loehle and published in the journal Ecological Modelling. The paper is a pretty straightforward determination of the climate sensitivity.  Loehle first uses a model of natural modulations to remove the influence of natural variability (such as solar activity and ocean circulation cycles) from the observed temperature history since 1850. The linear trend in the post-1950 residuals from Loehle’s natural variability model was then assumed to be largely the result, in net, of human carbon dioxide emissions.  By dividing the total temperature change (as indicated by the best-fit linear trend) by the observed rise in atmospheric carbon dioxide content, and then applying that relationship to a doubling of the carbon dioxide content, Loehle arrives at an estimate of the earth’s transient climate sensitivity—transient, in the sense that at the time of CO2 doubling, the earth has yet to reach a state of equilibrium and some warming is still to come. 

Loehle estimated the equilibrium climate sensitivity from his transient calculation based on the average transient:equilibrium ratio projected by the collection of climate models used in the IPCC’s most recent Assessment Report. In doing so, he arrived at an equilibrium climate sensitivity estimate of 1.99°C with a 95% confidence range of it being between 1.75°C and 2.23°C.

Compare Loehle’s estimate to the IPCC’s latest assessment of the earth’s equilibrium climate sensitivity which assigns a 66 percent or greater likelihood that it lies somewhere in the range from 1.5°C to 4.5°C. Loehle’s determination is more precise and decidedly towards the low end of the range.

The second entry to our list of low climate sensitivity estimates comes from  Roy Spencer and William Braswell and published in the Asia-Pacific Journal of Atmospheric Sciences. Spencer and Braswell used a very simple climate model to simulate the global temperature variations averaged over the top 2000 meters of the global ocean during the period 1955-2011. They first ran the simulation using only volcanic and anthropogenic influences on the climate. They ran the simulation again adding a simple take on the natural variability contributed by the El Niño/La Niña process. And they ran the simulation a final time adding in a more complex situation involving a feedback from El Niño/La Niña onto natural cloud characteristics. They then compared their model results with the set of real-world observations.

What the found, was the that the complex situation involving El Niño/La Niña feedbacks onto cloud properties produced the best match to the observations.  And this situation also produced the lowest estimate for the earth’s climate sensitivity to carbon dioxide emissions—a value of 1.3°C.

Spencer and Braswell freely admit that using their simple model is just the first step in a complicated diagnosis, but also point out that the results from simple models provide insight that should help guide the development of more complex models, and ultimately could help unravel some of the mystery as to why full climate models produce  high estimates of the earth’s equilibrium climate sensitivity, while estimates based in real-world observations are much lower.

Our Figure below helps to illustrate the discrepancy between climate model estimates and real-world estimates of the earth’s equilibrium climate sensitivity. It shows Loehle’s determination as well as that of Spencer and Braswell along with 16 other estimates reported in the scientific literature, beginning in 2011. Also included in our Figure is both the IPCC’s latest assessment of the literature as well as the characteristics of the equilibrium climate sensitivity from the collection of climate models that the IPCC uses to base its impacts assessment.

Figure 1. Climate sensitivity estimates from new research beginning in 2011 (colored), compared with the assessed range given in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) and the collection of climate models used in the IPCC AR5. The “likely” (greater than a 66% likelihood of occurrence)range in the IPCC Assessment is indicated by the gray bar. The arrows indicate the 5 to 95 percent confidence bounds for each estimate along with the best estimate (median of each probability density function; or the mean of multiple estimates; colored vertical line). Ring et al. (2012) present four estimates of the climate sensitivity and the red box encompasses those estimates. The right-hand side of the IPCC AR5 range is actually the 90% upper bound (the IPCC does not actually state the value for the upper 95 percent confidence bound of their estimate). Spencer and Braswell (2013) produce a single ECS value best-matched to ocean heat content observations and internal radiative forcing.

Quite obviously, the IPCC is rapidly losing is credibility.

As a result, the Obama Administration would do better to come to grips with this fact and stop deferring to the IPCC findings when trying to justify increasingly  burdensome  federal regulation of  carbon dioxide emissions, with the combined effects of manipulating markets and restricting energy choices.

References:

Loehle, C., 2014. A minimal model for estimating climate sensitivity. Ecological Modelling, 276, 80-84.

Spencer, R.W., and W. D. Braswell, 2013. The role of ENSO in global ocean temperature changes during 1955-2011 simulated with a 1D climate model. Asia-Pacific Journal of Atmospheric Sciences, doi:10.1007/s13143-014-0011-z.

Closing the Books on 2013: Another Year, Another Nail in the Coffin of Disastrous Global Warming

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

A few weeks have now passed since the end of last year, giving enough time for various data-compiling (and “data-adusting”) agencies to get their numbers in order and to release the sad figures from 2013.

U.S. Annual Average Temperature

We pointed out, back in this post in mid-December, that there was an outside chance—if December were cold enough—that the average annual temperature for the U.S. in 2013 would fall below the 20th century average for the first time since 1996.  Well, despite how cold it seemed in December, it turned out to not quite be cold enough to push the January-December 2013 temperature anomaly into negative territory. Figure 1 below shows the U.S. temperature history as compiled by the National Climatic Data Center from 1895 through 2013.

Figure 1. U.S. annual average temperature as compiled by the National Climatic Data Center, 1895-2013 (data: NCDC Climate at a Glance).

Please be advised that this history has been repeatedly “revised” to either make temperatures colder in the earlier years or warmer at the end.  Not one “adjustment” has the opposite effect, a clear contravention of logic and probability.  While the US has gotten slightly warmer in recent decades, compared to the early 20th century, so have the data themselves.  It’s a fact that if you just take all the thousands of fairly evenly-spaced “official” weather stations around the country and average them up since 1895, that you won’t get much of a warming trend at all.   Consequently a major and ongoing federal effort has been to try and cram these numbers into the box imposed by the theory that gives the government the most power—i.e., strong global warming.

What immediately stands out in 2013 is how exceptional the average temperature in 2012 (the warmest year in the record) really was. In fact, the recovery in 2013 from the lofty heights in 2012 was the largest year-over-year temperature decline in the complete 119 year record—an indication that 2012 was an outlier more so than “the new normal.”

Hot Air About Cold Air

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Last summer, we predicted that come this winter, any type of severe weather event was going to be linked to pernicious industrial activity (via global warming) through a new mechanism that had become a media darling—the loss of late summer/early fall Arctic sea ice leading to more persistent patterns in the jet stream. These are known as “blocking” patterns, which generally means that the same type of weather (usually somewhat extremish) hangs around longer than usual.

This global-warming-leading-to-more-extreme-winter-weather mechanism has been presented in several recent papers, perhaps the most noteworthy of which was a 2012 publication by Jennifer Francis and Stephen Vavrus, which was the subject of one of our blog posts last summer. We noted then how their idea ran counter to much of the extant literature of the topic as well as a host of other newly published papers investigating historical jet stream patterns.

After running through a list of observations compiled from the scientific literature countering the Francis and Vavrus explanation of things, we nevertheless wondered:

It’ll be interesting to see during this upcoming winter season how often the press—which seems intent on seeking to relate all bad weather events to anthropogenic global warming—turns to the Francis and Vavrus explanation of winter weather events, and whether or not the growing body of new and conflicting science is ever brought up.

We didn’t have to wait long. After a couple of early winter southward Arctic air excursions, the familiar and benign-sounding “jet stream” had become the “polar vortex”[1] which “sucked in” the United States. Of course, the U.S. being sucked into a polar vortex was part and parcel of what was to be expected from global warming.

Since we had predicted this action/reaction, we weren’t terribly surprised.

What did surprise us (although perhaps it shouldn’t have) is that the White House joined in the polar vortex horror show and released a video in which John Holdren, the  President’s Science Advisor—arguably the highest ranking “scientist” in the U.S.—linked the frigid air to global warming:

In the video, Holdren boldly stated:

 …a growing body of evidence suggests that kind of extreme cold being experienced by much of the United States as we speak is a pattern that we can expect to see with increasing frequency as global warming continues…

It seems that Holdren neither keeps up with our writings at Cato nor the scientific literature on the topic.

While perhaps it could be argued that Holdren’s statement is not an outright lie, it is, at its very best, a half-truth and even a stretch at that. For in fact, there is a larger and faster growing body of evidence that directly disputes Holdren’s contention.

In addition to the evidence that we reported on here and here, a couple of brand new papers just hit the scientific journals this month that emphatically reject the hypothesis that global warming is leading to more blocking patterns in the jet stream (and accompanying severe weather outbreaks across the U.S.).

The first paper is a modeling paper by a team of U.K. scientists led by Giacomo Masato from the University of Reading. Masato and his colleagues looked at how the magnitude and frequency of atmospheric blocking events in the Atlantic-Europe region is projected to change in the future according to four climate models which the authors claim match the observed characteristics of blocking events in this region pretty well. What they found was completely contradictory to Holdren’s claim. While the researchers did note a model-projected small future increase in the frequency of blocking patterns over the Atlantic (the ones which impact the weather in the U.S.), they found that the both the strength of the blocking events as well as the associated surface temperature anomalies over the continental U.S. were considerably moderated. In other words, global warming was expected to make “polar vortex” associated cold outbreaks less cold.

The second paper is by a research team led by Colorado State University’s Elizabeth Barnes. In their paper “Exploring recent trends in Northern Hemisphere blocking,” Barnes and colleagues used various meteorological definitions of “blocking” along with various datasets of atmospheric conditions to assess whether or not there have been any trends in the frequency of blocking events that could be tied to changes in global warming and/or the declines in Arctic sea ice.

They found no such associations.

From their conclusions:

[T]he link between recent Arctic warming and increased Northern Hemisphere blocking is currently not supported by observations. While Arctic sea ice experienced unprecedented losses in recent years, blocking frequencies in these years do not appear exceptional, falling well within their historically observed range. The large variability of blocking occurrence, on both inter-annual and decadal time scales, underscores the difficulty in separating any potentially forced response from natural variability.

In other words natural variability dominates the observed record making it impossible to detect any human-caused global warming signal even if one were to exist (which there is no proof of).

So, the most recent science shows 1) no observed relationship between global warming and winter severe weather outbreaks and 2) future “polar vortex”-associated cold outbreaks are projected to mollify—yet the White House prepares a special video proclaiming the opposite with the intent to spread climate alarm.

Full scientific disclosure in matters pertaining to global warming is not a characteristic that we have come to expect with this Administration.

References:

Barnes, E., et al., 2014. Exploring recent trends in Northern Hemisphere blocking. Geophysical Research Letters, doi:10.1002/2013GL058745.

Francis, J. A. and S. J. Vavrus, 2012: Evidence linking Arctic amplification to extreme weather in mid-latitudes. Geophysical Research Letters, 39, doi:10.1029/2012GL051000.

Masato, G., T. Woollings, and B.J. Hoskins, 2014. Structure and impact of atmospheric blocking over the Euro-Atlantic region in present day and future simulations. Geophysical Research Letters, doi:10.1002/2013GL058570.


[1] For what it’s worth, there’s been two polar vortices (north and south) on planet earth ever since it acquired an atmosphere and maintains rotation. 

CO2 Regulation News from the Federal Register

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

The Federal Register has been brimming with announcements of government activities aimed to reduce/regulate carbon dioxide emissions emanating from the United States.

You may wonder why the government finds the need to pursue such action since 1) U.S. carbon dioxide emissions have already topped out and have generally been on the decline for the past 7-8 years or so (from technological advances in natural gas extraction and a slow economy more so than from already- enacted government regulations and subsidies); 2) greenhouse gases from the rest of the world (primarily driven by China) have been sky-rocketing over the same period, which lessens any impacts that our emissions reduction have); and 3) even in their totality, U.S. carbon dioxide emissions have a negligible influence on local/regional/global climate change (even a immediate and permanent cessation of all our carbon dioxide emissions would likely result in a mitigation of global temperature rise of less than one-quarter of a degree C by the end of the century).

We wonder the same thing. Nevertheless, the government has lots of ideas for how to save ourselves from ourselves (with likely to opposite outcome).

Here is a summary of new announcements appearing in the Federal Register over the past month or so on actions aimed to curtail our carbon dioxide emissions (primarily the result of our desire for cheap and reliable energy—gasp!).

Posted November 26, 2013: The Office of Management and Budget (OMB) announced a call for review of the Technical Support Document currently justifying the Administration’s value of the social cost of carbon (SCC) used in federal cost/benefit analyses.  We have discussed this announcement previously, and while it provides a glimmer of hope for injecting some new science and common sense into the government’s social cost of carbon, we are highly skeptical of a positive outcome. We mention the announcement again here, because the public comment period ends on January 27, 2014.  Comments can be submitted here.

Posted December 6, 2013: The Department of Energy announced another in its seemingly endless string of intrusions into our personal choices through its energy efficiency requirement updates for all sorts of consumer products. These revised efficiency regulations rely on the SCC to offset the costs and enrich the apparent benefits of the new requirements. We have already submitted comments on several of these proposed regulations (from walk-in refrigerators to furnace fans), but they just keep on coming.  The latest pertains to commercial and electric motors. Final comments are due February 4, 2014 and con be submitted here.

Posted December 31, 2013: The Department of Energy (DoE) announced that it has declined a Petition for Reconsideration of its rule updating the energy conservation standards for of microwave ovens.  The Petition for Reconsideration was brought by the Landmark Legal Foundation which pointed out that the DoE used a social cost of carbon estimate in the cost/benefits analysis for the rule that had not been subject to public comment and which was some 50% higher than the value used in the C/B analysis that was available for public comment.  In other words, the DoE pulled a pretty big bait and switch.  We at the Cato’s Center for the Study of Science submitted comments on the Landmark Petition pointing out just how far afield from the actual science that the Administrations SCC estimate had become.  The denial was disappointing, but the fight over the proper value for the SCC has now moved to the OMB (as described above).