Tag: global warming

Oops: Got the Sign Wrong Trying to Explain Away the Global Warming “Pause”

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

A couple of years ago, when it was starting to become obvious that the average global surface temperature was not rising at anywhere near the rate that climate models projected, and in fact seemed to be leveling off rather than speeding up, explanations for the slowdown sprouted like mushrooms in compost.

We humbly suggested a combination of natural variability and a lower “sensitivity” of surface temperature to rising carbon dioxide.

Now, several years later, the “pause” continues. Natural variability is now widely accepted as making a significant contribution and our argument for a lowered climate sensitivity—which would indicate that existing climate models are not reliable tools for projecting future climate trends—is buoyed by accumulating evidence and is gaining support in the broader climate research community. Yet is largely rejected by federal regulators and their scientific supporters.  These folks prefer rather more exotic explanations that seek to deflect the blame away from the climate models and thus preserve their over-heated projections of future global warming.

The problem with exotic explanations is that they tend to unravel like exotic dancers.

Such is the case for the explanation—popular with the press when it was first proposed—that an increase in aerosol emissions, particularly from China, was acting to help offset the warming influence of anthropogenic carbon dioxide emissions.

The suggestion was made back in 2011 by a team of researchers led by Boston University’s Robert Kaufmann and published in the Proceedings of the National Academy of Sciences. Shortly after it appeared, we were critical of it in these pages, pointing out how the explanation was inconsistent with several lines of data.

Now, a new paper appearing in the peer-reviewed scientific literature takes a deeper view of aerosol emissions during the past 15 years and finds that, in net, changes in aerosol emissions over the period 1996-2010 contributed a net warming pressure to the earth’s climate.

Kühn et al. (2014) write:

Increases in Asian aerosol emissions have been suggested as one possible reason for the hiatus in global temperature increase during the past 15 years. We study the effect of sulphur and black carbon (BC) emission changes between 1996-2010 on the global energy balance. We find that the increased Asian emissions have had very little regional or global effects, while the emission reductions in Europe and the U.S. have caused a positive radiative forcing. In our simulations, the global-mean aerosol direct radiative effect changes 0.06 W/m2 during 1996–2010, while the effective radiative forcing (ERF) is 0.42 W/m2.

So in other words, rather than acting to slow global warming during the past decade and a half as proposed by Kaufmann et al. (2011), changes in anthropogenic aerosol emissions (including declining emissions trends in North America and Europe) have acted to enhance global warming (described as contributing to a positive increase in the radiative forcing in the above quote).

This means that the “pause,” or whatever you want to call it, in the rise of global surface temperatures is even more significant than it is generally taken to be, because whatever is the reason behind it, it is not only acting to slow the rise from greenhouse gas emissions but also the added rise from changes in aerosol emissions.

Until we understand what this sizeable mechanism is and how it works, our ability to reliably look into the future and foresee what climate lies ahead is a mirage. Yet, somehow, the Obama Administration is progressing full speed ahead with regulations about the kinds of cars and trucks we can drive, the appliances we use, and the types of energy available, etc., all in the name of mitigating future climate change.

As we repeatedly point out, not only will the Obama Administration’s actions have no meaningful impact on the amount of future climate change, but it is far from clear that the rate of future change will even be enough to mitigate—or even to worry about.

References

Kaufmann, R. K., et al., 2011. Reconciling anthropogenic climate change with observed temperature 1998–2008. Proceedings of the National Academy of Sciences. doi: 10.1073/pnas.1102467108

Kühn, T., et al., 2014. Climate impacts of changing aersol emission since 1996. Geophysical Research Letters, doi: 10.1002/2014GL060349

0.02°C Temperature Rise Averted: The Vital Number Missing from the EPA’s “By the Numbers” Fact Sheet

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”


Last week, the Obama Administration’s U.S. Environmental Protection Agency (EPA) unveiled a new set of proposed regulations aimed at reducing carbon dioxide emissions from existing U. S. power plants. The motivation for the EPA’s plan comes from the President’s desire to address and mitigate anthropogenic climate change.

We hate to be the party poopers, but the new regulations will do no such thing.

The EPA’s regulations seek to limit carbon dioxide emissions from electricity production in the year 2030 to a level 30 percent below what they were in 2005. It is worth noting that power plant CO2 emissions already dropped by about 15% from 2005 to2012, largely, because of market forces which favor less-CO2-emitting natural gas over coal as the fuel of choice for producing electricity. Apparently the President wants to lock in those gains and manipulate the market to see that the same decline takes place in twice the time.  Nothing like government intervention to facilitate market inefficiency. But we digress.

The EPA highlighted what the plan would achieve in their “By the Numbers” Fact Sheet that accompanied their big announcement.

For some reason, they left off their Fact Sheet how much climate change would be averted by the plan. Seems like a strange omission since, after all, without the threat of climate change, there would be no one thinking about the forced abridgement of our primary source of power production in the first place, and the Administration’s new emissions restriction scheme wouldn’t even be a gleam in this or any other president’s eye.

But no worries.  What the EPA left out, we’ll fill in.

Climate Science: No Dissent Allowed

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Award-winning climate modeler experiences “a situation that reminds me about the time of McCarthy”

An interesting juxtaposition of items appeared in our Inbox today.

First was an announcement that Dr. Lennart Bengtsson, former director of the Max Planck Institute for Meteorology, had resigned from the Academic Advisory Council of the U.K.’s Global Warming Policy Foundation. What was surprising about this announcement was that it was just announced a week or so ago that Dr. Bengtsson—a prominent and leading climate modeler and research scientist—was joining the GWPF Council. At that time, there was some wondering aloud as to why Dr. Bengtsson would join an organization that was somewhat “skeptical” when it comes to the projections and impacts of climate change and the effectiveness and direction of climate change policy.

During one recent interview Dr. Bengtsson explained:

I think the climate community shall be more critical and spend more time to understand what they are doing instead of presenting endless and often superficial results and to do this with a critical mind. I do not believe that the IPCC machinery is what is best for science in the long term. We are still in a situation where our knowledge is insufficient and climate models are not good enough. What we need is more basic research freely organized and driven by leading scientists without time pressure to deliver and only deliver when they believe the result is good and solid enough. It is not for scientists to determine what society should do. In order for society to make sensible decisions in complex issues it is essential to have input from different areas and from different individuals. The whole concept behind IPCC is basically wrong.

A good summary of the buzz that surrounded Dr. Bengtsson and his association with GWPF is contained over at Judith Curry’s website, Climate Etc.

So why did Dr. Bengtsson suddenly resign? 

California Shouldn’t Be Able to Impose Regulations on Businesses Outside of California

One of the several failures of the Articles of Confederation was the incapacity of the central government to deal with trade disputes among the states. The Constitution resolved this problem by empowering the federal government to regulate interstate commerce. It has since become a basic principle of American federalism that a state may not regulate actions in other states or impede the interstate flow of goods based on out-of-state conduct (rather than on the features of the goods themselves).

That principle was axiomatic until the U.S. Court of Appeals for the Ninth Circuit upheld one particular extra-territorial California regulation. California recently established a Low Carbon Fuel Standard (“LCFS”) that attempts to rate the “carbon intensity” of liquid fuels, so that carbon emissions can be reduced in the Golden State. California considers not only the carbon emissions from the fuel itself being burnt, however, but also the entire “lifetime” of the fuel, including its manufacture and transportation.

This has led to complaints from Midwestern ethanol producers, whose product—which is in all other ways identical to California-produced ethanol—being severely disadvantaged in California’s liquid fuel markets, simply because it comes from further away. Groups representing farmers and fuel manufacturers sued, arguing that the LCFS constitutes a clear violation of the Commerce Clause (the Article I federal power to regulate interstate commerce) by discriminating against interstate commerce and allowing California to regulate conduct occurring wholly outside of its borders. The Ninth Circuit recently upheld the LCFS, finding the regulation permissible because its purpose was primarily environmental and not economic protectionism (although judges dissenting from the court’s denial of rehearing pointed out that this is the wrong standard to apply).

The farmers and fuel manufacturer groups have now submitted a petition to have their case heard by the Supreme Court. Cato has joined the Pacific Legal Foundation, National Federation of Independent Business, Reason Foundation, California Manufacturers & Technology Association, and the Energy & Environmental Legal Institute on an amicus brief supporting the petition.

We argue that the lower court’s ruling provides a template for other states to follow should they want to evade Supreme Court precedents barring obstruction of interstate commerce and extraterritorial regulation. As the Founders fully recognized, ensuring the free flow of commerce among the states is vital to the wellbeing of the nation, and California’s actions—and the Ninth Circuit’s endorsement of them—threaten to clog up that flow. Not only does the appellate ruling allow California to throw national fuel markets into disarray, it invites other states to destabilize interstate markets and incite domestic trade disputes—precisely the type of uncooperative behavior the Constitution was designed to prevent.

The Supreme Court will likely decide whether to take Rocky Mountain Farmers Union v. Corey before it recesses for the summer. For more on the case, see this blogpost by PLF’s Tony Francois.

This blogpost was co-authored by Cato legal associate Julio Colomba.

Say What!?

While the social cost of carbon (SCC) is still being mulled over by the Office of Management and Budget, other federal agencies continue to push ahead with using the SCC to help justify their many regulations.

The way this works is that for every ton of carbon dioxide (CO2) that any new regulation is supposed to keep from being emitted into the atmosphere, the proposing agency gets about $32 credit to use to offset the costs that the new regulation will generate. This way, new regulations seem less costly—an attractive quality when trying to gain acceptance.

The idea is that the damage resulting from future climate changes will be decreased by $32 for every ton of carbon dioxide that is not emitted.

There is so much wrong with the way the government arrives at this number that we have argued that the SCC should be tossed out and barred from use in all federal rulemaking. It is far better not to include any value for the SCC cost/benefit analyses, than to include one which is knowingly improper, inaccurate and misleading.

Further, that the federal regulations limiting carbon dioxide emission will have any detectable impact on future climate change is highly debatable. To see for yourself, try out our global warming calculator that lets you select the magnitude of future carbon dioxide emissions reductions as well as which countries participate in your plan. The best that the U.S. can do—even if it were to halt all CO2 emissions now and forever—is to knock off about 0.1°C from the total climate model-projected global temperature rise by the year 2100.  In other words, U.S. actions are not very effective in limiting future climate change.

Apparently, the feds, too, agree that their plethora of proposed regulations will have little impact on carbon dioxide emissions and future climate change. But that doesn’t stop them from issuing them.

The passage below is from the proposed rulemaking from the Department of Energy to alter the Energy Conservation Standards for Commercial and Industrial Electric Motors  (this is only one of many proposed regulations making this claim):

The purpose of the SCC estimates presented here is to allow agencies to incorporate the monetized social benefits of reducing CO2 emissions into cost-benefit analyses of regulatory actions that have small, or “marginal,” impacts on cumulative global emissions.

In other words, DoE’s regulations won’t have any real impact on global CO2 emissions (and, in that manner, climate change), but nevertheless they’ll take a monetary credit for reduced damages that supposedly will result from the non-effective regulations.

(I wonder if can try that on my taxes)

It seems a bit, uh, cheeky, to take credit for something that you admit won’t happen.

But that’s the logic of the federal government for you!

Was Typhoon Haiyan the Most Intense Storm in Modern History?

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Global warming buffs have been fond of claiming that the roaring winds of Typhoon Haiyan were the highest ever measured in a landfalling tropical cyclone, and that therefore (?) this is a result of climate change. In reality, it’s unclear whether or not it holds the modern record for the strongest surface wind at landfall. 

This won’t be known until there is a thorough examination of its debris field.

The storm of record is 1969 Hurricane Camille, which I rode out in an oceanfront laboratory about 25 miles east of the eye. There’s a variety of evidence arguing that Camille is going to be able to retain her crown.

The lowest pressure in Haiyan was 895 millibars, or 26.42 inches of mercury. To give an idea, the needle on your grandmonther’s dial barometer would have to turn two complete counterclockwise circles to get there. While there have been four storms in the Atlantic in the modern era that have been as strong or a bit stronger, the western Pacific sees one of these approximately every two years or so.

Camille’s lowest pressure was a bit higher, at 905 mb (26.72 inches). At first blush it would therefore seem Haiyan would win the blowhard award hands down, but Hayian had a very large eye around which its winds swirled, while Camille’s was one of the smallest ever measured.  At times in its brief life, Camille’s was so small that the hurricane hunter aircraft could not safely complete a 360 degree turn without brushing through the devastating innermost cloud band, something you just don’t want to be near in a turning aircraft. In fact, the last aircraft to get into Camille, which measured 190mph sustained winds, lost an engine in the severe turbulence and fortunately was able to limp home.

Haiyan’s estimated 195mph winds were derived from satellite data, rather than being directly sensed by an aircraft.  But winds over the open ocean are always greater than those at landfall because of friction, and the five mph difference between the two storms is physically meaningless. 

Current Wisdom: Observations Now Inconsistent with Climate Model Predictions for 25 (going on 35) Years

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels and Paul C. “Chip” Knappenberger, from Cato’s Center for the Study of Science, review interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.  

 

Question: How long will the fantasy that climate models are reliable indicators of the earth’s climate evolution persist in face of overwhelming evidence to the contrary?  

Answer: Probably for as long as there is a crusade against fossil fuels.  

Without the exaggerated alarm conjured from overly pessimistic climate model projections of climate change from carbon dioxide emissions, fossil fuels—coal, oil, gas—would regain their image as the celebrated agents of  prosperity that they are, rather than being labeled as pernicious agents of our destruction.  

Just how credible are these climate models?  

In two words, “they’re not.”  

Everyone has read that over the past 10-15 years, most climate models’ forecasts of the rate of global warming have been wrong. Most predicted a hefty warming of the earth’s average surface temperature to have taken place, while there was no significant change in the real world.  

But very few  people know that the same situation has persisted for 25, going on 35 years, or that over the past 50-60 years (since the middle of the 20th century), the same models expected about 33 percent more warming to have taken place than was observed.