Tag: climate change

The Hurricane Last Time

As of this writing, Tuesday, September 11, Hurricane Florence is threatening millions of folks from South Carolina to Delaware. It’s currently forecast to be near the threshold of the dreaded Category 5 by tomorrow afternoon. Current thinking is that its environment will become a bit less conducive as it nears the North Carolina coast on Thursday afternoon, but still hitting as a Major Hurricane (Category 3+). It’s also forecast to slow down or stall shortly thereafter, which means it will dump disastrous amounts of water in southeastern North Carolina. Isolated totals of over two feet may be common. 

At the same time that it makes landfall, there is going to be the celebrity-studded “Global Climate Action Summit” in San Francisco, and no doubt Florence will be the poster girl.

There’s likely to be the usual hype about tropical cyclones (the generic term for hurricanes) getting worse because of global warming, even though their integrated energy and frequency, as published by Cato Adjunct Scholar Ryan Maue, show no warming-related trend whatsoever.

Maue’s Accumulated Cyclone Energy index shows no increase in global power or strength.

Maue’s Accumulated Cyclone Energy index shows no increase in global power or strength.

Here is the prevailing consensus opinion of the National Oceanic and Atmospheric Administration’s Geophysical Fluid Dynamics Laboratory (NOAA GFDL): “In the Atlantic, it is premature to conclude that human activities–and particularly greenhouse gas emissions that cause global warming–have already had a detectable impact on hurricane activity.”

We’ll also hear that associated rainfall is increasing along with oceanic heat content. Everything else being equal (dangerous words in science), that’s true. And if Florence does stall out, hey, we’ve got a climate change explanation for that, too! The jet stream is “weirding” because of atmospheric blocking induced by Arctic sea-ice depletion. This is a triple bank shot on the climate science billiards table. If that seems a stretch, it is, but climate models can be and are “parameterized” to give what the French Climatologist, Pierre Hourdin, recently called “an anticipated acceptable range” of results.

The fact is that hurricanes are temperamental beasts. On September 11, 1984, Hurricane Diana, also a Category 4, took aim at pretty much the same spot that Florence is forecast to landfall—Wilmington, North Carolina. And then—34 years ago—it stalled and turned a tight loop for a day, upwelling the cold water that lies beneath the surface, and it rapidly withered into a Category 1 before finally moving inland. (Some recent model runs for Florence have it looping over the exact same place.) The point is that what is forecast to happen on Thursday night—a major category 3+ landfall—darned near happened over three decades earlier… and exactly 30-years before that, in 1954, Hurricane Hazel made a destructive Category 4 landfall just south of the NC/SC border. The shape of the Carolina coastlines and barrier islands make the two states very susceptible to destructive hits. Fortunately, this proclivity toward taking direct hits from hurricanes has also taught the locals to adapt—many homes are on stilts, and there is a resilience built into their infrastructure that is lacking further north.

There’s long been a running research thread on how hurricanes may change in a warmer world. One thing that seems plausible is that the maximum potential power may shift a bit further north. What would that look like? Dozens of computers have cranked away thousands years of simulations and we have a mixture of results: but the consensus is that there will be slightly fewer but more intense hurricanes by the end of the 21st Century. 

We actually have an example of how far north a Category 4 can land, on August 27, 1667 in the tidewater region of southeast Virginia. It prompted the publication of a pamphlet in London called “Strange News from Virginia, being a true relation of the great tempest in Virginia.” The late, great weather historian David Ludlum published an excerpt:

Having this opportunity, I cannot but acquaint you with the Relation of a very strange Tempest which hath been in these parts (with us called a Hurricane) which began on Aug. 27 and continued with such Violence that it overturned many houses, burying in the Ruines much Goods and many people, beating to the ground such as were in any ways employed in the fields, blowing many Cattle that were near the Sea or Rivers, into them, (!!-eds), whereby unknown numbers have perished, to the great affliction of all people, few escaped who have not suffered in their persons or estates, much Corn was blown away, and great quantities of Tobacco have been lost, to the great damage of many, and the utter undoing of others. Neither did it end here, but the Trees were torn up by their roots, and in many places the whole Woods blown down, so that they cannot go from plantation to plantation. The Sea (by the violence of the winds) swelled twelve Foot above its usual height, drowning the whole country before it, with many of the inhabitants, their Cattle and Goods, the rest being forced to save themselves in the Mountains nearest adjoining, where they were forced to remain many days in great want.

Ludlum also quotes from a letter from Thomas Ludwell to Virginia Governor Lord Berkeley about the great tempest:

This poore Country…is now reduced to a very miserable condition by a continual course of misfortune…on the 27th of August followed the most dreadful Harry Cane that ever the colony groaned under. It lasted 24 hours, began at North East and went around to Northerly till it came to South East when it ceased. It was accompanied by a most violent raine, but no thunder. The night of it was the most dismal time I ever knew or heard of, for the wind and rain raised so confused a noise, mixed with the continual cracks of falling houses…the waves were impetuously beaten against the shores and by that violence forced and as it were crowded the creeks, rivers and bays to that prodigious height that it hazarded the drownding of many people who lived not in sight of the rivers, yet were then forced to climb to the top of their houses to keep themselves above water…But then the morning came and the sun risen it would have comforted us after such a night, hat it not lighted to us the ruins of our plantations, of which I think not one escaped. The nearest computation is at least 10,000 house blown down.

It is too bad that there were no anemometers at the time, but the damage and storm surge are certainly consistent with a Category 4 storm. And this was in 1667, at the nadir of the Little Ice Age.

Greenland Update: New Evidence for Post Ice-Age Warmth

Last month, we summarized evidence for the long-term stability of Greenland’s ice cap, even in the face of dramatically warmed summer temperatures. We drew particular attention to the heat in northwest Greenland at the beginning of the previous (as opposed to the current) interglacial. A detailed ice core shows around 6000 years of summer temperatures averaging 6-8oC (11-14oF) warmer than the 20th century average, beginning around 118,000 years ago. Despite six millenia of temperatures that are likely warmer than we can get them for a mere 500 years, Greenland only lost about 30% of its ice. That translates to only about five inches of sea level rise per century from meltwater.

We also cited evidence that after the beginning of the current interglacial (nominally 10,800 years ago) it was also several degrees warmer than the 20th century, but not as warm as it was at the beginning of the previous interglacial.

Not so fast. Work just published online in the Proceedings of the National Academy of Sciences by Jamie McFarlin (Northwestern University) and several coauthors now shows July temperatures averaged 4-7oC (7-13oF) warmer than the 1952-2014 average over northwestern Greenland from 8 to 10 thousand years ago. She also had some less precise data for maximum temperatures in the last interglacial, and they are in agreement (maybe even a tad warmer) with what was found in the ice core data mentioned in the first paragraph.

Award McFarlin some serious hard duty points. Her paleoclimate indicator was the assembly of midges buried in the annual sediments under Wax Lips Lake (we don’t make this stuff up), a small freshwater body in northwest Greenland between the ice cap and Thule Air Base, on the shore of the channel between Greenland and Ellesmere Island. Midges are horrifically irritating, tiny biting flies that infest most high-latitude summer locations. They’re also known as no-see-ums, and they are just as nasty now as they were thousands of years ago.  

Getting the core samples form Wax Lips Lake means being out there during the height of midge season.

She acknowledges the seeming paradox of the ice core data: how could it have been so warm even as Greenland retained so much of its ice? Her (reasonable) hypothesis is that it must have snowed more over the ice cap—recently demonstrated to be occurring for the last 200 years in Antarctica as the surrounding ocean warmed a tad. 

The major moisture source for snow in northwesternmost Greenland is the Arctic Ocean and the broad passage between Greenland and Ellesmere. The only way it would snow so much as to compensate for the two massive warmings that have now been detected, is for the water to have been warmer, increasing the amount of moisture in the air. As we noted in our last Greenland piece, the Arctic Ocean was periodically ice-free for millenia after the ice age.  

McFarlin’s results are further consistent, at least in spirit, with other research showing northern Eurasia to have been much warmer than previously thought at the beginning of the current interglacial.

Global warming apocalypse scenarios are driven largely by the rapid loss of massive amounts of Greenland ice, but the evidence keeps coming in that, in toto, it’s remarkably immune to extreme changes in temperature, and that an ice-free Arctic Ocean has been common in both the current and the last interglacial period. 

Climate Change: What Would Kavanaugh Do?

In a 2012 dissent from a District of Columbia Appellate Court opinion, Supreme Court nominee Brett Kavanaugh acknowledged that “dealing with global warming is urgent and important” but that any sweeping regulatory program would require an act of Congress:

But as in so many cases, the question here is: Who Decides? The short answer is that Congress (with the President) sets the policy through statutes, agencies implement that policy within statutory limits, and courts in justiciable cases ensure that agencies stay within the statutory limits set by Congress.

Here he sounds much like the late justice Antonin Scalia, speaking for the majority in the 2014 case Utility Air Regulatory Group v. EPA:

When an agency claims to discover in a long-extant statute an unheralded power to regulate “a significant portion of the American economy” we [the Court] typically greet its announcement with a measure of skepticism.  We expect Congress to speak clearly if it wishes to assign to an agency decisions of vast “economic and political significance.”

Scalia held this opinion so strongly that, in his last public judicial act, he wrote the order (passed 5-4) to stay the Obama Administration’s sweeping “Clean Power Plan.” Such actions occur when it appears the court is likely to vote in a similar fashion in a related case.

This all devolves to the 2007 landmark ruling, 5-4, in Massachusetts v. EPA, that the EPA indeed was empowered by the 1990 Clean Air Act Amendments to regulate emissions of carbon dioxide if the agency found that they endangered human health and welfare (which they subsequently did, in 2009). Justice Kennedy, Kavanaugh’s predecessor, voted with the majority.

Will Kavanaugh have a chance to reverse that vote? That depends on what the new Acting Administrator of the EPA plans to do about carbon dioxide emissions. If the agency simply stops any regulation of carbon dioxide, there will surely be some type of petition to compel the agency to continue regulation because of the 2009 endangerment finding. Alternatively, those already opposed to it might petition based upon the notion that the science has changed markedly since 2009, with increasing evidence that the computer models that were the sole basis for the finding have demonstrably overestimated warming in the current era. It’s also possible that Congress could compel EPA to reconsider its finding, and that a watered-down version might find itself at the center of a court-adjudicated policy fight.

Whatever happens, though, it is clear that Brett Kavanaugh clearly prefers Congressional statutes to agency fiat. Assuming that he is confirmed, he will surely exert his presence and preferences on the Court, including that global warming is “urgent and important,” but it is the job of Congress to define the regulatory statutes.

Some More Insensitivity about Global Warming

Hot off the press, in yesterday’s Journal of Climate, Nic Lewis and Judith Curry have re-calculated the equilibrium climate sensitivity (ECS) based upon the historical uptake of heat into the ocean and human emissions of greenhouse gases and aerosols. ECS is the net warming one expects for doubled atmospheric carbon dioxide. Their ECS ranges from 1.50 to 1.56 degrees Celsius.

Nic has kindly made the manuscript available here, so you don’t have to shell out $35 to the American Meteorological Society for a one-day view.

The paper is a follow-on to their 2015 publication that had a median ECS of 1.65⁰C. It was criticized for not using the latest-greatest “infilled” temperature history (in which less-than-global coverage becomes global using the same data) in order to derive the sensitivity. According to Lewis, writing yesterday on Curry’s blog, the new paper “addresses a range of concerns that have been raised about climate sensitivity estimates” like those in their 2015 paper.

The average ECS from the UN’s Intergovernmental Panel on Climate Change (IPCC) is 3.4⁰C, roughly twice the Lewis and Curry values. It somehow doesn’t seem surprising that the observed rate of warming is now running at about half of the rate in the UN’s models, does it?

Lewis and Curry’s paper appeared seven days after Andrew Dessler and colleagues showed that the mid-atmospheric temperature in the tropics is the best indicator of the earth’s energy balance. This means that any differences between observed and forecast midatmospheric temperatures there can be used to adjust the ECS.

Late last year, University of Alabama’s John Christy and Richard McNider showed that the observed rate of warming in the tropical mid-atmosphere is around 0.13⁰C/decade since 1979, while the model average forecast is 0.30⁰C/decade. This adjusts down the IPCC’s average ECS to the range of 1.5⁰C (actually 1.46⁰).

That’s three estimates of ECS all in the same range, and all approximately half of the UN’s average. 

It seems the long-range temperature forecast most consistent with these findings would be about half of what the IPCC is forecasting. That would put total human warming to 2100 right around the top goal of the Paris Accord, or 2.0⁰C.

Stay tuned on this one, because that might be in the net benefit zone.

Time to Cool It: The U.N.’s Moribund High-End Global Warming Emissions Scenario

The amount of future warming is predicated on the amount of emitted greenhouse gases and the sensitivity of earth’s surface temperature to changes in their concentrations. Here we take a look at the emissions component.

The U.N. currently entertains four emissions scenarios, all expressed as the change in downwelling radiation (in watts/meter-sq, nominal year 2100) towards the surface that results from an increase in the atmospheric concentration of certain greenhouse gases. They are called “representative concentration pathways,” or RCPs.

As can be seen in Figure 1, there are four, given as 2.6, 4.5, 6(.0) and 8.5. The ranges of associated warming for over 1000 total scenarios are given on the right axis.

Figure 1.  Approximately 1000 scenario runs for four RCPs. From Fuss et al., 2014.

Figure 1. Approximately 1000 scenario runs for four RCPs. From Fuss et al., 2014.

Our Review of the Draft Fourth “National Assessment” of Climate Change Impacts on the U.S.

Public comments on the draft fourth “National Assessment” of present and future climate change impacts on the U.S. are due at 11:59 PM tonight and will be embargoed from public release until after then. As soon as it is made public, we’ll link to our comments. Until then, just think about the previous three Assessments.

Reviewing the first one in 2000, myself and Chip Knappenberger discovered that the science team just happened to choose the two most extreme models (for temperature and precipitation) out of the 14 they considered. And then we discovered that they were worse than bad: when applied to a really simple record of temperature, they performed worse than a table of random numbers. Really, it was the same situation as if you took a multiple choice test with four possible answers, and somehow managed to get less than 25% right. That’s the highly sought after “negative knowledge,” something you might think impossible!

The second one (2009) was so bad that we covered it with a 211-page palimpsest, a document that looked exactly like the federal original in both design and content. Except that it contained all the missing science as well as correcting as many half-truths and incomplete statements as we could find. Like we said, that took 211 pages of beautiful typeset and illustrated prose.

The National Oceanic and Atmospheric Administration was instrumental in producing the third (2014) Assessment, and in their press release at its debut, gushed that “it is a key deliverable in President Obama’s Climate Action Plan.” That has been recently undelivered.

So what did we say in our review of the upcoming fourth one? Well, you’ll have to wait until tomorrow. 

UPDATE: comments by Ryan Maue and myself are now available on the Cato website.

More Data Fiddling—Is Another Warming “Pause” About to Start?

Yesterday Jim Hansen, now with Columbia University, and several of his colleagues released their summary of 2017 global temperatures. Their history, published by the NASA Goddard Institute for Space Studies, has constantly been evolving in ways that make the early years colder and the later years hot. I recently posted on how this can happen, and the differences between these modified datasets and those determined objectively (i.e. without human meddling).

For a couple years I have been pointing out (along with Judith Curry and others) that the latest fad—which puts a lot of warming in recent data—is to extend high-latitude land weather station data far out over the Arctic Ocean. Hansen’s crew takes stations north of 64⁰ latitude and extends them an astounding 1200 kilometers into the ocean.

This, plainly speaking, is a violation of one of the most fundamental principles of thermodynamics, which is that when matter is changing its state (from, say, solid to liquid), a stirred fluid will remain at “freezing” until it is all liquid, whereupon warming will commence.

This also applies in the Arctic, where the fluid is often stirred by strong winds. So if, say, Resolute, one of the northernmost land stations, is 50⁰F, and the Arctic is mixed water-ice (it always is), that 50 degrees will be extended out 1200 kilometers where the air-sea boundary temperature has to be around 30⁰F, the freezing point of seawater up there.

Hansen et al. did pay some attention to this, noting this extension, which they normally apply to their data, was responsible for making 2017 the second-warmest year in their record. If they “only” extended 250km (still dicey), it would drop their “global” temperatures by a tenth of a degree, which would send the year down a rank. The result of all of this is that the big “spike” at the end of their record is in no small part due to the 1200km extension that turns thermodynamics on its head.

There’s another interesting pronouncement in the NASA announcement; many people have noted that the sun is a bit cool in recent years, and that it continues to trend slightly downward. The changes in its radiance are probably good for a tenth of a degree (C) of surface temperature or so. Hansen et al. use this to provide covering fire should warming stall out yet again:

Therefore, because of the combination of the strong 2016 El Niño and the phase of the solar cycle, it is plausible, if not likely, that the next 10 years of global temperature change will leave an impression of a ‘global warming hiatus’.

The significance of this will all fall out in the next year or so. If temperatures head back down all the way to their pre-El Niño levels, that will ultimately bring back the post-1996 “pause.” We’re going to guess they are going to remain a couple of tenths of a degree above that, based on what happened after the big one in 1998, where they settled a small amount above the pre-El Niño of the earlier 1990s.

If the recent warming rate (adjusting for El Niño) continues, we’ll hear that it is doing so “despite” the sun. Given that one year (2018) can have little influence on a recent trendline, that copy may already have been written!

All of this begs the question: Hansen notes in his release that the warming rate since 1970 has been fairly constant, about 0.17⁰C per decade, and didn’t note that the average of the UN’s climate models say it should be about twice that now. More lukewarming.

Pages