global warming

The Hurricane Last Time

As of this writing, Tuesday, September 11, Hurricane Florence is threatening millions of folks from South Carolina to Delaware. It’s currently forecast to be near the threshold of the dreaded Category 5 by tomorrow afternoon. Current thinking is that its environment will become a bit less conducive as it nears the North Carolina coast on Thursday afternoon, but still hitting as a Major Hurricane (Category 3+). It’s also forecast to slow down or stall shortly thereafter, which means it will dump disastrous amounts of water in southeastern North Carolina.

Global Science Report: Another Indication of Lukewarming

In March 1990, NASA’s Roy Spencer and University of Alabama-Huntsville’s (UAH) John Christy dropped quite a bomb when they published the first record of lower atmospheric temperatures sensed by satellites’ microwave sounding units (MSUs). While they only had ten years of data, it was crystal clear there was no significant warming trend.

It was subsequently discovered by Frank Wentz of Remote Sensing Systems (RSS), a Santa Rosa (CA) consultancy, that the orbits of the sensing satellites successively decay (i.e., become lower) and this results in a spurious but slight cooling trend. Using a record ending in 1995, Wentz showed a slight warming trend of 0.07⁰C/decade, about half of what was being observed by surface thermometers. 

In 1994, Christy and another UAH scientist, Richard McNider, attempted to remove “natural” climate change from the satellite data by backing out El Niño/La Niña fluctuations and the cooling associated with two big volcanoes in 1983 and 1991. They arrived at a warming trend of 0.09⁰C/decade after their removal.

Over the years, Spencer and Christy slightly revised their record repeatedly, and its latest iteration shows a total warming trend of 0.13⁰C/decade, which includes natural variability. But it is noteworthy that this is biased upward by very warm readings near the end of the record, thanks to the 2015–16 El Niño.

Global Science Report: JRA-55—Better Than the BEST Global Surface Temperature History, and Cooler Than the Rest

This is the first in a series of posts on global temperature records. The problems with surface thermometric records are manifold. Are there more reliable methods for measuring the temperature of the surface and the lower atmosphere?

Let’s face it, global surface temperature histories measured by thermometers are a mess. Recording stations come on- and offline seemingly at random. The time of day when the high and low temperatures for the previous 24 hours are recorded varies, often changing at the same station. This has a demonstrable biasing effect on high or low readings. Local conditions can further bias temperatures. What is the effect of a free-standing tree 100 feet away from a station growing into maturity? And the “urban heat island,” often very crudely accounted for, can artificially warm readings from population centers with as few as 2,500 residents. Neighboring reporting stations can diverge significantly from each other for no known reason.

The list goes on. Historically, temperatures have been recorded by mercury-in-glass thermometers housed in a ventilated white box. But, especially in poorer countries, there’s little financial incentive to keep these boxes the right white, so they may darken over time. That’s guaranteed to make the thermometers read hotter than it actually is. And the transition from glass to electronic thermometers (which read different high temperatures) has hardly been uniform.

Some of these problems are accounted for, and they produce dramatic alterations of original climate records (see here for the oft-noted New York Central Park adjustments) via a process called homogenization. Others, like the problem of station darkening, are not accounted for, even though there’s pretty good evidence that it is artificially warming temperatures in poor tropical nations.

What You Won’t Find in the New National Climate Assessment

Under the U.S. Global Change Research Act of 1990, the federal government has been charged with producing large National Climate Assessments (NCA), and today the most recent iteration has arrived. It is typical of these sorts of documents–much about how the future of mankind is doomed to suffer through increasingly erratic weather and other tribulations. It’s also missing a few tidbits of information that convincingly argue that everything in it with regard to upcoming 21st century climate needs to be taken with a mountain of salt.

The projections in the NCA are all based upon climate models. If there is something big that is systematically wrong with them, then the projections aren’t worth making or believing. 

Here’s the first bit of missing information:

Changes in the Climate Policy Winds

Yesterday, Nature Geosciences published an article by Richard Millar of the University of Exeter and nine coauthors that states the climate models have been predicting too much warming. Adjusting for this, along with slight increases in emissions reductions by the year 2030 (followed by much more rapid ones after then) leaves total human-induced warming of around 1.5⁰C by 2100, which conveniently is the aspirational warming target in the Paris Accord. Much of it is a lot like material in our 2016 book Lukewarming.

This represents a remarkable turnaround. At the time of Paris, one of the authors, Michael Grubb, said its goals were “simply incompatible with democracy.” Indeed, the apparent impossibility of Paris was seemingly self-evident. What he hadn’t recognized at the time was the reality of “the pause” in warming that began in the late 1990s and ended in 2015. Taking this into consideration changes things.

If Paris is an admitted failure, then the world is simply not going to take their (voluntary, unenforceable) Paris “Contributions” seriously, but Millar’s new result changes things. He told Jeff Tollefson, a reporter for Nature, “For a lot of people, it would probably be easier if the Paris goal was actually impossible. We’re showing that it’s still possible. But the real question is whether we can create the policy action that would actually be required to realize these scenarios.”

Suddenly it’s feasible, if only we will reduce our emissions even more.

Coincidentally, we just had a peer-reviewed paper accepted for publication by the Air and Waste Management Association and it goes Millar et al. one better. It’s called “Reconciliation of Competing Climate Science and Policy Paradigms,” and you can find an advanced copy here.

We note the increasing discrepancy between the climate models and reality, but what we do, instead of running a series of new models, is rely upon the mathematical form of observed warming. Since the second warming of the 20th century began in the late 1970s, and despite the “pause,” the rate has been remarkably linear, which is actually simulated by most climate models—they just overestimate the slope of the increase. However, one model, the INM-CM4 model from Russia’s E.M. Volodin, indeed does have the right rate of increase.

Figure 1. Despite the “pause”, the warming beginning in the late 1970s is remarkably linear, which is a general prediction of climate models. The models simply overestimated the rate of increase.

Global Science Report: Health Effects of Global Warming

The impact of global warming on temperature-induced human mortality has long been a concern, where it has been hypothesized that rising temperatures will lead to an increase in the number of deaths due to an increase in the frequency and intensity of heat waves. Others claim that rising temperatures will also reduce the number of deaths at the cold end of the temperature spectrum (fewer and less severe cold spells), resulting in possibly no net change or even fewer total temperature-related deaths in the future.

The largest study—by far—on temperature-related mortality was published by Gasparrini et al. in the journal Lancet in 2015. They examined over 74 million (!) deaths worldwide from 1985 to 2012 and found that the ratio of cold-related to heat-related deaths was a whopping 17 to 1. Moreover, the temperature percentile for minimum mortality was around the 60th in the tropics and “80–90th” in the temperate zones. Based upon real-world data, it is obvious that global warming is going to directly prevent a large number of deaths.

One of us (Michaels) co-authored a peer-reviewed literature article showing that as heat waves become more frequent, heat-related deaths decrease because of adaptation. Given that our cities are heating up on their own—without needing a push from greenhouse gases—under our hypothesis, heat-related mortality should be dropping, which it is.

You Ought to Have a Look: Time for a New “Hiatus” in Warming, or Time for an Accelerated Warming Trend?

As you can tell from our blog volume, there’s been a blizzard of new and significant climate findings being published in the refereed literature, and here’s some things You Ought to Have a Look at concerning the recent “hiatus” in warming and what might happen to our (now) post-El Niño climate.

With President Trump still deciding on U.S. participation in the Paris Climate Agreement, new research suggests the Earth’s global mean surface temperature (GMST) will blow past the so-called 1.5°C Paris target in the next decade. But before making that ominous prediction, Henley and King (2017) provide us with a good history lesson on a taboo topic in climate science circles: the recent global warming “hiatus” or “pause” from 1998-2014. One could be forgiven for thinking the hiatus was “settled science” since it featured prominently in the 2013 IPCC AR5 assessment report. But a concerted effort has been made in recent years to discount the hiatus as an insignificant statistical artifact perhaps based upon bad observational data, or a conspiracy theory to distract the public and climate policymakers. Even acknowledging the existence of the “hiatus” is sufficient to be labeled as a climate change denier.      

Social scientists, psychologists, and theologians of all stripes feared that widespread community acknowledgement of the hiatus would wither support for climate policy at such a pivotal juncture. 

In a 2014 Nature Commentary (Boykoff Media discourse on the climate slowdown) saw the rise of the terms “hiatus and pause” in the media in 2013 as a “wasted opportunity” to highlight the conclusions of the IPCC AR5 report, which in itself ironically struggled with explaining the hiatus/pause (IPCC: Despite hiatus, climate change here to stay. Nature September 27, 2013). Amazingly, in a Nature interview a week prior to AR5’s release, assessment co-chair Thomas Stocker said this:

Comparing short-term observations with long-term model projections is inappropriate. We know that there is a lot of natural fluctuation in the climate system. A 15-year hiatus is not so unusual even though the jury is out as to what exactly may have caused the pause. 

Pages

Subscribe to RSS - global warming