“Homogenized” US Warming Trend May Be Grossly Exaggerated

‘Homogenization’ adds systematic errors to the data rather than accounting for them.

December 29, 2015 • Commentary
This article appeared on Town​hall​.com on December 29, 2015.

The way we measure global temperature is once again facing scrutiny for over‐​estimating the planet’s warming trends. Our government homogenizes weather data so that all nearby weather stations are all singing the same tune. It’s done to weed out bad stations or failing weather equipment. We discovered such a thing earlier this year when we found that perhaps the nation’s most politically iconic weather station — Washington DC’s Reagan National Airport — was reading temperatures that were far too hot to be plausible.

Now it turns out that the homogenization itself is suspect and also producing way too much warming. Anthony Watts, a prominent climate blogger without any external financial support, revealed this in a blockbuster presentation at the fall meeting of the American Geophysical Union in San Francisco, a few days before Christmas. Along with three colleagues, he may have invalidated much of the warming in recent years in the U.S. temperature history from our National Oceanic and Atmospheric Administration.

For years, Watts and a team of volunteers set about to photograph, or obtain photos or satellite imagery, of just about every weather station that forms NOAA’s “Historical Climate Network” (HCN), which our government claims was pretty much free of nagging problems like temperature sensors being close to parking lots or, even worse, heat sources like air‐​conditioning exhaust.

It turns out they weren’t, and after assiduously poring over all the pictures, Watts and his crew classified the stations into two general groups, well‐​sited, “compliant” stations, and poorly‐​sited “non‐​compliant” ones. From the compliant group, Watts’ team further selected only those stations which had no changes whatsoever in location or observation timing during their analysis period, 1979–2008, leaving 92 of the best quality stations distributed across the U.S.

Watts then plotted up the average temperature from the government network’s homogenized data compared to his ultra‐​clean stations.

Around 1979, the second warming trend of the 20th century began in both US and global records. The first one, from 1910 to 1945, is about the same magnitude as the second one, but couldn’t have been from dreaded carbon dioxide because we had not emitted very much for most of that period.

The second warming is important because there’s likely to be some human component to it. If you want to know what that really means, here is a shameless plug for our new book, Lukewarming: The New Climate Science that Changes Everything. It’s very important to see how much of it is human, or, to put it more indelicately, if a substantial fraction of it is measurement error caused by compromised weather stations, there’s likely to be quite a bit less warming in our future than has been forecast by some computer models.

Watts’ findings are spectacular. Averaged across the U.S., the government’s homogenized stations are warming at a rate more than 50 percent greater than Watts’ clean ones. A whopping difference.

There’s more. Average U.S. temperatures warmed from 1979 through 1997 and then levelled off. There was also a cooling period earlier in this century. The compliant data shows less warming and less cooling than the homogenized data. In other words, “homogenization” adds systematic errors to the data rather than accounting for them. Basically, the government’s procedures adjust the observations of well‐​sited stations to be closer to that of poorly‐​sited stations, rather than the obviously preferable and more scientifically appropriate vice versa. Thus, the government’s procedure results in an enhanced warming signal.

The U.S. surface record turns out to be in many ways representative of the behavior of the entire Northern Hemisphere, and it turns out that NOAA does the same thing to their global land records, which means that there is the very real probability that not only has the global warming been overestimated by computer models, it has been over‐​measured by homogenized data. This is yet another piece of strong evidence that the Earth is not warming as much as the UN says it should have.

For much of the last year, Washington has been abuzz with rumors that NOAA manipulated the global temperature records to get them to “disappear” the “hiatus” in global warming since the mid‐​1990s, a phenomenon that is obvious in global satellite data. Congressman Lamar Smith (R-TX), chair of the House Committee on Science, Space and Technology, seems to smelling smoke over this. It appears that Anthony Watts has found the fire.

About the Authors