Surface temperatures are indeed increasing slightly: They’ve been going up, in fits and starts, for more than 150 years, or since a miserably cold and pestilential period known as the Little Ice Age. Before carbon dioxide from economic activity could have warmed us up, temperatures rose three‐quarters of a degree Fahrenheit between 1910 and World War II. They then cooled down a bit, only to warm again from the mid‐1970s to the late ‘90s, about the same amount as earlier in the century.
Whether temperatures have warmed much since then depends on what you look at. Until last June, most scientists acknowledged that warming reached a peak in the late 1990s, and since then had plateaued in a “hiatus.” There are about 60 different explanations for this in the refereed literature.
That changed last summer, when the National Oceanic and Atmospheric Administration (NOAA) decided to overhaul its data, throwing out satellite‐sensed sea‐surface temperatures since the late 1970s and instead relying on, among other sources, readings taken from the cooling‐water‐intake tubes of oceangoing vessels. The scientific literature is replete with articles about the large measurement errors that accrue in this data owing to the fact that a ship’s infrastructure conducts heat, absorbs a tremendous amount of the sun’s energy, and vessels’ intake tubes are at different ocean depths. See, for instance, John J. Kennedy’s “A review of uncertainty in in situ measurements and data sets of sea surface temperature,” published Jan. 24, 2014, by the journal Reviews of Geophysics.
NOAA’s alteration of its measurement standard and other changes produced a result that could have been predicted: a marginally significant warming trend in the data over the past several years, erasing the temperature plateau that vexed climate alarmists have found difficult to explain. Yet the increase remains far below what had been expected.