Topic: Energy and Environment

AGU 2014: Quantifying the Mismatch between Climate Projections and Observations

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Pat Michaels is in San Francisco this week attending the annual meeting of the American Geophysical Union (AGU) and presenting a poster detailing the widening mismatch between observations of the earth’s temperature and climate model projections of its behavior. Since most global warming concern (including that behind regulatory action) stems from the projections of climate models as to how the earth’s temperature will evolve as we emit greenhouse gases into the atmosphere (as a result of burning fossil fuels to produce energy), it is important to keep a tab on how the model projections are faring when compared with reality. That they are faring not very well should be more widely known—Pat will spread the word while there.

We don’t want those of you who are unable to attend the conference to think you are missing out on anything, so we have reformatted our poster presentation to fit this blog format (it is available in its original format here).

———–

Quantifying the Lack of Consistency between Climate Model Projections and Observations of the Evolution of the Earth’s Average Surface Temperature since the Mid-20th Century

Patrick J. Michaels, Center for the Study of Science, Cato Institute, Washington DC

Paul C. Knappenberger, Center for the Study of Science, Cato Institute, Washington DC

INTRODUCTION

Recent climate change literature has been dominated by studies which show that the equilibrium climate sensitivity is better constrained than the latest estimates from the Intergovernmental Panel on Climate Change (IPCC) and the U.S. National Climate Assessment (NCA) and that the best estimate of the climate sensitivity is considerably lower than the climate model ensemble average. From the recent literature, the central estimate of the equilibrium climate sensitivity is ~2°C, while the climate model average is ~3.2°C, or an equilibrium climate sensitivity that is some 40% lower than the model average.

To the extent that the recent literature produces a more accurate estimate of the equilibrium climate sensitivity than does the climate model average, it means that the projections of future climate change given by both the IPCC and NCA are, by default, some 40% too large (too rapid) and the associated (and described) impacts are gross overestimates.

You Ought to Have A Look: Weak Link Between Global Warming and Extreme Weather

You Ought to Have a Look is a feature from the Center for the Study of Science posted by Patrick J. Michaels and Paul C. (“Chip”) Knappenberger. While this section will feature all of the areas of interest that we are emphasizing, the prominence of the climate issue is driving a tremendous amount of web traffic. Here we post a few of the best in recent days, along with our color commentary.

In this issue of You Ought To Have A Look, we feature the work of Martin Hoerling and his research team at the Physical Science Division (PSD) of NOAA’s Earth System Research Laboratory—a place where scientists live and breathe atmospheric dynamics and a rare government facility that puts science before hype when it comes to anthropogenic climate change.

It is pretty obvious by now that whenever severe weather strikes—rain, snow, heat, cold, flood, drought, etc.—someone will proclaim the events are “consistent with” expectations of global warming from human emissions of greenhouse gases.

Harder to find (at least on TV) are folks who pooh-pooh such notions and instead point out that nature is a noisy place and a definitive study linking such-and-such weather event to human climate modifications does not exist.

In truth, the science of severe weather is a messy, muddy place, not at all the simple, clean “science is settled” description preferred by climate alarmists and regulation seekers.

Hoerling is one scientist who does conjure some press coverage when describing the general lack of human fingerprint on all manner of extreme weather events. While most others hand-wave the science, Hoerling and his team actually put the historical observations and the behavioral expectations from climate models directly to the test.

Take, for example, the ongoing California drought. There are all manner of folks calling the drought conditions there “historic” and “epic” and the “worst in 1,200 years” and, of course, pointing the finger directly at humans. Even President Obama has gotten in on the act.

Not so fast say Hoerling’s team, in this case, led by Richard Seager. They decided to look at just what the expectations of California drought should be under an increasing greenhouse effect—expectations, in this case, defined by the very climate models making the future climate projections and upon which the case for catastrophic climate change (and equally catastrophic regulations) are founded. Their findings caught the attention of Seth Borenstein, science writer for the Associated Press, who highlighted them in an article earlier this week—an article that raised awareness of Seager and Hoerling’s findings.

Naomi Klein vs. the Climate

Liberal activist Naomi Klein has a new book out provocatively subtitled “Capitalism vs. the Climate.” (For those of you who don’t want to buy her book, an essay she wrote a couple years ago with the same title is here.)

What amuses me about her attempt to pit capitalism and the climate against each other is that I came across an excerpt from the book in the Toronto Globe and Mail in which, unwittingly, she advocated policies that, even accepting the debate on her terms, can’t be good for the climate.

Not surprisingly, she supports subsidies for renewable energy. It’s hard to have renewable energy without those. But what struck me was that she also argued for tying those subsidies to the use of local content. For example, there is a government program in Ontario that subsidizes solar energy, but only if the energy suppliers use a certain percentage of labor and materials that are made in Ontario.

There are two obvious and related problems with such a requirement. First, by requiring local inputs, you make your product more expensive (especially when local means high-cost Canada). If your goal is cheaper renewable energy, raising the price of inputs doesn’t make a whole lot of sense.

Second, the idea that each sub-federal government should promote local production of a particular product is absurd. Imagine if that happened worldwide: there would be thousands of producers of these products! I can’t think of a more inefficient and energy-wasting approach to manufacturing.

Just to be clear, I know many strong supporters of taking action against climate change who do not believe in this kind of protectionist approach. They recognize that local content requirements are economically harmful and shouldn’t be part of these policies. For reasons that are difficult to understand, Klein seems to have missed this pretty obvious point. (I did tweet it at her, but I’m not expecting much from that!)

The Terrible, Horrible, No Good, Very Bad Falling Gas Prices

A left-coast writer named Mark Morford thinks that gas prices falling to $2 a gallon would be the worst thing to happen to America. After all, he says, the wrong people would profit: oil companies (why would oil companies profit from lower gas prices?), auto makers, and internet retailers like Amazon that offer free shipping.

If falling gas prices are the worst for America, then the best, Morford goes on to say, would be to raise gas taxes by $6 a gallon and dedicate all of the revenue to boondoggles “alternative energy and transport, environmental protections, our busted educational system, our multi-trillion debt.” After all, government has proven itself so capable of finding the most cost-effective solutions to any problem in the past, and there’s no better way to reduce the debt than to tax the economy to death.

Morford is right in line with progressives like Naomi Klein, who thinks climate change is a grand opportunity to make war on capitalism. Despite doubts cast by other leftists, Klein insists that “responding to climate change could be the catalyst for a positive social and economic transformation”–by which she means government control of transportation, housing, and just about everything else.

These advocates of central planning remind me of University of Washington international studies professor Daniel Chirot assessment of the fall of the Soviet empire. From the time of Lenin, noted Chirot, soviet planners considered western industrial systems of the late nineteenth century their model for an ideal economy. By the 1980s, after decades of hard work, they had developed “the most advanced industries of the late 19th and early 20th centuries–polluting, wasteful, energy intensive, massive, inflexible–in short, giant rust belts.”

Morford and Klein want to do the same to the United States, using climate change as their excuse, and the golden age they wish to return to is around 1920, when streetcars and intercity passenger trains were at their peak (not counting the WWII era). Sure, there were cars, but only a few compared with today.

Current Wisdom: Record Global Temperature—Conflicting Reports, Contrasting Implications

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels and Paul C. “Chip” Knappenberger, from Cato’s Center for the Study of Science, review interesting items on global warming in the scientific literature or of a more technical nature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

Despite what you may think if you reside in the eastern United States, the world as a whole in 2014 has been fairly warm. For the past few months, several temperature-tracking agencies have been hinting that this year may turn out to be the “warmest ever recorded”—for whatever that is worth (keep reading for our evaluation). The hints have been turned up a notch with the latest United Nations climate confab taking place in Lima, Peru through December 12.  The mainstream media is happy to popularize these claims (as are government-money-seeking science lobbying groups).

But a closer look shows two things: first, whether or not 2014 will prove to be the record warmest year depends on whom you ask; and second, no matter where the final number for the year ranks in the observations, it will rank among the greatest “busts” of climate model predictions (which collectively expected it to be a lot warmer). The implication of the first is just nothing more than a jostling for press coverage. The implication of the latter is that future climate change appears to be less of a menace than assumed by the president and his pen and phone. 

Let’s examine at the various temperature records.

First, a little background. Several different groups compile the global average temperature in near-real time. Each uses slightly different data-handling techniques (such as how to account for missing data) and so each gets a slightly different (but nevertheless very similar) values. Several groups compute the surface temperature, while others calculate the global average temperature in the lower atmosphere (a bit freer from confounding factors like urbanization). All, thus far, only have data for 2014 compiled through October, so the final ranking for 2014, at this point in time, is only a speculation (although a pretty well-founded one).

The three major groups calculating the average surface temperature of the earth (land and ocean combined) all are currently indicating that 2014 will likely nudge out 2010 (by a couple hundredths of a degree Celsius) to become the warmest year in each dataset (which begin in mid-to-late 1800s). This is almost certainly true in the datasets maintained by the U.S. National Oceanographic and Atmospheric Administration (NOAA) and the UK Met Office Hadley Centre. In the record compiled by NASA’s Goddard Institute for Space Studies (GISS), the 2014 year-to-date value is in a virtual dead heat with the annual value for 2010, so the final ranking will depend heavily on the how the data come in for November and December. (The other major data compilation, the one developed by the Berkeley Earth group is not updated in real time).

COP-Out: Political Storyboarding in Peru

The 20th annual “Conference of the Parties” to the UN’s 1992 climate treaty (“COP-20”) is in its second week in Lima, Peru and the news is the same as from pretty much every other one.

You don’t need a calendar to know when these are coming up, as the media are flooded with global warming horror stories every November. This year’s version is that West Antarctic glaciers are shedding a “Mount Everest” of ice every year. That really does raise sea level—about 2/100 of an inch per year. As we noted here, that reality probably wouldn’t have made a headline anywhere.

The meetings are also preceded by some great climate policy “breakthrough.” This year’s was the president’s announcement that China, for the first time, was committed to capping its emissions by 2030. They did no such thing; they said they “intend” to level their emissions off “around” 2030. People “intend” to do a lot of things that don’t happen.

During the first week of these two-day meetings, developing nations coalesce around the notion the developed world (read: United States) must pay them $100 billion per year in perpetuity in order for them to even think about capping their emissions. It’s happened in at least the last five COPs.

In the second week, the UN announces, dolefully, that the conference is deadlocked, usually because the developing world has chosen not to commit economic suicide. Just yesterday, India announced that it simply wasn’t going to reduce its emissions at the expense of development.

Then an American savior descends. In Bali, in 2007, it was Al Gore. In 2009, Barack Obama arrived and barged into one of the developing nation caucuses, only to be asked politely to leave. This week it will be Secretary of State John Kerry, who earned his pre-meeting bones by announcing that climate change is the greatest threat in the world.

I guess nuclear war isn’t so bad after all.

As the deadlock will continue, the UN will announce that the meeting is going to go overtime, beyond its scheduled Friday end. Sometime on the weekend—and usually just in time to get to the Sunday morning newsy shows—Secretary Kerry will announce a breakthrough, the meeting will adjourn, and everyone will go home to begin the cycle anew until next December’s COP-21 in Paris, where a historic agreement will be inked.

Actually, there was something a little different in Lima this year: Given all the travel and its relative distance from Eurasia, COP-20 set the all-time record for carbon dioxide emissions associated with these annual gabfests.

The Purple Line Will Waste Money, Time, and Energy

Maryland’s Governor-Elect Larry Hogan has promised to cancel the Purple Line, another low-capacity rail boondoggle in suburban Washington DC that would cost taxpayers at least $2.4 billion to build and much more to operate and maintain. The initial projections for the line were that it would carry so few passengers that the Federal Transit Administration wouldn’t even fund it under the rules then in place. Obama has since changed those rules, but not to take any chances, Maryland’s current governor, Martin O’Malley, hired Parsons Brinckerhoff with the explicit goal of boosting ridership estimates to make it a fundable project.

I first looked at the Purple Line in April 2013, when the draft EIS (written by a team led by Parsons Brinckerhoff) was out projecting the line would carry more than 36,000 trips each weekday in 2030. This is far more than the 23,000 trips per weekday carried by the average light-rail line in the country in 2012. Despite this optimistic projection, the DEIS revealed that the rail project would both increase congestion and use more energy than all the cars it took off the road (though to find the congestion result you had to read the accompanying traffic analysis technical report, pp. 4-1 and 4-2).

A few months after I made these points in a blog post and various public presentations, Maryland published Parsons Brinckerhoff’s final EIS, which made an even more optimistic ridership projection: 46,000 riders per day in 2030, 28 percent more than in the draft. If measured by trips per station or mile of rail line, only the light-rail systems in Boston and Los Angeles carry more riders than the FEIS projected for the purple line.

Considering the huge demographic differences between Boston, Los Angeles, and Montgomery County, Maryland, it isn’t credible to think that the Purple Line’s performance will approach Boston and L.A. rail lines. First, urban Suffolk County (Boston) has 12,600 people per square mile and urban Los Angeles County has 6,900 people per square mile, both far more than urban Montgomery County’s 3,500 people per square mile.

However, it is not population densities but job densities that really make transit successful. Boston’s downtown, the destination of most of its light-rail (Green Line) trips, has 243,000 jobs. Los Angeles’s downtown, which is at the end of all but one of its light-rail lines, has 137,000 downtown jobs. LA’s Green Line doesn’t go downtown, but it serves LA Airport, which has and is surrounded by 135,000 jobs.

Montgomery County, where the Purple Line will go, really no major job centers. The closest is the University of Maryland which has about 46,000 jobs and students, a small fraction of the LA and Boston job centers. Though the university is on the proposed Purple Line, the campus covers 1,250 acres, which means many students and employees will not work or have classes within easy walking distance of the rail stations. Thus, the ridership projections for the Purple Line are not credible.

Pages