Topic: Energy and Environment

Richard Lindzen on “Global Climate Alarmism and Historical Precedents”

Richard Lindzen, Professor Emeritus at MIT, and now a Distinguished Senior Fellow in the Center for the Study of Science here at Cato, has just published a paper called “Science in the Public Square: Global Climate Alarmism and Historical Precedents.” The paper is in the Journal of American Physicians and Surgeons.

Lindzen begins with what he calls “The Iron Triangle,” an analog to a very popular aphorism coined by Ronald Reagan, that describes the generic and mutually beneficial relationship between Congress, the media, and special interest groups. Lindzen’s version is between scientists who make “meaningless or ambiguous statements” on climate change, which are translated into alarmist declarations by the global warming lobby, to which politicians respond by shoveling more money to the scientists. Dr. Lindzen cheekily calls this version the Iron Rice Bowl, the same phrase coined by Mao Zedong to describe lifetime employment in exchange for support of the communist state.

Lindzen, whose article is available here, notes this type of symbiosis supported two other particularly bad ideas. One was early 20th Century eugenics, which was enshrined by law in the United States, politically very useful in 1920s Germany, and institutionalized into the holocaust in the succeeding decade. For an exhausting and exhaustive insight on this process in Germany, you still can’t beat Robert J. Lifton’s 1986 book, The Nazi Doctors.

A similar dynamic surrounded the institutionalization of the obviously incorrect paradigm of  “the inheritance of acquired characteristics,” championed by Soviet agronomist Trofim Lysenko, under the enthusiastic support of Josef Stalin, who thought it would help bring about “The New Soviet Man,” by changing human nature genetically through physical experiences of the organism. The logic is as simple as this: if one, say, pumped iron incessantly with just the left arm, your children would be born with muscular left arms. Hogwash, but effective for a public that both feared its government and was scientifically illiterate. 

“The Situation in Biological Science,” published (and translated) by the Lenin Academy of Agricultural Sciences of the U.S.S.R. in 1949 sits very close to Lifton’s book on my shelf at Cato. It is an exhaustive compendium on Lysenko’s “new genetics.” It claims authority, and if you spoke against it as a scientist, a trip to Siberia (or worse) wasn’t far away. With global warming alarmism, we are much more humane. Speak against it, and you will lose your government funding and maybe your job, but not your life.

Lindzen finishes with a bit of optimism, noting that the eugenics and Lysenkoism lasted about thirty years, which would mean that the Iron Triangle of climate alarmism is getting a little long in the tooth (it started in1988). 

Methinks Professor Lindzen is a bit optimistic. After all, most regulation of ionizing radiation and carcinogens is based upon the obviously wrong notion that a single photon or a single molecule can induce cancer. That was enshrined in the 1950s and lives on today.

The Tenuous Link between Stronger Winter Storms and Global Warming Becomes Even Weaker

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Come the cold season, whenever there is some type of strong storm system near the U.S. Eastern Seaboard—be it a Nor’easter, a blizzard, or ex-hurricane Sandy—you don’t have to look very hard to find someone who will tell you that this weather is “consistent with” expectations of climate change resulting from human greenhouse gas emissions. The worse the storm, the more “consistent” it becomes.

The complete collection of climate science describes just how complex the physical processes are governing such storm systems. Teasing out any anthropogenic influence, including even the direction of any influence, is darn near impossible. Claims to the contrary are usually based on a highly selective assessment of the science or the data.

A case in point:

The latest en vogue explanation linking human greenhouse gas emissions to strong winter-season East Coast storms involves changes in the characteristics of the jet stream—a river of fast moving air in the atmosphere that influences both the strength and the forward speed of extratropical storm systems. A prominent (in the media, anyway) research study last year by Rutgers’s Jennifer Francis and University of Wisconsin’s Stephen Vavrus suggests that the declining temperature difference between the Arctic and the lower latitudes (adding greenhouse gases into the atmosphere warms colder, drier regions more so than warmer, wetter ones—with the notable exception of Antarctica) has led to changes in the jet stream which result in slower moving, and potentially stronger East Coast winter storm systems.

An Example of the Abuse of the Social Cost of Carbon

In my recent op-ed for The Hill examining the Obama administration’s estimation of the social cost of carbon (SCC)—a measure of how much future damage is purportedly going to be caused by each ton of carbon dioxide that is emitted through human activities—I identified two major problems with their measure.

First, the administration’s SCC was based on an estimate of global rather than domestic damages from anthropogenic climate change—an odd scope for a measure designed to be incorporated in the cost/benefit analysis of U.S. rules and regulations governing domestic activities (such as the energy efficiency of microwave ovens sold in the United States). In fact, Office and Management and Budget (OMB) guidelines state that

Your analysis should focus on benefits and costs that accrue to citizens and residents of the United States. Where you choose to evaluate a regulation that is likely to have effects beyond the borders of the United States, these effects should be reported separately.

Instead of “reporting separately,” the administration’s SCC embodies “effects beyond the borders of the United States.”

Second, the administration recently revised (upwards) its initial calculation of the SCC. In doing so, it included updates to its underlying economic/climate-change/damage models, but it did not include any updates to the characteristics of the equilibrium climate sensitivity used by the models. Since the equilibrium climate sensitivity is the key factor in how much climate change will result from a given amount of anthropogenic carbon dioxide emissions, and since there is mounting scientific evidence that the equilibrium climate sensitivity is better constrained and lower than that used in the initial analysis, there is no defensible reason why the new science was not included in the administration’s revised SCC calculation.

So that’s two strikes against it.

IPCC Chooses Option No. 3

Global Science Report is a weekly feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

The U.N.’s Intergovernmental  Panel on Climate Change (IPCC) is nearing the final stages of its Fifth Assessment Report (AR5)—the latest, greatest version of its assessment of the science of climate change. Information is leaking out, with some regularity, as to what the final report will contain (why it is secretive in the first place is beyond us).

A few weeks ago, The Economist reported on some of the information from the new IPCC report that was leaked. The key piece of information concerned the IPCC’s assessment of the equilibrium climate sensitivity—how much the earth’s average surface temperature increases as a result of a doubling of the atmospheric carbon dioxide concentration. As we have been reporting, the research now dominating the scientific literature indicates that the equilibrium climate sensitivity is around 2.0°C.  This value is about 40% lower than the average climate sensitivity value of the climate models used by the IPCC to make their future projections of climate change, including among other projections, those for temperature and sea level rise.  The Economist suggested that the IPCC was going to lower their assessed value for the equilibrium climate change based on the mountain of evidence from the literature, but gave no indication whether the IPCC was also going to, accordingly, lower all the projections made throughout their report.

In a Cato@Liberty article last month, we pointed out that the IPCC had three options as to how to proceed.  Quoting ourselves:

The IPCC has three options:

1. Round-file the entire AR5 as it now stands and start again.

2. Release the current AR5 with a statement that indicates that all the climate change and impacts described within are likely overestimated by around 50%, or

3. Do nothing and mislead policymakers and the rest of the world.

We’re betting on door number 3.

In its article earlier this week reporting on its own acquired leaked information from the IPCC AR5 report, the New York Times basically proved us right.

Current Wisdom: Greenland’s Disastrous SLR Is SOL

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels, director of the Center for the Study of Science, reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

Could President Obama have picked a worse time to announce his Climate Action Plan?

Global warming has been stuck in neutral for more than a decade and a half, scientists are increasingly suggesting that future climate change projections are overblown, and now, arguably the greatest threat from global warming—a large and rapid sea level rise (SLR)—has been shown overly lurid (SOL; what did you think I meant?).

You hardly need an “action plan” when there is so little “action” worth responding to.

As I frequently discuss the lack of warming and the decreases in the estimates of future climate change, I’ll focus here on new scientific findings concerning the potential for future sea level rise, interspersing a little travelogue.

Projections of a large sea-level rise this century depend on rapid ice loss from Greenland and/or Antarctica. Yes, as ocean waters warm, they expand, but this expansion-induced rise is pretty well constrained and limited to being about 6 inches plus or minus a couple of inches by century’s end. And the contribution from melting glaciers/ice in other parts of the world (not counting Greenland and Antarctica) is even smaller, maybe 2-4 inches. So that adds up to about 8-12 inches of sea level rise by the year 2100—not much different than that which has already occurred over the past century. This is hardly catastrophic.

Current Wisdom: Even More Low Climate Sensitivity Estimates

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels, director of the Center for the Study of Science, reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

Our periodic compilations of low equilibrium climate sensitivity (ECS) estimates have become a big hit.

In our on-going effort to keep up with the science, today we update our previous summary with two additional recently published lower-than-IPCC climate sensitivity estimates—one made by Troy Masters and another by Alexander Otto and colleagues (including several co-authors not typically associated with global warming in moderation, or “lukewarming”).  There is also a third paper currently in the peer-review process.

The new additions yield a total of at least 16 experiments published in the peer-reviewed scientific literature beginning in 2011 that have found that the most likely value of the ECS to be well below the (previously?) “mainstream” estimate from the U.N.’s Intergovernmental Panel on Climate Change (IPCC). Since the negative impacts from global warming/climate change scale with the magnitude of the temperature rise, lower projections of future warming should lead to lower projections of future damages. We say “should” because one way around this, as the federal government has figured out, is to ignore all the new science indicating less expected future warming when calculating future damages, and inexplicably doubling the damages estimated to be caused by a given increment of carbon dioxide (a.k.a., social cost of carbon).

Here is a quick summary of the two new papers:

Examining the output of climate models run under increases in human emissions of greenhouse gas and aerosols, Troy Masters noted a robust relationship between the modeled rate of heat uptake in the global oceans and the modeled climate sensitivity. With this relationship in hand, he then turned to the observations to determine what the observed rate of oceanic heat uptake has been during the past 50 years or so. From the observed behavior, he was able to determine the climate sensitivity, and found it to be substantially less than that in the vast majority of the climate models. He found that the most likely value of the ECS from the observations was 1.98°C with a 90 percent range extending from 1.2°C to 5.15°C. He notes that the high end is driven by uncertainties in the oceanic heat uptake data earlier in the record.

Otto and colleagues used a simple energy budget model to relate observed global temperature changes to changes in the radiation climatology and the heat uptake in the earth system as humans have heaped various substances into the atmosphere. They conclude that the at best estimate for ECS is 2.0°C with a 90 percent range from 1.2°C to 3.9°C.

Both studies come with a long list of caveats relating to data quality, etc., that are common to all studies trying to estimate the ECS.

Hyperloop’s Real Problem

Most reviews of Elon Musk’s hyperloop plan focus on technical questions. Will it cost as little as he estimates? Could it move as fast as he projects? Could the system work at all?

None of these are the real problem with the hyperloop. The real problem is how an infrastructure-heavy, point-to-point system can possibly compete with personal vehicles that can go just about anywhere–the United States has more than 4 million miles of public roads–or with an airline system that requires very little infrastructure and can serve far more destinations than the hyperloop.

Musk promises the hyperloop will be fast. But fast is meaningless if it doesn’t go where you want to go. Musk estimates that people travel about 6 million trips a year between the San Francisco and Los Angeles urban areas, where he wants to build his first hyperloop line. But these urban areas are not points: they are huge, each covering thousands of square miles of land.

Airlines deal with these large areas through multiple airports. The Los Angeles area has five commercial airports and San Francisco has three. The hyperloop would only have one station in each region, making it inconvenient for the vast majority of people.

Moreover, airplanes from these airports can reach hundreds of other airports across the country and around the world. Even if Musk’s optimistic cost estimates are valid (and remember, the first cost estimate for California high-speed rail was about $10 billion, less than a tenth of the current estimate), the hyperloop would require billions of dollars spent on more infrastructure to add any new city.