Tag: climate change

Supreme Court Takes Up Butterfly Effect

As Congress debates cap-and-trade, new fuel standards, and subsidies for “green” companies, some still feel that political solutions to global warming are not moving fast enough. In the present case, American Electric Power Co. v. Connecticut, eight states and New York City sued several public utilities (including the federal Tennessee Valley Authority), alleging that their carbon dioxide emissions contribute to global warming.

This is the third major lawsuit to push global warming into the courts (another being Comer v. Murphy Oil USA, in which Cato also filed a brief). All of these suits try to use the common law doctrine of nuisance—which, for example, lets you sue your neighbor if his contaminated water flows onto your land and kills your lawn—to attack carbon emitters. None of them had gotten very far until the Second Circuit vacated a lower-court ruling and allowed the claims here to proceed.

But the judiciary was not meant to be the sole method for resolving grievances with the government—even if everything looks like a nail to lawyers who only have a hammer. After all, there are two other co-equal branches, the legislative and executive, which are constitutionally committed to unique roles in our system of separation of powers. The doctrine of “standing” exists in part to ensure that the judiciary is not used to solve issues that properly belong to those other branches. Toward this end, the Constitution allows courts to hear only actual “cases or controversies” that can feasibly be resolved by a court.

Cato thus filed a brief supporting the defendant utilities’ successful request for Supreme Court review, and has now filed another brief supporting their position before the Court. Cato’s latest brief first argues that no judicial solution is possible here because the chain of causation between the defendants’ carbon emissions and the alleged harm caused by global warming is so attenuated that it resembles the famed “butterfly effect.” Just as butterflies should not be sued for causing tsunamis, a handful of utility companies in the Northeastern United States should not be sued for the complex (and disputed) harms of global warming.

Second, we contend that, even if the plaintiffs can demonstrate causation, it is unconstitutional for courts to make nuanced policy decisions that should be left to the legislature—and this is true regardless of the science of global warming. Just as it’s improper for a legislature to pass a statute punishing a particular person (bill of attainder), it’s beyond courts’ constitutional authority—under the “political question doctrine”—to determine wide-ranging policies in which numerous considerations must be weighed in anything but an adversarial litigation process.

If a court were to adjudicate the claims here and issue an order dictating emissions standards, two things will happen: 1) the elected branches will be encouraged to abdicate to the courts their responsibilities for addressing complex and controversial policy issues, and 2) an already difficult situation would become nearly intractable as regulatory agencies and legislative actors butt heads with court orders issued across the country in quickly multiplying global warming cases. These inevitable outcomes are precisely why the standing and political question doctrines exist.

Dissatisfaction with the decisions and pace of government does not give someone the right to sue over anything. Or, as Chief Justice Marshall once said, “If the judicial power extended to every question under the laws of the United States … [t]he division of power [among the branches of government] could exist no longer, and the other departments would be swallowed up by the judiciary.”

The Supreme Court will hear arguments in American Electric Power Co. v. Connecticut on April 19.

Special thanks to Trevor Burrus, who contributed to this post.

The Current Wisdom: The Short-Term Climate Trend Is Not Your Friend

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

***********

It seems like everyone, from exalted climate scientists to late-night amateur tweeters, can get a bit over-excited about short-term fluctuations, reading into them deep cosmic and political meaning, when they are likely the statistical hiccups of our mathematically surly atmosphere.

There’s been some major errors in forecasts of recent trends. Perhaps the most famous  were made by NASA’s James Hansen in 1988, who overestimated warming between then and now by a whopping 40% or so.

But it is easy to  get snookered by short-term fluctuations.  As shown in Figure 1, it is quite obvious that there has been virtually no net change in temperature since 1997, allowing for the fact that measurement errors in global average surface temperature are easily a tenth of a degree or more. (The magnitude of those errors will be considered in a future Current Wisdom).

Figure 1. Annual global average surface temperature anomaly (°C), 1997-2010 (data source: Hadley Center).

Some who are concerned about environmental regulation without good science have seized upon this 13-year stretch as “proof” that there is no such thing as global warming driven by carbon dioxide.  More on that at the end of this Wisdom.

Similarly, periods of seemingly rapid warming can prompt scientists to see changes where there aren’t any.

Consider a landmark paper published in 2000 in Geophysical Research Letters by Tom Karl, a prominent researcher who is the head of our National Climatic Data Center (NCDC) and who just finished a stint as President of the American Meteorological Society.  He couldn’t resist the climatic blip that was occurred prior  to the current stagnation of warming, namely the very warm episode of the late 1990s. 

Cooler heads at the time noted that it was an artifact of the great El Nino of 1997-98, a periodic warming of the tropical Pacific that has been coming and going for millions of years. 

Nonetheless, the paper was published and accompanied by a flashy press release titled “Global warming may be accelerating.”  

What Karl did was to examine the 16 consecutive months of record-high temperatures (beginning in May, 1997) and to calculate the chance that this could happen, given the fairly pokey warming rate—approximately 0.17°C (0.31°F) per  decade, that was occurring.  He concluded there was less than a five percent probability, unless the warming rate had suddenly increased.

From the press release:

Karl and colleagues conclude that there is only a small chance that the string of record high temperatures in 1997-98 was simply an unusual event, rather than a change point, the start of a new and faster ongoing trend.

He also gave a number:  “…the probability of observing the record temperatures is more likely with high average rates of warming, around 3°C [5.4°F]/century,” which works out to 0.3°C per decade.

Our Figure 2 shows what was probabilistically forecast beginning in May, 1997, and what actually happened.  Between then and now, according to this paper, global temperatures should have warmed around 0.4°C (0.7°F).  The observed warming rate for the last 13.5 years—which includes the dramatically warming temperatures beginning in 1997—was a paltry 0.06°C (0.11°F) per decade. 

Figure 2. Prior to mid-1997, the observed warming trend (dashed line) was 0.17°/decade.  Karl said there was a greater than 95% probability that 1997-8 would mark a “change point”, where warming would accelerate to around 0.30°/decade.  Since then, the rate has been 0.06°/decade, or 20% of what was forecast.

Karl did provide some statistical wiggle room.  While noting the less than 5% chance that the warming rate hadn’t increased, he wrote that “unusual events can occur” and that there still was a chance (given as less than 5%) that 97-98 was just a statistical hiccup, which it ultimately proved to be.

The press release couldn’t resist the “it’s worse than we thought” mindset that pervades climate science:

Since completing the research, the data for 1999 has been compiled.  The researchers found that 1999 was the fifth warmest year on record, although as a La Nina year it would normally be cooler” [than what?ed].

“La Nina” is cool phase of El Nino, which drops temperatures about as much as El Nino raises them. What the press release and the GRL paper completely neglected to mention is that the great warm year of 1998 was a result of the “natural”  El Nino superimposed upon the overall slight warming trend.

In other words, there was every reason to believe at that time that the anomalous temperatures were indeed a statistical blip resulting from a very high-amplitude version of a natural oscillation in the earth’s climate that occurred every few years.

Now, back to the last 13 years. The puny recent changes may also just be our atmosphere’s make-up call for the sudden warming of the late 1990s, or another hiccup.

It is characteristic for climate models whose carbon dioxide increase resembles that which is being observed to produce constant rates of warming.  There’s a good reason for this.  Temperature responds logarithmically—i.e.less and less—to changes in this gas as its concentration increases.  But the concentration tends to increase exponentially—i.e. more and more.  The combination of an increasingly damped response to an ever increasing rate of input tends to resemble a straight line, or a constant rate of warming.

Indeed, Karl noted in his paper (and I have noted in virtually every public lecture I give), that “projections of temperature change in the next [i.e. the 21st] century, using [the United Nations’] business as usual scenarios…have relatively constant rates of global temperature increase”.  It’s just that their constant rates tend to be higher than the one that is being observed.  The average rate of warming predicted for this century by the UN is about 2.5°C, while the observed value has been, as predicted, constant—but with a lower value of 1.7°.  As Figure 3 shows, this rate has been remarkably constant for over three decades.

 

Figure 3. Annual global average surface temperature anomaly (°C), 1976-2010 (data source: Hadley Center).  It’s hard to imagine a more constant trend, despite the 1998 peak and the subsequent torpid warming.

The bottom line is that short-term trends are not your friends when talking about long-term climate change.

References

Hansen, J.E., et al., 1988. Global climate changes as forecast by Goddard Institute for Space Studies three-dimensional model. Journal of Geophysical Research, 93, 9341-9364.

Karl, T. R., R. W. Knight, and B. Baker, 2000. The record breaking global temperatures of 1997 and 1998” Evidence for an increase in the rate of global warming? Geophysical Research Letters, 27, 719-722.

Michaels, P. J., and P. C. Knappenberger, 2009. Scientific Shortcomings in the EPA’s Endangerment Finding from Greenhouse Gases, Cato Journal, 29, 497-521, http://www.cato.org/pubs/journal/cj29n3/cj29n3-8.pdf.

Topics:

The Current Wisdom: Better Model, Less Warming

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.


Better Model, Less Warming

Bet you haven’t seen this one on TV:  A newer, more sophisticated climate model has lost more than 25% of its predicted warming!  You can bet that if it had predicted that much more warming it would have made the local paper.

The change resulted from a more realistic simulation of the way clouds work, resulting in a major reduction in the model’s “climate sensitivity,” which is the amount of warming predicted for a doubling of  the concentration of atmospheric carbon dioxide over what it was prior to the industrial revolution.

Prior to the modern era, atmospheric carbon dioxide concentrations, as measured in air trapped in ice in the high latitudes (which can be dated year-by-year) was pretty constant, around 280 parts per million (ppm).  No wonder CO2 is called a “trace gas”—there really is not much of it around.

The current concentration is pushing about 390 ppm, an increase of about 40% in 250 years.  This is a pretty good indicator of the amount of “forcing” or warming pressure that we are exerting on the atmosphere.  Yes, there are other global warming gases going up, like the chlorofluorocarbons (refrigerants now banned by treaty), but the modern climate religion is that these are pretty much being cancelled by reflective  “aerosol” compounds that go in the air along with the combustion of fossil fuels, mainly coal.

Most projections have carbon dioxide doubling to a nominal 600 ppm somewhere in the second half of this century, absent no major technological changes (which history tells us is a very shaky assumption).  But the “sensitivity” is not reached as soon as we hit the doubling, thanks to the fact that it takes a lot of time to warm the ocean (like it takes a lot of time to warm up a big pot of water with a small burner).

So the “sensitivity” is much closer to the temperature rise that a model projects about 100 years from now – assuming (again, shakily) that we ultimately switch to power sources that don’t release dreaded CO2 into the atmosphere somewhere around the time its concentration doubles.

The bottom line is that lower sensitivity means less future warming as a result of anthropogenic greenhouse gas emissions. So our advice… keep on working on the models, eventually, they may actually arrive at something close puny rate of warming that is being observed

At any rate, improvements to the Japanese-developed Model for Interdisciplinary Research on Climate (MIROC) are the topic of a new paper by Masahiro Watanabe and colleagues in the current issue of the Journal of Climate. This modeling group has been working on a new version of their model (MIROC5) to be used in the upcoming 5th Assessment Report of the United Nations’ Intergovernmental Panel on Climate Change, due in late 2013. Two incarnations of the previous version (MIROC3.2) were included in the IPCC’s 4th Assessment Report (2007) and contribute to the IPCC “consensus” of global warming projections.

The high resolution version (MIROC3.2(hires)) was quite a doozy – responsible for far and away the greatest projected global temperature rise (see Figure 1). And the medium resolution model (MIROC3.2(medres)) is among the Top 5 warmest models. Together, the two MIROC models undoubtedly act to increase the overall model ensemble mean warming projection and expand the top end of the “likely” range of temperature rise.

FIGURE 1

Global temperature projections under the “midrange” scenario for greenhouse-gas emissions produced by the IPCC’s collection of climate models.  The MIROC high resolution model (MIROC3.2(hires)) is clearly the hottest one, and the medium range one isn’t very far behind.

The reason that the MIROC3.2 versions produce so much warming is that their  sensitivity is very high, with the high-resolution  at 4.3°C (7.7°F) and the medium-resolution  at  4.0°C (7.2°F).  These sensitivities are very near the high end of the distribution of climate sensitivities from the IPCC’s collection of models (see Figure 2).

FIGURE 2

Equilibrium climate sensitivities of the models used in the IPCC AR4 (with the exception of the MIROC5). The MIROC3.2 sensitivities are highlighted in red and lie near the upper und of the collection of model sensitivities.  The new, improved, MIROC5, which was not included in the IPCC AR4, is highlighted in magenta, and lies near the low end of the model climate sensitivities (data from IPCC Fourth Assessment Report, Table 8.2 and Watanabe et al., 2010).

Note that the highest sensitivity is not necessarily in the hottest model, as observed warming is dependent upon how the model deals with the slowness of the oceans to warm.

The situation is vastly different in the new MIROC5 model.  Watanabe et al. report that the climate sensitivity is now  2.6°C (4.7°F) – more than 25% less than in the previous version on the model.[1] If the MIROC5 had been included in the IPCC’s AR4 collection of models, its climate sensitivity of 2.6°C would have been found near the low end of the distribution (see Figure 2), rather than pushing the high extreme as MIROC3.2 did.

And to what do we owe this large decline in the modeled climate sensitivity?  According to Watanabe et al., a vastly improved handling of cloud processes involving “a prognostic treatment for the cloud water and ice mixing ratio, as well as the cloud fraction, considering both warm and cold rain processes.”  In fact, the improved cloud scheme—which produces clouds which compare more favorably with satellite observations—projects that under a warming climate low altitude clouds become a negative feedback rather than acting as positive feedback as the old version of the model projected.[2] Instead of enhancing the CO2-induced warming, low clouds are now projected to retard it.

Here is how Watanabe et al. describe their results:

A new version of the global climate model MIROC was developed for better simulation of the mean climate, variability, and climate change due to anthropogenic radiative forcing….

MIROC5 reveals an equilibrium climate sensitivity of 2.6K, which is 1K lower than that in MIROC3.2(medres)…. This is probably because in the two versions, the response of low clouds to an increasing concentration of CO2 is opposite; that is, low clouds decrease (increase) at low latitudes in MIROC3.2(medres) (MIROC5).[3]

Is the new MIROC model perfect? Certainly not.  But is it better than the old one? It seems quite likely.  And the net result of the model improvements is that the climate sensitivity and therefore the warming projections (and resultant impacts) have been significantly lowered. And much of this lowering comes as the handling of cloud processes—still among the most uncertain of climate processes—is improved upon. No doubt such improvements will continue into the future as both our scientific understanding and our computational abilities increase.

Will this lead to an even greater reduction in climate sensitivity and projected temperature rise?  There are many folks out there (including this author) that believe this is a very distinct possibility, given that observed warming in recent decades is clearly beneath the average predicted by climate models. Stay tuned!

References:

Intergovernmental Panel on Climate Change, 2007.  Fourth Assessment Report, Working Group 1 report, available at http://www.ipcc.ch.

Watanabe, M., et al., 2010. Improved climate simulation by MIROC5: Mean states, variability, and climate sensitivity. Journal of Climate, 23, 6312-6335.


[1] Watanabe et al. report that the sensitivity of MIROC3.2 (medres) is 3.6°C (6.5°), which is less that what was reported in the 2007 IPCC report.  So 25% is likely a conservative estimate of the reduction in warming.

[2] Whether enhanced cloudiness enhances or cancels carbon-dioxide warming is one of the core issues in the climate debate, and is clearly not “settled” science.

[3] Degrees Kelvin (K) are the same as degrees Celsius (C) when looking at relative, rather than absolute temperatures.

The Current Wisdom

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

History to Repeat:  Greenland’s Ice to Survive, United Nations to Continue Holiday Party

This year’s installment of the United Nations’ annual climate summit (technically known as the 16th meeting of the Conference of the Parties to the Framework Convention on Climate Change) has come and gone in Cancun. Nothing substantial came of it policy-wise; just the usual attempts by the developing world to shake down our already shaky economy in the name of climate change.   News-wise probably the biggest story was that during the conference, Cancun broke an all time daily low temperature record.  Last year’s confab in Copenhagen was pelted by snowstorms and subsumed in miserable cold.  President Obama attended, failed to forge any meaningful agreement, and fled back to beat a rare Washington blizzard. He lost.

But surely as every holiday season now includes one of these enormous jamborees, dire climate stories appeared daily.  Polar bear cubs are endangered!  Glaciers are melting!!

Or so beat the largely overhyped drums, based upon this or that press release from Greenpeace or the World Wildlife Fund.

And, of course, no one bothered to mention a blockbuster paper appearing in Nature the day before the end of the Cancun confab, which reassures us that Greenland’s ice cap and glaciers are a lot more stable than alarmists would have us believe.  That would include Al Gore, fond of his lurid maps showing the melting all of Greenland’s ice submerging Florida.

Ain’t gonna happen.

The disaster scenario goes like this:  Summer temperatures in Greenland are warming, leading to increased melting and the formation of ephemeral lakes on the ice surface.  This water eventually finds a crevasse and then a way down thousands of feet to the bottom of a glacier, where it lubricates the underlying surface, accelerating the seaward march of the ice.  Increase the temperature even more and massive amounts deposit into the ocean by the year 2100, catastrophically raising sea levels.

According to Christian Schoof of the University of British Columbia (UBC), “The conventional view has been that meltwater permeates the ice from the surface and pools under the base of the ice sheet….This water then serves as a lubricant between the glacier and the earth underneath it….”

And, according to Schoof, that’s just not the way things work. A UBC press release about his Nature article noted that he found that “a steady meltwater supply from gradual warming may in fact slow down the glacier flow, while sudden water input could cause glaciers to speed up and spread.”

Indeed, Schoof finds that sudden water inputs, such as would occur with heavy rain, are responsible for glacial accelerations, but these last only one or a few days.

The bottom line?  A warming climate has very little to do with accelerating ice flow, but weather events do.

How important is this?  According to University of Leeds Professor Andrew Shepherd, who studies glaciers via satellite, “This study provides an elegant solution to one of the two key ice sheet instability problems” noted by the United Nations in their last (2007) climate compendium.  “It turns out that, contrary to popular belief, Greenland ice sheet flow might not be accelerated by increased melting after all,” he added.

I’m not so sure that those who hold the “popular belief” can explain why Greenland’s ice didn’t melt away thousands of years ago.  For millennia, after the end of the last ice age (approximately 11,000 years ago) strong evidence indicates that the Eurasian arctic averaged nearly 13°F warmer in July than it is now.

That’s because there are trees buried and preserved in the acidic Siberian tundra, and they can be carbon dated.  Where there is no forest today—because it’s too cold in summer—there were trees, all the way to the Arctic Ocean and even on some of the remote Arctic islands that are bare today. And, back then, thanks to the remnants of continental ice, the Arctic Ocean was smaller and the North American and Eurasian landmasses extended further north.

That work was by Glen MacDonald, from UCLA’s Geography Department. In his landmark 2000 paper in Quaternary Research, he noted that the only way that the Arctic could become so warm is for there to be a massive incursion of warm water from the Atlantic Ocean.  The only “gate” through which that can flow is the Greenland Strait, between Greenland and Scandinavia.

So, Greenland had to have been warmer for several millennia, too.

Now let’s do a little math to see if the “popular belief” about Greenland ever had any basis in reality.

In 2009 University of Copenhagen’s B. M. Vinther and 13 coauthors published the definitive history of Greenland climate back to the ice age, studying ice cores taken over the entire landmass. An  exceedingly conservative interpretation of  their results is that Greenland was 1.5°C (2.7°F) warmer for the period from 5,000-9000 years ago, which is also the warm period in Eurasia that MacDonald detected.  The integrated warming is given by multiplying the time (4,000 years) by the warming (1.5°), and works out (in Celsius) to 6,000 “degree-years.” 

Now let’s assume that our dreaded emissions of carbon dioxide spike the temperature there some 4°C.  Since we cannot burn fossil fuel forever, let’s put this in over 200 years.  That’s a pretty liberal estimate given that the temperature there still hasn’t exceeded values seen before in the 20th century.  Anyway, we get 800 (4 x 200) degree-years.

If the ice didn’t come tumbling off Greenland after 6,000 degree-years, how is it going to do so after only 800?  The integrated warming of Greenland in the post-ice-age warming (referred to as the “climatic optimum” in textbooks published prior to global warming hysteria) is over seven times what humans can accomplish in 200 years.  Why do we even worry about this?

So we can all sleep a bit better.  Florida will survive.  And, we can also rest assured that the UN will continue its outrageous holiday parties, accomplishing nothing, but living large.  Next year’s is in Durban, South Africa, yet another remote warm spot hours of Jet-A away.

References:

MacDonald, G. M., et al., 2000.  Holocene treeline history and climatic change across Northern Eurasia.  Quaternary Research 53, 302-311.

Schoof, C., 2010. Ice-sheet acceleration driven by melt supply variability. Nature 468, 803-805.

Vinther, B.M., et al., 2009.  Holocene thinning of the Greenland ice sheet. Nature 461, 385-388.

The Shocking Truth: The Scientific American Poll on Climate Change

November’s Scientific American features a profile of Georgia Tech atmospheric scientist Judith Curry,  who has committed the mortal sin of  reaching out to other scientists who hypothesize that global warming isn’t the disaster it’s been cracked up to be.  I have personal experience with this, as she invited me to give a research seminar in Tech’s prestigious School of Earth and Atmospheric Sciences in 2008.  My lecture summarizing the reasons for doubting the apocalyptic synthesis of climate change was well-received by an overflow crowd.

Written by Michael Lemonick, who hails from the shrill blog Climate Central, the article isn’t devoid of the usual swipes, calling her a “heretic„ which is hardly at all true.  She’s simply another hardworking scientist who lets the data take her wherever it must, even if that leads her to question some of our more alarmist colleagues. 

But, as a make-up call for calling attention to Curry, Scientific American has run a poll of its readers on climate change.  Remember that SciAm has been shilling for the climate apocalypse for years, publishing a particularly vicious series of attacks on Denmark’s Bjorn Lomborg’s Skeptical Environmentalist.  The magazine also featured NASA’s James Hansen and his outlandish claims on sea-level rise. Hansen has stated, under oath in a deposition, that a twenty foot rise is quite possible within the next 89 years; oddly, he has failed to note that in 1988 he predicted that the West Side Highway in Manhattan would go permanently under water in twenty years.

SciAm probably expected a lot of people would agree with the key statement in their poll that the United Nations’ Intergovernmental Panel on Climate Change (IPCC) is “an effective group of government representatives and other experts.”

Hardly. As of this morning, only 16% of the 6655 respondents agreed.  84%—that is not a typo—described the IPCC as “a corrupt organization, prone to groupthink, with a political agenda.” 

The poll also asks “What should we do about climate change?” 69% say “nothing, we are powerless to stop it.” When asked about policy options, an astonishingly low 7% support cap-and-trade, which passed the U.S. House of Representatives in June, 2009, and cost approximately two dozen congressmen their seats.

The real killer is question “What is causing climate change?” For this one, multiple answers are allowed.  26% said greenhouse gases from human activity, 32% solar variation, and 78% “natural processes.” (In reality all three are causes of climate change.)

And finally, “How much would you be willing to pay to forestall the risk of catastrophic climate change?”  80% of the respondents said “nothing.”

Remember that this comes from what is hardly a random sample.  Scientific American is a reliably statist publication and therefore appeals to a readership that is skewed to the left of the political center.  This poll demonstrates that virtually everyone now acknowledges that the UN has corrupted climate science, that climate change is impossible to stop, and that futile attempts like cap-and-trade do nothing but waste money and burn political capital, things that Cato’s scholars have been saying for years.

The Current Wisdom

NOTE:  This is the first in a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

The Iceman Goeth:  Good News from Greenland and Antarctica

How many of us have heard that global sea level will be about a meter—more than three feet—higher in 2100 than it was in the year 2000?  There are even scarier stories, circulated by NASA’s James E. Hansen, that the rise may approach 6 meters, altering shorelines and inundating major cities and millions of coastal inhabitants worldwide.

Figure 1. Model from a travelling climate change exhibit (currently installed at the Field Museum of natural history in Chicago) of Lower Manhattan showing what 5 meters (16 feet) of sea level rise will look like.

In fact, a major exhibition now at the prestigious Chicago Field Museum includes a 3-D model of Lower Manhattan under 16 feet of water—this despite the general warning from the James Titus, who has been EPA’s sea-level authority for decades:

Researchers and the media need to stop suggesting that Manhattan or even Miami will be lost to a rising sea. That’s not realistic; it promotes denial and panic, not a reasoned consideration of the future.

Titus was commenting upon his 2009 publication on sea-level rise in the journal Environmental Research Letters.

The number one rule of grabbing attention for global warming is to never let the facts stand in the way of a good horror story, so advice like Titus’s is usually ignored.

The catastrophic sea level rise proposition is built upon the idea that large parts of the ice fields that lay atop Greenland and Antarctica will rapidly melt and slip into the sea as temperatures there rise.  Proponents of this idea claim that the United Nations’ Intergovernmental Panel on Climate Change (IPCC), in its most recent (2007) Assessment Report,  was far too conservative in its projections of future sea level rise—the mean value of which is a rise by the year 2100 of about 15 inches.

In fact, contrary to virtually all news coverage, the IPCC actually anticipates that Antarctica will gain ice mass (and lower sea level) as the climate warms, since the temperature there is too low to produce much melting even if it warms up several degrees, while the warmer air holds more moisture and therefore precipitates more snow. The IPCC projects Greenland to contribute a couple of inches of sea level rise as ice melts around its periphery.

Alarmist critics claim that the IPCC’s projections are based only on direct melt estimates rather than “dynamic” responses of the glaciers and ice fields to rising temperatures.

These include Al Gore’s favorite explanation—that melt water from the surface percolates down to the bottom of the glacier and lubricates its base, increasing flow and ultimately ice discharge. Alarmists like Gore and Hansen claim that Greenland and Antarctica’s glaciers will then “surge” into the sea, dumping an ever-increasing volume of ice and raising water levels worldwide.

The IPCC did not include this mechanism because it is very hypothetical and not well understood.  Rather, new science argues that the IPCC’s minuscule projections of sea level rise from these two great ice masses are being confirmed.

About a year ago, several different research teams reported that while glaciers may surge from time to time and increase ice discharge rates, these surges are not long-lived and that basal lubrication is not a major factor in these surges. One research group, led by Faezeh Nick and colleagues reported that “our modeling does not support enhanced basal lubrication as the governing process for the observed changes.” Nick and colleagues go on to find that short-term rapid increases in discharge rates are not stable and that “extreme mass loss cannot be dynamically maintained in the long term” and ultimately concluding that “[o]ur results imply that the recent rates of mass loss in Greenland’s outlet glaciers are transient and should not be extrapolated into the future.”

But this is actually old news. The new news is that the commonly-reported (and commonly hyped) satellite estimates of mass loss from both Greenland and Antarctica were a result of improper calibration, overestimating ice loss by  some 50%.

As with any new technology, it takes a while to get all the kinks worked out. In the case of the Gravity Recovery and Climate Experiment (GRACE) satellite-borne instrumentation, one of the major problems is interpreting just what exactly the satellites are measuring. When trying to ascertain mass changes (for instance, from ice loss) from changes in the earth’s gravity field, you first have to know how the actual land under the ice is vertically moving (in many places it is still slowly adjusting from the removal of the glacial ice load from the last ice age).

The latest research by a team led by Xiaoping Wu from Caltech’s Jet Propulsion Laboratory concludes that the adjustment models that were being used by previous researchers working with the GRACE data didn’t do that great of a job. Wu and colleagues enhanced the existing models by incorporating land movements from a network of GPS sensors, and employing more sophisticated statistics. What they found has been turning heads.

Using the GRACE measurements and the improved model, the new estimates of the rates of ice loss from Greenland and Antarctica  are only about half as much as the old ones.

Instead of Greenland losing ~230 gigatons of ice each year since 2002, the new estimate is 104 Gt/yr. And for Antarctica, the old estimate of ~150 Gt/yr has been modified to be about 87 Gt/yr.

 How does this translate into sea level rise?

 It takes about 37.4 gigatons of ice loss to raise the global sea level 0.1 millimeter—four hundredths of an inch. In other words, ice loss from Greenland is currently contributing just over one-fourth of a millimeter of sea level rise per year, or one one-hundreth of an inch.  Antarctica’s contribution is just under one-fourth of a millimeter per year.  So together, these two regions—which contain 99% of all the land ice on earth—are losing ice at a rate which leads to an annual sea level rise of one half of one millimeter per year. This is equivalent to a bit less than 2 hundredths of an inch per year.  If this continues for the next 90 years, the total sea level rise contributed by Greenland and Antarctica by the year 2100 will amount to less than 2 inches.

 Couple this with maybe 6-8 inches from the fact that the ocean rises with increasing temperature,  temperatures and 2-3 inches from melting of other land-based ice, and you get a sum total of about one foot of additional rise by century’s end.

 This is about 1/3rd of the 1 meter estimates and 1/20th of the 6 meter estimates.

Things had better get cooking in a hurry if the real world is going to approach these popular estimates. And there are no signs that such a move is underway.

So far, the 21st century has been pretty much of a downer for global warming alarmists. Not only has the earth been warming at a rate considerably less than the average rate projected by climate models, but now the sea level rise is suffering a similar fate.

Little wonder that political schemes purporting to save us from these projected (non)calamities are also similarly failing to take hold.

References:

Nick, F. M., et al., 2009. Large-scale changes in Greenland outlet glacier dynamics triggered at the terminus. Nature Geoscience, DOI:10.1038, published on-line January 11, 2009.

Titus, J.G., et al., 2009. State and Local Governments Plan for Development of Most Land Vulnerable to Rising Sea Level along the U.S. Atlantic Coast, Environmental Research Letters 4 044008. (doi: 10.1088/1748-9326/4/4/044008).

Wu, X., et al., 2010. Simultaneous estimation of global present-day water treansport and glacial isostatic adjustment. Nature Geoscience, published on-line August 15, 2010, doi: 10.1038/NGE0938.

Kerry and Lieberman Unveil Their Climate Bill: Such a Deal!

I see that my colleague Sallie James has already blogged on the inherent protectionism in the Senate’s long-awaited cap-and-tax bill.  A summary was leaked last night by The Hill.

Well, we now have the real “discussion draft” of  “The American Power Act” [APA], sponsored by John Kerry (D-NH) and Joe Lieberman (I-CT).  Lindsay Graham (R-SC) used to be on the earlier drafts, but excused himself to have a temper tantrum.

So, while Sallie talked about the trade aspects of the bill, I’d like to blather about the mechanics, costs, and climate effects. If you don’t want to read the excruciating details, stop here and note that it mandates the impossible, will not produce any meaningful reduction of planetary warming, and it will subsidize just about every form of power that is too inefficient to compete today.

APA reduces emissions to the same levels that were in the Waxman-Markey bill passed by the House last June 26.  Remember that one – snuck through on a Friday evening, just so no one would notice?  Well, people did, and it, not health care, started the angry townhall meetings last summer.  No accident, either, that Obama’s approval ratings immediately tanked.

Just like Waxman-Markey, APA will allow the average American the carbon dioxide emissions of the average citizen back in 1867, a mere 39 years from today.  Just like Waxman-Markey, the sponsors have absolutely no idea how to accomplish this.  Instead they wave magic wands for noncompetitive technologies like “Carbon Capture and Sequestration” (“CCS”, aka “clean coal”), solar energy and windmills, and ethanol (“renewable energy”), among many others.

Just like Waxman-Markey, no one knows the (enormous) cost.  How do you put a price on something that doesn’t exist?  We simply don’t know how to reduce emissions by 83%.  Consequently, APA is yet another scheme to make carbon-based energy so expensive that you won’t use it.

This will be popular!  At $4.00 a gallon, Americans reduced their consumption of gasoline by a whopping 4%.  Go figure out how high it has to get to drop by 83%.

Oh, I know. Plug-in hybrid cars will replace gasoline powered ones. Did I mention that the government-produced Chevrolet Volt is, at first, only going to be sold to governments and where it is warm because even the Obama Administration fears that the car will not be very popular where most of us live.  Did I mention that the electric power that charges the battery most likely comes from the combustion of a carbon-based fuel? Getting to that 83% requires getting rid of carbon emissions from power production.  Period.  In 39 years. Got a replacement handy?

Don’t trot out natural gas.  It burns to carbon dioxide and water, just like coal.  True, it’s about 55% of the carbon dioxide that comes from coal per unit energy, but we’ll also use a lot more more electricity over the next forty years.  In other words, switching to natural gas will keep adding emissions to the atmosphere.

Anyway, just for fun, I plugged the APA emissions reduction schedule into the Model for the Assessment of Greenhouse-gas Induced Climate Change (MAGICC – I am not making this up), which is what the United Nations uses to estimate the climatic effects of various greenhouse-gas scenarios.

I’ve included two charts with three scenarios. One is for 2050 and the other for 2100.  They assume that the “sensitivity” of temperature to a doubling of atmospheric carbon dioxide is 2.5°C, a number that many scientists think is too high, given the pokey greenhouse-effect warming of the planet that has occurred as we have effectively gone half way to a doubling already. The charts show prospective warming given by MAGICC.

The first scenario is “business-as-usual”, the perhaps too-optimistic way of saying a nation without APA.  The second assumes that only the US does APA, and the third assumes that each and every nation that has “obligations” under the UN’s Kyoto Protocol on global warming does the same.

As you can plainly see,  APA does nothing, even if all the Kyoto-signatories meet its impossible mandates.  The amount of warming “saved” by 2100 is 7% of the total for Business-as-Usual, or two-tenths of a degree Celsius. That amount will be barely detectable above the year-to-year normal fluctuations.  Put another way, if we believe in MAGICC, APA – if adopted by us, Europe, Canada, and the rest of the Kyotos – will reduce the prospective temperature in 2100 to what it would be in 2093.

That’s a big if.  Of course, we could go it alone. In that case, the temperature reduction would in fact be too small to measure reliably.

I’m hoping these numbers surface in the “debate” over APA.

So there you have it, the new American Power Act, a bill that doesn’t know how to achieve its mandates, has a completely unknown but astronomical cost, and doesn’t do a darned thing about global warming.  Such a deal!