Tag: weather events

The Current Wisdom: Overplaying the Human Contribution to Recent Weather Extremes

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.


 The recent publication of two articles in Nature magazine proclaiming a link to rainfall extremes (and flooding) to global warming, added to the heat in Russia and the floods in Pakistan in the summer of 2010, and the back-to-back cold and snowy winters in the eastern U.S. and western Europe, have gotten a lot of public attention.  This includes a recent hearing in the House of Representatives, despite its Republican majority.  Tying weather extremes to global warming, or using them as “proof” that warming doesn’t exist (see: snowstorms), is a popular rhetorical flourish by politicos of all stripes.  

The hearing struck many as quite odd, inasmuch as it is much clearer than apocalyptic global warming that the House is going to pass meaningless legislation commanding the EPA to cease and desist from regulating greenhouse gas emissions.  “Meaningless” means that it surely will not become law.  Even on the long-shot probability that it passes the Senate, the President will surely veto, and there are nowhere near enough votes to override such an action.

Perhaps “wolf!” has been cried yet again.  A string of soon-to-be-published papers in the scientific literature finds that despite all hue and cry about global warming and recent extreme weather events, natural climate variability is to blame.

Where to start?  How about last summer’s Russian heat wave?

The Russian heat wave (and to some degree the floods in Pakistan) have been linked to the same large-scale, stationary weather system, called an atmospheric “blocking” pattern. When the atmosphere is “blocked” it means that it stays in the same configuration for period of several weeks (or more) and keeps delivering the same weather to the same area for what can seem like an eternity to people in the way.  Capitalizing on the misery in Russia and Pakistan, atmospheric blocking was added to the list of things that were supposed to be “consistent with” anthropogenically stimulated global warming which already, of course included heat waves and floods. And thus the Great Russian Heat Wave of 2010 became part of global warming lore.

But then a funny thing happened – scientists with a working knowledge of atmospheric dynamics started to review the situation and found scant evidence for global warming.

The first chink in the armor came back in the fall of 2010, when scientists from the Physical Sciences Division (PSD) of the Earth System Research Laboratory (ESRL) of the National Oceanic and Atmospheric Administration (NOAA) presented the results of their preliminary investigation on the web , and concluded that “[d]espite this strong evidence for a warming planet, greenhouse gas forcing fails to explain the 2010 heat wave over western Russia. The natural process of atmospheric blocking, and the climate impacts induced by such blocking, are the principal cause for this heat wave.”

The PSD folks have now followed this up with a new peer-reviewed article in the journal Geophysical Research Letters that rejects the global warming explanation. The paper is titled “Was There a Basis for Anticipating the 2010 Russian Heat Wave?” Turns out that there wasn’t.

To prove this, the research team, led by PSD’s Randall Dole, first reviewed the observed temperature history of the region affected by the heat wave (western Russia, Belarus, the Ukraine, and the Baltic nations). To start, they looked at the recent antecedent conditions: “Despite record warm globally-averaged surface temperatures over the first six months of 2010, Moscow experienced an unusually cold winter and a relatively mild but variable spring, providing no hint of the record heat yet to come.” Nothing there.

Then they looked at the long-term temperature record: “The July surface temperatures for the region impacted by the 2010 Russian heat wave shows no significant warming trend over the prior 130-year period from 1880 to 2009…. A linear trend calculation yields a total temperature change over the 130 years of -0.1°C (with a range of 0 to -0.4°C over the four data sets [they examined]).” There’s not a hint of a build-up to a big heat wave.

And as to the behavior of temperature extremes: “There is also no clear indication of a trend toward increasing warm extremes. The prior 10 warmest Julys are distributed across the entire period and exhibit only modest clustering earlier in this decade, in the 1980s and in the 1930s…. This behavior differs substantially from globally averaged annual temperatures, for which eleven of the last twelve years ending in 2006 rank among the twelve warmest years in the instrumental record since 1850….”

With regard any indication that “global” warming was pushing temperatures higher in Russia and thus helped to fuel the extreme heat last summer, Dole et al. say this: “With no significant long-term trend in western Russia July surface temperatures detected over the period 1880-2009, mean regional temperature changes are thus very unlikely to have contributed substantially to the magnitude of the 2010 Russian heat wave.”

Next the PSD folks looked to see if the existing larger-scale antecedent conditions, fed into climate models would produce the atmospheric circulation patterns (i.e. blocking) that gave rise to the heat wave.  The tested “predictors” included patterns of sea surface temperature and arctic ice coverage, which most people feel have been subject to some human influence.  No relationship: “These findings suggest that the blocking and heat wave were not primarily a forced response to specific boundary conditions during 2010.”

In fact, the climate models exhibited no predilection for projecting increases in the frequency of atmospheric blocking patterns over the region as greenhouse gas concentrations increased. Just the opposite: “Results using very high-resolution climate models suggest that the number of Euro-Atlantic blocking events will decrease by the latter half of the 21st century.”

At this point, Dole and colleagues had about exhausted all lines of inquiry and summed things up:

 Our analysis points to a primarily natural cause for the Russian heat wave. This event appears to be mainly due to internal atmospheric dynamical processes that produced and maintained an intense and long-lived blocking event. Results from prior studies suggest that it is likely that the intensity of the heat wave was further increased by regional land surface feedbacks. The absence of long-term trends in regional mean temperatures and variability together with the model results indicate that it is very unlikely that warming attributable to increasing greenhouse gas concentrations contributed substantially to the magnitude of this heat wave.

Can’t be much clearer than that.

But that was last summer. What about the past two winters? Both were very cold in the eastern U.S. with record snows events and/or totals scattered about the country.

Cold, snow, and global warming? On Christmas Day 2010, the New York Times ran an op-ed by Judah Cohen, a long-range forecaster for the private forecasting firm Atmospheric and Environmental Research, outlining his theory as to how late summer Arctic ice declines lead to more fall snow cover across Siberia which in turn induces atmospheric circulation patterns to favor snowstorms along the East Coast of the U.S. Just last week, the Union of Concerned Scientists held a news conference where they handed out a press release  headlined “Climate Change Makes Major Snowstorms Likely.” In that release, Mark Serreze, director of the National Snow and Ice Data Center, laid out his theory as to how the loss of Arctic sea ice is helping to provide more moisture to fuel winter snowstorms across the U.S. as well as altering atmospheric circulation patterns into a preferred state for big snowstorms. Weather Underground’s Jeff Masters chimed in with “Heavy snowstorms are not inconsistent with a warming planet.”

As is the wont for this Wisdom, let’s go back to the scientific literature.

Another soon-to-be released paper to appear in Geophysical Research Letters describes the results of using the seasonal weather prediction model from the European Center for Medium-Range Weather Forecasts (ECMWF) to help untangle the causes of the unusual atmospheric circulation patterns that gave rise to the harsh winter of 2009-2010 on both sides of the Atlantic. A team of ECMWF scientists led by Thomas Jung went back and did experiments changing initial conditions that were fed into the ECMWF model and then assessed how well the model simulated the known weather patterns of the winter of 2009-2010. The different set of initial conditions was selected so as to test all the pet theories behind the origins of the harsh winter.  Jung et al. describe their investigations this way: “Here, the origin and predictability of the unusual winter of 2009/10 are explored through numerical experimentation with the ECMWF Monthly forecasting system. More specifically, the role of anomalies in sea surface temperature (SST) and sea ice, the tropical atmospheric circulation, the stratospheric polar vortex, solar insolation and near surface temperature (proxy for snow cover) are examined.”

Here is what they found after running their series of experiments.

Arctic sea ice and sea surface temperature anomalies.  These are often associated with global warming caused by people. Finding:  “These results suggest that neither SST nor sea ice anomalies explain the negative phase of the NAO during the 2009/10 winter.”

(NAO are the commonly used initials for the North Atlantic Oscillation – and atmospheric circulation pattern that can act to influence winter weather in the eastern U.S. and western Europe. A negative phase of the NAO is associated with cold and stormy weather and during the winter of 2009-10, the NAO value was the lowest ever observed.)

A global warming-induced weakening stratospheric (upper-atmosphere) jetstream. “Like for the other experiments, these stratospheric relaxation experiments fail to reproduce the magnitude of the observed NAO anomaly.”

Siberian snow cover.  “The resulting [upper air patterns] show little resemblance with the observations…. The implied weak role of snow cover anomalies is consistent with other research….”

Solar variability.  “The experiments carried out in this study suggest that the impact of anomalously low incoming [ultraviolet] radiation on the tropospheric circulation in the North Atlantic region are very small… suggesting that the unusually low solar activity contributed little, if any, to the observed NAO anomaly during the 2009/10 winter.”

Ok then, well what did cause the unusual weather patterns during the 2009-10 winter?

The results of this study, therefore, increase the likelihood that both the development and persistence of negative NAO phase resulted from internal atmospheric dynamical processes.

Translation: Random variability.

To drive this finding home, here’s another soon-to-be-released paper (D’Arrigo et al., 2001) that uses tree ring-based reconstructions of atmospheric circulation patterns and finds a similar set of conditions (including a negative NAO value second only to the 2009-10 winter) was responsible for the historically harsh winter of 1783-84 in the eastern U.S. and western Europe, which  was widely noted by historians. It followed the stupendous eruption of the Icelandic volcano Laki the previous summer. The frigid and snowy winter conditions have been blamed on the volcano. In fact, Benjamin Franklin even commented as much.

But in their new study, Roseanne D’Arrigo and colleagues conclude that the harshness of that winter primarily was the result of anomalous atmospheric circulation patterns that closely resembled those observed during the winter of 2009-10, and that the previous summer’s volcanic eruption played a far less prominent role:

Our results suggest that Franklin and others may have been mistaken in attributing winter conditions in 1783-4 mainly to Laki or another eruption, rather than unforced variability.

Similarly, conditions during the 2009-10 winter likely resulted from natural [atmospheric] variability, not tied to greenhouse gas forcing… Evidence thus suggests that these winters were linked to the rare but natural occurrence of negative NAO and El Niño events.

The point is that natural variability can and does produce extreme events on every time scale, from days (e.g., individual storms), weeks (e.g., the Russian heat wave), months (e.g., the winter of 2009-10), decades (e.g., the lack of global warming since 1998), centuries (e.g., the Little Ice Age), millennia (e.g., the cycle of major Ice Ages), and eons (e.g., snowball earth).

Folks would do well to keep this in mind next time global warming is being posited for the weather disaster du jour. Almost assuredly, it is all hype and little might.

Too bad these results weren’t given a “hearing” in the House!


D’Arrigo, R., et al., 2011. The anomalous winter of 1783-1784: Was the Laki eruption or an analog of the 2009–2010 winter to blame? Geophysical Research Letters, in press.

Dole, R., et al., 2011. Was there a basis for anticipating the 2010 Russian heat wave? Geophysical Research Letters, in press.

Jung et al., 2011. Origin and predictability of the extreme negative NAO winter of 2009/10. Geophysical Research Letters, in press.

Min, S-K., et al., 2011. Human contribution to more-intense precipitation extremes. Nature, 470, 378-381.

Pall, P., et al., 2011. Anthropogenic greenhouse gas contribution to flood risk in England and Wales in autumn 2000. Nature, 470, 382-386.

Cherry Picking Climate Catastrophes: Response to Conor Clarke, Part II

Conor Clarke at The Atlantic blog, raised several issues with my study, “What to Do About Climate Change,” which Cato published last year.

One of Conor Clarke’s comments was that my analysis did not extend beyond the 21st century. He found this problematic because, as Conor put it, climate change would extend beyond 2100, and even if GDP is higher in 2100 with unfettered global warming than without, it’s not obvious that this GDP would continue to be higher “in the year 2200 or 2300 or 3758”. I addressed this portion of his argument in Part I of my response. Here I will address the second part of this argument, that “the possibility of ‘catastrophic’ climate change events — those with low probability but extremely high cost — becomes real after 2100.”

The examples of potentially catastrophic events that could be caused by anthropogenic greenhouse gas induced global warming (AGW) that have been offered to date (e.g., melting of the Greenland or West Antarctic Ice Sheets, or the shutdown of the thermohaline circulation) contain a few drops of plausibility submerged in oceans of speculation. There are no scientifically justified estimates of the probability of their occurrence by any given date. Nor are there scientifically justified estimates of the magnitude of damages such events might cause, not just in biophysical terms but also in socioeconomic terms. Therefore, to call these events “low probability” — as Mr. Clarke does — is a misnomer. They are more appropriately termed as plausible but highly speculative events.

Consider, for example, the potential collapse of the Greenland Ice Sheet (GIS). According to the IPCC’s WG I Summary for Policy Makers (p. 17), “If a negative surface mass balance were sustained for millennia, that would lead to virtually complete elimination of the Greenland Ice Sheet and a resulting contribution to sea level rise of about 7 m” (emphasis added). Presumably the same applies to the West Antarctic Ice Sheet.

But what is the probability that a negative surface mass balance can, in fact, be sustained for millennia, particularly after considering the amount of fossil fuels that can be economically extracted and the likelihood that other energy sources will not displace fossil fuels in the interim? [Remember we are told that peak oil is nigh, that renewables are almost competitive with fossil fuels, and that wind, solar and biofuels will soon pay for themselves.]

Second, for an event to be classified as a catastrophe, it should occur relatively quickly precluding efforts by man or nature to adapt or otherwise deal with it. But if it occurs over millennia, as the IPCC says, or even centuries, that gives humanity ample time to adjust, albeit at a socioeconomic cost. But it need not be prohibitively dangerous to life, limb or property if: (1) the total amount of sea level rise (SLR) and, perhaps more importantly, the rate of SLR can be predicted with some confidence, as seems likely in the next few decades considering the resources being expended on such research; (2) the rate of SLR is slow relative to how fast populations can strengthen coastal defenses and/or relocate; and (3) there are no insurmountable barriers to migration.

This would be true even had the so-called “tipping point” already been passed and ultimate disintegration of the ice sheet was inevitable, so long as it takes millennia for the disintegration to be realized. In other words, the issue isn’t just whether the tipping point is reached, rather it is how long does it actually take to tip over. Take, for example, if a hand grenade is tossed into a crowded room. Whether this results in tragedy — and the magnitude of that tragedy — depends upon how much time it takes for the grenade to go off, the reaction time of the occupants, and their ability to respond.

Lowe, et al. (2006, p. 32-33), based on a “pessimistic, but plausible, scenario in which atmospheric carbon dioxide concentrations were stabilised at four times pre-industrial levels,” estimated that a collapse of the Greenland Ice Sheet would over the next 1,000 years raise sea level by 2.3 meters (with a peak rate of 0.5 cm/yr). If one were to arbitrarily double that to account for potential melting of the West Antarctic Ice Sheet, that means a SLR of ~5 meters in 1,000 years with a peak rate (assuming the peaks coincide) of 1 meter per century.

Such a rise would not be unprecedented. Sea level has risen 120 meters in the past 18,000 years — an average of 0.67 meters/century — and as much as 4 meters/century during meltwater pulse 1A episode 14,600 years ago (Weaver et al. 2003; subscription required). Neither humanity nor, from the perspective of millennial time scales (per the above quote from the IPCC), the rest of nature seem the worse for it. Coral reefs for example, evolved and their compositions changed over millennia as new reefs grew while older ones were submerged in deeper water (e.g., Cabioch et al. 2008). So while there have been ecological changes, it is unknown whether the changes were for better or worse. For a melting of the GIS (or WAIS) to qualify as a catastrophe, one has to show, rather than assume, that the ecological consequences would, in fact, be for the worse.

Human beings can certainly cope with sea level rise of such magnitudes if they have centuries or millennia to do so. In fact, if necessary they could probably get out of the way in a matter of decades, if not years.

Can a relocation of such a magnitude be accomplished?

Consider that the global population increased from 2.5 billion in 1950 to 6.8 billion this year. Among other things, this meant creating the infrastructure for an extra 4.3 billion people in the intervening 59 years (as well as improving the infrastructure for the 2.5 billion counted in the baseline, many of whom barely had any infrastructure whatsoever in 1950). These improvements occurred at a time when everyone was significantly poorer. (Global per capita income today is more than 3.5 times greater today than it was in 1950). Therefore, while relocation will be costly, in theory, tomorrow’s much wealthier world ought to be able to relocate billions of people to higher ground over the next few centuries, if need be. In fact, once a decision is made to relocate, the cost differential of relocating, say, 10 meters higher rather than a meter higher is probably marginal. It should also be noted that over millennia the world’s infrastructure will have to be renewed or replaced dozens of times – and the world will be better for it. [For example, the ancient city of Troy, once on the coast but now a few kilometers inland, was built and rebuilt at least 9 times in 3 millennia.]

Also, so long as we are concerned about potential geological catastrophes whose probability of occurrence and impacts have yet to be scientifically estimated, we should also consider equally low or higher probability events that might negate their impacts. Specifically, it is quite possible — in fact probable — that somewhere between now and 2100 or 2200, technologies will become available that will deal with climate change much more economically than currently available technologies for reducing GHG emissions. Such technologies may include ocean fertilization, carbon sequestration, geo-engineering options (e.g., deploying mirrors in space) or more efficient solar or photovoltaic technologies. Similarly, there is a finite, non-zero probability that new and improved adaptation technologies will become available that will substantially reduce the net adverse impacts of climate change.

The historical record shows that this has occurred over the past century for virtually every climate-sensitive sector that has been studied. For example, from 1900-1970, U.S. death rates due to various climate-sensitive water-related diseases — dysentery, typhoid, paratyphoid, other gastrointestinal disease, and malaria —declined by 99.6 to 100.0 percent. Similarly, poor agricultural productivity exacerbated by drought contributed to famines in India and China off and on through the 19th and 20th centuries killing millions of people, but such famines haven’t recurred since the 1970s despite any climate change and the fact that populations are several-fold higher today. And by the early 2000s, deaths and death rates due to extreme weather events had dropped worldwide by over 95% of their earlier 20th century peaks (Goklany 2006).

With respect to another global warming bogeyman — the shutdown of the thermohaline circulation (AKA the meridional overturning circulation), the basis for the deep freeze depicted in the movie, The Day After Tomorrow — the IPCC WG I SPM notes (p. 16), “Based on current model simulations, it is very likely that the meridional overturning circulation (MOC) of the Atlantic Ocean will slow down during the 21st century. The multi-model average reduction by 2100 is 25% (range from zero to about 50%) for SRES emission scenario A1B. Temperatures in the Atlantic region are projected to increase despite such changes due to the much larger warming associated with projected increases in greenhouse gases. It is very unlikely that the MOC will undergo a large abrupt transition during the 21st century. Longer-term changes in the MOC cannot be assessed with confidence.”

Not much has changed since then. A shut down of the MOC doesn’t look any more likely now than it did then. See here, here, and here (pp. 316-317).

If one wants to develop rational policies to address speculative catastrophic events that could conceivably occur over the next few centuries or millennia, as a start one should consider the universe of potential catastrophes and then develop criteria as to which should be addressed and which not. Rational analysis must necessarily be based on systematic analysis, and not on cherry picking one’s favorite catastrophes.

Just as one may speculate on global warming induced catastrophes, one may just as plausibly also speculate on catastrophes that may result absent global warming. Consider, for example, the possibility that absent global warming, the Little Ice Age might return. The consequences of another ice age, Little or not, could range from the severely negative to the positive (if that would buffer the negative consequences of warming). That such a recurrence is not unlikely is evident from the fact that the earth entered and, only a century and a half ago, retreated from a Little Ice Age, and that history may indeed repeat itself over centuries or millennia.

Yet another catastrophe that greenhouse gas controls may cause is that CO2 not only contributes to warming, it is also the key building block of life as we know it. All vegetation is created by the photosynthesis of CO2 in the atmosphere. In fact, according to the IPCC WG I report (2007, p. 106), net primary productivity of the global biosphere has increased in recent decades, partly due to greater warming, higher CO2 concentrations and nitrogen deposition. Thus , there is a finite probability that reducing CO2 emissions would, therefore, reduce the net primary productivity of the terrestrial biosphere with potentially severe negative consequences for the amount and diversity of wildlife that it could support, as well as agricultural and forest productivity with adverse knock on effects on hunger and health.

There is also a finite probability that costs of GHG reductions could reduce economic growth worldwide. Even if only industrialized countries sign up for emission reductions, the negative consequences could show up in developing countries because they derive a substantial share of their income from aid, trade, tourism, and remittances from the rest of the world. See, for example, Tol (2005), which examines this possibility, although the extent to which that study fully considered these factors (i.e., aid, trade, tourism, and remittances) is unclear.

Finally, one of the problems with the argument that society should address low probability high impact events (assuming a probability could be estimated rather than assumed or guessed) is that it necessarily means there is a high probability that resources expended on addressing such catastrophic events will have been squandered. This wouldn’t be a problem but for the fact that there are opportunity costs associated with this.

According to the 2007 IPCC Science Assessment’s Summary for Policy Makers (p. 10), “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.” In plain language, this means that the IPCC believes there is at least a 90% likelihood that anthropogenic greenhouse gas emissions (AGHG) are responsible for 50-100% of the global warming since 1950. In other words, there is an up to 10% chance that anthropogenic GHGs are not responsible for most of that warming.

This means there is an up to 10% chance that resources expended in limiting climate change would have been squandered. Since any effort to significantly reduce climate change will cost trillions of dollars (see Nordhaus 2008, p. 82), that would be an unqualified disaster, particularly since those very resources could be devoted to reducing urgent problems humanity faces here and now (e.g., hunger, malaria, safer water and sanitation) — problems we know exist for sure unlike the bogeymen that we can’t be certain about.

Spending money on speculative, even if plausible, catastrophes instead of problems we know exist for sure is like a starving man giving up a fat juicy bird in hand while hoping that we’ll catch several other birds sometime in the next few centuries even though we know those birds don’t exist today and may never exist in the future.

Comments on Criticism of Cato Ad

Our friends at www.realclimate.org and www.ryanavent.com have been taking shots at the statements in our ad, so I’d like to offer a little commentary.

We make three factual assertions.

First, we say that “surface temperature changes over the past century have been episodic and modest”. We cite Brohan et al., Journal of Geophysical Research (2006 and updates) and Swanson and Tsonis, Geophysical Research Letters, 2009. The first is the latest update of the East Anglia temperature history, which long has been the IPCC staple. It is the one most cited over the years by the IPCC because it was the first long history that contained much more than simply World Weather Records data updated with local records at the end of a month. At any rate, both it and other global histories indeed show modest warming, about 0.8degC from 1900-2000, and indeed it is episodic. Everyone (well probably almost everyone…there are some real people who don’t believe it is right) pretty much agrees that there are two periods of warming, 1910-45 and 1977-98, with a slight cooling in between and no trend after. If that’s not “episodic”, I don’t know what is. The Swanson paper in fact specifically quantifies these episodes. The paragaph near the end of it that says this may mean that warming will be faster than we thought was pure speculation. It could just have easily been argued (as I do) that the lack of recent warming more likely indicates that 21st century warming will be lower than forecast by oceanic feedback because lack of warming simply delays any water vapor amplification. Pure and simple.

The second assertion is that, “after controlling for population growth and property values, there has been no increase in damages from severe weather events”. The citation is short – a note in the Bulletin of the American Meteorological Society, by Pielke Jr. et al, 2005. The et al. numbers over ten other large-name scientists/analysts, and the reference list is the important part. There are a large number of citations on climate-related damages for various places and/or periods. We couldn’t list them all in this format, so we chose a single citation that could be consulted and an interested reader would find all the subsidiary supporting material.

Finally we state that “the computer models forecasting rapid temperature change abjectly fail to explain recent climate behavior”, citing Douglas et al., International Journal of Climatology, 2007, which showed the major disparity between forecasts of the upper tropospheric tropical “warm spot”, a hallmark of greenhouse projections, and observations in the radiosonde record. Yes it is true that Santer et al. have published a lengthy rebuttal, but it is extremely dense and marks just another go-round-and-round over this issue. Douglas et al. have a response but it hasn’t been published yet. The debate will go on and on. Further, it is quite apparent from comparing midrange multimodel estimates from the IPCC to observed temperatures, and those indeed projected for coming years, that there is a signficant disconnect developing between the models and surface temperature. They simply don’t anticipate multidecadal periods without warming. Oh yes, since this has happened, all of the sudden models can be forced to “explain” it, but that’s not prospective. Instead, it is retrospective adjustment. Such work wouldn’t be performed if there weren’t something wrong.

That’s more than enough to negate President-elect Obama’s statement that “The science is beyond dispute and the facts are clear”!