Topic: Energy and Environment

The Current Wisdom: Better Model, Less Warming

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.


Better Model, Less Warming

Bet you haven’t seen this one on TV:  A newer, more sophisticated climate model has lost more than 25% of its predicted warming!  You can bet that if it had predicted that much more warming it would have made the local paper.

The change resulted from a more realistic simulation of the way clouds work, resulting in a major reduction in the model’s “climate sensitivity,” which is the amount of warming predicted for a doubling of  the concentration of atmospheric carbon dioxide over what it was prior to the industrial revolution.

Prior to the modern era, atmospheric carbon dioxide concentrations, as measured in air trapped in ice in the high latitudes (which can be dated year-by-year) was pretty constant, around 280 parts per million (ppm).  No wonder CO2 is called a “trace gas”—there really is not much of it around.

The current concentration is pushing about 390 ppm, an increase of about 40% in 250 years.  This is a pretty good indicator of the amount of “forcing” or warming pressure that we are exerting on the atmosphere.  Yes, there are other global warming gases going up, like the chlorofluorocarbons (refrigerants now banned by treaty), but the modern climate religion is that these are pretty much being cancelled by reflective  “aerosol” compounds that go in the air along with the combustion of fossil fuels, mainly coal.

Most projections have carbon dioxide doubling to a nominal 600 ppm somewhere in the second half of this century, absent no major technological changes (which history tells us is a very shaky assumption).  But the “sensitivity” is not reached as soon as we hit the doubling, thanks to the fact that it takes a lot of time to warm the ocean (like it takes a lot of time to warm up a big pot of water with a small burner).

So the “sensitivity” is much closer to the temperature rise that a model projects about 100 years from now – assuming (again, shakily) that we ultimately switch to power sources that don’t release dreaded CO2 into the atmosphere somewhere around the time its concentration doubles.

The bottom line is that lower sensitivity means less future warming as a result of anthropogenic greenhouse gas emissions. So our advice… keep on working on the models, eventually, they may actually arrive at something close puny rate of warming that is being observed

At any rate, improvements to the Japanese-developed Model for Interdisciplinary Research on Climate (MIROC) are the topic of a new paper by Masahiro Watanabe and colleagues in the current issue of the Journal of Climate. This modeling group has been working on a new version of their model (MIROC5) to be used in the upcoming 5th Assessment Report of the United Nations’ Intergovernmental Panel on Climate Change, due in late 2013. Two incarnations of the previous version (MIROC3.2) were included in the IPCC’s 4th Assessment Report (2007) and contribute to the IPCC “consensus” of global warming projections.

The high resolution version (MIROC3.2(hires)) was quite a doozy – responsible for far and away the greatest projected global temperature rise (see Figure 1). And the medium resolution model (MIROC3.2(medres)) is among the Top 5 warmest models. Together, the two MIROC models undoubtedly act to increase the overall model ensemble mean warming projection and expand the top end of the “likely” range of temperature rise.

FIGURE 1

Global temperature projections under the “midrange” scenario for greenhouse-gas emissions produced by the IPCC’s collection of climate models.  The MIROC high resolution model (MIROC3.2(hires)) is clearly the hottest one, and the medium range one isn’t very far behind.

The reason that the MIROC3.2 versions produce so much warming is that their  sensitivity is very high, with the high-resolution  at 4.3°C (7.7°F) and the medium-resolution  at  4.0°C (7.2°F).  These sensitivities are very near the high end of the distribution of climate sensitivities from the IPCC’s collection of models (see Figure 2).

FIGURE 2

Equilibrium climate sensitivities of the models used in the IPCC AR4 (with the exception of the MIROC5). The MIROC3.2 sensitivities are highlighted in red and lie near the upper und of the collection of model sensitivities.  The new, improved, MIROC5, which was not included in the IPCC AR4, is highlighted in magenta, and lies near the low end of the model climate sensitivities (data from IPCC Fourth Assessment Report, Table 8.2 and Watanabe et al., 2010).

Note that the highest sensitivity is not necessarily in the hottest model, as observed warming is dependent upon how the model deals with the slowness of the oceans to warm.

The situation is vastly different in the new MIROC5 model.  Watanabe et al. report that the climate sensitivity is now  2.6°C (4.7°F) – more than 25% less than in the previous version on the model.[1] If the MIROC5 had been included in the IPCC’s AR4 collection of models, its climate sensitivity of 2.6°C would have been found near the low end of the distribution (see Figure 2), rather than pushing the high extreme as MIROC3.2 did.

And to what do we owe this large decline in the modeled climate sensitivity?  According to Watanabe et al., a vastly improved handling of cloud processes involving “a prognostic treatment for the cloud water and ice mixing ratio, as well as the cloud fraction, considering both warm and cold rain processes.”  In fact, the improved cloud scheme—which produces clouds which compare more favorably with satellite observations—projects that under a warming climate low altitude clouds become a negative feedback rather than acting as positive feedback as the old version of the model projected.[2] Instead of enhancing the CO2-induced warming, low clouds are now projected to retard it.

Here is how Watanabe et al. describe their results:

A new version of the global climate model MIROC was developed for better simulation of the mean climate, variability, and climate change due to anthropogenic radiative forcing….

MIROC5 reveals an equilibrium climate sensitivity of 2.6K, which is 1K lower than that in MIROC3.2(medres)…. This is probably because in the two versions, the response of low clouds to an increasing concentration of CO2 is opposite; that is, low clouds decrease (increase) at low latitudes in MIROC3.2(medres) (MIROC5).[3]

Is the new MIROC model perfect? Certainly not.  But is it better than the old one? It seems quite likely.  And the net result of the model improvements is that the climate sensitivity and therefore the warming projections (and resultant impacts) have been significantly lowered. And much of this lowering comes as the handling of cloud processes—still among the most uncertain of climate processes—is improved upon. No doubt such improvements will continue into the future as both our scientific understanding and our computational abilities increase.

Will this lead to an even greater reduction in climate sensitivity and projected temperature rise?  There are many folks out there (including this author) that believe this is a very distinct possibility, given that observed warming in recent decades is clearly beneath the average predicted by climate models. Stay tuned!

References:

Intergovernmental Panel on Climate Change, 2007.  Fourth Assessment Report, Working Group 1 report, available at http://www.ipcc.ch.

Watanabe, M., et al., 2010. Improved climate simulation by MIROC5: Mean states, variability, and climate sensitivity. Journal of Climate, 23, 6312-6335.


[1] Watanabe et al. report that the sensitivity of MIROC3.2 (medres) is 3.6°C (6.5°), which is less that what was reported in the 2007 IPCC report.  So 25% is likely a conservative estimate of the reduction in warming.

[2] Whether enhanced cloudiness enhances or cancels carbon-dioxide warming is one of the core issues in the climate debate, and is clearly not “settled” science.

[3] Degrees Kelvin (K) are the same as degrees Celsius (C) when looking at relative, rather than absolute temperatures.

The Current Wisdom

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

History to Repeat:  Greenland’s Ice to Survive, United Nations to Continue Holiday Party

This year’s installment of the United Nations’ annual climate summit (technically known as the 16th meeting of the Conference of the Parties to the Framework Convention on Climate Change) has come and gone in Cancun. Nothing substantial came of it policy-wise; just the usual attempts by the developing world to shake down our already shaky economy in the name of climate change.   News-wise probably the biggest story was that during the conference, Cancun broke an all time daily low temperature record.  Last year’s confab in Copenhagen was pelted by snowstorms and subsumed in miserable cold.  President Obama attended, failed to forge any meaningful agreement, and fled back to beat a rare Washington blizzard. He lost.

But surely as every holiday season now includes one of these enormous jamborees, dire climate stories appeared daily.  Polar bear cubs are endangered!  Glaciers are melting!!

Or so beat the largely overhyped drums, based upon this or that press release from Greenpeace or the World Wildlife Fund.

And, of course, no one bothered to mention a blockbuster paper appearing in Nature the day before the end of the Cancun confab, which reassures us that Greenland’s ice cap and glaciers are a lot more stable than alarmists would have us believe.  That would include Al Gore, fond of his lurid maps showing the melting all of Greenland’s ice submerging Florida.

Ain’t gonna happen.

The disaster scenario goes like this:  Summer temperatures in Greenland are warming, leading to increased melting and the formation of ephemeral lakes on the ice surface.  This water eventually finds a crevasse and then a way down thousands of feet to the bottom of a glacier, where it lubricates the underlying surface, accelerating the seaward march of the ice.  Increase the temperature even more and massive amounts deposit into the ocean by the year 2100, catastrophically raising sea levels.

According to Christian Schoof of the University of British Columbia (UBC), “The conventional view has been that meltwater permeates the ice from the surface and pools under the base of the ice sheet….This water then serves as a lubricant between the glacier and the earth underneath it….”

And, according to Schoof, that’s just not the way things work. A UBC press release about his Nature article noted that he found that “a steady meltwater supply from gradual warming may in fact slow down the glacier flow, while sudden water input could cause glaciers to speed up and spread.”

Indeed, Schoof finds that sudden water inputs, such as would occur with heavy rain, are responsible for glacial accelerations, but these last only one or a few days.

The bottom line?  A warming climate has very little to do with accelerating ice flow, but weather events do.

How important is this?  According to University of Leeds Professor Andrew Shepherd, who studies glaciers via satellite, “This study provides an elegant solution to one of the two key ice sheet instability problems” noted by the United Nations in their last (2007) climate compendium.  “It turns out that, contrary to popular belief, Greenland ice sheet flow might not be accelerated by increased melting after all,” he added.

I’m not so sure that those who hold the “popular belief” can explain why Greenland’s ice didn’t melt away thousands of years ago.  For millennia, after the end of the last ice age (approximately 11,000 years ago) strong evidence indicates that the Eurasian arctic averaged nearly 13°F warmer in July than it is now.

That’s because there are trees buried and preserved in the acidic Siberian tundra, and they can be carbon dated.  Where there is no forest today—because it’s too cold in summer—there were trees, all the way to the Arctic Ocean and even on some of the remote Arctic islands that are bare today. And, back then, thanks to the remnants of continental ice, the Arctic Ocean was smaller and the North American and Eurasian landmasses extended further north.

That work was by Glen MacDonald, from UCLA’s Geography Department. In his landmark 2000 paper in Quaternary Research, he noted that the only way that the Arctic could become so warm is for there to be a massive incursion of warm water from the Atlantic Ocean.  The only “gate” through which that can flow is the Greenland Strait, between Greenland and Scandinavia.

So, Greenland had to have been warmer for several millennia, too.

Now let’s do a little math to see if the “popular belief” about Greenland ever had any basis in reality.

In 2009 University of Copenhagen’s B. M. Vinther and 13 coauthors published the definitive history of Greenland climate back to the ice age, studying ice cores taken over the entire landmass. An  exceedingly conservative interpretation of  their results is that Greenland was 1.5°C (2.7°F) warmer for the period from 5,000-9000 years ago, which is also the warm period in Eurasia that MacDonald detected.  The integrated warming is given by multiplying the time (4,000 years) by the warming (1.5°), and works out (in Celsius) to 6,000 “degree-years.” 

Now let’s assume that our dreaded emissions of carbon dioxide spike the temperature there some 4°C.  Since we cannot burn fossil fuel forever, let’s put this in over 200 years.  That’s a pretty liberal estimate given that the temperature there still hasn’t exceeded values seen before in the 20th century.  Anyway, we get 800 (4 x 200) degree-years.

If the ice didn’t come tumbling off Greenland after 6,000 degree-years, how is it going to do so after only 800?  The integrated warming of Greenland in the post-ice-age warming (referred to as the “climatic optimum” in textbooks published prior to global warming hysteria) is over seven times what humans can accomplish in 200 years.  Why do we even worry about this?

So we can all sleep a bit better.  Florida will survive.  And, we can also rest assured that the UN will continue its outrageous holiday parties, accomplishing nothing, but living large.  Next year’s is in Durban, South Africa, yet another remote warm spot hours of Jet-A away.

References:

MacDonald, G. M., et al., 2000.  Holocene treeline history and climatic change across Northern Eurasia.  Quaternary Research 53, 302-311.

Schoof, C., 2010. Ice-sheet acceleration driven by melt supply variability. Nature 468, 803-805.

Vinther, B.M., et al., 2009.  Holocene thinning of the Greenland ice sheet. Nature 461, 385-388.

Bad Advice from Gov. Polar Star

In 2006, Michigan Gov. Jennifer Granholm told citizens, “In five years, you’re going to be blown away by the strength and diversity of Michigan’s transformed economy.” When those words were uttered, Michigan’s unemployment rate was 6.7 percent. It’s now almost 13 percent.

Although Michigan’s economic doldrums can’t entirely be pinned on Granholm, her fiscal policies have not helped, such as her higher taxes on businesses.

The Mackinac Center’s Michael LaFaive explains why Granholm’s grandiose proclamation in 2006 hasn’t panned out:

In this case, Gov. Granholm was promoting her administration and the Legislature’s massive expansion of discriminatory tax breaks and subsidies for a handful of corporations. The purpose and main effect of this policy is to provide “cover” for the refusal of the political class to adopt genuine tax, labor and regulatory reforms, which they shy away from because it would anger and diminish the privileges and rewards of unions and other powerful special interests.

LaFaive’s colleague James Hohman recently pointed out that “Michigan’s economy produced 8 percent less in 2009 than it did in 2000 when adjusted for inflation. The nation rose 15 percent during this period.”

Granholm has written an op-ed in Politico on how federal policymakers can “win the race for jobs.” This would be like Karl Rove penning an op-ed complaining about Obama spending too much. Oh wait, bad example.

Granholm advises federal policymakers to create a “Jobs Race to the Top” modeled after the president’s education Race to the Top, which as Neal McCluskey explains, has not worked as she claims. Granholm’s plan boils down to more federal subsidies to state and local governments and privileged businesses to develop “clean energy” industries.

Typical of the dreamers who believe that the government can effectively direct economic activity, Granholm never considers the costs of government handouts and central planning. A Cato essay on federal energy interventions explains:

The problem is that nobody knows which particular energy sources will make the most sense years and decades down the road. But this level of uncertainty is not unique to the energy industry—every industry faces similar issues of innovation in a rapidly changing world. In most industries, the policy solution is to allow the decentralized market efforts of entrepreneurs and early adopting consumers figure out the best route to the future. Government efforts to push markets in certain directions often end up wasting money, but they can also delay the development of superior alternatives that don’t receive subsidies.

Granholm recently received “Sweden’s Insignia of First Commander, Order of the Polar Star for her work in fostering relations between Michigan and Sweden to promote a clean energy economy” from His Majesty King Carl XVI Gustaf. Unfortunately, her prescription for economic growth would be a royal mistake.

Supreme Court Should Tell Courts to Stay Out of Global Warming Cases

The Supreme Court is finally starting to put some interesting non-First Amendment cases on this term’s docket.

Today, the Court agreed to review American Electric Power Co., Inc. v. Connecticut, in which eight states, some non-profits, and New York City are suing a number of energy companies and utilities for harms they allegedly caused by contributing to global warming.  This is the third major lawsuit to push global warming into the courts (another being Comer v. Murphy Oil USA, in which Cato also filed a brief).  It’s America, after all, where we sue to solve our problems – even apparently, taking to court the proverbial butterfly that caused a tsunami.

Mind you, you can sue your neighbor for leaking toxic water onto your land. Courts are well positioned to adjudicate such disputes because they involve only two parties and have limited (if any) effects on others. But it is a different case when, using the same legal theory by which Jones sues Smith for his toxic dumping (called “nuisance”), plaintiffs selectively sue a few targeted defendants for a (quite literally) global problem.  As I discussed with reference to a previous such case, global warming is the type of issue that should be decided by the political branches. The Second Circuit ruled, however, that this suit could go forward. (Justice Sotomayor was involved in the case at that stage and so will be recused going forward.)  

The Supreme Court has always recognized that not all problems can or should be solved in the courtroom. Thus, the issue in AEP v. Connecticut – which the Court will now decide – is whether the states meet the legal requirements necessary to have their suit heard in court, what lawyers call “standing.” Historically, issues of policy have been decided by the legislative and executive branches while “cases and controversies” have been decided by courts. Therefore, when litigants have asked courts determine matters of broad-ranging policy, the Court has often termed the cases “political questions” and dismissed them. The reasoning is that, not only do unelected courts lack the political authority to determine such questions, they also lack any meaningful standards by which the case could be decided (called “justiciability”).

Indeed, even if the plaintiffs can demonstrate causation, it is unconstitutional for courts to make complex policy decisions — and this is true regardless of the science regarding global warming. Just as it’s unconstitutional for a legislature to pass a statute punishing a particular person (bill of attainder), it’s unconstitutional — under the “political question doctrine” — for courts to determine wide-ranging policies in which numerous considerations must be weighed against each other in anything but a bilateral way.  

We pointed out in our brief supporting the defendants’ request for Supreme Court review – and will again in the brief we plan to file at this next stage – that resolving this case while avoiding those comprehensive and far-reaching implications is impossible and that the Constitution prohibits the judicial usurpation of roles assigned to the other, co-equal branches of government.   After all, global warming is a global problem purportedly caused by innumerable actors, ranging from cows to Camrys. This fact not only underscores the political nature of the question, but it has constitutional significance: In order to sue someone, your injury must be “fairly traceable” to the defendant’s actions. Suits based on “butterfly effect” reasoning should not be allowed to move forward.

Perhaps surprisingly, the federal government –which is involved because one of the defendants is the Tennessee Valley Authority – agrees with Cato . The administration aptly played its role in our constitutional system by asserting that global warming policy was a matter for the executive and legislative branches to resolve, not the judiciary. 

Hmmm, Cato and Obama on the same side in a global warming dispute… but I still won’t be holding my breath awaiting an invite to the White House Christmas party.

Slow Death for High-Speed Rail

Tea party victories in November likely signal the beginning of the end for President Obama’s ambitious and expensive high-speed rail plans. Republican governors-elect of both Ohio and Wisconsin have vowed to return federal high-speed rail funds that had been granted to those states. The governor-elect of Florida is also a rail skeptic, and more and more obstacles are being thrown in front of California’s rail plans.


Obama Replaces Costly High-Speed Rail Plan With High-Speed Bus Plan

The prospects for high-speed rail are so dire that the Onion recently suggested that President Obama would shift his support to high-speed buses instead. Even the Washington Post has sounded caution about spending much more money on this obsolete form of travel.

The California High Speed Rail Authority, which wants to spend a mere $43 billion on the first leg of a proposed 220-mph rail network, has gained a reputation as a paragon of mismanagement and conflicts of interest. The authority’s chair, Anaheim Mayor Curt Pringle, has accused its staff of incompetence. Reports from the state auditor, the University of California Institute for Transportation Studies, and a committee of transportation professionals have all concluded that the authority’s cost projections are too low and its ridership revenue projections too high.

Nevertheless, in a blatant political move, the Obama administration gave the authority a $900 million grant just a week before the election on the condition that most of the money be spent in the district of a Democratic member of Congress who was fighting a close reelection campaign. The representative, Jim Costa, won reelection by a mere 3,000 votes. The rail authority dutifully decided to start building the rail line in the heart of Costa’s district, from the small town of Corcoran – known mainly as the home of Charles Manson and fellow prisoners – to an even smaller spot named Borden – population zero. This plan was quickly dubbed the train to nowhere and generated opposition not just from Republicans but from Costa’s fellow Democrat, Dennis Cardoza, who represents the congressional district just north of Costa’s.

Although California voters approved $9 billion in bonds for the rail project, the approval was conditional on getting matching funds. So far, the state has received only about $2 billion from the federal government, which means it only has about $4 billion to spend on construction – less than 10 percent of the amount needed to build from Los Angeles to San Francisco. Given the improbability of finding the other 90 percent, and the fact that Republicans in Congress hope to take back some of the money that has already been granted for high-speed rail, the California rail project seems all but dead. The authority’s only hope is to spend enough money building a train to nowhere that politicians will feel compelled to fund the rest.

Meanwhile, Florida was elated when the Obama administration funded half the cost of an 168-mph line running the 80 miles from Tampa to Orlando, with the promise of more funding later. But the state’s enthusiasm was greatly diminished when the administration announced that it expected the states to come up with at least 20 percent matching funds–funds Florida does not have. Even Orlando Congressman John Mica (likely the next chair of the House Transportation and Infrastructure Committee) has backed away from supporting the line. So the state’s new governor might be able to kill the project.

The Ohio and Wisconsin projects aren’t even worthy of being called high-speed rail, as Wisconsin’s average speed was projected to be just 59 mph and Ohio’s an even more lethargic 38.5 mph. Yet the Wisconsin project was going to cost nearly $1 billion, nearly all of which the feds agreed to fund, while Ohio’s would be more than half a billion, about $400 million of which was initially funded by the feds. Secretary of Immobility Transportation Ray LaHood vowed that these lines would be built no matter what the incoming governors said, then said that if they cancelled the projects, he would just give the money to other states. While that seems likely, Congress could override such a transfer.

Meanwhile, in a spectacular display of poor timing, Amtrak announced its own Boston-to-Washington high-speed rail plan just a week before the election. Current Amtrak trains reach top speeds of 130-150 mph but average only 80 mph on this route. For a mere $117 billion, Amtrak proposed to build a brand-new line capable of reaching 220-mph top speeds, meaning average speeds of about 130-140 mph. But Amtrak planners must have forgotten to low-ball their cost estimates, for the proposed cost-per-mile of $274 million was nearly three times the projected cost of the California line and more than 10 times the projected cost of Florida high-speed rail. No doubt Amtrak will shelve its plan in anticipation of a more favorable political environment.

New transportation technologies are successful when they are faster, more convenient, and less expensive than the technologies they replace. High-speed rail is slower than flying, less convenient than driving, and at least five times more expensive than either one. It is only feasible with heavy taxpayer subsidies and even then it will only serve a tiny portion of the nation’s population.

A few months before the election, LaHood estimated the administration’s high-speed rail construction plans would eventually cost taxpayers $500 billion, and that’s not counting operating subsidies. BNSF CEO Mark Rose thinks the cost will be closer to $1 trillion. If nothing else, the tea parties may be able to take credit for saving taxpayers at least that amount of money.

The Shocking Truth: The Scientific American Poll on Climate Change

November’s Scientific American features a profile of Georgia Tech atmospheric scientist Judith Curry,  who has committed the mortal sin of  reaching out to other scientists who hypothesize that global warming isn’t the disaster it’s been cracked up to be.  I have personal experience with this, as she invited me to give a research seminar in Tech’s prestigious School of Earth and Atmospheric Sciences in 2008.  My lecture summarizing the reasons for doubting the apocalyptic synthesis of climate change was well-received by an overflow crowd.

Written by Michael Lemonick, who hails from the shrill blog Climate Central, the article isn’t devoid of the usual swipes, calling her a “heretic„ which is hardly at all true.  She’s simply another hardworking scientist who lets the data take her wherever it must, even if that leads her to question some of our more alarmist colleagues. 

But, as a make-up call for calling attention to Curry, Scientific American has run a poll of its readers on climate change.  Remember that SciAm has been shilling for the climate apocalypse for years, publishing a particularly vicious series of attacks on Denmark’s Bjorn Lomborg’s Skeptical Environmentalist.  The magazine also featured NASA’s James Hansen and his outlandish claims on sea-level rise. Hansen has stated, under oath in a deposition, that a twenty foot rise is quite possible within the next 89 years; oddly, he has failed to note that in 1988 he predicted that the West Side Highway in Manhattan would go permanently under water in twenty years.

SciAm probably expected a lot of people would agree with the key statement in their poll that the United Nations’ Intergovernmental Panel on Climate Change (IPCC) is “an effective group of government representatives and other experts.”

Hardly. As of this morning, only 16% of the 6655 respondents agreed.  84%—that is not a typo—described the IPCC as “a corrupt organization, prone to groupthink, with a political agenda.” 

The poll also asks “What should we do about climate change?” 69% say “nothing, we are powerless to stop it.” When asked about policy options, an astonishingly low 7% support cap-and-trade, which passed the U.S. House of Representatives in June, 2009, and cost approximately two dozen congressmen their seats.

The real killer is question “What is causing climate change?” For this one, multiple answers are allowed.  26% said greenhouse gases from human activity, 32% solar variation, and 78% “natural processes.” (In reality all three are causes of climate change.)

And finally, “How much would you be willing to pay to forestall the risk of catastrophic climate change?”  80% of the respondents said “nothing.”

Remember that this comes from what is hardly a random sample.  Scientific American is a reliably statist publication and therefore appeals to a readership that is skewed to the left of the political center.  This poll demonstrates that virtually everyone now acknowledges that the UN has corrupted climate science, that climate change is impossible to stop, and that futile attempts like cap-and-trade do nothing but waste money and burn political capital, things that Cato’s scholars have been saying for years.

VIDEO: Joe Biden’s Weak Case for Government Meddling

Vice President Joe Biden believes that human progress depends almost entirely on government vision and government incentive. Donald J. Boudreaux, Cato Institute adjunct scholar and George Mason University economics professor, details why Biden is wrong both generally and in the specific case he touts:



Produced by Caleb O. Brown. Shot and edited by Evan Banks.