Tag: fossil fuels

Sen. Rand Paul Proposes Serious Cuts

Freshman Sen. Rand Paul (R-KY) has raised the bar in Washington by releasing a bill that would make substantial, specific, and immediate cuts in federal spending. While policymakers on both sides of the aisle have largely paid lip service to stopping Washington’s record run of fiscal profligacy, Paul’s proposal makes good on his campaign promise to seriously tackle the federal government’s bloated budget.

Paul’s bill would target $500 billion in cuts for fiscal 2011 alone. While audacious by Washington standards, cutting federal spending by that amount would still leave us with a projected $1 trillion deficit this year. Nonetheless, the federal government’s scope would be dramatically curtailed, which would pay dividends in coming years as the economy is unshackled from numerous failed federal interventions.

A description of Paul’s proposed cuts can be viewed here, but some of the bolder ideas merit a comment or two.

First, Paul would eliminate most Department of Education spending, with the exception of higher education subsidies. He correctly notes that the federal government’s increased involvement in education has been “detrimental” and that “the mere existence of the Department of Education is an overreach of power by the federal government.”

Second, the Department of Energy, which is becoming a chief source of corporate welfare, would be zeroed out. Paul would eliminate subsidies for all energy industries – from fossil fuels to so-called “green” energies. He notes that the government’s interference in energy development should be ended and the free market allowed to “start taking the reins.”

Third, the Department of Housing and Urban Development – one of most visible examples of government failure – would be eliminated. Among the HUD programs that Paul singles out, it is his criticism of housing vouchers that deserves the most applause as they remain popular in some Republican and conservative quarters.

Paul deserves credit for proposing cuts at the Department of Defense, although the savings would be relatively small. However, his proposal would cut the Department of Homeland Security almost in half, and would zero out billions of dollars in foreign aid. The latter is well-timed given the situation in Egypt, a major recipient of U.S. foreign aid dollars.

Finally, Paul would chop a quarter of the Department of Health and Human Service’s budget, although he doesn’t take on Medicare or Medicaid. He is reportedly at work on separate legislation that would address Medicare and Social Security. Because Paul’s proposal is focused on immediate cuts, his decision to tackle the big mandatory spending programs separately shouldn’t be viewed as a cop out.

Thus far, the spending cut bar in Washington has been set pretty low. Policymakers from both parties and varying ideological backgrounds have been timid in spelling out precisely what they would cut. By getting specific, Paul has raised the bar, which will hopefully put pressure on others – in particular, the congressional Republican leadership – to move beyond a vague, myopic fixation on nondefense discretionary spending.

Why the Neo-Malthusian Worldview Fails the Reality Check

Why does the Neo-Malthusians’ dystopian worldview — that human and environmental well-being will suffer with increases in population, affluence and technological change — fail the reality check? Why has human well-being improved in the Age of Industrialization despite order-of-magnitude increases in the consumption of materials, fossil fuel energy and chemicals?

I offer some reasons in the last of a series of posts (1, 2, 3, 4) at MasterResource.

I note that although population, affluence and technology can create some problems for humanity and the planet, they are also the agents for solving those problems. In particular, human capital and greater affluence have helped the development and adoption of new and improved technologies, which empirical data show have reduced risks faster than the new risks that may have been created — hence the continual improvement in human well-being in the era of modern economic growth.

A corollary to this is that projections of future impacts spanning a few decades, but that do not account for technological change as a function of time and affluence, more likely than not will overestimate impacts, perhaps by orders of magnitude. In fact, this is one reason why many estimates of the future impacts of climate change are suspect, because most do not account for changes in adaptive capacity either due to secular technological change or increases in economic development.

Yogi Berra is supposed to have said, “It’s tough to make predictions, especially about the future.” Most analysts recognize this. They know that just because one can explain and hindcast the past, it does not guarantee that one can forecast the future. Neo-Malthusians, by contrast, cannot hindcast the past but are confident they can forecast the future.

Finally, had the solutions that Neo-Malthusians espouse been put into effect a couple of centuries ago, most of us alive today would be dead and those who were not would be living poorer, shorter, and unhealthier lives, constantly subject to the vagaries of nature, surviving from harvest to harvest, spending more of our time in darkness because lighting would be a luxury, and spending more of our days in the drudgery of menial tasks because, under their skewed application of the precautionary principle (see here, here and here), fossil fuel consumption would be severely curtailed, if not banned.

Nor would the rest of nature necessarily be better off.  First, lower reliance on fossil fuels would mean greater demand for fuelwood, and the forests would be denuded.  Second, less fossil fuels also means less fertilizer and pesticides and, therefore, lower agricultural productivity. To compensate for lost productivity, more habitat would need to be converted to agricultural uses. But habitat conversion (including deforestation) — not climate change — is already the greatest threat to biodiversity!

Read the whole post here.

Atomic Dreams

Last week I was on John Stossel’s (most excellent) new show on Fox Business News to discuss energy policy – in particular, popular myths that Republicans have about energy markets.  One of the topics I touched upon was nuclear power.  My argument was the same that I have offered in print: Nuclear power is a swell technology but, given the high construction costs associated with building nuclear reactors, it’s a technology that cannot compete in free markets without a massive amount of government support.  If one believes in free markets, then one should look askance at such policies. 

As expected, the atomic cult has taken offense. 

Now, it is reasonable to argue that excessive regulatory oversight has driven up the cost of nuclear power and that a “better” regulatory regime would reduce costs.  Perhaps.  But I have yet to see any concrete accounting of exactly which regulations are “bad” along with associated price tags for the same.  If anyone out there in Internet-land has access to a good, credible accounting like that, please, send it my way.  But until I see something tangible, what we have here is assertion masquerading as fact.

Most of those who consider themselves “pro-nuke” are unaware of the fact that the current federal regulatory regime was thoroughly reformed in the late 1990s to comport with the industry’s model of what a “good” federal regulatory regime would look like.  As Oliver Kingsley Jr., the President of Exelon Nuclear, put it in Senate testimony back in 2001:

The current regulatory environment has become more stable, timely, and predictable, and is an important contributor to improved performance of nuclear plants in the United States.  This means that operators can focus more on achieving operational efficiencies and regulators can focus more on issues of safety significance.  It is important to note that safety is being maintained and, in fact enhanced, as these benefits of regulatory reform are being realized.  The Nuclear Regulatory Commission – and this Subcommittee – can claim a number of successes in their efforts to improve the nuclear regulatory environment.  These include successful implementation of the NRC Reactor Oversight Process, the timely extension of operating licenses at Calvert Cliffs and Oconee, the establishment of a one-step licensing process for advanced reactors, the streamlining of the license transfer process, and the increased efficiency in processing licensing actions.

It’s certainly possible that the industry left some desirable reforms undone, but it seems relevant to me that the Nuclear Energy Institute – the trade association for the nuclear energy industry and a fervent supporter of all these government assistance programs – does not complain that they’re being unfairly hammered by costly red-tape.

For the most part, however, the push-back against the arguments I offered last week has little to do with this.  It has to do with bias.  According to a post by Rod Adams over at “Atomic Insights Blog,” I am guilty of ignoring subsidies doled-out to nuclear’s biggest competitor – natural gas – and because Cato gets money from Koch Industries, it’s clear that my convenient neglect of that matter is part of a corporate-funded attack on nuclear power.  Indeed, Mr. Adams claims that he has unearthed a “smoking gun” with this observation.

Normally, I would ignore attacks like this.  This particular post, however, offers the proverbial “teachable moment” that should not be allowed to go to waste.

First, let’s look at the substance of the argument.  Did I “give natural gas a pass” as Mr. Adams contends? Well, yes and no; the show was about the cost of nuclear power, not the cost of natural gas.  I did note that natural gas-fired electricity was more attractive in this economic environment than nuclear power, something that happens to be true.  Had John Stossel asked me about whether gas’ economic advantage was due to subsidy, I would have told him that I am against natural gas subsidies as well – a position I have staked-out time and time again in other venues (while there are plenty of examples, this piece I co-authored with Daniel Becker – then of the Sierra Club – for The Los Angeles Times represents my thinking on energy subsidies across the board.  A blog post a while back about the Democratic assault on oil and gas subsidies found me arguing that the D’s should actually go further!  Dozens of other similar arguments against fossil fuel subsidies can be found on my publications page).  So let’s dispose of Mr. Adams’ implicit suggestion that I am some sort of tool for the oil and gas industry, arguing against subsidies here but not against subsidies there.

Second, let’s consider the implicit assertion that Mr. Adams makes – that natural gas-fired electricity is more attractive than nuclear power primarily because of subsidy.  The most recent and thorough assessment of this matter comes from Prof. Gilbert Metcalf, an economist at Tufts University.  Prof. Metcalf agrees with a 2004 report from the Energy Information Administration which contended that preferences for natural gas production in the tax code do little to increase natural gas production and thus do little to make natural gas less expensive than it might otherwise be.  They are wealth transfers for sure, but they do not do much to change natural gas supply or demand curves and thus do not affect consumer prices.  Prof. Metcalf argues that if we had truly level regulatory playing field without any tax distortions, natural gas-fired electricity would actually go down, not up!  Government intervention in energy markets does indeed distort gas-fired electricity prices.  It makes those prices higher than they otherwise would be!

The Energy Information Administration (EIA) identified five natural gas subsidies in 2007 that were relevant to the electricity sector (table 5).  Only two are of particular consequence.  They are:

  • Expensing of Exploration and Development Costs – Gas producers are allowed to expense exploration and development expenditures rather than capitalize and depreciate those costs over time.  Oil and gas producers (combined) took advantage of this tax break to the tune of $860 million per year.  How much goes to gas production rather than to oil production is unclear.
  • Excess of Percentage over Cost Depletion Deferral – Under cost depletion, producers are allowed to make an annual deduction equal to the non-recovered cost of acquisition and development of the resource times the proportion of the resource removed that year.  Under percentage depletion, producers deduct a percentage of gross income from resource production.  Oil and gas producers (combined) take advantage of this tax break to the tune of $790 million per year.  How much goes to gas production rather than to oil production is unclear. 

Even if we put aside the fact that these subsidies don’t impact final consumer prices in any significant manner, it’s useful to keep in mind the fact that the subsidy per unit of gas-fired electricity production – as calculated by EIA – works out to 25 cents per megawatt hour (table 35).  Subsidy per unit of nuclear-fired electricity production works out to $1.59 per megawatt hour.  Hence, the argument that nuclear subsidies are relatively small in comparison with natural gas subsidies is simply incorrect.

Some would argue that the Foreign Tax Credit – a generally applicable credit available to corporations doing business overseas that allows firms to treat royalty payments to foreign governments as a tax that can be deducted from domestic corporate income taxes – should likewise be on the subsidy list.  The Environmental Law Institute calculates that this credit saves the fossil fuel industry an additional $15.3 billion.  There is room for debate about the wisdom of that credit, but regardless, it doesn’t appear as if the Foreign Tax Credit affects domestic U.S. prices for gas-fired electricity.     

The bigger point is that without government help, few doubt that the natural gas industry would still be humming and electricity would still be produced in large quantities from gas-fired generators.  But without government production subsidies, without loan guarantees, and without liability protection via the Price-Anderson Act, even the nuclear power industry concedes that they would disappear.

Now, to be fair, Prof. Metcalf reports that nuclear power is cheaper than gas-fired power under both current law and under a no-subsidy, no-tax regime.  His calculations, however, were made at a time when natural gas prices were at near historic highs that were thought to be the new norm in energy markets and were governed by fairly optimistic assumptions about nuclear power plant construction costs.  Those assumptions have not held-up well with time.  For a more recent assessment, see my review of this issue in Reasonalong with this study from MIT, which warns that if more government help isn’t forthcoming, “nuclear power will diminish as a practical and timely option for deployment at a scale that would constitute a material contribution to climate change risk mitigation.”

Third, Mr. Adams argues that federal nuclear loan guarantee program is a self-evidently good deal and implies that only an anti-industry agitprop specialist (like me) could possibly refuse to see that.  “That program, with its carefully designed and implemented due diligence requirements for project viability, should actually produce revenue for the government.”  Funny, but when private investors perform those due diligence exercises, they come to a very different conclusion … which is why we have a federal loan guarantee program in the first place. 

Who do you trust to watch over your money – investment bankers or Uncle Sam?  The former don’t have the best track record in the world these days, but note that the popular indictment of that crowd is that investment banks weren’t tight fisted enough when it came to lending.  If even these guys were saying no to nuclear power – and at a time when money was flowing free and easy – what makes Mr. Adams think that a bunch of politicians are right about the glorious promise of nuclear power, particularly given the “too cheap to meter” rhetoric we’ve heard from the political world now for the better part of five decades? 

Anyway, for what it’s worth, the Congressional Budget Office has taken a close look at this alleged bonanza for the taxpayer and judged the risk of default on these loan guarantees to be around 50 percent.  They may be wrong of course, but the risks are there, something Moody’s acknowledged last year in a published analysis warning that they were likely to downgrade the credit-worthiness of nuclear power plant construction loans.

Fourth and finally, Mr. Adams cites Cato’s skepticism about “end-is-near” climate alarmism as yet more evidence that we are on the take from the fossil fuels industry.  I don’t know if Mr. Adams has been following current events lately, but I would think that we’re looking pretty good right now on that front.  Der Spiegel – no hot-bed of “Big Oil” agitprop – sums up the state of the debate rather nicely in the wake of the ongoing collapse of IPCC credibility.  Matt Ridley – another former devotee of climate alarmism – likewise sifts through the rubble that is now the infamous Michael Mann “hockey stick” analysis (which allegedly demonstrated an unprecedented degree of warming in the 20th Century) and finds thorough and total rot at the heart of the alarmist argument.  Mr. Adams is perhaps unaware that our own Pat Michaels has been making these arguments for years and Cato has no apologies to make on that score. 

Regardless, ad hominem is the sign of a man running out of arguments.  There aren’t many here to rebut, but the form of the complaints offered by Mr. Adams speaks volumes about how little the pro-nuclear camp has to offer right now in defense of nuclear power subsidies.

I have no animus towards nuclear power per se.  If nuclear power could compete without government help, I would be as happy as Mr. Adams or the next MIT nuclear engineer.  But I am no more “pro” nuclear power than I am “pro” any power.  It is not for me to pick winners in the market place.  That’s the invisible hand’s job.  If there is bad regulation out there harming the industry, then by all means, let’s see a list of said bad regulations and amend them accordingly.  But once those regulations are amended (if there are indeed any that need amending), nuclear power should still be subject to an unbiased market test.  Unlike Mr. Adams, I don’t want to see that test rigged.

Cherry Picking Climate Catastrophes: Response to Conor Clarke, Part II

Conor Clarke at The Atlantic blog, raised several issues with my study, “What to Do About Climate Change,” which Cato published last year.

One of Conor Clarke’s comments was that my analysis did not extend beyond the 21st century. He found this problematic because, as Conor put it, climate change would extend beyond 2100, and even if GDP is higher in 2100 with unfettered global warming than without, it’s not obvious that this GDP would continue to be higher “in the year 2200 or 2300 or 3758”. I addressed this portion of his argument in Part I of my response. Here I will address the second part of this argument, that “the possibility of ‘catastrophic’ climate change events — those with low probability but extremely high cost — becomes real after 2100.”

The examples of potentially catastrophic events that could be caused by anthropogenic greenhouse gas induced global warming (AGW) that have been offered to date (e.g., melting of the Greenland or West Antarctic Ice Sheets, or the shutdown of the thermohaline circulation) contain a few drops of plausibility submerged in oceans of speculation. There are no scientifically justified estimates of the probability of their occurrence by any given date. Nor are there scientifically justified estimates of the magnitude of damages such events might cause, not just in biophysical terms but also in socioeconomic terms. Therefore, to call these events “low probability” — as Mr. Clarke does — is a misnomer. They are more appropriately termed as plausible but highly speculative events.

Consider, for example, the potential collapse of the Greenland Ice Sheet (GIS). According to the IPCC’s WG I Summary for Policy Makers (p. 17), “If a negative surface mass balance were sustained for millennia, that would lead to virtually complete elimination of the Greenland Ice Sheet and a resulting contribution to sea level rise of about 7 m” (emphasis added). Presumably the same applies to the West Antarctic Ice Sheet.

But what is the probability that a negative surface mass balance can, in fact, be sustained for millennia, particularly after considering the amount of fossil fuels that can be economically extracted and the likelihood that other energy sources will not displace fossil fuels in the interim? [Remember we are told that peak oil is nigh, that renewables are almost competitive with fossil fuels, and that wind, solar and biofuels will soon pay for themselves.]

Second, for an event to be classified as a catastrophe, it should occur relatively quickly precluding efforts by man or nature to adapt or otherwise deal with it. But if it occurs over millennia, as the IPCC says, or even centuries, that gives humanity ample time to adjust, albeit at a socioeconomic cost. But it need not be prohibitively dangerous to life, limb or property if: (1) the total amount of sea level rise (SLR) and, perhaps more importantly, the rate of SLR can be predicted with some confidence, as seems likely in the next few decades considering the resources being expended on such research; (2) the rate of SLR is slow relative to how fast populations can strengthen coastal defenses and/or relocate; and (3) there are no insurmountable barriers to migration.

This would be true even had the so-called “tipping point” already been passed and ultimate disintegration of the ice sheet was inevitable, so long as it takes millennia for the disintegration to be realized. In other words, the issue isn’t just whether the tipping point is reached, rather it is how long does it actually take to tip over. Take, for example, if a hand grenade is tossed into a crowded room. Whether this results in tragedy — and the magnitude of that tragedy — depends upon how much time it takes for the grenade to go off, the reaction time of the occupants, and their ability to respond.

Lowe, et al. (2006, p. 32-33), based on a “pessimistic, but plausible, scenario in which atmospheric carbon dioxide concentrations were stabilised at four times pre-industrial levels,” estimated that a collapse of the Greenland Ice Sheet would over the next 1,000 years raise sea level by 2.3 meters (with a peak rate of 0.5 cm/yr). If one were to arbitrarily double that to account for potential melting of the West Antarctic Ice Sheet, that means a SLR of ~5 meters in 1,000 years with a peak rate (assuming the peaks coincide) of 1 meter per century.

Such a rise would not be unprecedented. Sea level has risen 120 meters in the past 18,000 years — an average of 0.67 meters/century — and as much as 4 meters/century during meltwater pulse 1A episode 14,600 years ago (Weaver et al. 2003; subscription required). Neither humanity nor, from the perspective of millennial time scales (per the above quote from the IPCC), the rest of nature seem the worse for it. Coral reefs for example, evolved and their compositions changed over millennia as new reefs grew while older ones were submerged in deeper water (e.g., Cabioch et al. 2008). So while there have been ecological changes, it is unknown whether the changes were for better or worse. For a melting of the GIS (or WAIS) to qualify as a catastrophe, one has to show, rather than assume, that the ecological consequences would, in fact, be for the worse.

Human beings can certainly cope with sea level rise of such magnitudes if they have centuries or millennia to do so. In fact, if necessary they could probably get out of the way in a matter of decades, if not years.

Can a relocation of such a magnitude be accomplished?

Consider that the global population increased from 2.5 billion in 1950 to 6.8 billion this year. Among other things, this meant creating the infrastructure for an extra 4.3 billion people in the intervening 59 years (as well as improving the infrastructure for the 2.5 billion counted in the baseline, many of whom barely had any infrastructure whatsoever in 1950). These improvements occurred at a time when everyone was significantly poorer. (Global per capita income today is more than 3.5 times greater today than it was in 1950). Therefore, while relocation will be costly, in theory, tomorrow’s much wealthier world ought to be able to relocate billions of people to higher ground over the next few centuries, if need be. In fact, once a decision is made to relocate, the cost differential of relocating, say, 10 meters higher rather than a meter higher is probably marginal. It should also be noted that over millennia the world’s infrastructure will have to be renewed or replaced dozens of times – and the world will be better for it. [For example, the ancient city of Troy, once on the coast but now a few kilometers inland, was built and rebuilt at least 9 times in 3 millennia.]

Also, so long as we are concerned about potential geological catastrophes whose probability of occurrence and impacts have yet to be scientifically estimated, we should also consider equally low or higher probability events that might negate their impacts. Specifically, it is quite possible — in fact probable — that somewhere between now and 2100 or 2200, technologies will become available that will deal with climate change much more economically than currently available technologies for reducing GHG emissions. Such technologies may include ocean fertilization, carbon sequestration, geo-engineering options (e.g., deploying mirrors in space) or more efficient solar or photovoltaic technologies. Similarly, there is a finite, non-zero probability that new and improved adaptation technologies will become available that will substantially reduce the net adverse impacts of climate change.

The historical record shows that this has occurred over the past century for virtually every climate-sensitive sector that has been studied. For example, from 1900-1970, U.S. death rates due to various climate-sensitive water-related diseases — dysentery, typhoid, paratyphoid, other gastrointestinal disease, and malaria —declined by 99.6 to 100.0 percent. Similarly, poor agricultural productivity exacerbated by drought contributed to famines in India and China off and on through the 19th and 20th centuries killing millions of people, but such famines haven’t recurred since the 1970s despite any climate change and the fact that populations are several-fold higher today. And by the early 2000s, deaths and death rates due to extreme weather events had dropped worldwide by over 95% of their earlier 20th century peaks (Goklany 2006).

With respect to another global warming bogeyman — the shutdown of the thermohaline circulation (AKA the meridional overturning circulation), the basis for the deep freeze depicted in the movie, The Day After Tomorrow — the IPCC WG I SPM notes (p. 16), “Based on current model simulations, it is very likely that the meridional overturning circulation (MOC) of the Atlantic Ocean will slow down during the 21st century. The multi-model average reduction by 2100 is 25% (range from zero to about 50%) for SRES emission scenario A1B. Temperatures in the Atlantic region are projected to increase despite such changes due to the much larger warming associated with projected increases in greenhouse gases. It is very unlikely that the MOC will undergo a large abrupt transition during the 21st century. Longer-term changes in the MOC cannot be assessed with confidence.”

Not much has changed since then. A shut down of the MOC doesn’t look any more likely now than it did then. See here, here, and here (pp. 316-317).

If one wants to develop rational policies to address speculative catastrophic events that could conceivably occur over the next few centuries or millennia, as a start one should consider the universe of potential catastrophes and then develop criteria as to which should be addressed and which not. Rational analysis must necessarily be based on systematic analysis, and not on cherry picking one’s favorite catastrophes.

Just as one may speculate on global warming induced catastrophes, one may just as plausibly also speculate on catastrophes that may result absent global warming. Consider, for example, the possibility that absent global warming, the Little Ice Age might return. The consequences of another ice age, Little or not, could range from the severely negative to the positive (if that would buffer the negative consequences of warming). That such a recurrence is not unlikely is evident from the fact that the earth entered and, only a century and a half ago, retreated from a Little Ice Age, and that history may indeed repeat itself over centuries or millennia.

Yet another catastrophe that greenhouse gas controls may cause is that CO2 not only contributes to warming, it is also the key building block of life as we know it. All vegetation is created by the photosynthesis of CO2 in the atmosphere. In fact, according to the IPCC WG I report (2007, p. 106), net primary productivity of the global biosphere has increased in recent decades, partly due to greater warming, higher CO2 concentrations and nitrogen deposition. Thus , there is a finite probability that reducing CO2 emissions would, therefore, reduce the net primary productivity of the terrestrial biosphere with potentially severe negative consequences for the amount and diversity of wildlife that it could support, as well as agricultural and forest productivity with adverse knock on effects on hunger and health.

There is also a finite probability that costs of GHG reductions could reduce economic growth worldwide. Even if only industrialized countries sign up for emission reductions, the negative consequences could show up in developing countries because they derive a substantial share of their income from aid, trade, tourism, and remittances from the rest of the world. See, for example, Tol (2005), which examines this possibility, although the extent to which that study fully considered these factors (i.e., aid, trade, tourism, and remittances) is unclear.

Finally, one of the problems with the argument that society should address low probability high impact events (assuming a probability could be estimated rather than assumed or guessed) is that it necessarily means there is a high probability that resources expended on addressing such catastrophic events will have been squandered. This wouldn’t be a problem but for the fact that there are opportunity costs associated with this.

According to the 2007 IPCC Science Assessment’s Summary for Policy Makers (p. 10), “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.” In plain language, this means that the IPCC believes there is at least a 90% likelihood that anthropogenic greenhouse gas emissions (AGHG) are responsible for 50-100% of the global warming since 1950. In other words, there is an up to 10% chance that anthropogenic GHGs are not responsible for most of that warming.

This means there is an up to 10% chance that resources expended in limiting climate change would have been squandered. Since any effort to significantly reduce climate change will cost trillions of dollars (see Nordhaus 2008, p. 82), that would be an unqualified disaster, particularly since those very resources could be devoted to reducing urgent problems humanity faces here and now (e.g., hunger, malaria, safer water and sanitation) — problems we know exist for sure unlike the bogeymen that we can’t be certain about.

Spending money on speculative, even if plausible, catastrophes instead of problems we know exist for sure is like a starving man giving up a fat juicy bird in hand while hoping that we’ll catch several other birds sometime in the next few centuries even though we know those birds don’t exist today and may never exist in the future.

Global Taxes and More Foreign Aid

The U.K.-based Guardian reports that the United Nations and other international bureaucracies dealing with so-called climate change are scheming to impose global taxes. That’s not too surprising, but it is discouraging to read that the Obama Administration appears to be acquiescing to these attacks on U.S. fiscal sovereignty. The Administration also has indicated it wants to squander an additional $400 billion on foreign aid, adding injury to injury:

…rich countries will be asked to accept a compulsory levy on international flight tickets and shipping fuel to raise billions of dollars to help the world’s poorest countries adapt to combat climate change. The suggestions come at the start of the second week in the latest round of UN climate talks in Bonn, where 192 countries are starting to negotiate a global agreement to limit and then reduce greenhouse gas emissions. The issue of funding for adaptation is critical to success but the hardest to agree. …It has been proposed by the world’s 50 least developed countries. It could be matched by a compulsory surcharge on all international shipping fuel, said Connie Hedegaard, the Danish environment and energy minister who will host the final UN climate summit in December. …In Bonn last week, a separate Mexican proposal to raise billions of dollars was gaining ground. The idea, known as the “green fund” plan, would oblige all countries to pay amounts according to a formula reflecting the size of their economy, their greenhouse gas emissions and the country’s population. That could ensure that rich countries, which have the longest history of using of fossil fuels, pay the most to the fund. Recently, the proposal won praise from 17 major-economy countries meeting in Paris as a possible mechanism to help finance a UN pact. The US special envoy for climate change, Todd Stern, called it “highly constructive”. …Last week, a US negotiator, Jonathan Pershing, said that the US had budgeted $400m to help poor countries adapt to climate change as an interim measure. But that amount was dismissed as inadequate by Bernarditas Muller of the Philippines, who is the co-ordinator of the G77 and China group of countries.