Powerful Evidence for School Choice

The mayor of Stockholm gave some brief remarks at the closing dinner of the Mont Pelerin Society meeting and mentioned that the number of students in private schools had skyrocketed after the implementation of Sweden’s school choice program. Intrigued, I emailed the folks at one of the nation’s research organizations to ask for some details.

The figures are impressive. The number of students attending private high schools has jumped from 1.7 percent in 1992 to 19.5 percent in 2008. Not surprisingly, the quality of education is high. Indeed, researchers have looked at the data and concluded, “Our findings support the hypothesis that school results in public schools improve due to competition.”

Why Future Net Negative Impacts of Global Warming Are Overestimated: Response to Conor Clarke, Part IV

This post responds to the last of Conor Clarke’s comments on my study, “What to Do About Global Warming,” published by Cato. This series started with the imaginatively titled, Response to Conor Clarke Part I, and continued with Cherry Picking Climate Catastrophes and  Do Industrialized Countries Have a Responsibility for the Well-Being of Developing Nations?

CONOR said:

I think Goklany is a bit picky and choosey with the evidence. … I also like the Goklany paper a lot. [THANK YOU!! I’ll take whatever I get.] But in this case it’s hard to resist. [Emphasis in original.]

To take one example (of several), Goklany’s hunger estimates rely heavily on those published by Global Environmental Change (GEC), which he uses to make the argument that “the world will be better off in 2085 with respect to hunger than it was in 1990 despite any increase in population.” But the GEC produced two estimates of hunger and climate change – one that assumes the benefits of CO2 fertilization and one that does not. Goklany picks the former estimate (I have no idea why), despite the fact the GEC says the effects of climate change “will fall somewhere between” the two. … [I}f you embrace anything other than the most Pollyanish CO2 fertilization estimate – the one that Goklany uses in his Cato paper – we will be living in a world in which climate change puts tens of millions of additional people at risk of starvation by 2085.

My RESPONSE:

First, let me elaborate on my selection of the set of studies that I used in my paper.  Essentially, the selected set of studies (published in Global Environmental Change) was the only one that had estimated global impacts using detailed process models in conjunction with the IPCC’s latest scenarios, and were peer reviewed.  Moreover, they come with a provenance that people who may be unhappy with my results cannot impugn. [This is important only because many people arguing about global warming seem to be more concerned about who did the study and whether the results bolster their predilections, than how the study was done.]  Specifically, virtually all the authors were intimately connected with the IPCC. The senior author of the hunger study was also the co-chairman of the IPCC’s Work Group II, which was responsible for compiling the portion of the IPCC’s latest assessment that dealt with impacts, vulnerability and adaptation. The authors of the water resource and coastal flooding studies were the lead authors of corresponding chapters in that IPCC report. An earlier version of the same set of impact studies was the basis for the claim by Sir David King, erstwhile science advisor to Her Majesty’s Government, that global warming was a more serious threat than terrorism (see here). The Stern Review also drew quite heavily from these studies (see below).

Let’s now turn to Conor’s comments on the hunger study and why I assumed that the benefits of carbon fertilization would be realized in the future. Indeed, the hunger study (Parry et al.) produced two separate estimates — one assuming that carbon fertilization is a reality, and the other assuming zero carbon fertilization.  But the two estimates are not equally likely. There are literally hundreds, if not thousands of experimental studies that show carbon fertilization is a reality (see also here), that higher CO2 not only increases the rate of photosynthesis, but also increases the efficiency of water use by plants (i.e., it confers a degree of immunity to drought), among the many other benefits CO2 bestows on plants and other carbon based life, including all creatures – big and small – in the biosphere that depend directly or indirectly on photosynthesis.  The probability that direct CO2 effects on crop growth are zero or negative is virtually non-existent (IPCC, 2001b: 254–256). Second, the positive effect of carbon fertilization was based on the average of experimental studies; it’s not an upper bound estimate. On the other hand, the notion of “zero fertilization” is an assumption not supported by the vast majority of empirical data. So averaging results from the two estimates makes no sense and would understate the average benefits that would likely result from carbon fertilization.

Notably, the Stern Review, invoked a study by Long et al. (subscription required) to estimate future levels of hunger based on “zero fertilization” using precisely the same study (Parry et al.) that I  – and Conor, in his comments – used. But Long et al.’s results have been disputed by other scientists (also see here), including some contributors to the IPCC’s assessment.  More importantly, Long et al. only suggested that under field conditions, carbon fertilization may be a third to less than half of what is indicated by experiments using growth chambers, not that it would be zero. It also noted that fertilization may be stronger under drought conditions or if sufficient nitrogen is employed. But drought is one of the bogeymen of global warming, and increased use of nitrogen is precisely the kind of adaptation that would become more affordable in the future as countries become wealthier, as they should if the IPCC’s scenarios are to be given any credence.  Indeed, that is one of the adaptations allowed in Parry et al. Also, the fact that crop yields are higher in richer countries is partly because they can more easily afford nitrogen fertilizers (see here, p. 78). In fact, China’s nitrogen use per hectare is already among the world’s highest. For all these reasons, even if one accepts the Long et al. study as gospel, it is reasonable to assume that the effect of carbon fertilization will be closer to the “higher” estimate from the Parry et al. study than to the “zero fertilization” case.

But, more importantly, the uncertainties related to the magnitude of the CO2 fertilization effect is most likely swamped by a major source of overestimation of hunger in Parry et al.’s estimates.

Although Parry et al. allows for some secular (time-dependent) increases in agricultural productivity, increases in crop yield with economic growth due to greater application of fertilizer and irrigation in richer countries, decreases in hunger due to economic growth, and for some adaptive responses at the farm level to deal with global warming, Parry et al. itself acknowledged that these adaptive responses are based on the “current range” of available technologies, not on technologies that would be available in the future or any technologies developed to specifically cope with the negative impacts of global warming (Parry et al., p. 57).  The potential for future technologies to cope with global warming is large, especially if one considers bioengineered crops (see here, chapter 9), which Parry et al. admittedly didn’t consider. Moreover, an examination of the sources cited in Parry et al. indicates that the “current range” of technology is actually based on 1990s or earlier technology. That is, it is not quite current.

The approach used in Parry et al. to estimate the impacts of global warming decades from now is, in essence, tantamount to estimating today’s level of hunger (and agricultural production) based on the technology of 50 years ago. In fact, the major reason why Paul Ehrlich’s Population Bomb turned out to be a dud was that it underestimated or ignored future developments in agricultural technology.

As noted in Part I of this series of responses, ignoring technological change can, over decades, lead to overestimating adverse impacts by orders of magnitude. Notably, due to a combination of technological change and increasing affluence, U.S. death rates due to various water related diseases – dysentery, typhoid, paratyphoid, other gastrointestinal disease, and malaria – declined by 99%–100% from 1900 to 1970.  For the same reasons, during the twentieth century, global death rates from extreme weather events declined by over 95%.

This basic methodological shortcoming, however, is not unique to Parry et al. It is common to ALL global warming impact studies that I have read – and I have read plenty of them.

For all these reasons, the adverse impacts of global warming for hunger (as well as other aspects of human well-being, e.g., due to malaria and coastal flooding) that I used in my paper are, more likely than not, substantially overestimated. And by the same token, ignoring technological change (and not fully accounting for increases in wealth) also assures that the positive impacts of global warming are likely to be underestimated, further overestimating the net negative impacts of global warming.

Therefore, far from being Pollyanish, the estimates used in my paper most likely substantially exaggerate the net negative impacts of global warming. Despite that, those estimates cannot justify emissions cuts that go beyond no-regret actions at this time or through the foreseeable future.

Why Government Rationing Ain’t a Good Deal

When government is paying the medical bill, it inevitably has to “ration” care.  Choices obviously have to be made by whoever is paying, but there’s good reason not to leave government with the dominant decision-making power, as in Great Britain.

There’s no need to demonize British care.  All one has to do is point out how government fiscal objectives so often run against good patient treatment.  And how most people have no exit to a better alternative.

Consider this rather amazing story from the Daily Telegraph:

Doctors have launched a campaign on behalf of a war hero who has been told he must go blind in one eye before he can receive NHS treatment and accused Gordon Brown of “incompetence” in managing the health service.

More than 120 doctors have sent £5 cheques to Downing Street, made out to the Prime Minister, in the hope of shaming him into helping former RAF bomber Jack Tagg. The 88-year-old was recently diagnosed with age-related macular degeneration, the leading cause of blindness in Britain, which affects an estimated 500,000 people.

Mr Tagg has the treatable, but most aggressive “wet” form of the disease, which can lead to the loss of central vision in as little as two months.

But he has been told that the NHS will only fund the injections which could save his sight, after he has lost the vision in one eye.

… “They told me there were three choices: let nature take its course and go blind, try to seek funding, or pay for immediate treatment. Time is of the essence, so we opted to pay up and fight for funding.

“This is happening to literally millions of people. It’s appalling and something has got to be done about it.”

The American medical system needs reform.   But that should be accomplished by promoting patient-directed care, with individuals and families, rather than government, deciding how best to use scarce resources when it comes to medical treatment.

Cash for Clunkers: Dumbest Program Ever?

As the Cash for Clunkers program begins to wind down, I nominate it as the dumbest government program ever. Here is what the program will have accomplished:

  • A few billion dollars worth of wealth was destroyed. About 750,000 cars, many of which could have provided consumer value for many years, were thrown in the trash. Suppose each clunker was worth $3,000 at a guess, that would mean that the government destroyed $2.25 billion of value.
  • Low-income families, who tend to buy used cars, were harmed because the clunkers program will push up used car prices.
  • Taxpayers were ripped off $3 billion. The government took my money to give to people who will buy new cars that are much nicer than mine! 
  • The federal bureaucracy has added 1,100 people to handle all the clunker administration. Again, taxpayers are the losers.
  • The environment was not helped. See here and here.
  • The auto industry received a short-term “sugar high” at the expense of lower future sales when the program is over. The program apparently boosted sales by about 750,000 cars this year, but that probably means that sales over the next few years will be about 750,000 lower. The program probably further damaged the longer-term prospects of auto dealers and automakers by diverting their attention from market fundamentals in the scramble for federal cash.   

Farm subsidies are unjust. Trade restrictions are counter-productive. Energy regulations have done great damage. Housing policies helped cause the financial crisis. But for pure dumbness, Cash for Clunkers takes the cake.

An Alternative Strategy for Afghanistan

Bernard Finel, a senior fellow at the American Security Project, has an excellent piece on forging an alternative strategy in Afghanistan.

I believe the United States should begin a relative rapid withdrawal of combat forces from Afghanistan.  It is not that I don’t think they can be locally effective.  It is just that I question the cost/benefit calculus of extending the commitment.  I think that many supporters of escalation fail to consider the potential consequences if we do fail to achieve our goal of largely defeating the Taliban and pacifying Afghanistan. [Emphasis mine]

Finel brings up a critical point. From former national security adviser Henry Kissinger to Council on Foreign Relations scholar Stephen Biddle, many prominent opinion leaders concede that the war in Afghanistan will be long, expensive, and risky, yet claim it is ultimately worth waging because a withdrawal would boost jihadism globally and make America look weak. But what happens if what we’ve invested in falls apart whether we withdraw tomorrow or 20 years from now? And wouldn’t trying to stay indefinitely — while accomplishing little — appear even worse? Trying to pacify all of Afghanistan, much less hoping to do so on a permanent basis, is a losing strategy.

afghanistan-malou innocentMr. Finel goes on to say further down, “we should recommit to doing everything in our power to revolve tensions between India and Pakistan.  Pakistan has legitimate security concerns regarding its neighbor and that gives Pakistan mixed motives in dealing with Islamist radicals.”

This too is a crucial recommendation. People in the Beltway have neglected the extent to which leaders in Islamabad fear the rise of an India-leaning government coming to power in Kabul, and thus, their leaders (principally their military) have little incentive to stop allowing their territory to be used as a de facto safe haven for the original Afghan Taliban. Thus, the question must be asked, can Washington offer any number of incentives for their leaders to relinquish support for extremists with whom they have associated for the past 30 years? This question gets lost when people discuss the possibility of talks with the Taliban. The question for U.S. policymakers is not whether the Taliban militants we talk to are “moderate” enough, but whether they will simply lie in wait and reemerge from their cross-border sanctuary after the eventual withdrawal of U.S. and NATO forces.

Unless Washington addresses Pakistan’s existential fear of India, and their military leadership’s continued support for the Taliban in order to counter India’s influence in Afghanistan, U.S. and NATO troops could fight for decades, win every discrete battle, and never come close to eradicating the militancy.

State and Local Government Employment Up Since Recession’s Start

Yesterday, the Rockefeller Institute released a report on state and local government employment since the beginning of the recession.  It found:

Private sector employment for the nation as a whole has fallen by 6.9 million jobs between the December 2007 start of the recession and July 2009. Over the same period, state and local government employment has risen by 110 thousand jobs or 0.6 percent, with increases in both state governments and local governments.

With a prolonged recession now forcing state and local governments to actually cut or furlough some employees, it’s important to remember that they were adding government jobs at a time when it was clear to the rest of the country that the air was out of the economic bubble.  In other words, taxpayers should have no sympathy for posturing politicians and their apologists warning of Armageddon should taxes not be increased to facilitate the continuance of bloated state and local governments.  Also, expect to hear claims that getting rid of government employees will somehow hinder an economic recovery.  In fact, getting rid of government employees — and the programs they support — would be good for the long-run health of the economy.

A government employee is inherently parasitic because without the “host” — i.e., taxpayers — their job would not exist.  One can debate the degree to which a government employee’s work benefits society, but the fact remains that any benefit comes at a cost to the economy given that productive individuals and businesses are taxed to pay for government jobs.  This should be obvious.  Unfortunately, it is not uncommon these days to hear intelligent people embrace increasing government employment during a recession to “make up” for job losses in the private sector.  One need only spend some time working in government, as I have, to recognize that an economic resurgence will not be fueled by increasing the government employee-to-host ratio.

Will Uribe Betray Liberal Democracy in Colombia?

After months of speculation, the Colombian Senate approved a constitutional amendment that would allow President Alvaro Uribe to run for a second reelection next year. Obstacles remain, however, and the amendment still has to be voted on in the House of Representatives, pass a review process by the Constitutional Court, and be put to a popular referendum — where it’s likely to be approved given Uribe’s high popularity among Colombian voters.

None of these required steps are certain: the final vote in the House of Representatives is not assured; the Constitutional Court might find irregularities during the discussion of the bill in Congress; and time is running out for organizing a national referendum before next year’s election. However, these last-minute efforts to change Colombia’s constitution and Uribe’s blatant interest in running again are troublesome.

I’ve praised Alvaro Uribe’s record before in tackling crime and guiding Colombia out of the abysm it was in at the start of the decade. However, democracy must transcend the virtues of any leader. Just as it is ominous for Venezuela’s democracy that Hugo Chávez plans to perpetuate himself in office, it would be unhealthy for Colombia’s democratic institutions for Uribe to run for a third consecutive term.

The ultimate decision will likely be Alvaro Uribe’s. This is his chance to show the world whether he’s loyal to liberal democracy or to the power he has become accustomed to.