Tag: science

Ms. Weaver Goes to Washington

Today in Washington: actress Sigourney Weaver testifies before the  Subcommittee on Oceans, Atmosphere, Fisheries, and Coast Guard of the Senate Committee on Commerce, Science and Transportation Committee on the topic of ocean acidification. Because, you know, she played an environmental scientist in Avatar. It’s the best fit since Jane Fonda, Jessica Lange, and Sissy Spacek – all of whom had played farm women – testified on America’s agricultural crisis.

Congress doesn’t have time to vote on presidential nominations. It doesn’t bother engaging in serious oversight of presidential power and civil liberties abuses. It looks at the ceiling and whistles as the national debt approaches Greek levels. But members of Congress have time to listen to an actress discuss the topic of ocean acidification.

This seems like a topic for “Really!?! with Seth and Amy” on Saturday Night Live. Really, Senate Commerce Committee? You think Sigourney Weaver has important information that you need to know? Really? And you’re not just doing this to get yourselves on television? Really!?! And you think the most important thing members of Congress could be doing today is getting their pictures taken with Sigourney Weaver? Really!?!

Of course, this is not just a one-day thing for Sigourney Weaver. She also traveled this month to Brazil to try to stop the construction of a dam. Because who would know better than a Hollywood-Manhattan actress how to make tradeoffs between energy needs and environmental risks in Brazil?

Now let me just say that I’m not arguing that ocean acidification isn’t an important topic. And I’m not criticizing Avatar or its defense of property rights. I’m just questioning whether Sigourney Weaver, Sissy Spacek, Jeff Daniels, Nick Jonas, and the Backstreet Boys have the kind of expertise that Congress ought to draw on in deciding how to run my life. Or then again, maybe planning the economy and running other people’s lives is farce at best, and Congress should just hold hearings with Will Ferrell and John Cleese.

Who Wants to Make Sarah Palin the Leader of the Republican Party?

Could it be the Washington Post? Bannered across the top of the Post’s op-ed page today is a piece titled “Copenhagen’s political science,” titularly authored by Sarah Palin. I’m delighted to see the Post publishing an op-ed critical of the questionable science behind the Copenhagen conference and the demands for massive regulations to deal with “climate change.”

But Sarah Palin? Of all the experts and political leaders a great newspaper might call on for a critical look at the science behind global warming, Sarah Palin?

What’s even more interesting is that the Post also ran an op-ed by Palin in July. But during this entire year, the Post has not run any op-eds by such credible and accomplished Republicans as Gov. Mitch Daniels; former governors Mitt Romney or Gary Johnson; Sen. John Thune; or indeed former governor Mike Huckabee, who might be Palin’s chief rival for the social-conservative vote. You might almost think the Post wanted Palin to be seen as a leader of Republicans.

I should note that during the past year the Post has run one op-ed each from John McCain, Bobby Jindal, Newt Gingrich, and Tim Pawlenty. (And for people who don’t read well, I should note that when I call the people above “credible and accomplished,” that’s not an endorsement for any political office.) Still, it’s the rare political leader who gets two Post op-eds in six months, and rarer still the Post op-eds by ex-governors who can’t name a newspaper that they read.

NAEP Math Scores, NCLB, and the Federal Government

I’m surprised anyone was surprised by the recent flat-lining of scores on the NAEP 4th grade math test. The rate of improvement in NAEP scores has been declining since No Child Left Behind was passed, and the recent results are consistent with that trend.

But what really amazes me is that so many people think the solution is just to tweak NCLB! The unstated assumption here is that federal policy is a key determinant of educational achievement. That’s rubbish.

We’ve spent $1.8 trillion on hundreds of different federal education programs since 1965, and guess what: at the end of high school, test scores are flat in both reading and math since 1970, and have actually declined slightly in science. (Charted for your viewing pleasure here).

If we’ve proved anything in the past 40 years, it is that federal involvement in education is a staggering waste of money.

Meanwhile, education economists have spent the last several decades finding out what actually does work in education. They’ve compared different kinds of school systems and it turns out that parent-driven, competitive education markets consistently outperform state monopoly school systems like ours. I tabulated the results in a recent peer-reviewed paper and they favor education markets over monopolies by a margin of 15 to 1.

So policymakers who actually care about improving educational outcomes should be spending their time and resources enacting laws that will bring free and competitive education markets within reach of all families. And they should be ignoring the education technocrats who – like Soviet central planners – just want to keep spending other people’s money tweaking their fruitless five year plans.

This Is Your Brain on Torture

We’ve all heard the argument that a subject under torture—or whatever this week’s euphemism is—may begin fabricating whatever they believe the interrogator wants to hear just to get the agony to stop.  Now neuroscientists are suggesting that inflicting too much pain and stress on a subject may not just induce them to lie; it may cause them to lose track of what’s true and false altogether:

Fact One: To recall information stored in the brain, you must activate a number of areas, especially the prefrontal cortex (site of intentionality) and hippocampus (the door to long-term memory storage). Fact Two: Stress such as that caused by torture releases the hormone cortisol, which can impair cognitive function, including that of the prefrontal cortex and hippocampus. Studies in which soldiers were subjected to stress in the form of food and sleep deprivation have found that it impaired their ability to recall personal memories and information, as this 2006 study reported. “Studies of extreme stress with Special Forces Soldiers have found that recall of previously-learned information was impaired after stress occurred,” notes O’Mara. “Water-boarding in particular is an extreme stressor and has the potential to elicit widespread stress-induced changes in the brain.”

Stress also releases catecholamines such as noradrenaline, which can enlarge the amygdale (structures involved in the processing of fear), also impairing memory and the ability to distinguish a true memory from a false or implanted one. Brain imaging of torture victims, as in this study, suggest why: torture triggers abnormal patterns of activation in the frontal and temporal lobes, impairing memory. Rather than a question triggering a (relatively) simple pattern of brain activation that leads to the stored memory of information that can answer the question, the question stimulates memories almost chaotically, without regard to their truthfulness.

In brief, the subject may lose genuine memories, and come to believe that their confabulations are authentic ones. The full literature review, from Trends in Cognitive Science, can be downloaded in PDF form here.

Actually, Big Mistakes Are to Be Expected…

Cognitive scientist Dan Willingham has a helpful column on the WaPo’s “Answer Sheet” blog. In it, he notes that DC Public Schools advises its employees to teach to students’ ”diverse learning styles” (e.g. “auditory learners,” “visual learners,” etc.) despite the fact that research shows these categories are pedagogically meaningless.

But what really grabbed my attention was this comment: “a misunderstanding of a pretty basic issue of cognition is a mistake that one does not expect from a major school system. It indicates that the people running the show at DCPS are getting bad advice about the science on which to base policy.”

As cognitive scientists have been collecting and analyzing evidence on “learning styles” for generations, social scientists and education historians been doing the same for school systems. What these latter groups find is that it is perfectly normal for public school districts to be unaware of or even indifferent to relevant research and to make major pedagogical errors as a result. Furthermore, there is no evidence that large districts are any better at avoiding these pitfalls than smaller ones. If anything, the reverse is true.

Not only are such errors to be expected of public school systems, we can actually say why that is the case with a good degree of confidence: public schooling lacks the freedoms and incentives that, in other fields, both allow and encourage institutions to acquire and effectively exploit expert knowledge.

Districts such as Washington DC can persist year after year with abysmal test scores, abysmal graduation rates, and astronomical costs. That is because they have a monopoly on a vast trove of  government k-12 spending. In the free enterprise system, behavior like that usually results in the failure of a business and its disappearance from the marketplace. So, in the free enterprise sector, it is indeed rare to see large institutions behaving in such a dysfunctional manner, because it would be difficult if not impossible for them to grow that big in the first place. Long before they could scale up on that level, they would lose their customers to more efficient, higher quality competitors.

So if we want to see the adoption and effective implementation of the best research become the norm in education, we have to organize schooling the same way we organize other fields: as a parent-driven competitive marketplace.

Cherry Picking Climate Catastrophes: Response to Conor Clarke, Part II

Conor Clarke at The Atlantic blog, raised several issues with my study, “What to Do About Climate Change,” which Cato published last year.

One of Conor Clarke’s comments was that my analysis did not extend beyond the 21st century. He found this problematic because, as Conor put it, climate change would extend beyond 2100, and even if GDP is higher in 2100 with unfettered global warming than without, it’s not obvious that this GDP would continue to be higher “in the year 2200 or 2300 or 3758”. I addressed this portion of his argument in Part I of my response. Here I will address the second part of this argument, that “the possibility of ‘catastrophic’ climate change events — those with low probability but extremely high cost — becomes real after 2100.”

The examples of potentially catastrophic events that could be caused by anthropogenic greenhouse gas induced global warming (AGW) that have been offered to date (e.g., melting of the Greenland or West Antarctic Ice Sheets, or the shutdown of the thermohaline circulation) contain a few drops of plausibility submerged in oceans of speculation. There are no scientifically justified estimates of the probability of their occurrence by any given date. Nor are there scientifically justified estimates of the magnitude of damages such events might cause, not just in biophysical terms but also in socioeconomic terms. Therefore, to call these events “low probability” — as Mr. Clarke does — is a misnomer. They are more appropriately termed as plausible but highly speculative events.

Consider, for example, the potential collapse of the Greenland Ice Sheet (GIS). According to the IPCC’s WG I Summary for Policy Makers (p. 17), “If a negative surface mass balance were sustained for millennia, that would lead to virtually complete elimination of the Greenland Ice Sheet and a resulting contribution to sea level rise of about 7 m” (emphasis added). Presumably the same applies to the West Antarctic Ice Sheet.

But what is the probability that a negative surface mass balance can, in fact, be sustained for millennia, particularly after considering the amount of fossil fuels that can be economically extracted and the likelihood that other energy sources will not displace fossil fuels in the interim? [Remember we are told that peak oil is nigh, that renewables are almost competitive with fossil fuels, and that wind, solar and biofuels will soon pay for themselves.]

Second, for an event to be classified as a catastrophe, it should occur relatively quickly precluding efforts by man or nature to adapt or otherwise deal with it. But if it occurs over millennia, as the IPCC says, or even centuries, that gives humanity ample time to adjust, albeit at a socioeconomic cost. But it need not be prohibitively dangerous to life, limb or property if: (1) the total amount of sea level rise (SLR) and, perhaps more importantly, the rate of SLR can be predicted with some confidence, as seems likely in the next few decades considering the resources being expended on such research; (2) the rate of SLR is slow relative to how fast populations can strengthen coastal defenses and/or relocate; and (3) there are no insurmountable barriers to migration.

This would be true even had the so-called “tipping point” already been passed and ultimate disintegration of the ice sheet was inevitable, so long as it takes millennia for the disintegration to be realized. In other words, the issue isn’t just whether the tipping point is reached, rather it is how long does it actually take to tip over. Take, for example, if a hand grenade is tossed into a crowded room. Whether this results in tragedy — and the magnitude of that tragedy — depends upon how much time it takes for the grenade to go off, the reaction time of the occupants, and their ability to respond.

Lowe, et al. (2006, p. 32-33), based on a “pessimistic, but plausible, scenario in which atmospheric carbon dioxide concentrations were stabilised at four times pre-industrial levels,” estimated that a collapse of the Greenland Ice Sheet would over the next 1,000 years raise sea level by 2.3 meters (with a peak rate of 0.5 cm/yr). If one were to arbitrarily double that to account for potential melting of the West Antarctic Ice Sheet, that means a SLR of ~5 meters in 1,000 years with a peak rate (assuming the peaks coincide) of 1 meter per century.

Such a rise would not be unprecedented. Sea level has risen 120 meters in the past 18,000 years — an average of 0.67 meters/century — and as much as 4 meters/century during meltwater pulse 1A episode 14,600 years ago (Weaver et al. 2003; subscription required). Neither humanity nor, from the perspective of millennial time scales (per the above quote from the IPCC), the rest of nature seem the worse for it. Coral reefs for example, evolved and their compositions changed over millennia as new reefs grew while older ones were submerged in deeper water (e.g., Cabioch et al. 2008). So while there have been ecological changes, it is unknown whether the changes were for better or worse. For a melting of the GIS (or WAIS) to qualify as a catastrophe, one has to show, rather than assume, that the ecological consequences would, in fact, be for the worse.

Human beings can certainly cope with sea level rise of such magnitudes if they have centuries or millennia to do so. In fact, if necessary they could probably get out of the way in a matter of decades, if not years.

Can a relocation of such a magnitude be accomplished?

Consider that the global population increased from 2.5 billion in 1950 to 6.8 billion this year. Among other things, this meant creating the infrastructure for an extra 4.3 billion people in the intervening 59 years (as well as improving the infrastructure for the 2.5 billion counted in the baseline, many of whom barely had any infrastructure whatsoever in 1950). These improvements occurred at a time when everyone was significantly poorer. (Global per capita income today is more than 3.5 times greater today than it was in 1950). Therefore, while relocation will be costly, in theory, tomorrow’s much wealthier world ought to be able to relocate billions of people to higher ground over the next few centuries, if need be. In fact, once a decision is made to relocate, the cost differential of relocating, say, 10 meters higher rather than a meter higher is probably marginal. It should also be noted that over millennia the world’s infrastructure will have to be renewed or replaced dozens of times – and the world will be better for it. [For example, the ancient city of Troy, once on the coast but now a few kilometers inland, was built and rebuilt at least 9 times in 3 millennia.]

Also, so long as we are concerned about potential geological catastrophes whose probability of occurrence and impacts have yet to be scientifically estimated, we should also consider equally low or higher probability events that might negate their impacts. Specifically, it is quite possible — in fact probable — that somewhere between now and 2100 or 2200, technologies will become available that will deal with climate change much more economically than currently available technologies for reducing GHG emissions. Such technologies may include ocean fertilization, carbon sequestration, geo-engineering options (e.g., deploying mirrors in space) or more efficient solar or photovoltaic technologies. Similarly, there is a finite, non-zero probability that new and improved adaptation technologies will become available that will substantially reduce the net adverse impacts of climate change.

The historical record shows that this has occurred over the past century for virtually every climate-sensitive sector that has been studied. For example, from 1900-1970, U.S. death rates due to various climate-sensitive water-related diseases — dysentery, typhoid, paratyphoid, other gastrointestinal disease, and malaria —declined by 99.6 to 100.0 percent. Similarly, poor agricultural productivity exacerbated by drought contributed to famines in India and China off and on through the 19th and 20th centuries killing millions of people, but such famines haven’t recurred since the 1970s despite any climate change and the fact that populations are several-fold higher today. And by the early 2000s, deaths and death rates due to extreme weather events had dropped worldwide by over 95% of their earlier 20th century peaks (Goklany 2006).

With respect to another global warming bogeyman — the shutdown of the thermohaline circulation (AKA the meridional overturning circulation), the basis for the deep freeze depicted in the movie, The Day After Tomorrow — the IPCC WG I SPM notes (p. 16), “Based on current model simulations, it is very likely that the meridional overturning circulation (MOC) of the Atlantic Ocean will slow down during the 21st century. The multi-model average reduction by 2100 is 25% (range from zero to about 50%) for SRES emission scenario A1B. Temperatures in the Atlantic region are projected to increase despite such changes due to the much larger warming associated with projected increases in greenhouse gases. It is very unlikely that the MOC will undergo a large abrupt transition during the 21st century. Longer-term changes in the MOC cannot be assessed with confidence.”

Not much has changed since then. A shut down of the MOC doesn’t look any more likely now than it did then. See here, here, and here (pp. 316-317).

If one wants to develop rational policies to address speculative catastrophic events that could conceivably occur over the next few centuries or millennia, as a start one should consider the universe of potential catastrophes and then develop criteria as to which should be addressed and which not. Rational analysis must necessarily be based on systematic analysis, and not on cherry picking one’s favorite catastrophes.

Just as one may speculate on global warming induced catastrophes, one may just as plausibly also speculate on catastrophes that may result absent global warming. Consider, for example, the possibility that absent global warming, the Little Ice Age might return. The consequences of another ice age, Little or not, could range from the severely negative to the positive (if that would buffer the negative consequences of warming). That such a recurrence is not unlikely is evident from the fact that the earth entered and, only a century and a half ago, retreated from a Little Ice Age, and that history may indeed repeat itself over centuries or millennia.

Yet another catastrophe that greenhouse gas controls may cause is that CO2 not only contributes to warming, it is also the key building block of life as we know it. All vegetation is created by the photosynthesis of CO2 in the atmosphere. In fact, according to the IPCC WG I report (2007, p. 106), net primary productivity of the global biosphere has increased in recent decades, partly due to greater warming, higher CO2 concentrations and nitrogen deposition. Thus , there is a finite probability that reducing CO2 emissions would, therefore, reduce the net primary productivity of the terrestrial biosphere with potentially severe negative consequences for the amount and diversity of wildlife that it could support, as well as agricultural and forest productivity with adverse knock on effects on hunger and health.

There is also a finite probability that costs of GHG reductions could reduce economic growth worldwide. Even if only industrialized countries sign up for emission reductions, the negative consequences could show up in developing countries because they derive a substantial share of their income from aid, trade, tourism, and remittances from the rest of the world. See, for example, Tol (2005), which examines this possibility, although the extent to which that study fully considered these factors (i.e., aid, trade, tourism, and remittances) is unclear.

Finally, one of the problems with the argument that society should address low probability high impact events (assuming a probability could be estimated rather than assumed or guessed) is that it necessarily means there is a high probability that resources expended on addressing such catastrophic events will have been squandered. This wouldn’t be a problem but for the fact that there are opportunity costs associated with this.

According to the 2007 IPCC Science Assessment’s Summary for Policy Makers (p. 10), “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.” In plain language, this means that the IPCC believes there is at least a 90% likelihood that anthropogenic greenhouse gas emissions (AGHG) are responsible for 50-100% of the global warming since 1950. In other words, there is an up to 10% chance that anthropogenic GHGs are not responsible for most of that warming.

This means there is an up to 10% chance that resources expended in limiting climate change would have been squandered. Since any effort to significantly reduce climate change will cost trillions of dollars (see Nordhaus 2008, p. 82), that would be an unqualified disaster, particularly since those very resources could be devoted to reducing urgent problems humanity faces here and now (e.g., hunger, malaria, safer water and sanitation) — problems we know exist for sure unlike the bogeymen that we can’t be certain about.

Spending money on speculative, even if plausible, catastrophes instead of problems we know exist for sure is like a starving man giving up a fat juicy bird in hand while hoping that we’ll catch several other birds sometime in the next few centuries even though we know those birds don’t exist today and may never exist in the future.

STEM Sky Not Falling?

Education policy is far too rarely driven by facts or logic – they’re just too inconvenient, mucking up both uber-hyped “crises” and warm-and-fuzzy myths.

Recently, the big scare has been that the United States is on its way to a desperate shortage of scientists and engineers, a message that has, of course, been heartily embraced by politicians determined to push more kids into science, technology, engineering, and mathematics (STEM) fields.

Well, it seems that once again the crisis du jour has been well overstated. USA Today has a great new story demonstrating that we actually have more than enough scientists and engineers. (Not that this hasn’t been pointed out before.) Most telling is the content in the article’s  sidebar, which includes some real crisis-deflating stuff:

Detailed findings issued last year by the federally funded RAND National Defense Research Institute found “no evidence of a current shortage” of science and engineering workers. It said National Science Foundation predictions of shortages so far have proved “inaccurate.”

RAND… recommended a permanent commitment to monitoring the USA’s science and technology performance, but said the slow growth of U.S.-born technical workers “will change when the earnings and attractiveness of S&E (science and engineering) careers improve.”

So we actually have plenty of scientists and engineers, and the market appears to be working just as it should?  I hope someone tells our leaders! Otherwise, they’ll almost certainly push even more kids into jobs that, it turns out, will probably only exist in the land of imaginary crises.