Topic: Energy and Environment

Pielke’s Problem

I generally admire the work of Roger Pielke Jr., a political scientist in the University of Colorado-Boulder’s Center for Science and Technology Policy Research. His new book on climate change is refreshingly honest and non-ideological, if a bit overly technophilic. His broader work offers the important insight that science alone cannot direct public policy, but rather it can only lay out possible results of different policy choices.

Given the quality of his work, I was disappointed by Pielke’s op-ed in today’s NYT defending Congress’s legislated obsolescence of the incandescent light bulb. He argues that government standard-setting is an important contribution to human welfare, and the light bulb standard is just part of that standard-setting (though he does suggest some minor policy tweaks to allow limited future availability of incandescents). 

To justify his argument, Pielke points out the great benefit of government-established standard measures, as well as quality standards:

Indeed, [in the United States of the late 19th century] the lack of standards for everything from weights and measures to electricity — even the gallon, for example, had eight definitions — threatened to overwhelm industry and consumers with a confusing array of incompatible choices.

This wasn’t the case everywhere. Germany’s standards agency, established in 1887, was busy setting rules for everything from the content of dyes to the process for making porcelain; other European countries soon followed suit. Higher-quality products, in turn, helped the growth in Germany’s trade exceed that of the United States in the 1890s.

America finally got its act together in 1894, when Congress standardized the meaning of what are today common scientific measures, including the ohm, the volt, the watt and the henry, in line with international metrics. And, in 1901, the United States became the last major economic power to establish an agency to set technological standards.

 Alas, this argument doesn’t support Pielke’s light bulb standard.

The weights-and-measures and product standards that he cites are examples of government response to market failures—instances where private action is unable to reach efficient results. Concerning weights and measures, a type of market failure known as the collective action problem can make it difficult to establish standard measures privately. Getting everyone to agree can be like herding cats, and there is ample incentive to secretly defect from that standard — e.g., a gas station would love to sell you a 120-ounce “gallon” that you assume is a standard 128 ounces. (OTOH, there are plenty of examples of private action overcoming this problem, such as the standardization of railroad track gauges in the late 19th century.) Likewise, quality standards can be understood as a response to a kind of market failure known as the information asymmetry problem— e.g., a producer of low-quality goods may knowingly try to pass them off as high-quality goods. (Again, there are plenty of examples of private action overcoming this problem.)

As libertarians, we recognize that there are market failures, and that government can sometimes mitigate them. (That’s why we’re not anarchists.) Also as libertarians, we recognize that government intervention can result in outcomes even less efficient than the original market failure. (That’s why we’re not run-of-the-mill Democrats or Republicans.)

But where is the market failure with incandescent bulbs? After nearly 125 years of use, people know the drawbacks and advantages of incandescents—that they use more electricity than other types of bulbs and have a shorter lifespan, but they cost very little and work much better in certain applications—from dimmer switches to Easy-Bake Ovens—than other bulbs. Besides, CFL bulbs were widely available before Congress’s 2007 legislation, and LED lights were already in the R&D pipeline.

Perhaps Pielke would argue that there is a market failure with incandescents: the negative externality of air pollution, including greenhouse gas emissions. But incandescent lighting is only one of many, many electricity-using devices, and electricity generation is just one of many, many sources of air pollution. So why the focus on just this one externality source instead of advocating a policy that broadly addresses emissions? And why devote his op-ed to discussing technology standards, and make no mention of air pollution?

The Current Wisdom: Overplaying the Human Contribution to Recent Weather Extremes

The Current Wisdom is a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

**********

 The recent publication of two articles in Nature magazine proclaiming a link to rainfall extremes (and flooding) to global warming, added to the heat in Russia and the floods in Pakistan in the summer of 2010, and the back-to-back cold and snowy winters in the eastern U.S. and western Europe, have gotten a lot of public attention.  This includes a recent hearing in the House of Representatives, despite its Republican majority.  Tying weather extremes to global warming, or using them as “proof” that warming doesn’t exist (see: snowstorms), is a popular rhetorical flourish by politicos of all stripes.  

The hearing struck many as quite odd, inasmuch as it is much clearer than apocalyptic global warming that the House is going to pass meaningless legislation commanding the EPA to cease and desist from regulating greenhouse gas emissions.  “Meaningless” means that it surely will not become law.  Even on the long-shot probability that it passes the Senate, the President will surely veto, and there are nowhere near enough votes to override such an action.

Perhaps “wolf!” has been cried yet again.  A string of soon-to-be-published papers in the scientific literature finds that despite all hue and cry about global warming and recent extreme weather events, natural climate variability is to blame.

Where to start?  How about last summer’s Russian heat wave?

The Russian heat wave (and to some degree the floods in Pakistan) have been linked to the same large-scale, stationary weather system, called an atmospheric “blocking” pattern. When the atmosphere is “blocked” it means that it stays in the same configuration for period of several weeks (or more) and keeps delivering the same weather to the same area for what can seem like an eternity to people in the way.  Capitalizing on the misery in Russia and Pakistan, atmospheric blocking was added to the list of things that were supposed to be “consistent with” anthropogenically stimulated global warming which already, of course included heat waves and floods. And thus the Great Russian Heat Wave of 2010 became part of global warming lore.

But then a funny thing happened – scientists with a working knowledge of atmospheric dynamics started to review the situation and found scant evidence for global warming.

The first chink in the armor came back in the fall of 2010, when scientists from the Physical Sciences Division (PSD) of the Earth System Research Laboratory (ESRL) of the National Oceanic and Atmospheric Administration (NOAA) presented the results of their preliminary investigation on the web , and concluded that “[d]espite this strong evidence for a warming planet, greenhouse gas forcing fails to explain the 2010 heat wave over western Russia. The natural process of atmospheric blocking, and the climate impacts induced by such blocking, are the principal cause for this heat wave.”

The PSD folks have now followed this up with a new peer-reviewed article in the journal Geophysical Research Letters that rejects the global warming explanation. The paper is titled “Was There a Basis for Anticipating the 2010 Russian Heat Wave?” Turns out that there wasn’t.

To prove this, the research team, led by PSD’s Randall Dole, first reviewed the observed temperature history of the region affected by the heat wave (western Russia, Belarus, the Ukraine, and the Baltic nations). To start, they looked at the recent antecedent conditions: “Despite record warm globally-averaged surface temperatures over the first six months of 2010, Moscow experienced an unusually cold winter and a relatively mild but variable spring, providing no hint of the record heat yet to come.” Nothing there.

Then they looked at the long-term temperature record: “The July surface temperatures for the region impacted by the 2010 Russian heat wave shows no significant warming trend over the prior 130-year period from 1880 to 2009…. A linear trend calculation yields a total temperature change over the 130 years of -0.1°C (with a range of 0 to -0.4°C over the four data sets [they examined]).” There’s not a hint of a build-up to a big heat wave.

And as to the behavior of temperature extremes: “There is also no clear indication of a trend toward increasing warm extremes. The prior 10 warmest Julys are distributed across the entire period and exhibit only modest clustering earlier in this decade, in the 1980s and in the 1930s…. This behavior differs substantially from globally averaged annual temperatures, for which eleven of the last twelve years ending in 2006 rank among the twelve warmest years in the instrumental record since 1850….”

With regard any indication that “global” warming was pushing temperatures higher in Russia and thus helped to fuel the extreme heat last summer, Dole et al. say this: “With no significant long-term trend in western Russia July surface temperatures detected over the period 1880-2009, mean regional temperature changes are thus very unlikely to have contributed substantially to the magnitude of the 2010 Russian heat wave.”

Next the PSD folks looked to see if the existing larger-scale antecedent conditions, fed into climate models would produce the atmospheric circulation patterns (i.e. blocking) that gave rise to the heat wave.  The tested “predictors” included patterns of sea surface temperature and arctic ice coverage, which most people feel have been subject to some human influence.  No relationship: “These findings suggest that the blocking and heat wave were not primarily a forced response to specific boundary conditions during 2010.”

In fact, the climate models exhibited no predilection for projecting increases in the frequency of atmospheric blocking patterns over the region as greenhouse gas concentrations increased. Just the opposite: “Results using very high-resolution climate models suggest that the number of Euro-Atlantic blocking events will decrease by the latter half of the 21st century.”

At this point, Dole and colleagues had about exhausted all lines of inquiry and summed things up:

 Our analysis points to a primarily natural cause for the Russian heat wave. This event appears to be mainly due to internal atmospheric dynamical processes that produced and maintained an intense and long-lived blocking event. Results from prior studies suggest that it is likely that the intensity of the heat wave was further increased by regional land surface feedbacks. The absence of long-term trends in regional mean temperatures and variability together with the model results indicate that it is very unlikely that warming attributable to increasing greenhouse gas concentrations contributed substantially to the magnitude of this heat wave.

Can’t be much clearer than that.

But that was last summer. What about the past two winters? Both were very cold in the eastern U.S. with record snows events and/or totals scattered about the country.

Cold, snow, and global warming? On Christmas Day 2010, the New York Times ran an op-ed by Judah Cohen, a long-range forecaster for the private forecasting firm Atmospheric and Environmental Research, outlining his theory as to how late summer Arctic ice declines lead to more fall snow cover across Siberia which in turn induces atmospheric circulation patterns to favor snowstorms along the East Coast of the U.S. Just last week, the Union of Concerned Scientists held a news conference where they handed out a press release  headlined “Climate Change Makes Major Snowstorms Likely.” In that release, Mark Serreze, director of the National Snow and Ice Data Center, laid out his theory as to how the loss of Arctic sea ice is helping to provide more moisture to fuel winter snowstorms across the U.S. as well as altering atmospheric circulation patterns into a preferred state for big snowstorms. Weather Underground’s Jeff Masters chimed in with “Heavy snowstorms are not inconsistent with a warming planet.”

As is the wont for this Wisdom, let’s go back to the scientific literature.

Another soon-to-be released paper to appear in Geophysical Research Letters describes the results of using the seasonal weather prediction model from the European Center for Medium-Range Weather Forecasts (ECMWF) to help untangle the causes of the unusual atmospheric circulation patterns that gave rise to the harsh winter of 2009-2010 on both sides of the Atlantic. A team of ECMWF scientists led by Thomas Jung went back and did experiments changing initial conditions that were fed into the ECMWF model and then assessed how well the model simulated the known weather patterns of the winter of 2009-2010. The different set of initial conditions was selected so as to test all the pet theories behind the origins of the harsh winter.  Jung et al. describe their investigations this way: “Here, the origin and predictability of the unusual winter of 2009/10 are explored through numerical experimentation with the ECMWF Monthly forecasting system. More specifically, the role of anomalies in sea surface temperature (SST) and sea ice, the tropical atmospheric circulation, the stratospheric polar vortex, solar insolation and near surface temperature (proxy for snow cover) are examined.”

Here is what they found after running their series of experiments.

Arctic sea ice and sea surface temperature anomalies.  These are often associated with global warming caused by people. Finding:  “These results suggest that neither SST nor sea ice anomalies explain the negative phase of the NAO during the 2009/10 winter.”

(NAO are the commonly used initials for the North Atlantic Oscillation – and atmospheric circulation pattern that can act to influence winter weather in the eastern U.S. and western Europe. A negative phase of the NAO is associated with cold and stormy weather and during the winter of 2009-10, the NAO value was the lowest ever observed.)

A global warming-induced weakening stratospheric (upper-atmosphere) jetstream. “Like for the other experiments, these stratospheric relaxation experiments fail to reproduce the magnitude of the observed NAO anomaly.”

Siberian snow cover.  “The resulting [upper air patterns] show little resemblance with the observations…. The implied weak role of snow cover anomalies is consistent with other research….”

Solar variability.  “The experiments carried out in this study suggest that the impact of anomalously low incoming [ultraviolet] radiation on the tropospheric circulation in the North Atlantic region are very small… suggesting that the unusually low solar activity contributed little, if any, to the observed NAO anomaly during the 2009/10 winter.”

Ok then, well what did cause the unusual weather patterns during the 2009-10 winter?

The results of this study, therefore, increase the likelihood that both the development and persistence of negative NAO phase resulted from internal atmospheric dynamical processes.

Translation: Random variability.

To drive this finding home, here’s another soon-to-be-released paper (D’Arrigo et al., 2001) that uses tree ring-based reconstructions of atmospheric circulation patterns and finds a similar set of conditions (including a negative NAO value second only to the 2009-10 winter) was responsible for the historically harsh winter of 1783-84 in the eastern U.S. and western Europe, which  was widely noted by historians. It followed the stupendous eruption of the Icelandic volcano Laki the previous summer. The frigid and snowy winter conditions have been blamed on the volcano. In fact, Benjamin Franklin even commented as much.

But in their new study, Roseanne D’Arrigo and colleagues conclude that the harshness of that winter primarily was the result of anomalous atmospheric circulation patterns that closely resembled those observed during the winter of 2009-10, and that the previous summer’s volcanic eruption played a far less prominent role:

Our results suggest that Franklin and others may have been mistaken in attributing winter conditions in 1783-4 mainly to Laki or another eruption, rather than unforced variability.

Similarly, conditions during the 2009-10 winter likely resulted from natural [atmospheric] variability, not tied to greenhouse gas forcing… Evidence thus suggests that these winters were linked to the rare but natural occurrence of negative NAO and El Niño events.

The point is that natural variability can and does produce extreme events on every time scale, from days (e.g., individual storms), weeks (e.g., the Russian heat wave), months (e.g., the winter of 2009-10), decades (e.g., the lack of global warming since 1998), centuries (e.g., the Little Ice Age), millennia (e.g., the cycle of major Ice Ages), and eons (e.g., snowball earth).

Folks would do well to keep this in mind next time global warming is being posited for the weather disaster du jour. Almost assuredly, it is all hype and little might.

Too bad these results weren’t given a “hearing” in the House!

References:

D’Arrigo, R., et al., 2011. The anomalous winter of 1783-1784: Was the Laki eruption or an analog of the 2009–2010 winter to blame? Geophysical Research Letters, in press.

Dole, R., et al., 2011. Was there a basis for anticipating the 2010 Russian heat wave? Geophysical Research Letters, in press.

Jung et al., 2011. Origin and predictability of the extreme negative NAO winter of 2009/10. Geophysical Research Letters, in press.

Min, S-K., et al., 2011. Human contribution to more-intense precipitation extremes. Nature, 470, 378-381.

Pall, P., et al., 2011. Anthropogenic greenhouse gas contribution to flood risk in England and Wales in autumn 2000. Nature, 470, 382-386.

Gingrich & Woolsey on Energy

The other day, The Wall Street Journal provided a public service by lambasting Newt Gingrich for his absurd speech to the ethanol lobby in Des Moines last month (money line:  ”Obviously big urban newspapers want to kill it because it’s working, and you wonder, ‘What are their values?’”).  Today, Gingrich and fellow ethanol-maven James Woolsey struck back in those very same pages.  In doing so, Gingrich provided yet more evidence that he’s intellectually unfit for office.

“It is in this country’s long-term best interest,” he said, ”to stop the flow of $1 billion a day overseas.”  Really?  So money sent overseas is gone forever.  News to me.  The only thing you can buy with dollars earned from oil sales to the U.S. is to buy things denominated in dollars or to exchange them so that someone else can.  And we sell a lot of stuff to foreigners that are denominated in dollars (treasury bills for one) and that money comes right back to the good old U.S. of A.

But put that aside.  If Gingrich really believes this, then why not just ban all imports all together?  Is that what the GOP is about these days - rank gooberism on trade?

And one other thing; the U.S. does not spend $1 billion a day on foreign oil.  It spends about half that; $530 million a day (in 2009 anyway).

“[I] co-produced a movie with my wife, Callista, ‘We Have the Power,’ that argued for an ‘all of the above’ energy strategy which would maximize all forms of domestic energy production.”  Apparently, being a pol means that one doesn’t have to pick and choose between investments a, b, or c.  We’ll just mandate everyone invest in everything that can attract a lobbyist. 
When you hear this stuff about an ”all of the above” energy strategy, what you’re hearing is a complaint that the Democrats aren’t subsidizing enough of the energy industry.  They are too tight-fisted with the public purse.  They are not ambitious enough in their planning.  And while Republicans bang the table for more, more, and more handouts to private corporations, liberals like Amory Lovins (prominent left-of-center energy guru) and Carl Pope (former head of the Sierra Club) call for zeroing out everyone’s subsidies and leaving the energy market the heck alone (at least when it comes to this matter).  It’s a mad, mad world.
 
“Nevertheless,” says Gingrich, ”the Journal attempts to equate my career-long commitment to increased American energy production with the anti-energy agenda of President Obama. This is a laughable charge, especially considering I have been one of the most vocal opponents of the president’s energy policies since he took office.”  Perhaps, but on this matter, Gingrich is attacking the administration from the Left.  
 
Even more amusing was James Woolsey’s lecture to the editorial board over what it means to be a conservative.   “We could not help wondering,” he asked along with his co-author, Gal Luft, ”why the Journal, despite its commitment to free enterprise, chose to attack Newt Gingrich for his call to open vehicles to fuel competition, which would cost auto makers under $100 per new car.”  Well Jim, a commitment to free enterprise is a commitment to allow enterprises to be free to produce whatever they want.  Of course, if Woolsey had read Gingrich’s speech to the ethanol lobby, he would not need to wonder - it’s about their sick, twisted values.
 
Nonetheless, Woolsey claims that such a mandate ”is perfectly in line with conservative economic principles.”  That may be true given what conservatives believe about economics.  But it’s not consistent with a principled support for a free market.
 
Finally, “Challenging Mr. Gingrich’s conservative bona fides based on his support for breaking oil’s virtual monopoly over transportation fuel is not only myopic but also the best gift the Journal can give OPEC.”  But … oil dominates the transportation market because it is a heck of a lot cheaper than any other fuel.  If it weren’t so much cheaper than ethanol, then we would have no need for such massive subsidies for the same.  The same goes for electric cars.  If and when that changes, oil’s “monopoly” will crumble.  Until then, taking oil out of transportation markets simply takes cheap fuel out of transportation markets.  It would be fun to watch a Gingrich/Woolsey ticket run on that.

Al Gore on Snowpocalypse 2011

Today POLITICO Arena asks:

Ex-VP Al Gore says the snowstorms that paralyzed much of the U.S. this week are more evidence of manmade global warming. “The scientific community has been addressing this particular question for some time now and they say that increased heavy snowfalls are completely consistent with what they have been predicting as a consequence of man-made global warming.” Do you agree?

My response:

A scientific hypothesis that’s essentially unfalsifiable – cold corroborates “global warming,” heat corroborates it, nothing really falsifies it – is worse than useless. It’s a scientific poseur, properly classified as a belief system, like religion. And the implication that there’s an optimal earth temperature, or range of temperatures, or that global warming is destructive, not possibly beneficial, is just further evidence that there’s more going on here than pure science.

Throw in beliefs about the human contributions to “global warming” and the policy recommendations that follow – massive shifts toward wildly expensive command-and-control energy systems, the effect on the world’s poor notwithstanding – and the politics of the matter come into view. Let’s remember that Al Gore, who never missed an opportunity to expand government,  was once an ethanol evangelist, a posture he’s recently admitted was connected mainly with presidential politics in Iowa – now that ethanol has been shown to have negative environmental consequences. Frankly, I’ll stick with Punxsutawney Phil.

Egypt and Energy Policy

Today Politico Arena asks:

Given that crude oil prices surged to nearly $90 per barrel on Friday, and could spike even higher if the crisis causes a shutdown of the Suez Canal, how should policymakers in Wasihngton respond regarding oil and the crisis in Egypt? Does the situation underscore a need for more domestic production? And does this crisis bolster or hamper Obama’s clean energy initiative that he called for in his State of the Union address last week?

My response:

The unrest in Egypt should have no bearing whatever on American energy policy. Like nearly every other commodity – food, clothing, shelter, education, health care – energy, from whatever source, is far more efficiently and equitably produced and distributed by the market than by governments, even when foreign governments play a part in that process. We saw that in the “energy crises” of the ’70s; we’ve seen it in every ”crisis” since.

Why, in the name of “energy independence,” should the U.S. government “promote” domestic production if foreign energy is cheaper? Do we imagine that manifold foreign producers will not supply us if the price is right? Where’s the evidence for that? Any government promotion should be by simply getting out of the way and letting the market determine where energy is produced.

Nor should today’s Egyptian unrest affect Obama’s “clean energy initiative,” which should fall on its own terms. It’s nothing but a massive government intrusion into the market, subsidizing expensive sources of energy for little environmental gain, making us all poorer, but especially the poorest among us. Do we need any better example that the ethanol boondoggle, which even Al Gore has admitted is environmentally destructive? Energy policy will be an early test of whether the new House majority is serious about reducing the role of government in our lives.

Property Rights and the Takoma Park Tree Tussle

It’s enviro vs. enviro in Washington’s most “progressive” suburb, Takoma Park. Indeed, the Washington Post reports, “a potentially bough-breaking debate between sun-worshipers and tree-huggers.” That is, which is more environmentally desirable, solar power or tree cover?

The modest gray house in Takoma Park was nearly perfect, from Patrick Earle’s staunchly environmentalist point of view. It was small enough for wood-stove heating, faced the right way for good solar exposure and, most important, was in a liberal suburb that embraces all things ecological.

Or almost all. When Earle and his wife, Shannon, recently sought to add solar panels to the house, which they have been turning into a sustainability showplace, the couple discovered that Takoma Park values something even more than new energy technologies: big, old trees.

When they applied to cut down a partially rotten 50-foot silver maple that overshadowed their roof, the Earles ran into one of the nation’s strictest tree-protection ordinances. Under the law, the town arborist would approve removing the maple only if the couple agreed to pay $4,000 into a city tree-replacement fund or plant 23 saplings on their own.

So now the rival environmentalists are squaring off in front of the city council:

Takoma Park City Council members, who are considering revising the 1983 tree-protection law, listened Monday night as otherwise like-minded activists vied to claim the green high ground.

Tree partisans hailed the benefits of the leafy canopy that shades 59 percent of the town: Trees absorb carbon, take up stormwater, control erosion and provide natural cooling….

Solar advocates at the hearing said that they are tree lovers, too, but that scientific studies support the idea of poking select holes in the tree cover to let a little sun power through.

Being an environmentalist homeowner can become a full-time job:

But even some veteran solar users don’t like the idea of trading trees for panels. Mike Tidwell, founder of the Chesapeake Climate Action Network, installed solar panels on his Takoma Park house 10 years ago. As the trees have grown, the panels’ effectiveness has diminished, and Tidwell now buys wind power credits to supplement them.

Still, he said, “I don’t believe you should cut down trees for solar.” Rather, he thinks neighbors should work together to place shared panels on the sunniest roofs.

The city’s “official arborist” turned down Earle’s application to tear down one rotting tree to accommodate his solar panels. Now the council is debating the issue.

The Earles’ council member, Josh Wright, said he was sympathetic to their plight. He said it should remain hard to cut down a tree, but he’d like to see a break for people installing solar power. Wright also wants all homeowners to get credit for trees they may have planted in the years before they remove a tree.

It all sounds very complicated. And who knows what the right answer is? Or if there is a right answer? Or if the right answer might change next year?

And that’s where property rights come in.  They allocate both jurisdiction and liability over scarce resources, like roofs, trees, and access to sunlight.  A little “law and economics” can help to understand the Takoma Park Tree Tussle.  Nobel Laureate in Economics Ronald Coase, who just turned 100, brought law and economics together to study the way that people externalize costs (make others pay for them) or internalize them (take them into account when making decisions).  When property rights are well defined and legally secure, and rights can be exchanged at low cost, resources will be directed to their most highly valued use.  In fact, the initial allocation of property rights doesn’t affect the allocation of resources, if the transfers are freely and easily negotiable.

That, unfortunately, is no longer the case in Takoma Park, where instead of a fairly straightforward transaction (facilitated by a purchase), there is a tussle over ill-defined rights and obligations that have little or no legal security, in a very expensive and costly process of negotiation that will almost certainly consume more wood pulp for memos than is contained in the tree in question.  Well-defined and legally secure property rights save us the rather substantial trouble of sitting down like the Takoma Park City Council and trying to judge the advisability of every proposed purchase, all the while consuming large amounts of paper and exuding large amount of hot air.

The Traffic Congestion Problem

A new report says that traffic congestion is worse, and the American Public Transportation Association urges Congress to … spend more money on public transportation.

Cato senior fellow Randal O’Toole has been challenging the received wisdom on traffic and mass transit for years. See his book Gridlock: Why We’re Stuck in Traffic and What to Do About It, and lots of other studies. In November he debated the head of the American Public Transportation Association at a Cato Policy Forum: