Topic: Energy and Environment

Was Typhoon Haiyan the Most Intense Storm in Modern History?

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Global warming buffs have been fond of claiming that the roaring winds of Typhoon Haiyan were the highest ever measured in a landfalling tropical cyclone, and that therefore (?) this is a result of climate change. In reality, it’s unclear whether or not it holds the modern record for the strongest surface wind at landfall. 

This won’t be known until there is a thorough examination of its debris field.

The storm of record is 1969 Hurricane Camille, which I rode out in an oceanfront laboratory about 25 miles east of the eye. There’s a variety of evidence arguing that Camille is going to be able to retain her crown.

The lowest pressure in Haiyan was 895 millibars, or 26.42 inches of mercury. To give an idea, the needle on your grandmonther’s dial barometer would have to turn two complete counterclockwise circles to get there. While there have been four storms in the Atlantic in the modern era that have been as strong or a bit stronger, the western Pacific sees one of these approximately every two years or so.

Camille’s lowest pressure was a bit higher, at 905 mb (26.72 inches). At first blush it would therefore seem Haiyan would win the blowhard award hands down, but Hayian had a very large eye around which its winds swirled, while Camille’s was one of the smallest ever measured.  At times in its brief life, Camille’s was so small that the hurricane hunter aircraft could not safely complete a 360 degree turn without brushing through the devastating innermost cloud band, something you just don’t want to be near in a turning aircraft. In fact, the last aircraft to get into Camille, which measured 190mph sustained winds, lost an engine in the severe turbulence and fortunately was able to limp home.

Haiyan’s estimated 195mph winds were derived from satellite data, rather than being directly sensed by an aircraft.  But winds over the open ocean are always greater than those at landfall because of friction, and the five mph difference between the two storms is physically meaningless. 

Victory for Cato: Feds Now Seeking Input on the Social Cost of Carbon

It’s about time!

For months, we have been hammering away at the point that the Feds’ current determination of the social cost of carbon is grossly out of touch with the relevant scientific literature and economic guidance.

Perhaps in response to the fact that they can’t argue against what we have been saying, the Administration has finally capitulated and is opening up their determination of the social cost of carbon (SCC) for public comment.

Their SCC calculation—in keeping with the playbook of the president’s Climate Action Plan—is a backdoor way of implementing a carbon tax. And it is slowly, pervasively, and worse of all, silently, creeping into all of our lives.  We’ve been trying to stop all of this by, at the very least, pulling back the cloak of secrecy and trying to make this once-esoteric  subject a topic of dinnertime conversation.

Meanwhile,  the government’s regulatory push using the SCC continues.

The Institute for Energy Research has recently identified nearly 30 federal regulations which have incorporated the SCC into their cost benefit analysis (and several more have been recently announced).

The SCC is used to make regulations seem less costly.  We say “seem,” because the “benefit” from reducing carbon dioxide (CO2)  emissions, as valued by the SCC, is likely never to be realized by the American consumer—yet the other costs (such as increased manufacturing costs) most assuredly will be.

The SCC is a theoretical cost of each additional CO2 emission. But the theory is so loosey-goosey that with a little creativity, you can arrive at pretty much any value for  the SCC—a point noted by M.I.T.’s Robert Pindyck in an article for the Summer 2013 edition of Cato’s Regulation.

As the Obama Administration wants to regulate away as many carbon dioxide emissions as possible, it is in its own self-interest to try to arrive at the highest SCC value possible.  This way, the more that CO2 emissions are reduced, the more money is “saved.”

Or so the idea goes.

But their path towards a high SCC is one away from both the best science and the most common-sense economics.

We imagine that readers of this blog probably are well-aware of the details behind this reality, as we have laid them out on many occasions,  so we won’t go into them again here.

Instead, we want to point out several opportunities to draw further attention to the short-comings in the Administration’s SCC determination.

The period for accepting public comments on several proposed rulemakings is open, and provides a good opportunity to remind the issuing agency what they did wrong. For example, here is a recently-announced regulation proposal from the Department of Energy (DoE) which seeks to impose higher energy efficiency rules for residential furnace fans. It employs the SCC to make this rule seem a lot sweeter than it actually is.

We have already submitted comments on several of these proposed regulations, including DoE regulations to increase the efficiency standards for Microwave Ovens, Walk-In Freezers, and Commercial Refrigeration Equipment.

So, it’s important that  the White House’s  Office of Management and Budget (OBM)  just announced that the social cost of carbon determination currently in force will be open to public comment starting sometime in the presumably near future (keep an eye on the Federal Register for the official announcement).

While it is too early to tell, this willingness to hear public comments on the SCC probably originated from the comments received on the Petition to Reconsider the proposed Microwave Oven ruling—the first rulemaking to incorporate the Administration’s latest-worst iteration of the SCC (which was about a 50% increase over its original figure). There hasn’t been an official announcement as to the result of Petition, but the scientific argument against it is a Cato product.

More than likely, though, this will all be for show.  The feds could selectively use some comments and  somehow find a way to raise the SCC even further.  Like we said, that’s easy to do—crank down the discount rate, or  crank up the damage function (make-up new damages not included in the current models)—even while paying lip service to the lowered equilibrium climate sensitivity and the CO2 fertilization effect.

We’d be more than happy to be wrong about this. But until then, our efforts to set things straight will continue.

Energy Subsidies vs. Energy Progress

If we did a poll of free market economists about federal programs that are the most wasteful and ridiculous, energy subsidies would be near the top of the list. It’s not just that energy subsidies make no sense in economic theory, but also that there are so many news stories highlighting the folly that it’s hard to see why policymakers persist in wasting our money.

From the Washington Post on Friday:

The Department of Energy failed to disclose concerns about a green-technology company that won $135 million in federal funding but ended up filing for bankruptcy in September, according to a watchdog report released this week. DOE Inspector General Gregory Friedman noted that the firm, San Francisco-based Ecotality, is still due to receive $26 million from the agency for testing electric vehicles.

The Energy Department awarded the firm $100 million in 2009 Recovery Act funding for that initiative, in addition to a combined $35 million from a separate program to help pay for testing vehicles

Ecotality is among a number of failed firms that received stimulus funding through an Obama administration initiative to support green-technology companies during the recession. Solyndra, a Silicon Valley-based solar-panel maker, stands as perhaps the most high-profile example. The business collapsed after receiving more than a half-billion dollars in Recovery Act money. Other examples include Beacon Power , a Massachusetts-based company that received at least $39 million from the federal government, along with Michigan-based battery manufacturers LG Chem and A123, which landed grants worth $150 million and $249 million, respectively.

On Sunday, the Washington Post profiled the economic chaos, central planning, and wasteful lobbying generated by federal mandates for cellulosic ethanol:

Congress assumed that it could be phased in gradually, but not this gradually. This year refiners were supposed to mix about one billion gallons of it into motor fuel. So far, there has been hardly a drop. More than a dozen companies have tried and failed to find a profitable formula combining sophisticated enzymes and the mundane but costly and labor-intensive job of collecting biomass.

To reach the ethanol goals set by Congress, the government came up with a byzantine implementation plan. Each gallon of renewable fuel has its own 38-character number, called a “renewable identification number,” to track its use and monitor trading. There are different types of these RINs for different biofuels, including corn-based ethanol, cellulosic ethanol and biodiesel.

In February of each year, refiners who fail to provide enough renewable fuel to the blenders who mix ethanol and gasoline must buy extra RIN certificates. When companies have extra credits for renewable fuels, the RINs can be banked and sold in later years. If there are not enough renewable fuels overall, the price of RINs rises — and provides an incentive to produce more.

And in a related story on ethanol, the Post found:

Five years ago, about a dozen companies were racing to start up distilleries that would produce enough cellulosic ethanol to meet the congressionally mandated target of 16 billion gallons a year by 2022 … The Agriculture Department provided a $250 million loan guarantee for the Coskata plant. Today, most of the dozen contenders have gone out of business or shelved their plans.

Federal subsidies and mandates for ethanol and other energy activities are sadly causing the diversion of billions of dollars of capital to uneconomic uses. That’s the bad news.

But there is good news on the energy front, which comes from far outside of Washington. The Wall Street Journal last weekend profiled “the little guys,” the market entrepreneurs, who were behind the shale energy revolution:

The experts keep getting it wrong. And the oddballs keep getting it right. Over the past five years of business history, two events have shocked and transformed the nation. In 2007 and 2008, the housing market crumbled and the financial system collapsed, causing trillions of dollars of losses. Around the same time, a few little-known wildcatters began pumping meaningful amounts of oil and gas from U.S. shale formations. A country that once was running out of energy now is on track to become the world’s leading producer.

The resurgence in U.S. energy came from a group of brash wildcatters who discovered techniques to hydraulically fracture—or frack—and horizontally drill shale and other rock. Many of these men operated on the fringes of the oil industry, some without college degrees or much background in drilling, geology or engineering.

Thank goodness for the oddballs. And thank goodness for the market system that channels the brashness into creating growth for all of us, not just the favored few getting handouts from Washington.

Transit Spending Slows Urban Growth

Contrary to the claims of many transit advocates, regions that spend more money on transit seem to grow slower than regions that spend less. The fastest-growing urban areas of the country tend to offer transit service mainly to people who lack access to automobiles. Urban areas that seek to provide high-cost transit services, such as trains, in order to attract people out of their cars, tend to grow far slower.

Transit advocates often argue that a particular city or region must spend more on urban transit in order to support the growth of that region. To test that claim, I downloaded the latest historic data files from the National Transit Database, specifically the capital funding and service data and operating expenses by mode time series. These files list which urbanized area each transit agency primarily serves, so it was easy to compare these data with Census Bureau population data from 1990, 2000, and 2010.

The transit data include capital and operating expenses for all years from 1991 through 2011. I decided to compare the average of 1991 through 2000 per capita expenses with population growth in the 1990s, and the average of 2001 through 2010 per capita expenses with population growth in the 2010s. In case there is a delayed response, I also compared the average of 1990 through 2000 per capita expenses with population growth in the 2000s. Although it shouldn’t matter too much, I used GNP deflators to convert all costs to 2012 dollars.

MagLev: The Idea Whose Time Never Came

Superconducting magnetic levitation is the “next generation of transportation,” says a new rail advocacy group that calls itself The Northeast Maglev (TNEM). The group’s proposed New York-Washington maglev line has received attention from the Washington Post and Baltimore Sun. TNEM’s claims might have seemed valid 80 years ago, when maglev trains were first conceived, but today maglev is just one more superexpensive technology that can’t compete with what we already have.

Superconducting maglev train being tested in Japan. Wikimedia commons photo by Yosemite.

Maglev has all the defects of conventional high-speed rail with the added bonuses of higher costs and greater energy requirements. Unlike automobiles on roads, rails don’t go where you want to go when you want to go there. Compared with planes, even the fastest trains are slow, and modest improvements in airport security would do far more to speed travelers, at a far lower cost, than building expensive new rail infrastructure.

Current Wisdom: Observations Now Inconsistent with Climate Model Predictions for 25 (going on 35) Years

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels and Paul C. “Chip” Knappenberger, from Cato’s Center for the Study of Science, review interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.  

 

Question: How long will the fantasy that climate models are reliable indicators of the earth’s climate evolution persist in face of overwhelming evidence to the contrary?  

Answer: Probably for as long as there is a crusade against fossil fuels.  

Without the exaggerated alarm conjured from overly pessimistic climate model projections of climate change from carbon dioxide emissions, fossil fuels—coal, oil, gas—would regain their image as the celebrated agents of  prosperity that they are, rather than being labeled as pernicious agents of our destruction.  

Just how credible are these climate models?  

In two words, “they’re not.”  

Everyone has read that over the past 10-15 years, most climate models’ forecasts of the rate of global warming have been wrong. Most predicted a hefty warming of the earth’s average surface temperature to have taken place, while there was no significant change in the real world.  

But very few  people know that the same situation has persisted for 25, going on 35 years, or that over the past 50-60 years (since the middle of the 20th century), the same models expected about 33 percent more warming to have taken place than was observed.  

Thanks to Natural Gas and Climate Change, U.S. Carbon Dioxide Emissions Continue Downward Trend

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Carbon dioxide emissions in the United States from the production and consumption of energy have been on the decline since about 2005, after generally being on the rise ever since our country was first founded.

The decline in emissions between 2012 and 2011 was 3.8 percent, which, according to the Energy Information Administration (EIA) was the largest decline in a non-recession year since 1990 and the first time that carbon dioxide (CO2) emissions fell while the per capita economic output increased by more than 2 percent.  In other words, we are producingmore while emitting less carbon dioxide.