Topic: Energy and Environment

Multi-Millionaire Singer Proposes Toilet Paper Restrictions

Al Gore’s histrionics are amusing, but nothing he has said compares to Sheryl Crow’s proposal to restrict how much toilet paper can be used. Perhaps there can be a new monitoring bureaucracy to search our homes. Maybe government agencies can stand guard in public restrooms. The BBC reports on the latest in cutting-edge environmentalism:

Singer Sheryl Crow has said a ban on using too much toilet paper should be introduced to help the environment. … The 45-year-old…has just toured the
US on a biodiesel-powered bus to raise awareness about climate change. …The pair targeted 11 university campuses to persuade students to help combat the world’s environmental problems. … “I have spent the better part of this tour trying to come up with easy ways for us all to become a part of the solution to global warming,” Crow wrote. …”I propose a limitation be put on how many squares of toilet paper can be used in any one sitting.”

Crow’s publicists managed to get the BBC to reference her biodiesel bus, but her environmental bona fides do not stand up under closer scrutiny. Thesmokinggun.com exposes the demands she makes when going on tour:

The rock star’s performance contract includes specific day-to-day instructions on what kind of booze Sheryl needs in her dressing room (TSG has never seen such attention to detail in any other concert rider we’ve posted). … promoters are directed to purchase specific booze depending on what day of the week the concert falls, as the below rider excerpt reveals. Additionally, when the global warming warrior hits the road, her touring entourage (and equipment) travels in three tractor trailers, four buses, and six cars. Now that’s a carbon footprint!

More Negative Consequences of Government Intervention

Many writers, including Cato experts, have noted the negative economic consequences of ethanol subsidies. While the direct effects are bad, government intervention also has negative indirect effects. As the UK-based Times notes, the subsidies are driving up the price of corn, hurting not only poor Mexicans but also American meat buyers:

Typically, meat production in the United States rises by about 2 per cent a year, but the pressure from American ethanol producers manufacturing road fuel from corn has sent the price of maize soaring to $4 a bushel. The USDA is predicting that the 2006 corn crop will sell for an average of $3.10 a bushel at the farm gate, the highest for a decade. Faced with extortionate feed costs, cattle and poultry farmers are rearing fewer animals and slaughtering them early. That means a sudden reversal in the annual meat production gain, representing a fall of 1.7lb per person. “There is a new demand component,” Shayle Shagam, a livestock analyst at USDA, said. “Livestock producers have to bid against the ethanol industry to get supplies of corn.” The biofuel revolution’s unpleasant negative consequence was first felt south of Rio Grande, when the escalating price of corn affected a food staple. Mexico’s tortilla inflation crisis is spreading north to the heartland of rib-eye steak and chicken wings. The USDA predicts that food prices will rise by up to 3.5 per cent this year as farmers rein in output in response to feedstock costs.

The Search for a Limited-Government Candidate Continues

Newt Gingrich, who continues to vigorously – though unofficially, so he can do it with million-dollar donations – campaign for president, appeared in Washington yesterday at what was billed as a debate with John Kerry on global warming. Some conservatives, disillusioned by the prospect of choosing among Rudy Giuliani, John McCain, and Mitt Romney, have looked to Gingrich as an actually Reaganite candidate. He should have dispelled those thoughts yesterday.

Instead of disagreeing with Kerry, Gingrich said that global warming is a problem and that “we should address it very actively.” He raved about Kerry’s book on the environment. He refused even to disagree with Kerry over the urgency of government action. Perhaps most un-Reaganesquely, he declared that while he preferred tax incentives to government mandates, “I am not automatically saying that coercion and bureaucracy is not an answer.”

There’s a Republican mantra for the new century.

The Sound of No ‘Peak’ Story Popping

Last week, in a Capitol Hill press conference featuring congressmen Roscoe Bartlett (R-Md.) and Tom Udall (D-N.M.),  the Government Accountability Office unveiled a new report on the looming catastrophe the United States faces from “peak oil.” With gas prices up and environmental stories popping in the press, Bartlett, Udall, and the GAO had to be thinking they’d have a hit on their hands.

So, if a GAO report falls on Capitol Hill and the media ignores it, does it count as news?

I can find no coverage of the press conference or the report in either the New York Times or the Washington Post. The only mention of it on either of those papers’ websites is in a transcript of an online chat session with Post politics reporter Lois Romano, wherein a reader asks if the Bartlett-Udall press conference will generate buzz.  Romano’s response (in essence): What press conference?

In fairness, the report did get a bit of play: the AP moved a short story on it and the WSJ briefed it. But no one is interviewed in either story, and the two pieces have the whiff of being quickly typed up from a press release. In other words, the media decided the report didn’t merit any real attention.

Peak oil, if you’ve never heard the term, is the theory that oil, as a finite resource, will grow increasingly difficult and expensive to extract over time. At some point, the global extraction rate will peak and then decline because of the increasing cost and difficulty.

The GAO report investigates the theory and comes up with three scintillating conclusions (I’m paraphrasing):

(1)  The world will indeed reach an oil peak — in the next few years, or the next 15 years, or the next 35 years, or the next 70 years, or sometime in the 22nd century.

(2)  It’s currently unclear how the United States will adjust to declining production rates when they do occur.

(3)  We’re all doomed, doomed I tellz ya’!

OK, (3) is hyperbolic — but just a tiny bit.

The notion of peak oil gained currency back in the early 1970s, a little more than a decade after geophysicist Marion King Hubbert correctly predicted that (Lower-48) U.S.-produced oil would peak around 1970. (Peak oil theory is often referred to as “Hubbert’s peak.”)

But Hubbert wasn’t the first person to come up with the concept. The notion dates at least to 1875 (yes, 1875) when John Strong Newberry claimed the oil peak was imminent. From then on, there’ve been many versions of the same refrain: The End (of oil) is nigh.

In respect to Newberry, Hubbert, Bartlett, Udall, and all the other “end is nigh” guys, there is validity to their theory. At some point in the future, the rate of global oil production will max out and then begin to decline. And it’s quite possible that we may not have cheap and easy substitutes for oil when that occurs, so there’ll be some significant changes for the world. But it’s also quite possible that we’ll develop substitutes for oil long before the cost of extraction, by itself, produces an oil peak; instead, the peak would result from our preferring — and thus shifting to — the substitutes. After all, that’s what has produced many previous natural resource shifts.

But let’s assume the former scenario plays out. Does that mean we are, indeed, doomed? And should we thus adopt the GAO report’s two policy recommendations that the U.S. government (1) carry out a massive global information-gathering effort to determine when the oil peak will occur, and (2) orchestrate a bold, unified national program to prepare for the peak oil transition to substitutes?

Let’s consider the policy recommendations first. Given the U.S. government’s track record on determining Iraq’s supply of weapons of mass destruction, how wise would it be to rely on the government to estimate the future supply of known and unknown sources of oil in Iraq, Iran, Saudi, Nigeria, Russia, Kuwait, Syria, Venezuela, China, Cuba, under the world’s oceans, etc.? How reliable would be government projections of the future technological developments that will increase human abilities to access that oil? Moreover, given that the U.S. government’s only great success in developing and broadly implementing an alternative energy program is nuclear power, do we really want it to be orchestrating a national program for a major transition to new energy sources? (I won’t mention the risk that the government, in carrying out these policies, would “fix” its findings and efforts around various politicians’ agendas.) If we are solely dependent on government to save us from the ruination of peak oil, then we probably are doomed.

So, does this mean that we should do nothing? Quite the opposite, quite the opposite — we should, and already are, acting boldly on energy. There are countless scientists, engineers, business executives, economists, and others, both in the United States and abroad, exploring and developing all sorts of transition strategies and technologies to substitute for oil. And there are countless scientists, engineers, business executives, and others, both in the United States and elsewhere, who are exploring and developing strategies and technologies to extend the life of the oil we have yet to extract. And we consumers have the best (and only necessary) incentive to utilize those developments when it makes sense to do so — we have to pay for the oil and alternative energies that we use. Those dynamics are far broader, more powerful, and more effective than any government Great (Energy) Leap Forward would be.

Bartlett, Udall, and the GAO are correct to be thinking about peak oil. But realizing that oil will peak one day is only the beginning of a thoughtful policy discussion, not the clinching demonstration that immediate government action is necessary. The only necessary (and sufficient) government energy policy is to allow consumers, innovators and entrepreneurs the degrees of freedom to make their own energy choices and to experience the costs and benefits of those choices.

Government is not the sole enlightened, rational actor on the planet. (Some might say the word “sole” should be removed from the previous sentence.) Somehow, we need to get the politicians to discover that.

Supreme Court to EPA: Hurry Up and Wait?

Lots of news outlets have been describing the Supreme Court’s opinion in Massachusetts v. EPA along the following lines: “Supreme Court says global warming is bad; tells EPA to fix the problem.”

Is that right? Not really.

In fact, if you read between the lines of the majority’s decision, its not clear that it will alter EPA policy one jot or tittle.

“Regulation,” under the Clean Air Act, can take a number of forms: It can take the form of declaring aspirational emission standards. Or it can take more draconian forms, such as looming technology mandates and imminent implementation deadlines, backed by tough civil and criminal penalties.

Even assuming that, after the Court’s decision yesterday, the EPA has to “regulate” in the sense of promulgating some GHG emission standards, the Court’s decision leaves the EPA with ample room to argue that it can defer deciding when and how to implement those standards in light of the potentially high and uncertain costs of implementation.

Its true, of course, that some parts of the Clean Air Act prohibit the EPA from undertaking this sort of cost-benefit analysis. The parts of the CAA governing auto emission standards are, however, different. There, the EPA retains considerable discretion weigh costs and-benefits—particularly when it comes to the “when” and “how” of implementing emission controls. For example, as Justice Stevens notes, section 202(a)(2) of the CAA gives the EPA broad discretion to delay implementation of pollution controls to the extent that “the Administrator finds necessary to permit the development and application of the requisite [pollution control] technology, giving appropriate consideration to the cost of compliance within such period.” Put in plain English, that means that if the “costs” of developing effective pollution-reducing technologies are very large, and the pay off of this R&D is in the far-distant future, the CAA doesn’t require the EPA to implement its standards right away.

The Court’s opinion also reaffirms the great deference owed to the EPA’s decision not to enforce any standards that it might promulgate. In the words of Justice Stevens yesterday, an “agency has broad discretion to choose how best to marshal its limited resources and personnel to carry out its delegated responsibilities.” Given the breadth of discretion granted the agency to defer implementation under provisions like section 202(a)(2), and the costs and uncertainties associated with implementation, that deference may give the EPA very substantial room to defer—perhaps for a very long time—implementation of a federal GHG enforcement regime, freeing the EPA to deal with more immediate and pressing environmental problems.

Nor is analysis of the EPA’s leeway to delay implementation much different if, as some assume, the Court’s decision means that GHG emissions are also “pollutants” under CAA provisions dealing with “national ambient air quality standards.” True, in Whitman v. American Trucking Association, the Court held that the EPA must set NAAQS without regard to the costs of implementation. But in his concurrence in that case, Justice Breyer suggested that even CAA requirements governing national ambient air quality standards permit some modified cost-benefit analysis. He emphasized, for example, that when setting NAAQS, the EPA doesn’t have to eliminate “any health risk, however slight, at any economic cost, however great.” It is only required to eliminate “unacceptable” risks, defined as those that the public is not willing to tolerate at any cost.

New American car emissions count for only 6% of worldwide carbon dioxide emissions. Eliminating these emissions wouldn’t necessarily reverse global warming or even appreciably slow it—particularly given the dynamic nature of emissions in developing countries. Thus, its far from evident that the added global warming risks created by new American car emissions are “unacceptable” in the sense suggested by Justice Breyer.  On the face of the record, its also far from clear that the risks posed by other GHG-omitting sources in the U.S., such as stationary sources, are any more publicly “unacceptable” in the sense meant by Breyer, given uncertainty about the payoff of unilateral American remediation and given the cost and current feasibility of GHG control technology.

Ultimately, then, the key flaw with the EPA’s decision may not have been the outcome of that decision, or even the overarching reasons given by the EPA for its decision. The fatal flaw may have been only the conclusory nature of the reasons given by the EPA for its decision. For example, the EPA said that it wouldn’t act now because effective GHG-reducing technologies weren’t feasible at present and wouldn’t be feasible in the near future. But the EPA didn’t make any effort to quantify, or otherwise support with evidence, that feasibility assessment. Instead, it offered its conclusions as facts that courts must accept at face value—something five justices weren’t willing to do. But if the EPA can supplement its feasibility conclusions with at least some evidence, it may be able to pull at least one or two justices—most likely Breyer or Kennedy–into the dissenters’ orbit.

(This post is cross-posted at ScotusBlog).

Federal Judge Orders Forest Service to Spend Millions on Nothing

Back in the 1980s, the Forest Service spent well over a billion dollars writing forest plans for each of the 100 or so national forests. Naturally, the Sierra Club and other environmental groups took many of these plans to court. After winning many of those challenges, they were stunned when the Supreme Court ruled in 1998 that the plans made no decisions. With no decisions, they did not constitute an “action,” so the court said no one had the standing to appeal them.

Unfortunately, no one bothered to tell Congress that the plans it had required in 1976 did nothing but spend money, so Congress still requires the agency to revise the plans every ten to fifteen years. But last year, the Bush Administration decided to dispense with about half the paperwork involved in such revisions by not requiring the forests to write separate environmental impact statements for each plan.

Though the plans do nothing, the Sierra Club and other environmental groups took this decision to court. Last week, a federal judge in California ruled that, even though the plans themselves were not an “action,” the rules for how the plans were written are an action. So the judge tossed the rules on the ground that the Forest Service had not written an environmental impact statement for them.

So we can expect the Forest Service to continue to spend hundreds of millions of dollars on paper plans that make no decisions and take no actions. Although I consider myself more of an environmentalist than a “timber beast,” I am inclined to agree with a representative of the timber industry who says this is “bureaucracy for bureaucracy’s sake.”

Full disclosure: In the 1980s and 1990s, I helped the Sierra Club and other environmental groups challenge forest plans – for what it is worth, the only challenges that were successful were ones that I was involved in. The main lesson I learned was that planning was a waste of time – the Forest Service changed tremendously between 1980 and 2000, but most of those changes were in spite of planning, not because of it.