Topic: Government and Politics

Federal Nuclear Clean Up: $150 Billion

Cleaning up the government’s nuclear weapons sites has become a vast sinkhole for taxpayer dollars. The Department of Energy (DOE) spends about $6 billion a year on environmental clean up of federal nuclear sites. These sites were despoiled in the decades following World War II with little notice taken by Congress. Then during the 1980s, a series of reports lambasted DOE for its lax safety and environmental standards, and federal polices began to change.

Since 1990, federal taxpayers have paid more than $150 billion to clean up the mess from the government’s nuclear sites, based on my calculations. Unfortunately, many more billions will be likely needed in coming years, partly because DOE management continues to be so poor.

A 2003 GAO report (GAO-03-593) found that “DOE’s past efforts to treat and dispose of high-level waste have been plagued with false starts and failures.” And a 2008 GAO report (GAO-08-1081) found that 9 out of 10 major clean up projects “experienced cost increases and schedule delays in their life cycle baseline, ranging from $139 million for one project to more than $9 billion for another.”

The largest of the nuclear clean up sites is Hanford in Washington State. One facility at the site has ballooned in cost from $4.3 billion in 2000 to $13.4 billion today (GAO-13-38). Overall, $19 billion has been spent cleaning up the Hanford site since 1989, and the effort continues to face huge problems (GAO-15-354).

The Washington Post reported yesterday:

A nearly completed government facility intended to treat the radioactive byproducts of nuclear weapons production is riddled with design flaws that could put the entire operation at risk of failure, according to a leaked internal report.

A technical review of the treatment plant on the grounds of the former Hanford nuclear site identified hundreds of “design vulnerabilities” and other weaknesses, some serious enough to lead to spills of radioactive material.

The draft report is the latest in a series of blows to the clean-up effort at Hanford, the once-secret government reservation in eastern Washington state where much of the nation’s plutonium stockpile originated. Engineers have struggled for years to come up with a safe method for disposing of Hanford’s millions of gallons of high-level radioactive waste, much of which is stored in leaky underground tanks.

Obviously this is a complex task, but a former Clinton administration DOE official told the newspaper that DOE:

“has proven to be incapable of managing a project of this magnitude and importance,” Alvarez said. “The agency has shown a long-standing intolerance for whistleblowers while conducting faith-based management of its contractors regardless of poor performance. This has bred a culture in which no safety misdeed goes unrewarded.”

Solyndra: A Case Study in Green Energy, Cronyism, and the Failure of Central Planning

Back in 2011 I wrote several times about the failure of Solyndra, the solar panel company that was well connected to the Obama administration. Then, as with so many stories, the topic passed out of the headlines and I lost touch with it. Today, the Washington Post and other papers bring news of a newly released federal investigative report:

Top leaders of a troubled solar panel company that cost taxpayers a half-billion dollars repeatedly misled federal officials and omitted information about the firm’s financial prospects as they sought to win a major government loan, according to a newly-released federal investigative report.

Solyndra’s leaders engaged in a “pattern of false and misleading assertions” that drew a rosy picture of their company enjoying robust sales while they lobbied to win the first clean energy loan the new administration awarded in 2009, a lengthy investigation uncovered. The Silicon Valley start-up’s dramatic rise and then collapse into bankruptcy two years later became a rallying cry for critics of President Obama’s signature program to create jobs by injecting billions of dollars into clean energy firms.

And why would it become such a rallying cry for critics? Well, consider the hyperlink the Post inserted at that point in the article: “[Past coverage: Solyndra: Politics infused Obama energy programs]” And what did that article report?

A Transparency Milestone

This week, I reported at the Daily Caller (and got a very nice write-up) about a minor milestone in the advance of government transparency: We recently finished adding computer-readable code to every version of every bill in the 113th Congress.

That’s an achievement. More than 10,000 bills were introduced in Congress’s last-completed two-year meeting (2013-14). We marked up every one of them with additional information.

We’ve been calling the project “Deepbills” because it allows computers to see more deeply into the content of federal legislation. We added XML-format codes to the texts of bills, revealing each reference to federal agencies and bureaus, and to existing laws no matter how Congress cited them. Our markup also automatically reveals budget authorities, i.e., spending.

Want to see every bill that would have amended a particular title or section of the U.S. code? Deepbills data allows that.

Want to see all the bills that referred to the Administration on Aging at HHS? Now that can be done.

Want to see every member of Congress who proposed a new spending program and how much they wanted to spend? Combining Deepbills data with other data allows you to easily collect that imporant information.

Tree-ring Temperature Reconstructions May have Masked Prior Warmth

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Proxy temperature records serve a significant purpose in the global warming debate – they provide a reality check against the claim that current temperatures are unprecedentedly warm in the context of the past one to two thousand years. If it can be shown that past temperatures were just as warm as, or warmer than, they are presently, the hypothesis of a large CO2-induced global warming is weakened. It would thus raise the possibility that current temperatures are influenced to a much greater degree by natural climate oscillations than they are by rising atmospheric CO2.

Tree ring data account for one of the most commonly utilized sources of proxy temperatures. Yet, as with any substitute, proxy temperatures derived from tree ring data do not perfectly match with standard thermometer-based measurements; and, therefore, the calculations and methods are not without challenge or controversy. For example, many historic proxies are based upon a dwindling number of trees the further the proxy extends back in time. Additionally, some proxies mix data from different trees and pool their data prior to mass spectrometer measurement, which limits the ability to discern long-term climate signals among individual trees. Though it has the potential to significantly influence a proxy record, this latter phenomenon has received little attention in the literature – until now.

In an intriguing new study, Esper et al. (2015) recognize this deficiency by noting “climate reconstructions derived from detrended tree-ring δ13C data, in which δ13C level differences and age-trends have been analyzed and, if detected, removed, are largely missing from the literature.” Thus, they set out to remedy this situation by developing “a millennial-scale reconstruction based on decadally resolved, detrended, δ13C measurements, with the climate signal attributed to the comparison of annually resolved δ13C measurements with instrumental data.” Then, they compared their new proxy with proxies derived from a more common, but presumably inferior, method based on maximum latewood density (MXD) data. The location of study was at a sampling site near lake Gerber (42.63°N, 1.1°E), Spanish Pyrenees, at the upper treeline (2400 m).

Hurricane Katrina: Remembering the Federal Failures

Ten years ago this week, Hurricane Katrina made landfall on the Gulf Coast and generated a huge disaster. The storm flooded New Orleans, killed more than 1,800 people, and caused $100 billion in property damage. The storm’s damage was greatly exacerbated by the failures of Congress, the Bush administration, the Federal Emergency Management Agency (FEMA), and the Army Corps of Engineers.

Weather forecasters warned government officials about Katrina’s approach, so they should have been ready for it. But they were not, and Katrina exposed major failures in America’s disaster preparedness and response systems.

Here are some of the federal failures:

  • Confusion. Key federal officials were not proactive, they gave faulty information to the public, and they were not adequately trained. The 2006 bipartisan House report on the disaster, A Failure of Initiative, said, “federal agencies … had varying degrees of unfamiliarity with their roles and responsibilities under the National Response Plan and National Incident Management System.” The report found that there was “general confusion over mission assignments, deployments, and command structure.” One reason was that FEMA’s executive suites were full of political appointees with little disaster experience.
  • Failure to Learn. The government was unprepared for Katrina even though it was widely known that such a hurricane was probable, and weather forecasters had accurately predicted the advance of Katrina before landfall. A year prior to Katrina, government agencies had performed a simulation exercise—“Hurricane Pam”—for a hurricane of similar strength hitting New Orleans, but governments “failed to learn important lessons” from the exercise.
  • Communications Breakdown. The House report found that there was “a complete breakdown in communications that paralyzed command and control and made situational awareness murky at best.” Agencies could not communicate with each other due to equipment failures and a lack of system interoperability. These problems occurred despite the fact that FEMA and predecessor agencies have been giving grants to state and local governments for emergency communication systems since the beginning of the Cold War.
  • Supply Failures. Some emergency supplies were prepositioned before the storm, but there was nowhere near enough. In places that desperately needed help, such as the New Orleans Superdome, it took days to deliver medical supplies. FEMA also wasted huge amounts of supplies. It delivered millions of pounds of ice to holding centers in cities far away from the Gulf Coast. FEMA sent truckers carrying ice on wild goose chases across the country. Two years after the storm, the agency ended up throwing out $100 million of unused ice. FEMA also paid for 25,000 mobile homes costing $900 million, but they went virtually unused because of FEMA’s own regulations that such homes cannot be used on flood plains, which is where most Katrina victims lived.
  • Indecision. Indecision plagued government leaders in the deployment of supplies, in medical personnel decisions, and in other areas. Even the grisly task of body recovery after Katrina was slow and confused. Bodies went uncollected for days “as state and federal officials remained indecisive on a body recovery plan.” FEMA waited for Louisiana to make decisions about bodies, but the governor of Louisiana blamed FEMA’s tardiness in making a deal with a contractor. Similar problems of too many bureaucratic cooks in the kitchen hampered decisionmaking in areas, such as organizing evacuations and providing law enforcement resources to Louisiana.

A Very Simple Plan to Balance the Budget by 2021

Earlier this month, Americans for Prosperity held a “Road to Reform” event in Las Vegas.

I got to be the warm-up speaker and made two simple points.

First, we made a lot of fiscal progress between 2009 and 2014 because various battles over debt limits, shutdowns, and sequestration actually did result in real spending discipline.

Second, I used January’s 10-year forecast from the Congressional Budget Office to explain how easy it would be to balance the budget with a modest amount of future spending restraint.

Here’s my speech:

The Feds into Everything

Our hyperactive, grasping federal government has inserted its wasteful, probing fingers into just about everything these days.

I hadn’t been to an eye doctor in a while, and so when I went recently I was surprised to be presented with these two forms:

The first form claims that electronic transmission of prescriptions “helps protect the privacy of your personal information.” That strikes me as plainly false—an old-fashioned piece of paper with my eye information couldn’t get hacked on the Internet or wouldn’t be sent to the government. The form lists the supposed benefits of e-prescribing to the patient. On net, the benefits may indeed outweigh the costs—but then we wouldn’t need a federal mandate to bring it about.

Like many Americans, I find the second form regarding race rather offensive. It would be one thing if university researchers were surveying a sample of patients for such information in order to study eye diseases that may vary by personal characteristics. But reading between the lines on this form, the government appears to be collecting the information not for medical research, but essentially for socialist planning purposes.