A Transparency Milestone

This week, I reported at the Daily Caller (and got a very nice write-up) about a minor milestone in the advance of government transparency: We recently finished adding computer-readable code to every version of every bill in the 113th Congress.

That’s an achievement. More than 10,000 bills were introduced in Congress’s last-completed two-year meeting (2013-14). We marked up every one of them with additional information.

We’ve been calling the project “Deepbills” because it allows computers to see more deeply into the content of federal legislation. We added XML-format codes to the texts of bills, revealing each reference to federal agencies and bureaus, and to existing laws no matter how Congress cited them. Our markup also automatically reveals budget authorities, i.e., spending.

Want to see every bill that would have amended a particular title or section of the U.S. code? Deepbills data allows that.

Want to see all the bills that referred to the Administration on Aging at HHS? Now that can be done.

Want to see every member of Congress who proposed a new spending program and how much they wanted to spend? Combining Deepbills data with other data allows you to easily collect that imporant information.

Tree-ring Temperature Reconstructions May have Masked Prior Warmth

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

Proxy temperature records serve a significant purpose in the global warming debate – they provide a reality check against the claim that current temperatures are unprecedentedly warm in the context of the past one to two thousand years. If it can be shown that past temperatures were just as warm as, or warmer than, they are presently, the hypothesis of a large CO2-induced global warming is weakened. It would thus raise the possibility that current temperatures are influenced to a much greater degree by natural climate oscillations than they are by rising atmospheric CO2.

Tree ring data account for one of the most commonly utilized sources of proxy temperatures. Yet, as with any substitute, proxy temperatures derived from tree ring data do not perfectly match with standard thermometer-based measurements; and, therefore, the calculations and methods are not without challenge or controversy. For example, many historic proxies are based upon a dwindling number of trees the further the proxy extends back in time. Additionally, some proxies mix data from different trees and pool their data prior to mass spectrometer measurement, which limits the ability to discern long-term climate signals among individual trees. Though it has the potential to significantly influence a proxy record, this latter phenomenon has received little attention in the literature – until now.

In an intriguing new study, Esper et al. (2015) recognize this deficiency by noting “climate reconstructions derived from detrended tree-ring δ13C data, in which δ13C level differences and age-trends have been analyzed and, if detected, removed, are largely missing from the literature.” Thus, they set out to remedy this situation by developing “a millennial-scale reconstruction based on decadally resolved, detrended, δ13C measurements, with the climate signal attributed to the comparison of annually resolved δ13C measurements with instrumental data.” Then, they compared their new proxy with proxies derived from a more common, but presumably inferior, method based on maximum latewood density (MXD) data. The location of study was at a sampling site near lake Gerber (42.63°N, 1.1°E), Spanish Pyrenees, at the upper treeline (2400 m).

Hurricane Katrina: Remembering the Federal Failures

Ten years ago this week, Hurricane Katrina made landfall on the Gulf Coast and generated a huge disaster. The storm flooded New Orleans, killed more than 1,800 people, and caused $100 billion in property damage. The storm’s damage was greatly exacerbated by the failures of Congress, the Bush administration, the Federal Emergency Management Agency (FEMA), and the Army Corps of Engineers.

Weather forecasters warned government officials about Katrina’s approach, so they should have been ready for it. But they were not, and Katrina exposed major failures in America’s disaster preparedness and response systems.

Here are some of the federal failures:

  • Confusion. Key federal officials were not proactive, they gave faulty information to the public, and they were not adequately trained. The 2006 bipartisan House report on the disaster, A Failure of Initiative, said, “federal agencies … had varying degrees of unfamiliarity with their roles and responsibilities under the National Response Plan and National Incident Management System.” The report found that there was “general confusion over mission assignments, deployments, and command structure.” One reason was that FEMA’s executive suites were full of political appointees with little disaster experience.
  • Failure to Learn. The government was unprepared for Katrina even though it was widely known that such a hurricane was probable, and weather forecasters had accurately predicted the advance of Katrina before landfall. A year prior to Katrina, government agencies had performed a simulation exercise—“Hurricane Pam”—for a hurricane of similar strength hitting New Orleans, but governments “failed to learn important lessons” from the exercise.
  • Communications Breakdown. The House report found that there was “a complete breakdown in communications that paralyzed command and control and made situational awareness murky at best.” Agencies could not communicate with each other due to equipment failures and a lack of system interoperability. These problems occurred despite the fact that FEMA and predecessor agencies have been giving grants to state and local governments for emergency communication systems since the beginning of the Cold War.
  • Supply Failures. Some emergency supplies were prepositioned before the storm, but there was nowhere near enough. In places that desperately needed help, such as the New Orleans Superdome, it took days to deliver medical supplies. FEMA also wasted huge amounts of supplies. It delivered millions of pounds of ice to holding centers in cities far away from the Gulf Coast. FEMA sent truckers carrying ice on wild goose chases across the country. Two years after the storm, the agency ended up throwing out $100 million of unused ice. FEMA also paid for 25,000 mobile homes costing $900 million, but they went virtually unused because of FEMA’s own regulations that such homes cannot be used on flood plains, which is where most Katrina victims lived.
  • Indecision. Indecision plagued government leaders in the deployment of supplies, in medical personnel decisions, and in other areas. Even the grisly task of body recovery after Katrina was slow and confused. Bodies went uncollected for days “as state and federal officials remained indecisive on a body recovery plan.” FEMA waited for Louisiana to make decisions about bodies, but the governor of Louisiana blamed FEMA’s tardiness in making a deal with a contractor. Similar problems of too many bureaucratic cooks in the kitchen hampered decisionmaking in areas, such as organizing evacuations and providing law enforcement resources to Louisiana.

ACLU v. Nevada Children

The American Civil Liberties Union announced today that it is filing a legal challenge against Nevada’s new education savings account program. The ACLU argues that using the ESA funds at religious institutions would violate the state’s historically anti-Catholic Blaine Amendment, which states “No public funds of any kind or character whatever…shall be used for sectarian purposes.”  

What “for sectarian purposes” actually means (beyond thinly veiled code for “Catholic schools”) is a matter of dispute. Would that prohibit holding Bible studies at one’s publicly subsidized apartment? Using food stamps to purchase Passover matzah? Using Medicaid at a Catholic hospital with a crucifix in every room and priests on the payroll? Would it prohibit the state from issuing college vouchers akin to the Pell Grant? Or pre-school vouchers? If not, why are K-12 subsidies different?

While the legal eagles mull those questions over, let’s consider what’s at stake. Children in Nevada–particularly Las Vegas–are trapped in overcrowded and underperforming schools. Nevada’s ESA offers families much greater freedom to customize their children’s education–a freedom they appear to appreciate. Here is how Arizona ESA parents responded when asked about their level of satisfaction with the ESA program:

 Parental satisfaction with Arizona's ESA program

And here’s how those same parents rated their level of satisfaction with the public schools that their children previously attended:

Parental satisfaction among AZ ESA families with their previous public schools 

Note that the lowest-income families were the least satisfied with their previous public school and most satisfied with the providers they chose with their ESA funds.

Similar results are not guaranteed in Nevada and there are important differences between the programs–when the survey was administered, eligibility for Arizona’s ESA was limited only to families of students with special needs who received significantly more funding than the average student (though still less than the state would have spent on them at a public school). By contrast, Nevada’s ESA program is open to all public school students, but payments to low-income families are capped at the average state funding per pupil ($5,700). Nevertheless, it is the low-income students who have the most to gain from the ESA–and therefore the most to lose from the ACLU’s ill-considered lawsuit.

TSA’s Classified “Risk-Reduction Analysis”

Last month, our friends at the Competitive Enterprise Institute filed suit against the TSA because the agency failed to follow basic administrative procedures when it deployed its notorious “strip-search machines” for use in primary screening at our nation’s airports. Four years after being ordered to do so by the U.S. Court of Appeals for the D.C. Circuit, TSA still hasn’t completed the process of taking comments from the public and finalizing a regulation setting this policy. Here’s hoping CEI’s effort helps make TSA obey the law.

The reason why federal law requires agencies to hear from the public is so that they can craft the best possible rules. Nobody believes in agency omniscience. Public input is essential to gathering the information for setting good policies.

But an agency can’t get good information if it doesn’t share the evidence, facts, and inferences that underlie its proposals and rules. That’s why this week I’ve sent TSA a request for mandatory declassification review relating to a study that it says supports its strip-search machine policy. The TSA is keeping its study secret.

In its woefully inadequate (and still unfinished) policy proposal on strip-search machines, TSA summarily asserted: “[R]isk reduction analysis shows that the chance of a successful terrorist attack on aviation targets generally decreases as TSA deploys AIT. However, the results of TSA’s risk-reduction analysis are classified.”

Fed Officials Endorse Monetary Commission!

According to a report I have before me, straight from the U.S. Senate, prominent Federal Reserve officials, including the presidents of the Federal Reserve Banks of New York and Philadelphia, have publicly endorsed legislation that would establish a bipartisan Monetary Commission authorized “to make a thorough study of the country’s entire banking and monetary set-up,” and to evaluate various alternative reforms, including a “return to the gold coin standard.” The proposed commission would be the first such undertaking since the Aldrich-Vreeland Act established the original National Monetary Commission in 1908.

Surprised? It gets better. The same Senate document includes a letter from the Fed’s Chairman, addressed to the Senate Banking Committee, indicating that the Board of Governors itself welcomes the proposed commission. Such a commission, the letter says, “would be desirable and could be expected to form the basis for conservative legislation in this field.”

Can it be? Have Fed officials had a sudden change of heart? Have they really decided to welcome the proposed “Centennial Monetary Commission” with open arms? Is it time to break out the Dom Pérignon, or have I just been daydreaming?

Is the Bombastic Donald the Best of a Bad GOP Lot on Foreign Policy?

Donald Trump has wrecked the best plans of nearly a score of “serious” Republican presidential candidates. Yet, what may be most extraordinary about his campaign is that, on foreign policy at least, he may be the most sensible Republican in the race. It is the “mainstream” and “acceptable” Republicans who are most extreme, dangerous, and unrealistic.

First, the Republicans scream that the world has never been so dangerous. Yet when in history has a country been as secure as America from existential and even substantial threats?

Hyperbole is Trump’s stock in trade, but he has used it only sparingly on foreign policy. Referring to North Korea, for instance, he claimed: “this world is just blowing up around us.” But he used that as a justification for talking to North Korea, not going to war.