We Should Be Wary of Federal Body Camera Funds

Last week, the Department of Justice (DOJ) announced a $20 million police body camera pilot funding scheme to assist law enforcement agencies in developing body camera programs. In the wake of the killings of Michael Brown, Walter Scott, and Freddie Gray there has been renewed debate on police accountability. Unsurprisingly, body cameras feature heavily in this debate. Yet, despite the benefits of police body cameras, we ought to be wary of federal top-down body camera funding programs, which have been supported across the political spectrum.

The $20 million program is part of a three-year $75 million police body camera initiative, which was announced by the Obama administration shortly after the news that Darren Wilson, the officer who shot and killed Michael Brown in Ferguson, Missouri, would not be indicted. It is undoubtedly the case that if Wilson had been wearing a body camera that there would be fewer questions about the events leading up to and including his killing of Brown. And, while there are questions about the extent to which police body cameras prompt some “civilizing effect” on police, the footage certainly provides welcome additional evidence in investigations relating to police misconduct, thereby improving transparency and accountability.

Failing Aviation Administration (FAA)

The federal government operates the air traffic control (ATC) system as an old-fashioned bureaucracy, even though ATC is a high-tech business. It’s as if the government took over Apple Computer and tried to design breakthrough products. The government would surely screw it up, which is the situation today with ATC run by the Federal Aviation Administration (FAA).

The Washington Post reports:

A day after the Federal Aviation Administration celebrated the latest success in its $40 billion modernization of the air-traffic control system, the agency was hit Friday by the most scathing criticism to date for the pace of its efforts.

The FAA has frustrated Congress and been subject to frequent critical reports as it struggles to roll out the massive and complex system called NextGen, but the thorough condemnation in a study released Friday by the National Academies was unprecedented.

Mincing no words, the panel of 10 academic experts brought together by the academy’s National Research Council (NRC) said the FAA was not delivering the system that had been promised and should “reset expectations” about what it is delivering to the public and the airlines that use the system.

The “success” the WaPo initially refers to is a component of NextGen that was four years behind schedule and millions of dollars over-budget. That is success for government work I suppose.

The NRC’s findings come on the heels of other critical reports and years of FAA failings. The failings have become so routine—and the potential benefits of improved ATC so large— that even moderate politicians, corporate heads, and bureaucratic insiders now support major reforms:

“We will never get there on the current path,” Rep. Bill Shuster (R-Pa.), chairman of the House Transportation Committee, said two months ago at a roundtable discussion on Capitol Hill. “We’ve spent $6 billion on NextGen, but the airlines have seen few benefits.”

American Airlines chief executive Doug Parker added, “FAA’s modernization efforts have been plagued with delays.”

And David Grizzle, former head of the FAA’s air-traffic control division, said taking that division out of FAA hands “is the only means to create a stable” future for the development of NextGen.

The reform we need is ATC privatization. Following the leads of Canada and Britain, we should move the entire ATC system to a private and self-supporting nonprofit corporation. The corporation would cover its costs by generating revenues from customers—the airlines—which would make it more responsible for delivering results.

Here is an interesting finding from the NRC report:  “Airlines are not motivated to spend money on equipment and training for NextGen.” Apparently, the airlines do not trust the government to do its part, and so progress gets stalled because companies cannot be sure their investments will pay off. So an advantage of privatization would be to create a more trustworthy ATC partner for the users of the system.

ATC privatization should be an opportunity for Democrats and Republicans to forge a bipartisan legislative success. In Canada, the successful ATC privatization was enacted by a Liberal government and supported by the subsequent Conservative government. So let’s use the Canadian system as a model, and move ahead with ATC reform and modernization.

E-Verify in the States

Many state legislatures are proposing to expand E-Verify – a federal government-run electronic system that allows or forces employers to check the identity of new hires against a government database.  In a perfect world, E-Verify tells employers whether the new employee can legally be hired.  In our world, E-Verify is a notoriously error-prone and unreliable system.

E-Verify mandates vary considerably across states.  Currently, Alabama, Arizona, Mississippi and South Carolina have across the board mandates for all employers.  The state governments of Georgia, Utah, and North Carolina force all businesses with at least 10, 15, and 25 employees, respectively, to use E-Verify.  Florida, Indiana, Missouri, Nebraska, Oklahoma, Pennsylvania and Texas mandate-Verify for public employees and state contractors, while Idaho and Virginia mandate E-Verify for public employees. The remaining states either have no state-wide mandates or, in the case of California, limit how E-Verify can be used by employers.

Despite E-Verify’s wide use in the states and problems, some state legislatures are considering forcing it on every employer within their respective states. 

In late April, the North Carolina’s House of Representatives passed a bill (HB 318) 80-39 to lower the threshold for mandated E-Verify to businesses with five or more employees.  HB 318 is now moving on to the North Carolina Senate where it could pass.  Nevada’s AB 172 originally included an E-Verify mandate that the bill’s author removed during the amendment process. Nebraska’s LB611 would have mandated E-Verify for all employers in the state.  LB611 has since stalled since a hostile hearing over in February.

E-Verify imposes a large economic cost on American workers and employers, does little to halt unlawful immigration because it fails to turn off the “jobs magnet,” and is an expansionary threat to American liberties.  Those harms are great while the benefits are uncertain – at best.  At a minimum, state legislatures should thoroughly examine the costs and supposed benefits of E-Verify before expanding or enacting mandates.

Scott Platton helped to write this blog post.

Raise the Wage Act Is More Rhetoric than Reality

When U.S Congressman Robert C. “Bobby” Scott (D-VA) and U.S. Senator Patty Murray (D-WA) introduced the Raise the Wage Act on April 30, they promised that their bill would “raise wages for nearly 38 million American workers.” Their bill would also phase out the subminimum tipped wage and index the minimum wage to median wage growth.

With rhetorical flourish, Sen. Murray said, “Raising the minimum wage to $12 by 2020 is a key component to helping more families make ends meet, expanding economic security, and growing our economy from the middle out, not the top down.”

The fact sheet that accompanied the bill claims that passing the Raise the Wage Act would reduce poverty and benefit low-wage workers, especially minorities. Indeed, it is taken as given that the Act “would give 37 percent of African American workers a raise”—by the mere stroke of a legislative pen. It is also assumed that “putting more money into the pockets of low-wage workers stimulates consumer demand and strengthens the economy for all Americans.”

The reality is that whenever wages are artificially pushed above competitive market levels jobs will be destroyed, unemployment will increase for lower-skilled workers, and those effects will be stronger in the long run than in the short run.  The least productive workers will be harmed the most as employers substitute new techniques that require fewer low-skilled workers.  There will be less full-time employment for those workers and their benefits will be cut over time.  That is the logic of the market price system.

Those Gruelling U.S. Tax Rates: A Global Perspective

The Tax Foundation released its inaugural “International Tax Competitiveness Index” (ITCI) on September 15th, 2014. The United States was ranked an abysmal 32nd out of the 34 OECD member countries for the year 2014. (See accompanying Table 1.) The European welfare states such as Norway, Sweden and Denmark, with their large social welfare systems, still managed to have less burdensome tax systems on local businesses than the U.S. The U.S. is even ranked below Italy, the country that has had such a pervasive problem with tax evasion that the head of its Agency of Revenue (roughly equivalent to the Internal Revenue Service in the United States) recently joked that Italians don’t pay taxes because they were Catholic and hence were used to “gaining absolution.” In fact, according to the ranking, only France and Portugal have the dubious honor of operating less competitive tax systems than the United States.

The ITCI measures “the extent to which a country’s tax system adheres to two important principles of tax policy: competitiveness and neutrality.” The competitiveness of a tax system can be measured by the overall tax rates faced by domestic businesses operating within the country. In the words of the Tax Foundation, when tax rates are too high, it “drives investment elsewhere, leading to slower economic growth.” Tax competitiveness is measured from 40 different variables across five different categories: consumption taxes, individual taxes, corporate income taxes, property taxes, and the treatment of foreign earnings. Tax neutrality, the other principle taken into account when composing the ITCI, refers to a “tax code that seeks to raise the most revenue with the fewest economic distortions.” This would mean that tax systems are fair and equally targeted towards all firms and industries, with no tax breaks for any specific business activity. A neutral tax system would also limit the rate of – amongst others – capital gains and dividends taxes, all of which encourage consumption at the expense of savings and investment. 

Even the two countries that have less competitive tax regimes than the U.S. – France and Portugal – have lower corporate tax rates than the U.S., at 34.4% and 31.5%, respectively. The U.S. corporate rate on average across states, on the other hand, is at 39.1%. This is the highest rate in the OECD, which has an average corporate tax rate of 24.8% across the 34 member countries. According to a report by KPMG, if the United Arab Emirates’ severance tax on oil companies was ignored, the U.S. average corporate tax rate would be the world’s highest.

Contra Shiller: Stock P/E Ratio Depends on Bond Yields, Not Historical Averages

The Wall Street Journal just offered two articles in one day touting Robert Shiller’s cyclically adjusted price/earnings ratio (CAPE).  One of then, “Smart Moves in a Pricey Stock Market” by Jonathan Clements, concludes that, “U.S. shares arguably have been overpriced for much of the past 25 years.” Identical warnings keep appearing, year after year, despite being endlessly wrong.  

The Shiller CAPE assumes the P/E ratio must revert to some heroic 1881-2014 average of 16.6 (or, in Clements’ account, a 1946-1990 average of 15).  That assumption is completely inconsistent with the so-called “Fed model” observation that the inverted P/E ratio (the E/P ratio or earnings yield) normally tracks the 10 year bond yield surprisingly closely.  From 1970 to 2014, the average E/P ratio was 6.62 and the average 10-Year bond yield was 6.77.  

When I first introduced this “Fed Model” relationship to Wall Street consulting clients in “The Stock Market Like Bonds,” March 1991, I suggested bonds yields were about to fall because a falling E/P commonly preceded falling bond yields. And when the E/P turned up in 1993, bond yield obligingly jumped in 1994.

Since 2010, the E/P ratio has been unusually high relative to bond yields, which means the P/E ratio has been unusually low.  The gap between the earnings yield and bond yield rose from 2.8 percentage points in 2010 to a peak of 4.4 in 2012.  Recylcing my 1991 analysis, the wide 2012 gap suggested the stock market thought bond yields would rise, as they did –from 1.8% in in 2012 to 2.35% in 2013 and 2.54% in 2014.

On May 1, the trailing P/E ratio for the S&P 500 was 20.61, which translates into an E/P ratio of 4.85 (1 divided by 20.61). That is still high relative to a 10-year bond yield of 2.12%.   If the P/E fell to 15, as Shiller fans always predict, the E/P ratio would be 6.7 which would indeed get us close to the Shiller “buy” signal of 6.47 in 1990.  But the 10-year bond yield in 1990 was 8.4%.  And the P/E ratio was so depressed because Texas crude jumped from $16 in late June 1990 to nearly $40 after Iraq invaded Kuwait. Oil price spikes always end in recession, including 2008.

Today’s wide 2.7 point gap between the high E/P ratio and low bond yield will not be closed by shoving the P/E ratio back down to Mr. Shiller’s idyllic level of the 1990 recession.  It is far more likely that the gap will be narrowed by bond yields rising. 

You Ought to Have a Look: Science Round Up—Less Warming, Little Ice Melt, Lack of Imagination

You Ought to Have a Look is a feature from the Center for the Study of Science posted by Patrick J. Michaels and Paul C. (“Chip”) Knappenberger. While this section will feature all of the areas of interest that we are emphasizing, the prominence of the climate issue is driving a tremendous amount of web traffic. Here we post a few of the best in recent days, along with our color commentary.

As Pope Francis, this week, focused on examining the moral issues of climate change (and largely ignoring the bigger moral issues that accompany fossil fuel restrictions), he pretty much took as a given that climate change is “a scientific reality” that requires “decisive mitigation.” Concurrently, unfolding scientific events during the week were revealing a different story.

First and foremost, Roy Spencer, John Christy and William Braswell of the University of Alabama-Huntsville (UAH)—developers and curators of the original satellite-derived compilation of the temperature history of the earth’s atmosphere—released a new and improved version of their iconic data set. Bottom line: the temperature trend in the lower atmosphere from the start of the data (1979) through the present came in as 0.114°C/decade (compared with 0.14°C in the previous data version). The new warming trend is less than half what climate models run with increasing atmospheric carbon dioxide emissions project to have occurred.

While the discrepancy between real world observations and climate model projections of temperature rise in the lower atmosphere has been recognized for a number of years, the question has remained as to whether the “problem” lies within the climate models or the observations. With this new data release, the trend in the UAH data now matches very closely with the trend through an independent compilation of the satellite-temperature observations maintained by a team of researchers at Remote Sensing Systems (RSS). The convergence of the observed data sets is an indication the climate models are the odd man out.

As with most long-term, real-world observations, the data are covered in warts. The challenge posed to Spencer et al. was how to splice together remotely sensed data collected from a variety of instruments carried aboard a variety of satellites in unstable orbits—and produce a product robust enough for use in climate studies. The details as to how they did it are explained as clearly as possible in this post over at Spencer’s website (although still quite a technical post). The post provides good insight as to why raw data sets need to be “adjusted”—a lesson that should be kept in mind when considering the surface temperature compilations as well. In most cases, using raw data “as is” is an inherently improper thing to do, and the types of adjustments that are applied may vary based upon the objective.

Here is a summary of the new data set and what was involved in producing it:

Version 6 of the UAH MSU/AMSU global satellite temperature data set is by far the most extensive revision of the procedures and computer code we have ever produced in over 25 years of global temperature monitoring. The two most significant changes from an end-user perspective are (1) a decrease in the global-average lower tropospheric (LT) temperature trend from +0.140 C/decade to +0.114 C/decade (Dec. ’78 through Mar. ’15); and (2) the geographic distribution of the LT trends, including higher spatial resolution. We describe the major changes in processing strategy, including a new method for monthly gridpoint averaging; a new multi-channel (rather than multi-angle) method for computing the lower tropospheric (LT) temperature product; and a new empirical method for diurnal drift correction… The 0.026 C/decade reduction in the global LT trend is due to lesser sensitivity of the new LT to land surface skin temperature (est. 0.010 C/decade), with the remainder of the reduction (0.016 C/decade) due to the new diurnal drift adjustment, the more robust method of LT calculation, and other changes in processing procedures.

Figure 1 shows a comparison of the data using the new procedures with that derived from the old procedures. Notice that in the new dataset, the temperature anomalies since about 2003 are less than those from the previous version. This has the overall effect of reducing the trend when computed over the entirety of the record.

Figure 1. Monthly global-average temperature anomalies for the lower troposphere from Jan. 1979 through March, 2015 for both the old and new versions of LT (source: www.drroyspencer.com)

 

Figure 1. Monthly global-average temperature anomalies for the lower troposphere from Jan. 1979 through March 2015 for both the old and new versions of LT. (Source: www.drroyspencer.com)

While this new version, admittedly, is not perfect, Spencer, Christy, and Braswell see it as an improvement over the old version. Note that this is not the official release, but rather a version the authors have released for researchers to examine and see if they can find anything that looks irregular that may raise questions as to the procedures employed. Spencer et al. expect a scientific paper on the new data version to be published sometime in 2016.

But unless something major comes up, the new satellite data are further evidence the earth is not warming as expected.  That means that, before rushing into “moral obligations” to attempt to alter the climate’s future course by restricting energy production, we perhaps ought to spend more time trying to better understand what it is we should be expecting in the first place.

One of the things we are told by the more alarmist crowd that we should expect from our fossil fuel burning is a large and rapid sea level rise, primarily a result of a melting of the ice sheets that rest atop Greenland and Antarctica. All too frequently we see news stories telling tales of how the melting in these locations is “worse than we expected.” Some soothsayers even attack the United Nations’ Intergovernmental Panel on Climate Change (IPCC) for being too conservative (of all things) when it comes to projecting future sea level rise. While the IPCC projects a sea level rise of about 18–20 inches from its mid-range emissions scenario over the course of this century, a vocal minority clamor that the rise will be upwards of 3 feet and quite possibly (or probably) greater. All the while, the sea level rise over the past quarter-century has been about 3 inches.

But as recent observations do little to dissuade the hardcore believers, perhaps model results (which they are seemingly more comfortable with) will be more convincing.

A new study available this week in the journal Geophysical Research Letters is described by author Miren Vizcaino and colleagues as “a first step towards fully-coupled higher resolution simulations with more advanced physics”—basically, a detailed ice sheet model coupled with a global climate model.

They ran this model combination with the standard IPCC emissions scenarios to assess Greenland’s contribution to future sea level rise. Here’s what they found:

The [Greenland ice sheet] volume change at year 2100 with respect to year 2000 is equivalent to 27 mm (RCP 2.6), 34 mm (RCP 4.5) and 58 mm (RCP 8.5) of global mean SLR.

Translating millimeters (mm) into inches give this answer: a projected 21st century sea level rise of 1.1 in. (for the low emissions scenario; RCP 2.6), 1.3 in. (for the low/mid scenario; RCP 4.5), and 2.3 in (for the IPCC’s high-end emission scenario). Some disaster.

As with any study, the authors attach some caveats:

The study presented here must be regarded as a necessary first step towards more advanced coupling of ice sheet and climate models at higher resolution, for instance with improved surface-atmosphere coupling (e.g., explicit representation of snow albedo evolution), less simplified ice sheet flow dynamics, and the inclusion of ocean forcing to Greenland outlet glaciers.

Even if they are off by 3–4 times, Greenland ice loss doesn’t seem to be much of a threat. Seems like it’s time to close the book on this imagined scare scenario.

And while imagination runs wild when it comes to linking carbon dioxide emissions to calamitous climate changes and extreme weather events (or even war and earthquakes),  imagination runs dry when it comes to explaining non-events (except when non-events string together to produce some sort of negative outcome [e.g., drought]).

Case in point, a new study looking into the record-long absence of major hurricane (category 3 or higher) strikes on the U.S. mainland—an absence that exceeds nine years (the last major hurricane to hit the U.S was Hurricane Wilma in late-October 2005). The authors of the study, Timothy Hall of NASA’s Goddard Institute for Space Studies and Kelly Hereid from ACE Tempest Reinsurance, concluded that while a streak this long is rare, their results suggest “there is nothing unusual underlying the current hurricane drought. There’s no extraordinary lack of hurricane activity.” Basically they concluded that it’s “a case of good luck” rather than “any shift in hurricane climate.”

That is all well and good, and almost certainly the case. Of course, the same was true a decade ago when the United States was hit by seven major hurricanes over the course of two hurricane seasons (2004 and 2005)—an occurrence that spawned several prominent papers and endless discussion pointing the finger squarely at anthropogenic climate change. And the same is true for every hurricane that hits the United States, although this doesn’t stop someone, somewhere, from speculating to the media that the storm’s occurrence was “consistent with” expectations from a changing climate.

What struck us as odd about the Hall and Hereid paper is the lack of speculation as to how the ongoing record “drought” of major hurricane landfalls in the United States could be tied in with anthropogenic climate change. You can rest assured—and history will confirm—that if we had been experiencing a record run of hurricane landfalls, researchers would be falling all over themselves to draw a connection to human-caused global warming.

But the lack of anything bad happening? No way anyone wants to suggest that is “consistent with” expectations. According to Hall and Hereid:

A hurricane-climate shift protecting the US during active years, even while ravaging nearby Caribbean nations, would require creativity to formulate. We conclude instead that the admittedly unusual 9-year US Cat3+ landfall drought is a matter of luck. [emphasis added]

Right! A good string of weather is “a matter of luck” while bad weather is “consistent with” climate change.

It’s not like it’s very hard, or (despite the authors’ claim) it requires much “creativity” to come up with ways to construe a lack of major hurricane strikes on U.S. soil to be “consistent with” anthropogenic climate change. In fact, there are loads of material in the scientific literature that could be used to construct an argument that under global warming, the United States should experience fewer hurricane landfalls. For a rundown of them, see p. 30 of our comments on the government’s National Assessment on Climate Change, or check out our piece titled, “Global Savings: Billion-Dollar Weather Events Averted by Global Warming.”

It is not for lack of material, but rather, for lack of desire, that keeps folks from wanting to draw a potential link between human-caused climate change and good things occurring in the world.

References:

Hall, T., and K. Hereid. 2015. “The Frequency and Duration of US Hurricane Droughts.” Geophysical Research Letters, doi:10.1002/2015GL063652

Vizcaino, M. et al. 2015. “Coupled Simulations of Greenland Ice Sheet and Climate Change up to AD 2300.” Geophysical Research Letters, doi: 10.1002/2014GL061142