Cato’s congressional trade votes database now includes votes from last year on major trade bills and amendments in both houses of Congress. The purpose of the database is to educate the public about the trade policy preferences of individual members. We do that by recording their votes on major trade bills and amendments and using the data to map a broader ideological profile.
Whether a particular member qualifies as a free trader, an isolationist, an internationalist, or an interventionist based on our methodology depends on their support for (or opposition to) trade barriers and subsidies.
In previous years, the farm bill and its various amendments have provided a treasure trove of vote data to pin down members’ proclivities on specific commodities and willingness to use public money to distort the economy for the benefit of select cronies. This year was no different, except that votes taken in the House of Representatives on the full package bill have been excluded. Those votes hinged almost entirely on the issue of food stamps, and because the purpose of the database is to reveal members’ trade policy positions, including them in the database would be inappropriate.
That doesn’t mean, of course, that you shouldn’t be dismayed by Republicans who, after successfully removing food stamps from the bill so that productive debate could be had on reforming farm programs, nevertheless voted en masse to continue our Soviet‐style agriculture policy with no significant change.
The new votes on the site include the Senate farm bill, failed votes in both houses to reform the sugar program, an amendment to avoid protectionist regulations on imported olive oil, an extension of “Buy American” policies in government procurement, and a continuation of export marketing subsidies for wealthy agribusiness.
I encourage you to check out the site, read up on our unique methodology, and find out just how protectionist your favorite (or least favorite) member of Congress really is.
The Current Wisdom is a series of monthly articles in which Patrick J. Michaels and Paul C. “Chip” Knappenberger, from Cato’s Center for the Study of Science, review interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.
With all the stern talk about global warming and widespread concern over climate change, you would think that we humans would have a propensity for cooler temperatures. Everywhere you look, the misery that rising temperatures (and the associated evils) will supposedly heap upon us seems to dominate reports about the coming climate. But do patterns of population movement really support the idea that we prefer cooler locations?
Since 1900, the population of the United States increased from about 76 million people to about 309 million people in 2010. Accompanying that population growth were major advances in technology and industry, including vast improvements in our nation’s system of transportation. As planes, trains, and automobiles replaced the horse and buggy, Americans became more mobile, and where we live was no longer connected primarily with proximity to where we were born. Instead, we became much freer to choose our place of residence based on considerations other than ease of getting there.
Where has our new-found freedom of mobility led us? Figure 1 shows the rate of population change from 1900 to 2010 for each of the contiguous 48 states. Notice the increases in states with warm climates such as Florida, Texas, and California, and also in states with big industry (that is, jobs), such as New York, Michigan, and Ohio for example.
Figure 1. The state-by-state population trend (people/year) from 1900 to 2010 (data from U.S. Census Bureau).Read the rest of this post »
Paul Krugman weighed in yesterday on the Trans Pacific Partnership (TPP). I agree with one of his points; I disagree with another.
First, the disagreement: Krugman claims protectionism is mostly gone, and thus the TPP is not all that important:
The first thing you need to know about trade deals in general is that they aren’t what they used to be. The glory days of trade negotiations—the days of deals like the Kennedy Round of the 1960s, which sharply reduced tariffs around the world—are long behind us.
Why? Basically, old-fashioned trade deals are a victim of their own success: there just isn’t much more protectionism to eliminate. Average U.S. tariff rates have fallen by two-thirds since 1960. The most recent report on American import restraints by the International Trade Commission puts their total cost at less than 0.01 percent of G.D.P.
Read the rest of this post »
Tariffs on certain goods are still quite high. A publication called World Tariff Profiles illustrates this nicely. If you look at p. 170 for U.S. statistics, you will see tariff duties for four general product categories of over 10%. You’ll also see maximum tariffs (i.e., the high tariff on particular products) of over 100%!
And if you look at the duty rates for other countries, they are generally much higher.
And none of that includes special “trade remedy” tariffs (anti-dumping, countervailing duties, safeguards), subsidies, discriminatory government procurement, or domestic laws and regulations that discriminate (such as local content requirements).
So, protectionism is alive and well.
To make fun of big efforts that produce small results, the Roman poet Horace wrote, "The mountains will be in labor, and a ridiculous mouse will be brought forth."
That line sums up my view of the new tax reform plan introduced by Rep. Dave Camp (R-Mich.), chairman of the House Ways and Means Committee.
To his credit, Chairman Camp put in a lot of work. But I can't help but wonder why he went through the time and trouble. To understand why I'm so underwhelmed, let's first go back in time.
Back in 1995, tax reform was a hot issue. The House Majority Leader, Dick Armey, had proposed a flat tax. Congressman Billy Tauzin was pushing a version of a national sales tax. And there were several additional proposals jockeying for attention.
To make sense of the clutter, I wrote a paper for the Heritage Foundation that demonstrated how to grade the various proposals that had been proposed.
Whatever its words, a poster without a striking image is a missed opportunity, and incongruous, vaguely disturbing images often work best. (The snake is among the most unsettling creatures on earth to gaze at, yet it figures as the sympathetic subject in not one but two great American political images, the “Don’t Tread on Me” Gadsden flag and Ben Franklin’s “Join or Die.”) For World Press Freedom Day last year, a journalists’-advocacy group in Jordan came up with this simple design. Yes, today’s tyrants are more interested in clamping controls on keyboards, blogs, and cellphone transmissions, but for evocativeness it’s hard to beat the chained nib of an old‐style fountain pen, trembling somewhat as if in resistance.
Today, social media and meme culture endlessly rework classic posters and poster genres for purposes of commentary and satire. That stands in a great tradition: as a means of persuasion, posters are themselves a powerful part of the press. Use them in a good cause, and enjoy them too. [Earlier entries in this series: Monday, Tuesday, Wednesday, Thursday]
Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”
We have two new entries to the long (and growing) list of papers appearing the in recent scientific literature that argue that the earth’s climate sensitivity—the ultimate rise in the earth’s average surface temperature from a doubling of the atmospheric carbon dioxide content—is close to 2°C, or near the low end of the range of possible values presented by the U.N.’s Intergovernmental Panel on Climate Change (IPCC). With a low-end warming comes low-end impacts and an overall lack of urgency for federal rules and regulations (such as those outlined in the President’s Climate Action Plan) to limit carbon dioxide emissions and limit our energy choices.
The first is the result of a research effort conducted by Craig Loehle and published in the journal Ecological Modelling. The paper is a pretty straightforward determination of the climate sensitivity. Loehle first uses a model of natural modulations to remove the influence of natural variability (such as solar activity and ocean circulation cycles) from the observed temperature history since 1850. The linear trend in the post-1950 residuals from Loehle’s natural variability model was then assumed to be largely the result, in net, of human carbon dioxide emissions. By dividing the total temperature change (as indicated by the best-fit linear trend) by the observed rise in atmospheric carbon dioxide content, and then applying that relationship to a doubling of the carbon dioxide content, Loehle arrives at an estimate of the earth’s transient climate sensitivity—transient, in the sense that at the time of CO2 doubling, the earth has yet to reach a state of equilibrium and some warming is still to come.
Loehle estimated the equilibrium climate sensitivity from his transient calculation based on the average transient:equilibrium ratio projected by the collection of climate models used in the IPCC’s most recent Assessment Report. In doing so, he arrived at an equilibrium climate sensitivity estimate of 1.99°C with a 95% confidence range of it being between 1.75°C and 2.23°C.
Compare Loehle’s estimate to the IPCC’s latest assessment of the earth’s equilibrium climate sensitivity which assigns a 66 percent or greater likelihood that it lies somewhere in the range from 1.5°C to 4.5°C. Loehle’s determination is more precise and decidedly towards the low end of the range.
The second entry to our list of low climate sensitivity estimates comes from Roy Spencer and William Braswell and published in the Asia-Pacific Journal of Atmospheric Sciences. Spencer and Braswell used a very simple climate model to simulate the global temperature variations averaged over the top 2000 meters of the global ocean during the period 1955-2011. They first ran the simulation using only volcanic and anthropogenic influences on the climate. They ran the simulation again adding a simple take on the natural variability contributed by the El Niño/La Niña process. And they ran the simulation a final time adding in a more complex situation involving a feedback from El Niño/La Niña onto natural cloud characteristics. They then compared their model results with the set of real-world observations.
What the found, was the that the complex situation involving El Niño/La Niña feedbacks onto cloud properties produced the best match to the observations. And this situation also produced the lowest estimate for the earth’s climate sensitivity to carbon dioxide emissions—a value of 1.3°C.
Spencer and Braswell freely admit that using their simple model is just the first step in a complicated diagnosis, but also point out that the results from simple models provide insight that should help guide the development of more complex models, and ultimately could help unravel some of the mystery as to why full climate models produce high estimates of the earth’s equilibrium climate sensitivity, while estimates based in real-world observations are much lower.
Our Figure below helps to illustrate the discrepancy between climate model estimates and real-world estimates of the earth’s equilibrium climate sensitivity. It shows Loehle’s determination as well as that of Spencer and Braswell along with 16 other estimates reported in the scientific literature, beginning in 2011. Also included in our Figure is both the IPCC’s latest assessment of the literature as well as the characteristics of the equilibrium climate sensitivity from the collection of climate models that the IPCC uses to base its impacts assessment.
Figure 1. Climate sensitivity estimates from new research beginning in 2011 (colored), compared with the assessed range given in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) and the collection of climate models used in the IPCC AR5. The “likely” (greater than a 66% likelihood of occurrence)range in the IPCC Assessment is indicated by the gray bar. The arrows indicate the 5 to 95 percent confidence bounds for each estimate along with the best estimate (median of each probability density function; or the mean of multiple estimates; colored vertical line). Ring et al. (2012) present four estimates of the climate sensitivity and the red box encompasses those estimates. The right-hand side of the IPCC AR5 range is actually the 90% upper bound (the IPCC does not actually state the value for the upper 95 percent confidence bound of their estimate). Spencer and Braswell (2013) produce a single ECS value best-matched to ocean heat content observations and internal radiative forcing.[/caption]
Quite obviously, the IPCC is rapidly losing is credibility.
As a result, the Obama Administration would do better to come to grips with this fact and stop deferring to the IPCC findings when trying to justify increasingly burdensome federal regulation of carbon dioxide emissions, with the combined effects of manipulating markets and restricting energy choices.
Loehle, C., 2014. A minimal model for estimating climate sensitivity. Ecological Modelling, 276, 80-84.
Spencer, R.W., and W. D. Braswell, 2013. The role of ENSO in global ocean temperature changes during 1955-2011 simulated with a 1D climate model. Asia-Pacific Journal of Atmospheric Sciences, doi:10.1007/s13143-014-0011-z.
From the Washington Post:
Annapolis Police Chief Michael A. Pristoop thought he came prepared when he testified before a Maryland State Senate panel on Tuesday about the perils of legalizing marijuana.
In researching his testimony against two bills before the Judicial Proceedings Committee, Pristoop said, he had found a news article to illustrate the risks of legalization: 37 people in Colorado, he said, had died of marijuana overdoses on the very day that the state legalized pot.…
Trouble is, the facts were about as close to the truth as oregano is to pot. After a quick Google search on his laptop, [State Senator Jamin] Raskin — the sponsor of the legalization bill that was the subject of the Senate hearing—advised the chief that the Colorado overdose story, despite its deadpan delivery, had been made up for laughs by The Daily Currant, an online comedy magazine.
Ouch! For more on the momentum of marijuana law reform, check out today’s New York Times.