Though a monument to the ravages of Soviet central planning, the barren Magnitogorsk steel works complex still inspires America’s industrial policy proponents. “Failure to plan is a plan for failure,” said comrade Rep. Dan Lipinski (D-IL), as he described the “pro-manufacturing” legislation he helped slip into the mammoth Cromnibus bill, which became law this month.
The Revitalize American Manufacturing and Innovation Act directs the Secretary of Commerce to establish a “Network for Manufacturing Innovation” to:
- improve the competitiveness of U.S. manufacturing and increase production of goods manufactured predominately within the United States;
- stimulate U.S. leadership in advanced manufacturing research, innovation, and technology;
- accelerate the development of an advanced manufacturing workforce; and
- create and preserve jobs
Of course, the verbs “revitalize,” “improve,” “stimulate,” “accelerate,” “create,” and “preserve” are euphemisms for protect, subsidize, regulate, and intervene.
From Lipinsky’s perspective:
This is a big victory for a sector of our economy that over the years has provided so many high quality jobs in my district, in our region, and across the nation, but has taken many hits over the past couple of decades, especially during the recent recession. While manufacturing is by-and-large a private, market endeavor, few can disagree that government policy impacts manufacturing in countless ways.
Yes, government policy has affected manufacturing in countless--usually adverse--ways. Excessive regulations enabled by irresponsible agency cost-benefit analyses, a burdensome tax system, endemic exposure to frivolous lawsuits, exorbitant health care costs, inefficient union work rules, tariffs on industrial inputs, absurd restrictions on immigration, subsidization of chosen firms, and other forms of corporate favoritism are all drags on manufacturing (and other economic sectors, too). U.S. manufacturing would benefit from less Washington, not more.
The Revitalize Act is a solution in search of a problem – and a bad solution at that. American manufacturing does not need revitalizing. Despite Washington’s many meddling interventions, and despite the persistence of the myth of U.S. industrial decline, U.S. manufacturing is thriving--and always has been. Year after year, with the exceptions of during cyclical recession, new records are set with respect to most relevant industry health metrics. The most recent official data reveal all-time highs for manufacturing sector output, value-added, revenues, exports, profits, and foreign direct investment--all in real terms and all achieved, largely, in the absence of top-down planning.
Moreover, for a country whose consumers spend twice as much on services than on goods, and where 90 percent of the workforce is employed outside the manufacturing sector, official obsession over the future of manufacturing is more than a bit overplayed. Many with this obsession dwell on the past, evoking the good old days of 1979, when the sector employed almost 20 million workers, or 1953, when manufacturing accounted for a record 28 percent of U.S. GDP. But today’s manufacturing worker produces an average of $170,000 of value-added per year, as compared to $28,000 in 1979--a more than quadrupling of output in real terms. And, although manufacturing’s share of the economy has declined to about 12 percent today, the absolute value of U.S. manufacturing output, in real terms, has increased more than six-fold since 1953. The facts that the sector supports far fewer jobs today and accounts for a smaller share of the U.S. economy say absolutely nothing about the state of the manufacturing.
What matters is whether there is continued growth in value-added, revenues, foreign direct investment, research and development expenditures, capital expenditures, and productivity. By each of those measures, U.S. manufacturing is robust. When it comes to the question of the condition of U.S. manufacturing, which is likely to be a more credible barometer: legislators who benefit from the perception of fixing “manufactured” problems or investors revealing their preferences through their own actions? As of 2013, nearly $1 trillion of foreign direct investment was parked in U.S. manufacturing, by far the number-one manufacturing investment destination in the world. That’s a rather strong endorsement of the state of U.S. manufacturing.
How on earth would U.S. manufacturers--the world’s most advantaged with their unparalleled access to idea incubators, research universities, R&D laboratories, and broad and deep capital markets to commercialize the ideas that make it through a rigorous vetting process--benefit from the Commerce Department’s participation in mapping out the future? Sure, some firms in some manufacturing industries--those that succeed in convincing the government that they are worthy of public support (i.e., those that commit more resources to political, rather than economic, activities)--may benefit. But others, which rely upon and adapt to the verdicts of consumers in a market environment, will be disadvantaged.
Industrial policy is anathema to the market. It short-circuits a selective, evolutionary process that has undergirded the world’s most successful innovation machine and reduces chances of worthy ideas, firms, and industries leading the next commercial wave. Did the last generation’s policymakers anticipate the arrival of Steve Jobs, Bill Gates, or Mark Zuckerberg and the revolutionary products and services they delivered? Did Washington bureaucrats foresee the advent of specific life-extending medicines and devices, like swallowable, pill-sized cameras? Had those proposing industrial policy in response to a rising Japan in the 1980s and early 1990s prevailed, much of the technology and medical advances taken for granted today would have never come to fruition. American manufacturing and the broader U.S. economy have been successful by shunning, more than embracing, industrial policy.
With its pre-eminence in innovation and entrepreneurship still intact, the United States is situated at the top of the global value chain. Staying there will require Americans to remain skeptical of top-down industrial policy. It could propel the United States above Kazakhstan as the world’s greatest producer of potassium, but at unthinkable costs.
The media are full of headlines about war, sexual assault, inequality, obesity, cancer risk, environmental destruction, economic crisis, and other disasters. It's enough to make people think that the world of their children and grandchildren will be worse than today's world.
But the real story, which rarely makes headlines, is that, to paraphrase Indur Goklany's book title, we are living longer, healthier, more comfortable lives on a cleaner and more peaceful planet. (Allister Heath summed up his argument in a cover story for the Spectator of London, without all the charts and tables.) Fortunately, beyond the headlines, more people do seem to be recognizing this.
The Cato Institute, for instance, has created an ever-expanding website on human progress, known simply as HumanProgress.org.
Here's Steven Pinker expanding on the information in his book The Better Angels of Our Nature: Why Violence Has Declined in Slate:
The world is not falling apart. The kinds of violence to which most people are vulnerable—homicide, rape, battering, child abuse—have been in steady decline in most of the world. Autocracy is giving way to democracy. Wars between states—by far the most destructive of all conflicts—are all but obsolete.
He has charts of the data in each of those areas. And here's Pinker at the Cato Institute discussing why people are so pessimistic when the real trends are so good:
Fraser Nelson, editor of the Spectator, writes that
2014 has been the best year ever – just as 2013 was, and just as 2015 will be. It is something that is, now, true every year but the point cannot be made enough. We’re living through a period of amazing progress – in medicine, prosperity, health and even conquering violence.
Nelson offers this brilliant graphic from the Lancet, a British medical journal:
And just today we learn in a new report from the American Cancer Society that cancer rates have fallen 22 percent in two decades. At Spiked Online, editor Brendan O'Neill points out "10 Kickass Things Humanity Did in 2014."
Andres Martinez at Zocalo Public Square:
The “good old days” are a figment of our imagination. Life--here, there, everywhere--has never been better than it is today. Our lives have certainly never been longer: Life expectancy in the U.S. is now 78.8 years, up from 47.3 years in 1900. We are also healthier by almost any imaginable measure, whether we mean that literally, by looking at health indices, or more expansively, by looking at a range of living-standard and social measures (teen pregnancy rates, smoking, air-conditioning penetration, water and air quality, take your pick).
I’ll concede, very grudgingly, that all this whining can be a good thing. As Yuval Noah Harari, the author of Sapiens: A Brief History of Humankind, has written, we’re hard-wired to be disgruntled. It’s the only way we achieve progress. Evolution requires us to demand more and better, all the time.
So on Monday let's go back to demanding more and better. But for tonight, Happy New Year!
The lame duck Congress suffered through its usual year end brinkmanship before avoiding a government shutdown. Horrors! What would people do if politicians weren’t able to legislate, regulate, and dictate in the "public interest?"
The traditional civics book notion of government is that the state does for us what we cannot do for ourselves. If the state focused on its most fundamental tasks, we might notice if it closed.
Unfortunately, the state has turned into something very different. It’s now a welfare agency for the wealthy, a vast soup kitchen for special interests, an engine for social engineering at home and abroad, and a national nanny determined to run citizens’ lives. Closing down Washington’s great income redistribution racket actually would help most Americans.
Yet, as I point out in the American Spectator: “perhaps the most irritating, even infuriating, government activity is paternalism. There’s a basic difference between a gang of highwaymen and Congress. The first group takes your cash and then leaves you alone. The second group empties your wallet or purse, and then insists on sticking around for your benefit to manage your life. Your new overseers expect not only regular payment but eternal gratitude.”
Consider the campaign against smoking. Adults are entitled to smoke cancer sticks if they want. The idea that not one restaurant or bar in a city of thousands or state of millions can allow someone to smoke is, well, outrageous.
Former New York City Mayor Michael Bloomberg attempted to ban large cups of soda. He felt entitled to substitute his preferences for those of the people he was supposed to “serve.”
Last month the City of Berkeley, California became the first city to impose a special tax on drinks with sugar. No word yet on whether the tax man next will target chocolate bars, ice cream, and households lacking an elliptical trainer.
In October the City of Burien, Washington banned body odor. Or at least too much body odor in public. Explained city manager Kamuron Gurol, “Occasionally, people will unfortunately have such a bodily odor that it’s very hard for other patrons to physically be in the same place.” Are mandatory public showers next?
Authorities in North Attleboro, Massachusetts recently rejected selectman Patrick Reynolds’s request to eliminate the ban on playing ball in the street after the police broke up a game being played by friends. Responded the police chief: what would people think of the city if the community okayed this horrid practice?
In August the State of California loosened its earlier prohibition on people bringing dogs into restaurants. Now under a set of specific conditions—in an outdoor area, with the animal in a carrier or on a leash—dogs can join their owners at a meal. But why not simply leave the decision up to the restaurant?
The paternalist FDA long has delayed the approval of life-saving drugs, thereby killing thousands of people, far more than the number likely saved by preventing the sale of dangerous medicines. Last year, the agency outlawed mimolette cheese because the rinds might contain trace quantities of cheese mites. The latter are harmless, but never mind.
Earlier this year, the FDA decided to ban cheese aged on boards—which means most European cheese imports. After all, “The porous structure of wood enables it to absorb and retain bacteria, therefore bacteria generally colonize not only the surface but also the inside layers of wood.” Millions of Europeans die every year from cheese poisoning. Well, not really. But you never can be too careful!
Actually, very frequently government is too careful, at least when it comes to regulating people’s lives. When it comes to spending taxpayers’ money, tossing folks into jail, and invading foreign countries, on the other hand, officials go wild and crazy, tossing caution to the wind.
It’s time to shut down government activities that aren’t legitimate, which would include most of them, and especially paternalism.
I'm tempted to feel a certain degree of sympathy for Paul Krugman.
As a leading proponent of the notion that bigger government stimulates growth (a.k.a., Keynesian economics), he's in the rather difficult position of rationalizing why the economy was stagnant when Obama first took office and the burden of government spending was rising.
And he also has to somehow explain why the economy is now doing better at a time when the fiscal burden of government is declining.
But you have to give him credit for creativity. Writing in the New York Times, he attempts to square the circle.
Let's start with his explanation for results in the United States.
...in America we haven’t had an official, declared policy of fiscal austerity — but we’ve nonetheless had plenty of austerity in practice, thanks to the federal sequester and sharp cuts by state and local governments.
If you define "austerity" as spending restraint, Krugman is right. Overall government spending has barely increased in recent years.
But then Krugman wants us to believe that there's been a meaningful change in fiscal policy in the past year or so. Supposedly there's been less so-called austerity and this explains why the economy is doing better.
The good news is that we...seem to have stopped tightening the screws: Public spending isn’t surging, but at least it has stopped falling. And the economy is doing much better as a result. We are finally starting to see the kind of growth, in employment and G.D.P., that we should have been seeing all along... What held us back was unprecedented public-sector austerity...now that this de facto austerity is easing, the economy is perking up.
And deficits also are shrinking as a share of economic output according to all these measures, so there's still "austerity" regardless of whether we're looking at the underlying disease of government spending or the symptom of red ink.
I sliced and diced the data to see if there was some way of justifying Krugman's hypothesis and the only numbers that are (vaguely) supportive are the ones from the IMF that show total government spending (federal, state, and local) has increased by an average of 2.3 percent annually over the past two years, after increasing by 1.3 percent per year over the prior three years.
On that basis, one could sort of argue that Krugman is right and "austerity is easing."
But if that's his definition of victory, then I'm more than willing to let him be the winner. If we can constrain the public sector so that it grows at 2.3 percent annually, we'll be complying with my Golden Rule and the burden of government spending will continue to slowly but surely shrink as a share of GDP.
And we'll definitely have much better fiscal policy than we had between 2002-2009, when overall government spending rose by an average of 7.1 percent annually.
So does this mean Krugman and I are on the same page?
You may sense a slight tone of sarcasm in my remarks, and that's because Krugman surely doesn't want government to "only" grow by 2.3 percent annually. He simply wants to justify his hypothesis that the economy's improving performance is somehow due to less austerity. Even if that means he's implicitly endorsing genuine spending restraint.
In other words, Krugman actually is being slippery and misleading in his analysis of American austerity.
But that's nothing compared to his analysis of so-called austerity on the other side of the Atlantic Ocean. Here's some of what he wrote about fiscal policy in the United Kingdom.
...in 2010 Britain’s newly installed Conservative government declared that a sharp reduction in budget deficits was needed to keep Britain from turning into Greece. Over the next two years growth in the British economy, which had been recovering fairly well from the financial crisis, more or less stalled. In 2013, however, growth picked up again — and the British government claimed vindication for its policies. Was this claim justified? No, not at all.
Krugman then claims that there was better economic performance because U.K. politicians decided against "further cuts."
What actually happened was that the Tories stopped tightening the screws — they didn’t reverse the austerity that had already occurred, but they effectively put a hold on further cuts. ...And sure enough, the nation started feeling better.
So is he right?
Well, the IMF numbers show that overall government spending has been growing, on average, by 2 percent annually since 2009. By today's standards, that's a decent record of spending restraint.
But what if we dissect the numbers? Did spending grow very slowly between 2010-2012, followed by a relaxation of restraint beginning in 2013? In other words, is Krugman's argument legitimate, even if it requires him to implicitly endorse (as in the American example) decent fiscal discipline over the past two years?
Nope. Instead, the numbers show just the opposite. Between 2010-2012, the burden of government spending expanded by an average of 2.3 percent per year.
But over the past two years, the "austerity" has become tighter and the budget has grown by 1.5 percent annually.
In other words, it seems that Krugman is either sloppy or mendacious.
Though I'm going to give him an escape hatch, a way of justifying his assertions. When the Tories took over in the United Kingdom, they quickly imposed a series of tax hikes (in addition to the tax hikes imposed by the outgoing Labor government). But since that time, the government has implemented some tax cuts, most notably reductions in corporate tax rates and lower tax rates on personal income.
So if Krugman wants to argue that tax increases decelerated the British economy for a few years and that tax cuts are now helping to boost growth, I'm willing to give him a probationary membership in the supply-side club.
But I don't expect him at the next meeting.
P.S. This isn't the first time Krugman has mangled numbers when analyzing U.K. fiscal policy.
P.P.S. He's also butchered data when writing about fiscal policy in nations such as France, Estonia, and Germany,
A new study from the Illinois Policy Institute analyzes the welfare benefits package available at different levels of earnings in that state. The authors find that low-income workers have limited economic incentive to increase their earnings from the minimum wage, and at some higher levels of earnings these workers actually see a reduction in net income. America’s complex welfare system can too often create these perverse situations where beneficiaries are financially worse off as they increase work effort and earned income. In these poverty traps, lost benefits and increased taxes outweigh any additional earnings, making it harder for beneficiaries to escape from poverty and reach the middle class
Author Erik Randolph finds that a single mother with two children who increases her hourly earnings from the Illinois minimum wage of $8.25 to $12 only sees her net income increase by less than $400. For many low-income workers striving to climb the ladder of prosperity, our welfare system takes away almost all of their incentive to move up from an entry-level job as they do not get to realize almost any of these gains. Even worse, someone in this scenario who works hard and increases her earnings all the way to $18 an hour, a wage level which would place her in the middle class, would actually see her net income decrease by more than $24,800 due to benefit reductions and tax increases. Instead of making it easier for beneficiaries to become independent and achieve a level of prosperity, the welfare system traps them into low levels of earnings. This parent would have to increase her earnings all the way to $38 an hour in order to replace the lost benefits and achieve the same standard of living.
These findings echo some of the insights from our Work versus Welfare Trade-off paper, in which we compared the benefits available to a similar family in each state to the equivalent wage that family would have to earn to obtain the same level of net income. Our study found that the high level of benefits available combined with benefit cliffs created situations that would deter work. In 34 states, the parent would have to earn well above the minimum wage to achieve the same standard of living she had when not working.
This new report from the Illinois Policy Institute illustrates some of the biggest problems with our current welfare system and corroborates many of the findings of our past work. Work versus Welfare looked at two situations, one where the parent worked and one where she had no earned income. This new study from the Illinois Policy Institute provides valuable additional insight, as it looks at this tradeoff at different levels of earned income to analyze the poverty traps in place as beneficiaries move to higher levels of earned income. Instead of encouraging work, the current welfare system often takes away much of the incentive for low-income workers to increase work effort and earnings. As Randolph puts it, “[r]ather than providing a hand up, Illinois’ welfare system can become a trap,” and this is unfortunately the case throughout the country. This study shows yet another reason why our welfare system needs fundamental reform.
Cato will host a conference in New York January 29th to further explore poverty and the welfare system. The conference agenda and registration information can be found here.
This spring, the Affordable Care Act will make its third trip to the Supreme Court. But King v. Burwell is different from its predecessors. Instead of challenging Obamacare’s constitutionality, or the way certain regulations burden particular types of plaintiffs, this lawsuit questions how the executive branch has enforced the law generally—or, more precisely, modified, delayed, and suspended it.
After supporting the challengers’ successful request that the Supreme Court take up this case, the Cato Institute has now joined with Professor Josh Blackman on an amicus brief that alerts the Court to the separation-of-powers and rule-of-law violations attending the ACA’s implementation. Through a series of memoranda, regulations, and even blog posts, President Obama has disregarded statutory text, ignored legislative history, and remade the law in his own image.
King focuses on tax credits—the subsidies that allow people to pay increased premiums—one of the key pillars of Obamacare that the administration has toppled. To assist those who lack employer-sponsored insurance, and because it couldn’t command states to establish exchanges, Congress authorized these credits for residents of states that do create the exchanges. The statute expresses this design in language that is clear as day: Individuals receive tax credits if they bought a qualifying health plan “through an Exchange established by the State.”
In other words, if a state failed to establish an exchange, its residents—who would end up buying plans through the federal HealthCare.gov—would not be eligible for the subsidies. (The ACA’s Medicaid expansion plan operated with a similar carrot-and-stick approach until the Supreme Court rewrote it.)
But a funny thing happened on the way to utopia: only 14 states set up exchanges, meaning that the text of the law denied subsidies in nearly three-quarters of states. This result was untenable to an administration intent on pain-free implementation. To obviate the uncomfortable compromises Congress reached, the executive engaged in its own lawmaking process, issuing a regulation that nullifies the relevant ACA provision.
Under the “IRS Rule,” subsidies would be available in all states. As documented in a detailed report by the House Oversight Committee, the executive branch engaged in a multi-agency process based on a convoluted series of linguistic contortions without any meaningful analysis of the ACA’s history. At least one government attorney recognized that there “was no direct statutory authority to interpret [a federal] exchange as an ‘Exchange established by the State.’” But such concerns were squelched, and the rogue rule was released.
Through the IRS Rule, the executive emulates Humpty Dumpty: “When I use a word . . . it means just what I choose it to mean—neither more or less.” In response, Alice naturally asked “whether you can make words mean so many different things.”The Supreme Court must answer no and vacate the IRS rule that provides subsidies in states that did not establish exchanges.
Through its oversimplification of how the ACA works as a whole—by arguing for the legality of literally any policy that advances “access,” no matter how unmoored from statutory authority—the government incorrectly assumes that the 111th Congress shared President’s Obama’s evolving vision of how to reform the healthcare system (and granted him discretion to advance it accordingly). To paraphrase Inigo Montoya, Congress didn’t think “expand coverage” means what the executive thinks it means.
In King, which will be argued on March 4, the Supreme Court should address the president’s disregard of Congress and belief that legislative gridlock allows him to transcend his constitutional authority. A ruling that upholds his behavior sets a dangerous precedent for the nascent ACA superstatute, which will be implemented for years to come by administrations with different views of the law. More troubling, such a precedent could be used in future to license virtually any executive action that modifies, amends, or suspends any duly enacted law.
Josh Blackman, who co-authored Cato's brief, contributed to this blogpost.
The Ukrainian parliament has repealed the law barring participation in NATO. The U.S. should respond no.
Right before Christmas Ukraine’s Rada repealed legislation mandating “nonparticipation of Ukraine in the military-political alliances.” Said President Petro Poroshenko: “Ukraine’s nonaligned status is out.”
Russia’s foreign minister called the move “counterproductive.” An alliance spokesman said “Our door is open and Ukraine will become a member of NATO if it so requests and fulfills the standards and adheres to the necessary principles.”
In fact, joining could be counterproductive for Kiev. Some Ukrainians may imagine that NATO would protect them from Vladimir Putin. But if the consequence was a full-blown war, as is likely, it would be a disaster for Ukraine.
Moreover, the West doesn’t have the will to act. In 2008 Georgians expected the American military to come to their rescue in their war with Russia. However, Washington would not go to war with Russia over such minimal geopolitical stakes.
The allies made a similar assessment of Ukraine. Despite abundant verbal support, practical aid has been limited.
Russian President Vladimir Putin has violated international norms, unleashed bitter conflict, upset the regional order, and disturbed his European neighbors. Nevertheless, his actions have had little impact on America and Europe. Keeping Ukraine whole simply doesn’t warrant playing international chicken with a nuclear-armed power.
Thus, Ukraine might rue being inducted into NATO. The alliance would discourage Kiev from doing more for itself and addressing Russia directly. Yet Kiev might find its allies to be as inconstant as Moscow was antagonistic.
Which means the Western states must reject any NATO application from Kiev. Past NATO expansion has added members with minimal militaries and extensive problems. Providing small troop contingents for Washington’s unnecessary Third World wars (Afghanistan and Iraq so far) isn’t nearly enough recompense to America for defending countries from a nuclear-armed power.
The most dangerous alliance illusion is that if NATO would just demonstrate “resolve” the Russian invaders would turn tail and race back to Moscow. Yet deterrence works both ways.
Moscow desires respect from other great powers, consideration in decisions affecting its interests, and especially secure borders. The West challenged all of these concerns by expanding NATO, forcibly dismantling Serbia, and pressing to incorporate into the Western bloc both Georgia and Ukraine. None of this justifies Ukraine’s forcible dismemberment, but it is important to understand why Russia acted.
In fact, Russia is better able to deter the West than vice versa in Ukraine. The geopolitical stakes are far greater for Russia than for the U.S. and Europe. Thus, the Putin government remains willing to spend and risk more than the U.S. and Europe. Moscow already has demonstrated its “resolve” by going to war.
Moreover, history is filled with examples of alliances which failed to deter. Countries believe they will win, their opponents will back down, their adversaries will be forced to negotiate, or, if nothing else, they have no alternative but to fight.
Fear of a hostile hegemonic power dominating Eurasia animated Washington’s Cold War promise to protect war-torn Western Europe. Today Kiev is not key to any Western nation’s security.
Recognizing the problems of military action, the allies seem inclined to emphasize economic pressure. However, Ukraine is closer to collapse than is Russia.
Moreover, as I wrote in Forbes, “authoritarian governments like Moscow are more likely to retaliate than capitulate. The Europeans, especially, should beware creating “Weimar Russia.” A similar screenplay seven decades ago ended badly.”
Better for all to seek a negotiated settlement. Kiev decentralizing power, separatists accepting its formal authority, Ukraine acquiescing to Crimea’s separation, Russia holding an internationally monitored referendum, Kiev forbearing military ties to NATO and the U.S., the allies dropping sanctions, Moscow accepting a united Ukraine looking both east and west economically.
The Rada’s vote to end military neutrality is a desperation move. The U.S. should warn Kiev not to look to the alliance to solve its Russia problem.