Tag: central planning

Clinton, Obama, and Hayek

President Obama has been saying that if the United States government can find and eliminate Osama bin Laden after ten years of searching, it can do anything:

Already, in several appearances since the raid, Obama has described it as a reminder that “as a nation there is nothing that we can’t do,” as he put it during an unrelated White House ceremony Monday. On Sunday night, during his first comments about the operation, he linked it to American values, saying the country is “once again reminded that America can do whatever we set our mind to.”

This is, of course, nonsense. Finding bin Laden, difficult as it proved to be, was an incomparably simple task compared to using coercion and central planning to bring about desired results in defiance of economic reality. You can’t deliver better health care to more people for less money by reducing the role of incentives and markets, even if you set your mind to it. As Russell Roberts said about a similar concept, “If we can put a man on the moon, then…”:

Putting a man on the moon is an engineering problem. It yields to a sufficient application of reason and resources. Eliminating poverty is an economic problem (and by the word “economic” I do not mean financial or related to money), a challenge that involves emergent results. In such a setting, money alone—in the amounts that a non-economic approach might suggest, one that ignores the impact of incentives and markets—is unlikely to be successful.

Obama should listen to Bill Clinton, who last fall seemed to be channeling Hayek:

Friedrich Hayek, The Fatal Conceit: “The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.”

Bill Clinton, 9/21: “Do you know how many political and economic decisions are made in this world by people who don’t know what in the living daylights they are talking about?”

Hope and Dismay about Haiti’s Future

Nicholas Kristof provides “a useful reminder of the limitations of charity and foreign aid” in his New York Times op-ed about Haiti today. “Nearly a year after the earthquake in Haiti,” he notes, “more than one million people are still living in tents and reconstruction has barely begun.”

He emphasizes the importance of “trade, not aid” and of the role of business: “It’s hard to think of a charitable project that will be as beneficial as the Coca-Cola Company’s decision to build up the mango juice industry in Haiti, supporting 25,000 farmers.”

He also cites a seemingly successful microfinance aid project that lends money to poor women in Haiti to begin and expand business ventures by, for example, investing in livestock or growing fruit for sale. It is impossible to evaluate the record of that organization based on the anecdotes Kristof provides, but, while microcredit may for a time alleviate the conditions under which poor recipients live (and be successful at pulling some recipients out of poverty), there is little evidence from its overall record that microcredit effectively reduces poverty. It is certainly not a way to reduce poverty on a widespread or sustainable basis. David Roodman of the Center for Global Development notes, for example, that “microfinance institutions in Haiti only reach perhaps 250,000 people, about 2.5% of the population.” (For a critique of some of the claims of microcredit proponents see Thomas Dichter’s Cato study.)

In line with Kristof’s main argument and with decades of evidence of successful countries around the world, the most effective way to reduce poverty in economically repressed Haiti is by opening its markets and increasing economic freedom. Unfortunately, Haiti’s reconstruction and long-term development plan, according to which the United States and international donors have pledged more than $15 billion, reads like a relic of central planning with virtually no mention of policies that promote economic freedom. Two sentences in the document mention the importance of clarifying land titles. One page mentions the role of the private sector, but it is in regards to its cooperation with the government’s “development centers” that will operate throughout the country to stimulate predetermined industries using government funds and guarantees and for “better redistribution of [the] population.”

We’ve been down this road before. If the Haitian government wishes to avoid disappointment and free itself from dependence on international aid, it needs to rethink its approach to development.

Beware of Americans Proselytizing the Chinese Economic Model

In a Cato paper released earlier this month, I argued that the glacial pace of America’s economic recovery and its growing public debt juxtaposed against China’s almost uninterrupted double-digit annual economic growth and its role as Congress’s sugar daddy have bred insecurity among U.S. opinion leaders, many of whom now advocate a more strident approach to China, or emulation of its top-down approach.

I cite, among others, Thomas Friedman of the New York Times, who is enamored of autocracy’s capacity to facilitate China’s singularity of purpose to dominate the industries of the future:

One-party autocracy certainly has its drawbacks. But when it is led by a reasonably enlightened group of people, as China is today, it can also have great advantages. That one party can just impose the politically difficult but critically important policies needed to move a society forward in the 21st century. It is not an accident that China is committed to overtaking us in electric cars, solar power, energy efficiency, batteries, nuclear power, and wind power. China’s leaders understand that in a world of exploding populations and rising emerging-market middle classes, demand for clean power and energy efficiency is going to soar. Beijing wants to make sure that it owns that industry and is ordering the policies to do that, including boosting gasoline prices, from the top down.

Friedman’s theme—but less googoo eyed and more all-hands-on-deck!—is echoed in an op-ed by China-expert James McGregor, which ran in yesterday’s Washington Post.  McGregor conveys what he describes as an emerging sentiment within the U.S. business community in China.  That is: the Chinese government is hell bent on creating national economic champions; is using its increasing leverage (as global financier and fastest-growing market) to impose its own interpretations of the global rules of economic engagement in support of its comprehensive industrial policy, and, ultimately; the United States must wake up and rise to the challenge by crafting some top-down industrial policy of its own.

I don’t dispute some of McGregor’s premises.  China’s long process of market liberalization has slowed down, halted, and even reversed in some areas.  Policies are proliferating that favor local companies (particularly state-owned enterprises), hamper the operations of foreign-owned firms, and impede market access for imports.  Indeed, many of these policies are likely the product of industrial planning. 

But McGregor’s conclusion is extreme:

The time has come for a White House-led, public-private, comprehensive examination of American competitiveness against a clear-eyed view of China’s very smart and comprehensive industrial development policies and plans…What technology do we protect? What do we share? What are our commercial strategic imperatives as a nation? How do we retool the U.S. government’s inadequate and outdated trade bureaucracy to provide thoughtful strategic focus and interagency coordination? How do we overcome the fundamental disconnect between our system of scattered bureaucratic responsibilities and almost no national economic planning vs. China’s top-down, disciplined and aggressive national economic development planning machine?

Central planning may be more en vogue in Washington than usual nowadays, but to even come close to reaching his conclusion requires disregarding many facts, which is how McGregor gets there sans tongue in cheek.

First, in an effort to preempt any suggestion that China’s protectionism is nothing exceptional and can be remedied through the World Trade Organization and other channels, McGregor offers this blanket statement: “Chinese policymakers are masters of creative initiatives that slide through the loopholes of WTO and other international trade rules.”  I realize that op-ed writing forces one to economize on words, but that statement, which serves as McGregor’s springboard to socialism, cannot suffice for an analysis of the facts.  One of those facts is that the United States has been successful in compelling changes in China’s protectionist practices in all of the formal WTO disputes it has lodged that have been resolved thus far (6 of 8 formal cases have been resolved).  If China violates the agreed rules of trade, and its actions impair benefits or impose costs on U.S. interests that are too large to ignore, pursuing a WTO case is a legitimate and proven channel of resolution. Chinese protectionism can be addressed without the radical changes McGregor counsels. 

But I think McGregor—sharing the tactics of other in the media and politics—exploits public angst over a rising China to promote his idea as the obvious and only solution to what he sells as a rapidly-metastasizing problem.  McGregor argues that China is aiming to create national champions through subsidies and other preferential policies, while charging foreign companies admission to its market in the form of technology transfer, joint-venturing requirements, and local content rules.  McGregor claims, that this appropriation of foreign technology will be used to “create Chinese ‘indigenous innovations’ that will come back at us globally.”  Ultimately, McGregor fears that “American technology companies could be coerced to plant the seeds of their destruction in the fertile China market.”

It is telling that McGregor doesn’t consider U.S. government expropriation of those companies’ technology assets as planting the seeds of their own destruction.  Indeed, it is nothing short of expropriation when technology that is owned by individual companies in the private sector, making unique decisions to improve their own bottom-lines on behalf of their own shareholders is suddenly subject to the questions McGregor wants answered: What technology do we protect? What do we share? What are our commercial strategic imperatives as a nation?  Those questions, let alone the answers, imply that the U.S. government should have at least de facto ownership and control over these privately-held technology assets.

What is wrong with allowing each of these companies to decide for themselves whether they want to license or transfer some of their technology to Chinese companies, as the price of doing business in China?  Some will, some won’t, but the presupposition that those who do are selling the golden goose is not based on fact.  Let companies decide for themselves how to use their resources, and don’t treat industry as a monolith, as in “What are our commercial strategic imperatives as a nation?” 

Had we tried to answer and implement the answer to that question in the face the Japanese “threat” two decade ago, we’d be bereft of some of the most ingenious technological breakthroughs and the hundreds of industries and thousands of products that “our system of almost no national economic planning” has yielded.

When we peel away the chicken-little rhetoric, when we dispense with neo-Rahm Emanualism (“Never manufacture a good crisis and then let it go to waste”), when cooler heads and analytical minds prevail, the economic question boils down to this: What has been more successful at creating growth, central planning or decentralized dynamism?  For both China and the United States, it has been the latter. 

My bet is that China’s re-embrace of greater central planning will be brief, as it wastes resources, yields few -if any- national champions, and limits innovation.  For similar reasons, U.S. opinion leaders will eschew central planning, as well.

By Pulling His Punches, Bernanke Shatters ObamaCare’s Credibility

Federal Reserve Chairman Ben Bernanke gave a speech in Dallas yesterday where he inadvertently discredited claims that ObamaCare would reduce health care costs and the federal deficit.  According to The Washington Post:

Federal Reserve Chairman Ben S. Bernanke warned Wednesday that Americans may have to accept higher taxes or changes in cherished entitlements such as Medicare and Social Security if the nation is to avoid staggering budget deficits that threaten to choke off economic growth…

While the immediate audience for the speech was the Dallas Regional Chamber, his message was intended for Congress and the Obama administration…

Bernanke has urged Congress to address long-term fiscal imbalances in congressional testimony before, but usually only when he is asked about them by lawmakers. His speech Wednesday aimed to reach a broader audience, steering away from technical economic speak and using plain, sometimes wry, language – a rare thing for a Fed chairman.

The non-partisan Congressional Budget Office projects the annual federal deficit will be at least $700 billion in each of the next 10 years.  Deficit spending is a form of taxation without representation, because it increases the tax burden of generations who cannot yet vote (often because they are as yet unborn).  Bernanke wants us to end deficit spending.  Kudos to him.

But consider the timing of his speech.  Why wait until April 7, 2010, to deliver that message directly to the public?  Why not give that speech in January? Or February? Or any time before March 21?

The reason is obvious: Bernanke held back to appease his political masters.

Until three weeks ago, the nation was locked in a debate over whether Congress should enact ObamaCare, the most sweeping piece of social legislation in American history.  That law includes two new health care entitlements – the very type of government spending driving the federal budget further into the red.  Democrats rigged the bill so that the CBO was obliged to score it as deficit-reducing, but 87 percent of the public weren’t buying it.

If Bernanke really wanted to warn the American public about the dangers of rising budget deficits, then a congressional debate over creating two new entitlement programs would be the most important time to deliver that message.  Democrats would have responded that ObamaCare would reduce the deficit.  But since voters believe ObamaCare to be a budget-buster, Bernanke’s warning would have dealt ObamaCare a serious blow.

Had Bernanke delivered his populist warning before January 28, it could have jeopardized his confirmation by the Senate to a second term as Fed chairman.  Had he done so between January 28 and March 21, he would have suffered a storm of criticism from Democrats (and possible retribution when his term came up for renewal in 2013) because his sensible, responsible warning would have made moderate House Democrats more skeptical about ObamaCare’s new entitlements.

So Bernanke pulled his punches.  As Dick Morris would put it, anyone who doesn’t think that political concerns affected Bernanke’s timing is too naive for politics.

Bernanke’s behavior thus reveals why ObamaCare’s cost would exceed projections and would increase the deficit.

Knowledgeable leftists, notably Tom Daschle and Uwe Reinhardt, recognize that Congress is no good at eliminating wasteful health care spending because politics gets in the way.  (Every dollar of wasteful health care spending is a dollar of income to somebody, and that somebody has a lobbyist.)

The Left’s central planners believe they can contain health care costs by creating an independent government bureaucracy that sets prices and otherwise rations care without interference from (read: without being accountable to) Congress.  ObamaCare’s new Independent Payment Advisory Board is a precursor to what Daschle calls a “Health Fed,” so named to convey that this new bureaucracy would have the same vaunted reputation for independence as the Federal Reserve.

Yet Fed scholar Allan Meltzer reports, and Bernanke’s behavior confirms, that not even the hallowed Federal Reserve can escape politics:

In reading the minutes of the Fed and watching what they do, the Fed has always been very much afraid of Congress…The idea of having a really independent agency in Washington, that’s just not going to happen…[The Fed] is very much concerned — always — about what the Congress is doing, and doesn’t want to deviate very far from that.

Politics affects Bernanke’s behavior and the Fed’s behavior.  Politics will defang the Independent Payment Advisory Board, and many of  ObamaCare’s other purported cost-cutting measures.  ObamaCare’s cost will outpace projections. The deficit will rise.

Repeal the bill.

I’m Sick of Central Planners

Education scholar Diane Ravitch has an op-ed in today’s Washington Post arguing that the nation needs to change course on K-12 education.

Ravitch was a supporter of the No Child Left Behind Act, but now she says “we wasted eight years with the ‘measure and punish’ strategy of NCLB.”

So central planning of the nation’s schools from Washington didn’t work under George W. Bush, but now Ravitch has a whole bunch of new central planning ideas for the schools. She uses the phrases “we need” and “we must” repeatedly, implying that we should impose new national rules of her choosing on all the schools.

She says:  ”Everyone agrees that good education requires good teachers. To get good teachers, states should insist — and the federal government should demand — that all new teachers have a major in the subject they expect to teach…”

In the column, Ravitch laments the unexpected negative consequences of NCLB, but she seems not to realize that the new policies she advocates would probably also have negative consequences. Wouldn’t demands that teachers have certain degrees push up teaching costs at a time when schools are already complaining that their budgets are stretched tight? Wouldn’t her mandate cause schools to substitute teachers with paper qualifications but poor teaching skills for other teachers who have better teaching skills? Is having a degree in a specific subject more important than teachers having qualities such as empathy, patience, and love of learning?

I don’t know the answer to those questions, and I’m not an education expert. But I do know that the experts often disagree on the best teaching methods and that the established educational wisdom is always evolving. For that reason, one-size-fits-all decrees from Washington make absolutely no sense. So why should Ravitch impose her judgment regarding teacher qualifications on all 100,000 or so public schools in America?

Let’s let the nation’s schools in their local communites try new approaches, learn from each other, and move the ball forward as they see fit. And let’s encourage folks like Ravitch to run for her local school board if she has ideas about schooling that she wants to experiment with.

RIP Michael Foot, a Socialist Who Understood What Socialism Was

“Michael Foot, a bookish intellectual and anti-nuclear campaigner who led Britain’s Labour Party to a disastrous defeat in 1983, died [March 3],” reported the Associated Press. He was 96.

Foot personified the socialist tendency in the Labour Party, which Tony Blair successfully erased when he won power at the head of a business-friendly, interventionist “New Labour.” Yet Foot remained a respected, even revered, figure.

“Michael Foot was a giant of the Labour movement, a man of passion, principle and outstanding commitment to the many causes he fought for,” Blair said Wednesday. Prime Minister Gordon Brown, Blair’s partner in creating “New Labour,” praised Foot as a “genuine British radical” and a “man of deep principle and passionate idealism.”

Michael Foot may have been the most serious intellectual ever to head a major Western political party. He wrote biographies of Labour politicians Aneurin Bevan and Harold Wilson, and of H.G. Wells, and a 1988 book on Lord Byron, “The Politics of Paradise,” and he edited the “Thomas Paine Reader” in 1987. So when you asked Michael Foot what socialism was, you could expect a deeply informed answer. And that’s what the Washington Post got in 1982, when they asked the Labour Party leader for an example of socialism in practice that could “serve as a model of the Britain you envision.” Foot replied,

The best example that I’ve seen of democratic socialism operating in this country was during the second world war.  Then we ran Britain highly efficiently, got everybody a job… . The conscription of labor was only a very small element of it.  It was a democratic society with a common aim.

Wow. Michael Foot, the great socialist intellectual, a giant of the Labour movement, a man of deep principle and passionate idealism, thought that the best example ever seen of “democratic socialism” was a society organized for total war.

And he wasn’t the only one. The American socialist Michael Harrington wrote, “World War I showed that, despite the claims of free-enterprise ideologues, government could organize the economy effectively.” He hailed World War II as having “justified a truly massive mobilization of otherwise wasted human and material resources” and complained that the War Production Board was “a success the United States was determined to forget as quickly as possible.” He went on, “During World War II, there was probably more of an increase in social justice than at any [other] time in American history. Wage and price controls were used to try to cut the differentials between the social classes… . There was also a powerful moral incentive to spur workers on: patriotism.”

Collectivists such as Foot and Harrington don’t relish the killing involved in war, but they love war’s domestic effects: centralization and the growth of government power. They know, as did the libertarian writer Randolph Bourne, that “war is the health of the state”—hence the endless search for a moral equivalent of war.

As Don Lavoie demonstrated in his book National Economic Planning: What Is Left?, modern concepts of economic planning—including “industrial policy” and other euphemisms—stem from the experiences of Germany, Great Britain, and the United States in planning their economies during World War I. The power of the central governments grew dramatically during that war and during World War II, and collectivists have pined for the glory days of the War Industries Board and the War Production Board ever since.

Walter Lippmann was an early critic of the collectivists’ fascination with war planning. He wrote, “A close analysis of its theory and direct observation of its practice will disclose that all collectivism… is military in method, in purpose, in spirit, and can be nothing else.” Lippman went on to explain why war—or a moral equivalent—is so congenial to collectivism:

Under the system of centralized control without constitutional checks and balances, the war spirit identifies dissent with treason, the pursuit of private happiness with slackerism and sabotage, and, on the other side, obedience with discipline, conformity with patriotism. Thus at one stroke war extinguishes the difficulties of planning, cutting out from under the individual any moral ground as well as any lawful ground on which he might resist the execution of the official plan.

National service, national industrial policy, national energy policy—all have the same essence, collectivism, and the same model, war. War is sometimes, regrettably, necessary. But why would anyone want its moral equivalent?

Groopman on How Behavioral Economics Undermines the Case for Central Planning

In The New York Review of Books, oncologist and author Jerome Groopman delivers a stunning rebuke to those in the Obama administration (read: OMB director Peter Orszag) who think the federal government can improve health care quality by telling doctors how to practice medicine:

in the Senate health care bill…Doctors and hospitals that follow “best practices,” as defined by government-approved standards, are to receive more money and favorable public assessments. Those who deviate from federal standards would suffer financial loss and would be designated as providers of poor care…

Over the past decade, federal “choice architects”—i.e., doctors and other experts acting for the government and making use of research on comparative effectiveness—have repeatedly identified “best practices,” only to have them shown to be ineffective or even deleterious.

For example, Medicare specified that it was a “best practice” to tightly control blood sugar levels in critically ill patients in intensive care. That measure of quality was not only shown to be wrong but resulted in a higher likelihood of death when compared to measures allowing a more flexible treatment and higher blood sugar. Similarly, government officials directed that normal blood sugar levels should be maintained in ambulatory diabetics with cardiovascular disease. Studies in Canada and the United States showed that this “best practice” was misconceived. There were more deaths when doctors obeyed this rule than when patients received what the government had designated as subpar treatment (in which sugar levels were allowed to vary).

That’s just one of many examples Groopman offers of where government planners have gone awry.  He concludes:

Ironically, the failure of experts to recognize when they overreach can be explained by insights from behavioral economics…

The care of patients is complex, and choices about treatments involve difficult tradeoffs. That the uncertainties can be erased by mandates from experts is a misconceived panacea, a “focusing illusion.”

Come to think of it, Groopman makes much the same case as I did in my article, “Pay-for-Performance: Is Medicare a Good Candidate?” (Yale J. Health P. Law & Ethics, Vol. 7, issue 1: Winter 2007): evidence-based medicine is essential, but variation in disease burden and patient preferences (read: values) makes it impossible for central planners to define quality accurately.

Read the whole thing.