Following U.S. Secretary of State John Kerry’s saber-rattling statements on the 26th of August, the value of the Syrian pound (SYP) has zigged and zagged. Indeed, the SYP lost 24.7% of its value against the U.S. dollar in the two days following Kerry’s announcement (moving from 225 to 270 SYP/USD). Then, yesterday, we saw a sharp reversal in the course of the pound. Over the past two days, the SYP regained 25.58% of its value, bringing the black-market exchange rate back down to 215 SYP/USD. At this rate, the implied annual inflation rate is 209.85% (see the charts below the jump).
So, what caused the recent strengthening of the Syrian pound? We have to look no further than the eroding support for a U.S.-led strike against Syria. Yes, the United States has lost support from important allies, the United Kingdom, Canada, and Italy.
In addition, Syrian authorities have cracked down again on black-market currency trading. In the past week, the authorities have shut down a number of currency traders; made “friendly” reminders to the public of the penalties of trading on the black market—imprisonment of 10 years and a hefty fine; and warned Syrians to stay away from “counterfeit” dollars that have supposedly been circulating. The authorities’ “get tough” policy followed speculation that the SYP/USD rate would surpass the 300 mark.
I have established a page to track current black-market exchange-rate and implied inflation data for the Syrian pound, as well as for troubled currencies in Iran, Argentina, North Korea, and Venezuela. For more, see: The Troubled Currencies Project.
With friends and allies backing away from war with Syria, President Obama has been reduced to threatening unilateral military action—just enough so the administration won’t be “mocked,” said one unnamed official. But that's also enough to violate the Constitution’s requirement for a congressional declaration of war.
The nation’s Founders feared just such a moment. John Jay pointed to the dubious motives that caused kings “to engage in wars not sanctified by justice or the voice and interests of his people.” So the Framers gave most military powers to Congress. Under Article 1, sec. 8 (11), "Congress shall have the power ... to declare war."
Future president James Madison explained the "fundamental doctrine of the Constitution that the power to declare war is fully and exclusively vested in the legislature." The Founders did recognize that the president might have to respond to attack; however, this was a very limited grant of authority. George Mason favored "clogging rather than facilitating war." James Wilson observed: “It will not be in the power of a single man, or a single body of men, to involve us in such distress; for the important power of declaring war is in the legislature at large.” Thomas Jefferson approved the "effectual check to the dog of war by transferring the power of letting him loose."
No surprise, many presidents have pushed against the Constitution’s restrictions, unilaterally employing the military for different operations. However, most such deployments have been limited and temporary and many had colorable legislative authority.
Even strong presidents acknowledged the limits on their power. George Washington explained,“The Constitution vests the power of declaring war with Congress; therefore no offensive expedition of importance can be undertaken until after they shall have deliberated upon the subject, and authorized such a measure.” Similarly, said Dwight Eisenhower, "I am not going to order any troops into anything that can be interpreted as war, until Congress directs it."
Barack Obama once agreed with his predecessors. In December 2007 candidate Obama acknowledged, “The president does not have power under the Constitution to unilaterally authorize a military attack in a situation that does not involve stopping an actual or imminent threat to the nation.”
On August 19, the Cato Institute released a study by me and Charles Hughes, The Work vs. Welfare Trade‐Off, 2013: An Analysis of the Total Level of Welfare Benefits by State, showing that a family collecting welfare benefits from seven common programs – Temporary assistance for Needy Families (TANF), food stamps, Medicaid, WIC, public housing assistance, utilities assistance (LIHEAP) and free commodities – could receive more than what a minimum wage job would pay in 35 states. Critics responded: so raise the minimum wage.
Making work pay better, including the sort of entry level jobs that people leaving welfare can expect to find, is a terrific goal. Unfortunately, government has very little ability to force such increases. Attempts to simply mandate that businesses pay more, through increased minimum wages or living wage laws, as well as attempts to mandate employee benefits like health insurance (see Obamacare), primarily result in fewer jobs.
The amount of compensation a worker receives is more or less a function of his or her productivity. As Greg Mankiw, Chairman and Professor of Economics at Harvard University explains, “Economic theory says that the wage a worker earns, measured in units of output, equals the amount of output the worker can produce.” This somewhat oversimplifies, of course. There are other factors involved. But one can’t just arbitrarily declare a worker’s value.
The academic evidence on this point is pretty clear. A comprehensive review of more than 100 studies on the minimum wage by David Neumark and William Wascher for the National Bureau of Economic Research found that 85 percent of the studies they reviewed found negative employment effects. Newmark and Wascher concluded, “the preponderance of the evidence points to disemployment effects… [and] studies that focus on the least‐skilled groups provide relatively overwhelming evidence of stronger disemployment effects for these groups.”
Indeed, evidence of employment losses goes all the way back to 1938 and first federally imposed minimum wage. The U.S. Department of Labor concluded that that first 25‐cent minimum wage resulted in the loss of 30,000 to 50,000 jobs, or 10 to 13 percent of the 300,000 workers affected by the increase.
More recently, Michael Hicks of Ball State University looked at the impact of the July 2008 minimum wage increase on unemployment rates in the United States and concluded that a 10 percent increase in the minimum wage results in a roughly 0.19 percent increase in unemployment, meaning the loss of about 160,000 jobs.
And, a study by Joseph Sabia and Richard Burkhauser for the Employment Policy Institute concluded that an increase in the federal minimum wage to $9.50 would result in the loss of 1.3 million jobs, primarily low‐skilled jobs. Moreover, Sabia and Burjhauser concluded that there would be very little gain in exchange for this pain. According to their research, state and federal minimum wage increases between 2003 and 2007 had no effect on state poverty rates.
Or simply look at how Obamacare’s employer mandate is causing businesses to shift workers to part time or to reduce hiring in response to higher labor costs.
We should understand that for most low skilled workers, such as the hypothetical mother in our welfare study, a minimum wage job is a starting point, not a destination. Nearly two‐thirds of minimum wage workers receive a raise within one year, with the median hike for full‐time workers about 14 percent. Indeed, we know that just 2.6 percent of full‐time workers (including minimum wage workers) live in poverty.
That is not to say we shouldn’t try to increase entry level wages. But the best way to accomplish that goal is to create a climate that leads to greater economic growth overall. Greater prosperity eventually finds its way into higher wages for workers, including those at the bottom of the ladder. That means reducing taxes and regulations to encourage people to invest and expand their businesses.
Somehow, I don’t think that is what most critics of our study have in mind.
The United States faces no serious military threats today, yet is constantly at war. Syria is the latest target.
Traditionally Washington did not look for wars to fight. The government’s duty was to protect the American people from conflict.
Measured on this scale there is no cause for intervening in the Syrian imbroglio. The regime has little capacity to harm the U.S. or resist the overwhelming retaliation that would occur in response to any attack. Syria’s chemical weapons have little more utility than high explosives and nothing close to the killing capacity of America’s many nuclear weapons.
The possibility of radical Islamist insurgents gaining control over territory is more worrisome, but is most likely in the event of U.S. intervention against the Assad government. The conflict is destabilizing, but friendly states should deal with the consequences.
Of course, the Syrian civil war is a tragedy, like many others throughout history. Civil wars may be the worst, often with few genuine good guys.
The rebels are united only by their opposition to Assad. The strongest factions appear least interested in a liberal, democratic future for Syria and most interested in using Syria to attack Americans.
Richard Lindzen, Professor Emeritus at MIT, and now a Distinguished Senior Fellow in the Center for the Study of Science here at Cato, has just published a paper called “Science in the Public Square: Global Climate Alarmism and Historical Precedents.” The paper is in the Journal of American Physicians and Surgeons.
Lindzen begins with what he calls “The Iron Triangle,” an analog to a very popular aphorism coined by Ronald Reagan, that describes the generic and mutually beneficial relationship between Congress, the media, and special interest groups. Lindzen’s version is between scientists who make “meaningless or ambiguous statements” on climate change, which are translated into alarmist declarations by the global warming lobby, to which politicians respond by shoveling more money to the scientists. Dr. Lindzen cheekily calls this version the Iron Rice Bowl, the same phrase coined by Mao Zedong to describe lifetime employment in exchange for support of the communist state.
Lindzen, whose article is available here, notes this type of symbiosis supported two other particularly bad ideas. One was early 20th Century eugenics, which was enshrined by law in the United States, politically very useful in 1920s Germany, and institutionalized into the holocaust in the succeeding decade. For an exhausting and exhaustive insight on this process in Germany, you still can’t beat Robert J. Lifton’s 1986 book, The Nazi Doctors.
A similar dynamic surrounded the institutionalization of the obviously incorrect paradigm of “the inheritance of acquired characteristics,” championed by Soviet agronomist Trofim Lysenko, under the enthusiastic support of Josef Stalin, who thought it would help bring about “The New Soviet Man,” by changing human nature genetically through physical experiences of the organism. The logic is as simple as this: if one, say, pumped iron incessantly with just the left arm, your children would be born with muscular left arms. Hogwash, but effective for a public that both feared its government and was scientifically illiterate.
“The Situation in Biological Science,” published (and translated) by the Lenin Academy of Agricultural Sciences of the U.S.S.R. in 1949 sits very close to Lifton’s book on my shelf at Cato. It is an exhaustive compendium on Lysenko’s “new genetics.” It claims authority, and if you spoke against it as a scientist, a trip to Siberia (or worse) wasn’t far away. With global warming alarmism, we are much more humane. Speak against it, and you will lose your government funding and maybe your job, but not your life.
Lindzen finishes with a bit of optimism, noting that the eugenics and Lysenkoism lasted about thirty years, which would mean that the Iron Triangle of climate alarmism is getting a little long in the tooth (it started in1988).
Methinks Professor Lindzen is a bit optimistic. After all, most regulation of ionizing radiation and carcinogens is based upon the obviously wrong notion that a single photon or a single molecule can induce cancer. That was enshrined in the 1950s and lives on today.
An issue with my paper on corporate welfare in the federal budget is that cases can be made for other expenditures not on the list. A prime example would be Pentagon weapons procurement. I’ll simply say that deciding what counts and what doesn’t is complicated.
The New York Times has another example of what could be considered a form of corporate welfare: excessive federal reimbursement rates for anti‐anemia drugs used by dialysis centers. This snippet provides the background:
The multibillion‐dollar dialysis industry has been accused by medical researchers and former employees of putting a higher priority on profits than on care before, giving patients for many years too many doses of the expensive anti‐anemia drug Epogen to collect higher reimbursements — allegations the companies have strongly disputed.
The excessive payments to the companies since 2011 came about, in fact, as the federal government tried to create a single bundled payment for each patient visit. The idea was to eliminate the incentive to prescribe too many doses of Epogen, which medical research showed was harming patients.
With the profit incentive gone, use of Epogen dropped even more than the federal government expected. So the amount of money set aside in the new bundle exceeded the cost of the drugs, two separate federal audits in the last year have shown.
The industry, as a result, has collected an extra $530 million to $880 million a year in federal payments since 2011, compared with the actual use of Epogen and other dialysis drugs. That is the windfall that Congress ordered Health and Human Services to eliminate in January.
Thanks to a massive lobbying campaign from the dialysis industry ($8 million since 2009), Congress is now considering giving back the taxpayer‐financed windfall. According to the Times, “more than 100 of the same members of Congress who voted in January to impose the cut are now trying to push the Obama administration to reverse it or water it down.” Leading the charge to reverse the cuts are “lawmakers who are among the top recipients of campaign contributions from the industry, including Representatives John Lewis, a Georgia Democrat, and John M. Shimkus, an Illinois Republican, as well as Mr. [Ben] Lujan [D-NM].”
And, surprise, the effort is bipartisan:
The full‐court press has energized Congress. A broad coalition, including conservative Republicans and liberal Democrats and many in between, has joined the industry appeal, with 205 members of the House alone signing a letter this month to the Medicare administrator asking her to reconsider the proposed cut. Half of those signers voted in favor of the cut in January.
In sum, Republicans are teaming up with Democrats to keep the taxpayer dollars flowing to a special interest.
It’s just another day in the Beltway.
Today, the Dept. of Justice finally announced its first official response to the dramatic changes underway at the state level with respect to legalizing marijuana.
As a matter of law, a direct legal challenge to the state initiatives approved by voters in Colorado and Washington would have failed. A basic principle of constitutional law is that the federal government cannot “commandeer” the state legislatures and tell them what laws they should pass and what laws they can repeal. The state laws that legalize marijuana are not obstructing the FBI or DEA from enforcing federal law – and that’s the key test.
As a matter of policy, if the Obama administration is not yet ready to admit that the drug war is a failed policy, it should at least respect the prerogatives of the states that are choosing to legalize marijuana in their respective jurisdictions. Today’s announcement is an important step in that direction.