At Politico Jeff Greenfield writes about "The Hollywood Hit Movie That Urged FDR to Become a Fascist." The movie was “Gabriel Over the White House” in 1933 and, Greenfield writes, "it was designed as a clear message to President Franklin Delano Roosevelt that he might need to embrace dictatorial powers to solve the crisis of the Great Depression." Greenfield assures us that FDR did not become a dictator, but he notes that "the impulse toward strongman rule" often stems from a sense of populist grievance, along with the scapegoating of "subversive enemies undermining the nation." Depending on the time and the strongman, those subversive enemies can be Jews, capitalists, Wall Street, the 1 percent, the homosexuals, or in some countries the Americans.
Gene Healy wrote about "Gabriel" 10 years ago in The Cult of the Presidency and in this column in 2012:
...many of us still believe in authoritarian powers for the president.
In a November 2011 column, the Washington Post's Dana Milbank offered "A Machiavellian model for Obama" in Jack Kennedy's "kneecapping" and "mob-style threats" against steel-company executives who'd dared to raise prices.
Despite the obligatory caveat: "President Obama doesn't need to sic the FBI on his opponents," Milbank observed that "the price increase was rolled back" only after "subpoenas flew [and] FBI agents marched into steel executives' offices": "Sometimes, that's how it must be. Can Obama understand that?"
Greenfield says "Gabriel" was both a commercial and critical hit, but "faded into obscurity, in large measure because the idea of a “benevolent dictatorship” seemed a lot less attractive after the degradation of Hitler, Mussolini and Stalin."
But that wasn't so obvious in 1933. As I wrote in a review of Three New Deals by Wolfgang Schivelbusch, there was a lot of enthusiasm in the United States for central planning and "Fascist means to gain liberal ends." Two months after Roosevelt's inauguration, the New York Times reporter Anne O’Hare McCormick wrote that the atmosphere in Washington was “strangely reminiscent of Rome in the first weeks after the march of the Blackshirts, of Moscow at the beginning of the Five-Year Plan.… America today literally asks for orders.”
In their highly influential book describing behavioral economics, Nudge, Richard H. Thaler and Cass R. Sustein devote 2 pages to the notion of "bad nudges." They describe a "nudge" as any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives. The classic example of a nudge is the decision of an employer to "opt-in" or "opt-out" employees from a 401(k) plan while allowing the employee to reverse that choice; the empirical evidence strongly suggests that opting employees into such plans dramatically raises 401(k) participation. Many parts of the book advocate for more deliberate choice architecture on the part of the government in order to "nudge" individuals in the social planner's preferred direction.
Thaler and Sunstein provide short discussion and uncompelling examples of bad nudges. They correctly note "In offering supposedly helpful nudges, choice architects may have their own agendas. Those who favor one default rule over another may do so because their own economic interests are at stake." (p. 239) With respect to nudges by the government, their view is "One question is whether we should worry even more about public choice architects than private choice architects. Maybe so, but we worry about both. On the face of it, it is odd to say that the public architects are always more dangerous than the private ones. After all, managers in the public sector have to answer to voters, and managers in the private sector have as their mandate the job of maximizing profits and share prices, not consumer welfare."
In my recent work (with Jim Marton and Jeff Talbert), we show how bad nudges by public officials can work in practice through a compelling example from Kentucky. In 2012, Kentucky implemented Medicaid managed care statewide, auto-assigned enrollees to three plans, and allowed switching. This fits in with the "choice architecture" and "nudge" design described by Thaler and Sunstein. One of the three plans – called KY Spirit – was decidedly lower quality than the other two plans, especially in eastern Kentucky. For example, KY Spirit was not able to contract with the dominant health care provider in eastern Kentucky due to unsuccessful rate negotiations. KY Spirit’s difficulties in eastern Kentucky were widely reported in the press, so we would expect there to be greater awareness of differences in MCO provider network quality in that region.
President Trump’s appointment of Gina Haspel as the new director of the Central Intelligence Agency has revived memories of the abuses the CIA committed during George W. Bush’s administration. The appointment is indeed deeply troubling, since Haspel ran one of the Agency’s infamous overseas “black sites” that featured “enhanced interrogation” techniques (a cynical euphemism for torture). But as I point out in a new National Interest Online article, Haspel’s conduct is the symptom of a much deeper problem. Both during the Cold War and the war on terror, too many U.S. officials have succumbed to the temptation to combat evil behavior with evil behavior. In the process, they have undermined and imperiled fundamental American values.
There is no question that communist powers and radical Islamic terrorists are morally odious adversaries. The United States rightly condemned Moscow’s subjugation of Eastern Europe and the Kremlin’s global subversion campaigns against other societies. But Washington’s conduct was hardly exemplary, and as the Cold War continued, U.S. behavior became increasingly questionable. Especially shameful were those cases in which Washington subverted and overthrew democratic governments to help install “friendly dictators.” There is now indisputable evidence that the United States was involved in such disreputable moves against elected governments in Iran, Guatemala, Chile, and other countries. Indeed, U.S. leaders seemed to prefer pliable autocrats to unpredictable pluralistic systems. When General Chun Doo‐hwan overthrew an embryonic democratic government in South Korea, John Wickham, the commander of U.S. forces in that country, excused the seizure of power, saying that South Koreans were “lemming‐like” and needed a strong leader.
Indeed, in the name of waging the Cold War, U.S. officials flirted with utterly horrific options. During the Kennedy administration, the CIA concocted a scheme to stage false flag attacks, including blowing up civilian airliners, as a phony justification to invade Cuba and oust Fidel Castro. Fortunately, the White House rejected the scheme, but that supposedly ethical officials could even consider murdering innocent Americans as a geopolitical pretext illustrated just how much U.S. policymakers were beginning to emulate their immoral communist counterparts.
The casual willingness to cut moral corners is evident in the war on terror as well. In addition to the CIA’s own use of torture, Washington embraced the practice of rendition, whereby the United States sent accused terrorists to cooperative dictatorships renowned for using torture techniques that made even the U.S. conduct look mild. Those governments included Saudi Arabia’s brutal theocratic autocracy, Hosni Mubarak’s dictatorship in Egypt, and Bashar al-Assad’s regime in Syria. Outsourcing torture in that fashion, however, did nothing to dilute America’s responsibility for the resulting egregious human rights violations.
Washington’s overseas military conduct in the war on terror is equally troubling. The awful destruction that U.S. forces have visited on Afghanistan, Iraq, Libya, and Syria has resulted in the deaths of hundreds of thousands of innocent civilians and turned millions of others into destitute refugees. In addition to the needless carnage that the U.S. military has inflicted directly, the United States is an active accomplice in Saudi Arabia’s atrocity‐filled war in Yemen.
The German philosopher Friedrich Nietzsche expressed the cautionary admonition: “Beware that, when fighting monsters, you yourself do not become a monster.” Too often, U.S. leaders have ignored that warning. In doing so, they have established disturbing, sometimes horrifying, precedents that betray the basic values of a liberal democracy.
Congress passed its first naturalization law 228 years ago on March 26, 1790. The Naturalization Act of 1790 was the most open naturalization law in the world at the time, allowing free white persons of good character to naturalize after two years of residence in the country and one year of residence in a particular state. Denying citizenship to American Indians, free blacks, indentured servants, and others who did not count as free white persons was a great injustice, but the 1790 Act was an improvement over other countries at the time that also limited naturalization based on gender, skill, or religion in addition to race. Although the Naturalization Act of 1790 did place some restrictions on who could become a citizen, it placed no restrictions on who could enter the United States. The Supreme Court largely corrected Congress’ error in 1898 in its United States v. Wong Kim Ark decision when it ruled that children of immigrants, including non-whites, were also citizens if they were born in the United States.
The Western world had a long legal, social, and ethical tradition of openness to immigrants and naturalization that culminated in some of the best portions of the American immigration system. Hillsdale College history professor Bradley J. Birzer wrote a wonderful essay for The American Conservative in January that showed that our civilizational heritage is replete with relatively open borders and the unencumbered movement of people across them with few practical restrictions, with the United States as a recent inheritor of such thought. Although Greeks and Romans both had liberal migration systems, there were important distinctions between Roman and Greek practices of naturalization and citizenship. The American Founding Fathers decidedly favored a model close to that of the Romans over that of the more restrictive Greeks.
Congress has passed an omnibus appropriations bill that jacks up spending across the board. Projections from the Committee for a Responsible Federal Budget show that the federal river of red ink is fast becoming a flood.
The chart shows CRFB’s “alternative” projection, which is the likely budget path if policymakers do not make major reforms. Deficits are expected to rise relentlessly, topping $1 trillion next year and hitting $2.4 trillion by 2028. That means spending $2.4 trillion more than available revenue that year.
The next president will come into office in early 2021, and the nation will be facing the most dangerous budget situation in peacetime history. If policies are not changed between now and then, he or she will be looking at 10‐year deficits of $20 trillion or more. If you think Washington is a dysfunctional mess now with members at each other’s throats, I am guessing that today is a picnic compared to federal policymaking down the road.
E-Verify is a federal government program that allows businesses to check the identities of new hires against federal databases to judge whether they are eligible to legally work in the United States. The goal of the program is to deny illegal immigrants work in the United States. E-Verify has serious problems as it misidentifies a small portion of legal workers as illegal immigrants, imposes a serious regulatory burden on employers and employees, increases employee turnover costs, is expensive, stimulates black market document forging and identity theft, might increase crime, and fails in its primary function of turning off the wage magnet.
Despite all of those problems, the best thing about E-Verify is that many employers do not use it in states where it is mandated and workers have many ways to get around the system, reducing the cost of the mandate. Government data on the number of E-Verify checks that run in each state are sketchy and seem to change with each new FOIA but the most recent one I received from the Department of Homeland Security revealed that my previous work likely overestimated the rates of E-Verify compliance in South Carolina.
South Carolina mandated E-Verify for all employers in 2011 but delayed the start date until January 1, 2012, because (surprise) the system was more complicated than its proponents claimed and the state government did not want to punish every small employer in the state for noncompliance. Despite that, proponents of mandatory E-Verify point to South Carolina as a model system because the state Department of Labor, Licensing, and Regulation (DLLR) conducts random audits of employers to guarantee that they use the system for all new hires.
South Carolina E-Verify Compliance
Sources: Department of Homeland Security and Longitudinal Employer-Household Dynamics Survey.Read the rest of this post »
ObamaCare turns eight years old today. Some opponents had hoped to mark the occasion by giving supporters the birthday gift they’ve always wanted: a GOP-sponsored bailout of ObamaCare-participating private insurance companies. Fortunately, a dispute over subsidies for abortion providers killed what could have been the first of many GOP ObamaCare bailouts.
ObamaCare premiums have been skyrocketing. All indications are this will continue in 2019, with insurers announcing premium increases up to 32 percent or more just before this year’s mid-term elections. Some Republicans fear voters will punish them for the effects of a law every Republican opposed and most still want to repeal.
Senate health committee chairman Lamar Alexander (R-TN), Sen. Susan Collins (R-ME), and House Energy & Commerce Committee chairman Greg Walden (R-OR) hope to avert calamity by expanding on a proven failure. For months, they have been pushing legislation that would resurrect ObamaCare’s expired “reinsurance” program with $30 billion of new funding.
ObamaCare’s architects knew the law's preexisting-conditions provisions would effectively destroy the individual health insurance market. They added the reinsurance program in an attempt to put Humpty Dumpty back together again.
ObamaCare’s preexisting-conditions provisions both increase health-insurance premiums and reduce health-insurance quality. They achieve the former, first, by requiring insurers to cover patients with uninsurable preexisting conditions, and again by unleashing adverse selection. Those factors in turn reduce quality by literally punishing insurers who offer high-quality coverage for the sick.
From 2014 until it expired at the end of 2016, ObamaCare’s reinsurance program gave participating insurers extra taxpayer subsidies to cover the claims of high-cost patients whom its preexisting-conditions provisions require them to cover at a loss. The extra subsidies were supposed to reduce premiums, and prevent a race to the bottom fueled by ObamaCare’s penalties on quality coverage.
If ObamaCare’s reinsurance program was supposed to keep premiums from skyrocketing, it was an utter failure. Premiums increased 18-25 percent per year from 2013 through 2016, well above the trend of 3-4 percent from 2008 to 2013. By 2017, premiums had doubled—a cumulative increase of 99 percent or 105 percent, depending on the source—from pre-ObamaCare levels. ObamaCare’s preexisting-conditions provisions were the driving force behind these premium increases.