Tag: technology

Americans Have More than They Realize

According to Gallup, more Americans think of themselves as “have-nots” today than at any point since Gallup began posing the question almost thirty years ago, while fewer Americans see themselves as “haves.” (Please see Emily Ekins’s earlier post for an in-depth analysis from a different angle). But do Americans actually have less in 2015 than in 1988? Let’s dig into the data to see whether Americans might have more than they realize.

2015 is the first year when Americans spent more money dining out than they spent on groceries. Let’s examine why that might be. In 2015, U.S. GDP per person (adjusted for inflation) reached an all-time high. At the same time that average personal wealth is rising, many necessities like food are going down in price. As a result, spending on the basics takes up a smaller and smaller share of an American’s personal disposable income—dropping from 39% in 1988 to 32% in 2013. This means that Americans have more money left at the end of the day, which they can then choose to save, invest, or spend on luxuries like dining out.

Not only are Americans wealthier on average, but they are also working less. The average American worker in 2015 works 30 fewer hours in a year than her counterpart in 1988, and yet is almost $18,000 dollars richer in real terms.

HumanProgress.org advisory board member Mark Perry recently pointed out that today’s young Americans may actually be the luckiest generation in history, based on what they can buy with earnings from a summer job. And increases in real wealth do not capture technological advances, which also contribute to rising living standards. The quality and variety of available goods is improving across the board. Almost no one had a cell phone in the United States back in 1990, but today they’re ubiquitous—and more useful, with an app for just about everything.

In many ways, Americans have more today than ever before: more leisure time away from work, more disposable income left after basic expenses,  more choice in what they buy, and more advanced technologies at their fingertips.  Of course, there are still people who live in genuine need. The Great Recession and various growth-retarding policy decisions have done great harm, especially to the poor. Still, if the many positive trends that we are seeing continue, then hopefully more Americans will come to count themselves among the haves instead of the have-nots. To learn more about improving living standards in the United States and beyond, pay a visit to HumanProgress.org.

Topics:

Innovating Within an Overregulated Alcohol Landscape: A #CatoDigital Discussion

April is Alcohol Awareness Month. What better time to take a close look at one of our nation’s most heavily regulated industries and the inventive ways entrepreneurs are innovating within this realm?

The ratification of the 21st Amendment may have officially ended this nation’s failed experiment with alcohol Prohibition, but the policy hangover has had lingering effects. From dry counties to bans on Sunday sales, the sale of alcohol is severely restricted in a confusing patchwork of local, state, and federal regulations. Homebrewing was not legal in all 50 states until 2013 (and homebrewers still cannot legally sell their product). Eighteen states maintain a state monopoly over the wholesaling or retailing of some or all categories of alcoholic beverages. But, even in this stifling economy, intrepid businesses are finding new ways to serve thirsty consumers.  

One real-world example of this is Klink, formerly known as DrinkDrivers, a rapidly growing start-up with a strong foothold in the nation’s capital. The app-based alcohol delivery company relies upon the mechanisms of the sharing economy—which has faced its own share of difficulties from overzealous regulators—to navigate the treacherous legal landscape of the American alcohol industry.

The concept behind Klink is a simple one: modern consumers want the ease of on-demand goods and services, deliverable at the touch of a button, wherever they are. Yet, Klink is not an alcohol provider in the traditional sense.

Unlike many other businesses in the sharing economy, Klink is stringent in its adherence to the laws and regulations governing alcohol sales. When you place an order, the company does not itself process your payments or deliver your alcohol. Instead, Klink plays the role of middleman, partnering with licensed liquor retailers, providing an easy-to-use online platform to connect alcohol providers with customers and occasionally running localized marketing campaigns.

Tomorrow at noon, I’ll be moderating a live-streamed lunchtime discussion featuring my colleague Matthew Feeney, who is Cato’s leading expert on the sharing economy; David Ozgo, the Distilled Spirits Council of the United States (DISCUS)’s Senior Vice President of Economic & Strategic Analysis; and Klink’s Founder and CEO, Jeffrey Nadel.

We’ll be discussing the ways in which Klink is navigating the treacherous regulatory waters of both the sharing economy and the alcohol industry, the regulatory hurdles standing in their way, and what this means for the future of tech innovation and alcohol sales. The panel will be live-streamed, and at-home viewers are encouraged to participate in the Twitter discussion—and tweet their question—using #CatoDigital.

…In Which Katz Is Not Cited

The Supreme Court is gradually coming to terms with the effect information technology is having on the Fourth Amendment. In 2001, the Kyllo court curtailed the use of high-tech devices for searching homes. In its early 2012 decision in United States v. Jones, a unanimous Court agreed that government agents can’t attach a GPS device to a vehicle and track it for four weeks without a warrant.

But the Court was divided as to rationale. The majority opinion in Jones found (consistent with Cato’s brief) that attaching the device to the car was at the heart of the Fourth Amendment violation. Four concurring members of the Court felt that the government’s tracking violated a “reasonable expectation of privacy.”

What is the right way to decide these cases? Fourth Amendment law is at a crossroads.

The next round of development in Fourth Amendment law may come in a pair of cases being argued in April. They ask whether government agents are entitled to search the cell phone of someone they’ve arrested merely because the phone has been properly seized. Riley v. California and Wurie v. United States have slightly different fact patterns, which should allow the fullest exposition of the issues.

Cato’s brief in Riley, filed this week, again seeks to guide the Court toward using time-tested principles in Fourth Amendment cases. Rather than vague pronouncements about privacy and people’s expectations around it, we invite the Court to apply the Fourth Amendment as a law.

The Boy Who Cried Wolf Was Eventually Right

“We are reaching end times for Western affluence,” warns economist Stephen King (insert obligatory horror joke here) in yesterday’s New York Times. King, who has authored a book entitled When the Money Runs Out: The End of Western Affluence, joins the ranks of economic Cassandras like Tyler Cowen and Robert Gordon, both of whom have made waves with pessimistic takes on the U.S. economy’s prospects. Like Cowen and Gordon, King couches his claims in overstatements that make it easier for skeptical readers to dismiss his arguments. Peel away the hype, though, and these growth pessmists are still fundamentally correct. The wolf really is at the door this time. In other words, the growth outlook really is darkening.

Cowen put the hype right in the title of his attention-getting book: The Great Stagnation, his term for the past 40 years or so. Of course, real GDP per capita has nearly doubled since 1973, so stagnation is obviously an inapt term. It’s true that productivity growth and growth in median incomes have slowed down, but The Moderate Slowdown is a pretty boring book title. Meanwhile, Gordon saw Cowen and raised him with the highly provocative and speculative argument that technological progress is largely exhausted and, therefore, the 250-year era of modern economic growth is winding down. You don’t have to be Raymond Kurzweil to find that contention unpersuasive.

Now King warns that Western affluence is coming to an end. Well it’s not: even if all growth stopped tomorrow, today’s advanced economies are affluent beyond the wildest dreams of yesteryear.

Push past the hype, though, and Cowen, Gordon, and King are making a point that really needs to be more widely understood: growth is getting harder for the U.S. economy, and there are strong reasons for thinking that growth rates over the next decade or two will fall short of the long-term U.S. historical average. As I explain in a new Cato paper released today, you don’t have to be a pessimist about the future of innovation to be pessimistic about the U.S. economy’s medium-term growth outlook. The main source of weakness lies in demographics: the 20th century saw big increases in both the percentage of the population in the workforce (thanks to the changing role of women in society) and the overall skill level of the workforce (thanks to a huge increase in formal schooling). The rise in schooling has slowed down considerably since 1980, and the labor force participation rate has actually been falling since 2000 (it’s now back to where it was in 1979). What were tailwinds for growth have turned into headwinds.

Topics:

Barack Obama, Luddite?

In the video clip above, President Obama blames America’s current unemployment problem on… automation. ATMs and airport kiosks are singled out.

These words could only be uttered by someone who knows very little about economics or the history of human progress. In fact, they could only be uttered by someone who has never reflected on this question before in his  life. Because if you reflect for one moment, you come up with this glaringly obvious counterfactual: we use a lot more  labor-saving technology today than in previous generations, and yet we also employ far more people. Therefore, increased automation does not lead to decreased national employment.

If you do more than just think for a second – if you read an economic history book, for instance – you discover that increased automation doesn’t even necessarily lead to decreased employment in the industry being automated! The classic example is the 19th century British textile industry. The so-called “Luddites” smashed automated looms fearing that they would lead to rampant unemployment in their industry. But, as the new technology proliferated, textile industry employment rose. Among other reasons, increased efficiency drastically lowered the prices of textile goods, that shot demand through the roof, and to meet the new demand new workers were required to operate and maintain the new machinery.

There are other examples, of course, and the president will save the American people a great deal of hardship, and himself further embarrassment,  if he familiarizes himself with them. Here’s a good brief introduction from the British Secretary of State… under Margaret Thatcher.

Update:

For those having trouble viewing the video, here is a transcript of the relevant Q&A:

Q: Why, at a time of record profits, have you been unable to convince businesses to hire more people Mr. President?

A: [….] the other thing that happened, though, and this goes to the point you were just making: there are some structural issues with our economy, where a lot of businesses have learned to be a lot more efficient with a lot fewer workers. You see it when you go to a bank and there’s an ATM, you don’t go to a bank teller. Or you go to the airport, and you’re using a kiosk instead of checking in at the gate.

Privatize the FAA

Bloomberg is reporting more bad news for the nation’s air traffic control system, which is run by the Federal Aviation Administration. The FAA is $500 million overbudget and six years behind schedule on a $2.1 billion technology upgrade project.

The FAA has a long history of mismanaged technology projects, and so the latest screw-ups are nothing new. Yet the nation needs high-tech advances in air traffic control more than ever to ease our increasingly congested airspaces.

There is a better way to run air traffic control—a private sector way, as Canada has been demonstrating. In 1996, Canada converted its government air traffic control system to a private nonprofit corporation. Nav Canada has been a smashing success, providing an excellent model for possible U.S. reforms.

A December 24 story in the Financial Post describes how Nav Canada is a world leader in efficiency, safety, and technology under private management. “A once troubled government asset, the country’s civil air traffic controller was privatized 14 years ago and is now a shining example of how to create a global technology leader out of a hulking government bureaucracy.” It really is an impressive story of pro-market reform.  

Canada’s system recently won an award from the International Air Transport Association. The IATA said that “Nav Canada is a global leader in the efficient implementation and reliable delivery of air traffic control procedures and technologies.”

We should have that type of efficient air traffic control system in this country. Privatizing the FAA should be a high priority for the next Congress.

See here for a discussion on privatizing air traffic control.

The Current Wisdom

NOTE:  This is the first in a series of monthly posts in which Senior Fellow Patrick J. Michaels reviews interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

The Current Wisdom only comments on science appearing in the refereed, peer-reviewed literature, or that has been peer-screened prior to presentation at a scientific congress.

The Iceman Goeth:  Good News from Greenland and Antarctica

How many of us have heard that global sea level will be about a meter—more than three feet—higher in 2100 than it was in the year 2000?  There are even scarier stories, circulated by NASA’s James E. Hansen, that the rise may approach 6 meters, altering shorelines and inundating major cities and millions of coastal inhabitants worldwide.

Figure 1. Model from a travelling climate change exhibit (currently installed at the Field Museum of natural history in Chicago) of Lower Manhattan showing what 5 meters (16 feet) of sea level rise will look like.

In fact, a major exhibition now at the prestigious Chicago Field Museum includes a 3-D model of Lower Manhattan under 16 feet of water—this despite the general warning from the James Titus, who has been EPA’s sea-level authority for decades:

Researchers and the media need to stop suggesting that Manhattan or even Miami will be lost to a rising sea. That’s not realistic; it promotes denial and panic, not a reasoned consideration of the future.

Titus was commenting upon his 2009 publication on sea-level rise in the journal Environmental Research Letters.

The number one rule of grabbing attention for global warming is to never let the facts stand in the way of a good horror story, so advice like Titus’s is usually ignored.

The catastrophic sea level rise proposition is built upon the idea that large parts of the ice fields that lay atop Greenland and Antarctica will rapidly melt and slip into the sea as temperatures there rise.  Proponents of this idea claim that the United Nations’ Intergovernmental Panel on Climate Change (IPCC), in its most recent (2007) Assessment Report,  was far too conservative in its projections of future sea level rise—the mean value of which is a rise by the year 2100 of about 15 inches.

In fact, contrary to virtually all news coverage, the IPCC actually anticipates that Antarctica will gain ice mass (and lower sea level) as the climate warms, since the temperature there is too low to produce much melting even if it warms up several degrees, while the warmer air holds more moisture and therefore precipitates more snow. The IPCC projects Greenland to contribute a couple of inches of sea level rise as ice melts around its periphery.

Alarmist critics claim that the IPCC’s projections are based only on direct melt estimates rather than “dynamic” responses of the glaciers and ice fields to rising temperatures.

These include Al Gore’s favorite explanation—that melt water from the surface percolates down to the bottom of the glacier and lubricates its base, increasing flow and ultimately ice discharge. Alarmists like Gore and Hansen claim that Greenland and Antarctica’s glaciers will then “surge” into the sea, dumping an ever-increasing volume of ice and raising water levels worldwide.

The IPCC did not include this mechanism because it is very hypothetical and not well understood.  Rather, new science argues that the IPCC’s minuscule projections of sea level rise from these two great ice masses are being confirmed.

About a year ago, several different research teams reported that while glaciers may surge from time to time and increase ice discharge rates, these surges are not long-lived and that basal lubrication is not a major factor in these surges. One research group, led by Faezeh Nick and colleagues reported that “our modeling does not support enhanced basal lubrication as the governing process for the observed changes.” Nick and colleagues go on to find that short-term rapid increases in discharge rates are not stable and that “extreme mass loss cannot be dynamically maintained in the long term” and ultimately concluding that “[o]ur results imply that the recent rates of mass loss in Greenland’s outlet glaciers are transient and should not be extrapolated into the future.”

But this is actually old news. The new news is that the commonly-reported (and commonly hyped) satellite estimates of mass loss from both Greenland and Antarctica were a result of improper calibration, overestimating ice loss by  some 50%.

As with any new technology, it takes a while to get all the kinks worked out. In the case of the Gravity Recovery and Climate Experiment (GRACE) satellite-borne instrumentation, one of the major problems is interpreting just what exactly the satellites are measuring. When trying to ascertain mass changes (for instance, from ice loss) from changes in the earth’s gravity field, you first have to know how the actual land under the ice is vertically moving (in many places it is still slowly adjusting from the removal of the glacial ice load from the last ice age).

The latest research by a team led by Xiaoping Wu from Caltech’s Jet Propulsion Laboratory concludes that the adjustment models that were being used by previous researchers working with the GRACE data didn’t do that great of a job. Wu and colleagues enhanced the existing models by incorporating land movements from a network of GPS sensors, and employing more sophisticated statistics. What they found has been turning heads.

Using the GRACE measurements and the improved model, the new estimates of the rates of ice loss from Greenland and Antarctica  are only about half as much as the old ones.

Instead of Greenland losing ~230 gigatons of ice each year since 2002, the new estimate is 104 Gt/yr. And for Antarctica, the old estimate of ~150 Gt/yr has been modified to be about 87 Gt/yr.

 How does this translate into sea level rise?

 It takes about 37.4 gigatons of ice loss to raise the global sea level 0.1 millimeter—four hundredths of an inch. In other words, ice loss from Greenland is currently contributing just over one-fourth of a millimeter of sea level rise per year, or one one-hundreth of an inch.  Antarctica’s contribution is just under one-fourth of a millimeter per year.  So together, these two regions—which contain 99% of all the land ice on earth—are losing ice at a rate which leads to an annual sea level rise of one half of one millimeter per year. This is equivalent to a bit less than 2 hundredths of an inch per year.  If this continues for the next 90 years, the total sea level rise contributed by Greenland and Antarctica by the year 2100 will amount to less than 2 inches.

 Couple this with maybe 6-8 inches from the fact that the ocean rises with increasing temperature,  temperatures and 2-3 inches from melting of other land-based ice, and you get a sum total of about one foot of additional rise by century’s end.

 This is about 1/3rd of the 1 meter estimates and 1/20th of the 6 meter estimates.

Things had better get cooking in a hurry if the real world is going to approach these popular estimates. And there are no signs that such a move is underway.

So far, the 21st century has been pretty much of a downer for global warming alarmists. Not only has the earth been warming at a rate considerably less than the average rate projected by climate models, but now the sea level rise is suffering a similar fate.

Little wonder that political schemes purporting to save us from these projected (non)calamities are also similarly failing to take hold.

References:

Nick, F. M., et al., 2009. Large-scale changes in Greenland outlet glacier dynamics triggered at the terminus. Nature Geoscience, DOI:10.1038, published on-line January 11, 2009.

Titus, J.G., et al., 2009. State and Local Governments Plan for Development of Most Land Vulnerable to Rising Sea Level along the U.S. Atlantic Coast, Environmental Research Letters 4 044008. (doi: 10.1088/1748-9326/4/4/044008).

Wu, X., et al., 2010. Simultaneous estimation of global present-day water treansport and glacial isostatic adjustment. Nature Geoscience, published on-line August 15, 2010, doi: 10.1038/NGE0938.

Pages