Many of the most persistently gloomy reports about the U.S. economy have long been based on the single most misleading statistic the government produces.
According to New York Times columnist Paul Krugman, “the purchasing power of an average non‐supervisory worker’s wage has fallen about 1.5 percent since the summer of 2003.”
According to the Washington Post business news: “After adjusting for inflation, average weekly earnings for private production and non‐managerial employees … bought 0.5 percent less last month than they did in July 2004, after taking price increases into account. These workers make up 80 percent of the labor force.”
According to a Washington Times editorial, “The fact that the U.S. economy has generated a negative growth rate over five years for real average weekly earnings of 80 percent of its workforce should be a concern shared by all people, regardless of political orientation.”
And according to a new New York Times book, “Class in America,” “For most workers, the only time in the last three decades when the rise in hourly pay beat inflation was during the speculative bubble of the ‘90s.”
All such comments allude to “average earnings” of production workers in manufacturing and non‐supervisory workers in services. But that data series does not purport to measure hourly pay at all, much less a typical worker’s wage. The figures cover only 62 percent of all jobs, not 80 percent, if government workers and the self‐employed were included. And that just begins to explain the confusion.
In fact, this data series is so misleading it is finally being phased out by the Bureau of Labor Statistics (BLS), to be replaced over the next four years by one that covers all private employees. For one thing, as the BLS explains: “the production and non‐supervisory worker hours and payroll data have become increasingly difficult to collect, because these categorizations are not meaningful to survey respondents. Many survey respondents report that it is not possible to tabulate their payroll records based on the production/non‐supervisory definitions.”
An accountant in a manufacturing company should not be counted as a production worker, for example, but an accountant in a bank should be counted as a non‐supervisory worker. Non‐supervisory is defined to exclude supervisors, yet include “supervisory workers.” Such arbitrary distinctions make responses “increasingly difficult to collect,” suggesting the estimates depend on an increasingly dubious sample of older firms.
The most obvious flaw in the average earnings figures — as Stephen Moore pointed out in an Aug. 29 column, “The Wages of Prosperity” — is that they totally ignore health, pension and other benefits. With benefits included, real compensation per hour was up 3.6 percent between the second quarters of 2004 and 2005 among non‐farm businesses, and up 5.6 percent in manufacturing.
Average weekly earnings are derived from the voluntary survey of payroll employment at 155,000 businesses. The BLS adds up all the dollars spent on payrolls and divides by paid hours (including vacations). This “differs from wage rates,” the BLS warns, and is “not the earnings average of ‘typical’ jobs or jobs held by ‘typical’ workers.”
“Average earnings” is an arithmetic average — a mean not a median — and it includes part‐time jobs. As a result, taking part‐timers and low‐income workers off the payroll has the paradoxical effect of raising average earnings among the rest, though it surely doesn’t make anyone better off. Average weekly earnings can therefore rise in hard times because many part‐time and/or low‐wage workers lose their jobs. And average weekly earnings can fall in recoveries because previously unemployed low‐wage workers — or a flood of unskilled immigrants — find jobs.
As deceptive as all this is when it comes to measuring short‐term cyclical changes, the average earnings statistics become truly unbelievable when used to assert, as the New York Times book does, that real incomes for 80 percent of American households have fallen for three decades.
This claim has been endlessly repeated since the 1980s. In 1994, for example, Paul Krugman and Robert Z. Lawrence began a serious article in Scientific American with the spurious claim that “real earnings of blue‐collar workers have fallen in most years since 1973.” In the boomy year of 1998, Jeffrey Madrick called the United States a “treadmill economy” largely because “the average wages of production and non‐supervisory workers, who basically comprise the lower 80 percent of earners, are still 10 or 15 percent below their 1973 highs.”
In reality, average earnings do not measure “blue‐collar” earnings or wages among “the lower 80 percent” (in fact, half of U.S. employees earn no wages — they earn salaries). Non‐supervisory workers include “physicians, lawyers, accountants, nurses, social workers, research aides, teachers, drafters, photographers, beauticians (and) musicians.”
Since 1973, as the BLS explains, there have been “persistent long‐term increases in the proportion of part‐time workers in retail trade, and many of the service industries have reduced average workweeks in these industries.” Millions of previously nonworking spouses and students sought and found part‐time work, which diluted average earnings, particularly on a weekly basis. Substituting a low‐wage job for an unpaid job makes average earnings appear lower, yet results in higher family incomes. Adding millions of low‐skilled immigrants in recent years has likewise diluted average earnings without affecting typical earnings.
Besides, talking about what happened since 1973 conceals what happened when. If average weekly earnings are adjusted for inflation with the PCE deflator, they appeared to fall from a cyclical peak of $496 in 1973 (an illusory blip, because of price controls) to cyclical lows of $459 in 1982 and $429 in 1992 (in 2000 dollars). By this year’s second quarter, however, this deeply flawed statistic was back to $488 — up 13.8 percent since 1992 and 1.7 percent since 1980.
The latest figure still makes real earnings appear lower than in 1973. To accept such a conclusion, however, requires: 1) ignoring all other measures of U.S. living standards, and 2) adding another statistical absurdity to all those previously mentioned. The fundamental problem is that no price index, least of all the fixed‐weight CPI-W normally used for this purpose, can actually provide a credible comparison of real incomes across three decades. Attempting to compare today’s price index with one from 1973 would require comparing computers with typewriters, digital TiVo with rooftop antennas, and contemporary cars with Chevy Vegas and Ford Pintos.
Despite the problems price indexes have in coping with new and better products, measured real consumption per capita has nonetheless doubled since 1973. Unless the rich could somehow consume unlimited numbers of houses, cars, shirts and steaks, it is difficult to imagine how each American’s real consumption could have doubled if real salaries had actually been unchanged. Could anyone believe that all those shopping malls that have sprung up since 1973, and all the new homes and restaurants, are really catering to just a fortunate few?
The BLS has half a dozen superior measures of labor earnings — the Census Bureau, Social Security Administration and Bureau of Economic Analysis have others. Real compensation per hour, for example, has risen 43.6 percent since 1973. So how could the real wages of 80 percent of the workforce have “fallen in most years” since then? They didn’t. Wage stagnation is an old statistical hoax whose time is coming to an end.
When the average earnings series is at last given a well‐earned burial, after suffering a lifetime of statistical abuse, what will all the gloomy worrywarts then find to write about?