Looking back, there were two kinds of people who lived in America in 2016: people who believed Donald Trump, and people who believed data.
Trump claimed on the campaign trail that globalization had destroyed US manufacturing—and in the process, the American economy—by letting China and other countries steal American factory jobs. From the turnout at Trump’s rallies and the “Make America Great Again” stickers slapped on bumpers across America, it was clear the message was resonating.
The data camp didn’t get it. Yes, the US had hemorrhaged manufacturing jobs, losing close to 5 million of them since 2000. Trade may have been a factor—but it clearly wasn’t the main culprit. Automation was. Robots and fancy machines had supplanted workers, turning the US into a manufacturing dynamo at the cutting edge of innovation. An article in Vox, published a month before the 2016 presidential election, spelled out the situation.
“Declining manufacturing employment over the past 30 years has given a lot of people the impression that America’s manufacturing sector is in decline. But that’s actually wrong,” the Vox article explained. “American factories are about twice as efficient today as they were three decades ago. So we’re producing more and more stuff, even as we use fewer and fewer people to do it.”
This was hardly a groundbreaking insight. For a decade or so, this phenomenon had been put forth by Ivy League economists, former US secretaries of treasury, transportation, and labor, Congressional Research Services, vice president Joe Biden, president Barack Obama—and by Quartz too, for that matter. In a 2016 New York Times articletitled “The Long-Term Jobs Killer is Not China. It’s Automation,” Harvard economist Lawrence Katz laid out the general consensus: “Over the long haul, clearly automation’s been much more important—it’s not even close.”
America’s manufacturing sector is in far worse shape than the media, politicians, and even most academics realize. Manufacturers’ embrace of automation was supposedly a good thing. Sure, some factory workers lost their jobs. But increased productivity boosted living standards, and as manufacturing work vanished, new jobs in construction and other services took its place. This was more of a shift than a loss, explained Bradford DeLong, economics professor at the University of California, Berkeley.
So when Trump won the presidential election, the true-blue data believers dismissed his victory as the triumph of rhetoric over fact. His supporters had succumbed to a nativist tale with cartoon villains like “cheating China” and a shadowy cabal of Rust Belt-razing “globalists.”
But it turns out that Trump’s story of US manufacturing decline was much closer to being right than the story of technological progress being spun in Washington, New York, and Cambridge.
Thanks to a painstaking analysis by a handful of economists, it’s become clear that the data that underpin the dominant narrative—or more precisely, the way most economists interpreted the data—were way off-base. Foreign competition, not automation, was behind the stunning loss in factory jobs. And that means America’s manufacturing sector is in far worse shape than the media, politicians, and even most academics realize.
Worse than the Great Depression: America’s manufacturing jobs implosion
In the four decades between 1960 and 2000, US manufacturing employment was basically stable, averaging around 17.5 million jobs. Even during the 1980s and 1990s, as Korea and other smaller Asian nations joined the ranks of Germany and Japan to threaten the dominance of US factories, the absolute number of manufacturing workers stayed mostly flat. That’s why what happened next is so alarming.
Between 2000 and 2010, manufacturing employment plummeted by more than a third. Nearly 6 million American factory workers lost their jobs. The drop was unprecedented—worse than any decade in US manufacturing history. Even during the Great Depression, factory jobs shrunk by only 31%, according to a Information Technology & Innovation Foundation report. Though the sector recovered slightly since then, America’s manufacturing workforce is still more than 26% smaller than it was in 2000.
What’s odd is that, even as US factories laid off an historically unprecedented share of workers, the amount of stuff they made rose steadily—or at least, it appeared to. The sector’s growth in output, adjusted for inflation, had been chugging away at roughly the same pace as US GDP since the late 1940s. That makes sense given that productivity—that is, advances in technology, skill, or management that allow workers to make more stuff in less time—has also been growing at a zippy clip.
How, then, do you reconcile the epic employment slump of the 2000s with the steady rise in output? The obvious conclusion is that factories needed fewer people than they did in the past because robots are now doing more and more of the producing. That’s tough for factory workers, but US manufacturing is doing fine.
That rests on the basic assumption that the manufacturing output data reflect the actual volume of stuff produced by US factories. It’s a reasonable assumption to make. Unfortunately, it’s not an accurate one.
Houseman’s light bulb moment
Economists have long been aware that computers and electronics, a relatively small sector of manufacturing, has powered much of manufacturing’s growth in output over the past few decades. But until 2009, no one had connected this fact to the puzzling paradox of surging manufacturing output alongside dwindling employment. That’s when Susan Houseman and her colleagues first took a crack at it—and, in the process, discovered something funny going on with data.
“It was staggering—it was actually staggering—how much that subsector was contributing to growth.” An economist at the Upjohn Institute, an independent organization that researches employment, Houseman specializes in measuring globalization. She had been working with a team of Federal Reserve economists with access to more granular data than was publicly available, which allowed them to strip away the computers industry output from the rest of the data. That revealed just how the rest of manufacturing was doing—and it was much worse than what Houseman and her colleagues expected.
“It was staggering—it was actually staggering—how much that was contributing to growth in real [meaning, inflation-adjusted] manufacturing productivity and output,” says Houseman.
This was especially striking given that the two measures lay at the heart of the prevailing narrative that US manufacturing is growing healthily.
In 2011, Houseman and her colleagues mentioned their discovery in a paper they published in the Journal of Economic Perspectives. But the point went largely unnoticed.
Undeterred, Houseman spent the next few years digging further into why this relatively small industry was driving so much growth—and what was really going on with America’s manufacturing.
How economists calculate manufacturing output
In order to understand how the manufacturing sector is doing, economists look at how much stuff factories are making compared with previous years. The key measure of this output is “value added”: manufacturing sales, minus the cost of things like electricity and parts used in the manufacturing process. They look at this across a dozen or so manufacturing subsectors, such as paper, apparel, furniture, and chemicals.
But that figure alone isn’t enough. To make the output volume comparable from one year to the next, the statisticians aggregating the data adjust for price changes, as well as improvements in product quality. For example, let’s say statisticians want to figure out how much the sales of Intel processors grew in 2017 versus 2016.
The problem is, the processor released in 2017 is superior to that sold in 2016 in many tangible ways. But how do you account for the fact that a 2017 processor provides users with more value? In general, statisticians assume the difference in value between the two models is just the difference in their prices. If, say, the 2017 processor costs twice as much as the 2016 one does, then selling one 2017 processor counts as selling two of the 2016 versions in the statisticians’ books.
The adjustment makes it seem like the whole of American manufacturing is making many more goods than it actually is. In this hypothetical, the real output data might look like increased sales of processors. But it could also simply reflect the statisticians’ assumptions that people value their new processor more than they did earlier models, because the new version’s superior performance.
Government data wizards do this sort of quality adjustment for all sorts of products, including automobiles. However, the biggest adjustments show up in processors and the other goods made by the computers subsector, in which the blazing pace of technological change makes for dramatic and ultra-fast leaps in quality.
In other words, the method statisticians use to account for these advances can make it seem like US firms are producing and selling more computers than they actually are. And when the computers data are aggregated with the other subsectors, the adjustment makes it seem like the whole of American manufacturing is churning out more goods than it actually is.
Misreading the manufacturing statistics
It’s this adjustment that is the crux of economists’ misinterpretation of the health of manufacturing. There’s nothing wrong with accounting for product quality. But most economists and policymakers have failed to take into account how adjusting for quality improvements in a relatively small subsector skews the manufacturing output data.
“Even though well-trained economists know that you can’t look at descriptive evidence and jump to the conclusion that productivity growth in the form of automation is causing employment declines,” says Houseman, “the evidence just looks so compelling it seems obvious that’s what going on.”
“The dominant narrative is that there’s no problem.” Many economists are aware of the computer industry’s outsize contributions to sector statistics. But few realize that the figures showing vast increases in manufacturing output have been dominated by a single small industry, according to Houseman.
“The dominant narrative is that there’s no problem, that it’s doing very well, and that’s kind of the end of the story, at least among economists,” she says. “Trump won to some degree arguing that trade had harmed US workers and that US manufacturing was not doing well. Very often, the mainstream media and economists were quick to point out that that’s not borne out by statistics. But that’s based on a misreading of the statistics.”
The hollowing-out hidden in the data
This erroneous notion, based solely on a statistical anomaly, long ago crystallized into deeply misleading consensus that high-tech advances in America’s manufacturing sector give it a comfortable competitive edge. And that’s not at all the case.
One way of gauging how the sector has been doing is to compare how much real output in manufacturing has grown, both with and without computers, compared to the private sector as a whole—which encompasses everything from finance and agriculture to retail and manufacturing. According to Houseman’s research, between 1947 and 1979, real output in manufacturing and the private sector expanded at about the same speed. Strip out the computer subsector from both datasets, and that trend is pretty much the same.
The divergence first emerged in the late 1970s, as the semiconductor industry took off and the computers and electronics subsector began driving growth in manufacturing output.
Between 2000 and 2016, the average growth in the sector’s real output was only about 63% of that of the private sector. But when you take out computers out of both data series, the trend is far more striking: Since 2000, manufacturing output expanded at an average pace equal to only 12% of the private sector’s average growth.
In fact, according to Houseman’s data, without computers, manufacturing’s real output expanded at an average rate of only about 0.2% a year in the 2000s. By 2016, real manufacturing output, sans computers, was lower than it was in 2007.
This has grim implications for what had been assumed to be healthy productivity. As with real output, productivity growth comes mostly from the computers subsector’s quality adjustment—which means that the apparently robust growth in manufacturing productivity is mostly a mirage.
To be clear, automation did happen in manufacturing. However, throughout the 2000s, the industry was automating at about the same pace as in the rest of the private sector. And if booming robot-led productivity growth wasn’t displacing factory workers, then the sweeping scale of job losses in manufacturing necessarily stemmed from something else entirely.
The truth about automation versus trade
It’s not perfectly clear what, exactly, is the culprit behind relatively anemic growth in manufacturing output. But the signs indicate trade and globalization played a much more significant role than is commonly recognized.
Of particular importance is China’s emergence as a major exporter, which US leaders encouraged. A pair of papers by economists David Autor, David Dorn, and Gordon Hanson, found that the parts of the US hit hard by Chinese import competition saw manufacturing job loss, falling wages, and the shrinking of their workforces. They also found that offsetting employment gains in other industries never materialized.
Offsetting employment gains in other industries never materialized. Another important paper by this team of economists, along with MIT’s Daron Acemoglu and Brendan Price, estimated that competition from Chinese imports cost the US as many as 2.4 million jobs between 1999 and 2011.
Why did China have such a big impact? In their 2016 study, economists Justin Pierce and Peter Schott argue that China’s accession to the WTO in 2001—set in motion by president Bill Clinton—sparked a sharp drop in US manufacturing employment. That’s because when China joined the WTO, it extinguished the risk that the US might retaliate against the Chinese government’s mercantilist currency and protectionist industrial policies by raising tariffs. International companies that set up shop in China therefore enjoyed the benefits of cheap labor, as well as a huge competitive edge from the Chinese government’s artificial cheapening of the yuan.
Robots, like humans, have to work somewhere. The resulting appreciation of the dollar hurt US exporters—in particular, manufacturers. A 2017 study on the dollar’s appreciation in the early 2000s by economist Douglas Campbell found that the dollar strengthened sharply, in real terms, compared to low-wage trading partners including China. The subsequent increase in foreign imports and diminished demand for American exports resulted in a loss of around 1.5 million manufacturing jobs between 1995 and 2008.
There are also observable signs that automation wasn’t to blame. Consider the shuttering of some 78,000 manufacturing plants between 2000 and 2014, a 22% drop. This is odd given that robots, like humans, have to work somewhere. Then there’s the fact that there simply aren’t that many robots in US factories, compared with other advanced economies.
The cost of complacency
Two decades of ill-founded policymaking radically restructured the US economy, and reshuffled the social order too. The America that resulted is more unequal and more polarized than it’s been in decades, if not nearly a century.
In effect, US policymakers put diplomacy before industrial development at home, offering the massive American consumer market as a carrot to encourage other countries to open up their economies to multinational investment. Then, thanks to the popular narrative that automation was responsible for job losses in manufacturing, American leaders tended to dismiss the threat of foreign competition to a thriving manufacturing industry and minimize its importance to the overall health of the US economy.
“We didn’t have the intelligent debates about what was going on with trade because people were just denying there was any problem.” “A lot of policymakers, not everyone, but most, just missed the boat,” says Houseman. “We didn’t have the intelligent debates about what was going on with trade, etc., because a lot of people were just denying there was any problem, period.”
The problem is that manufacturing plays a significant role in the US economy. Manufacturing jobs tend to pay better, and create opportunities for learning skills that are particularly important to workers with less formal education. Factories also encourage innovation by attracting research and development (R&D) facilities, which need access to production lines to translate design into real products and to work out the kinks in prototypes. This is why when plants shutter and are moved overseas, R&D centers almost always go with them, says Houseman. Detached from the innovative feedback loop formed with R&D, US factories struggle to compete.
The received wisdom that the US was simply becoming a service-driven economy also lulled leaders into complacency about the long-term economic and social cost of lost manufacturing jobs. The establishment assumed that the apparent increase in the sector’s output and productivity would eventually solve the problem; where there was wealth, there would be new job openings to replace lost factory work. But, as a growing heap of research shows, workers hit by mass layoffs suffer unusually big wage losses throughout their careers, and many exit the workforce entirely.
Detached from the innovative feedback loop, US factories struggle to compete. While the forces of globalization battered America’s middle class, they largely benefited the country’s emerging urban professional elite—managers, consultants, lawyers, and investment bankers enriched by booming international investment and by the cheapening of imports. And as multinational corporations and their bosses gained political clout, the interests of the middle class faded.
Two decades of complacency among US leaders gave companies in Asia and other emerging export bases time to create world-class factories and robust supply chains. Tellingly, even as the real output of the computers subsector has appeared to grow astonishingly quickly, the sector has been steadily losing market share to Asian competitors, according to a 2014 paper by Houseman and economists Timothy Bartik and Timothy Sturgeon.
A legacy of ignorance
One reason why Houseman’s revelation is so important is that the myth of automation continues to have a strong grip on the minds of American policymakers and pundits. The lessons of the populist backlash during the 2016 presidential election didn’t seem to take. As the US gears up for mid-term elections this year, the Democrats have no vision for how to reverse the industrial backslide.
Ironically, that criticism applies to Trump, too. His campaign ignited a vitally important national conversation on the relationship between US trade policies and manufacturing’s decline. Since he took office, however, Trump has paid minimal attention to boosting US manufacturing. Instead, he’s favored counterproductive protectionismand ignored currency manipulation, preferring the punitive over the constructive.
US leaders’ longstanding misunderstanding of the manufacturing industry led to the biggest presidential election upset in American history. But they still don’t seem to grasp what’s been lost, or why. It’s easy to dismiss the disappearance of factory jobs as a past misstep—with a “we’re not getting those jobs back” and a sigh. Then again, you can’t know that for sure if you never try.