The U.S. is enduring one of the worst bouts of stagnant productivity growth in its post-WWII history, and no one knows why. From 2011 to 2015, the percentage annual change in productivity was just .5%. The last time there was such a dismal five-year cluster of low productivity growth was from 1978 to 1982, a period when it was just .3%. By examining the chart below, which is based on data from the Bureau of Labor Statistics, we can see just how unusual such a period of sustained poor productivity is.
This is lousy news for the nation as a whole. Most economists think rising productivity, which is measured as output per hour in this dataset, is critical to lifting U.S. wages and living standards.
But why has it occurred? There are various theories. Some believe that the government isn’t able to properly measure productivity in the digital age. Others think that businesses don’t have much incentive to invest in a lot of productivity-enhancing tech in a post-Great Recession era when labor has been relatively cheap.
Still others assert that the U.S. has not been investing enough in the skills of its employees. The Brookings Institute, a Washington think tank, reports, “Although firms continue to invest in IT equipment and software (albeit at a lower rate than in the past), similar investments in workers’ acquisition of more advanced digital skills have not materialized. Over time, this shift in private-sector investment from workers to capital equipment prompted labor’s share of productivity growth to decline by 5 percent over the last decade.”
No one knows how long the stagnation will last (in the first quarter of 2016, there was actually a decline of .6%). Some experts believe that emerging technologies will boost U.S. productivity numbers in the not-so-distant future. A PricewaterhouseCoopers article, for example, suggests that service robots could play an integral role: “Service robots are at an inflection point, opening new contexts for productivity gains beyond what industrial robots have done.”
Artificial intelligence in general may have a major impact as well. Gartner forecasts that “the initial group of companies that will leverage smart machine technologies most rapidly and effectively will be startups and other newer companies,” and it notes that “the speed, cost savings, productivity improvements and ability to scale of smart technology for specific tasks offer dramatic advantages over the recruiting, hiring, training and growth demands of human labor.”
New machine-learning algorithms are, in fact, making mobile technology “intelligent interfaces” such as Viv more powerful. The promised rise of work-related augmented and virtual reality technologies might also help turn employee productivity around, although it’s still too soon to make forecasts in that area.
Some worry that the rash of new technologies, including self-driving vehicles, will be so productive that they will push millions of people out of work. One survey of senior executives conducted at the World Economic Forum suggests that five millions net jobs could be lost to new technologies by the year 2020. And, in a recent report to the U.S. Congress, White House economists forecast an 83% chance that workers with a median hourly wage of less than $20 per hour will eventually lose their jobs to automation (see page 239 of the report).
Of course, new jobs will also be created by these new technologies, and the jobs that do somehow harness these new technologies should be highly productive ones.
Solving the Productivity Puzzle
One way of interpreting the longitudinal data is that the U.S. and other developed nations are approaching the cusp of another rising mound of productivity. In the early 20th century, it took years for employers and the workforce to learn how to use the remarkable new technologies of wide-spread electricity, internal combustion engines, and telephones. Once they did, however, there was a huge productivity spike.
The personal computer revolution saw a similar, if less dramatic, trend. In the late 1980s and early 1990s, experts were concerned about the so-called “productivity paradox” behind the fact that increased usage of computer technologies had not led to significant increases in productivity. In the latter part of the 1990s and early part of the 2000s, however, investments in information technologies looked as if they were paying off in terms of higher productivity growth.
Today, the new term is “productivity puzzle,” but it amounts to a similar quandary. We have a number of emerging technologies that we haven’t yet learned to properly develop and harness: mobile apps, machine learning, service robots, Internet of Things, genetic editing, mixed reality tech, etc. Based on historical trends, we can project that these technologies will eventually result in another productivity spike.
If that happens, will it be enough to reverse the downward trend line we can see in the graph above? Maybe. One crucial factor will be the new skill sets that are developed alongside such technologies. A workforce that is capable of leveraging these smart technologies will almost certainly be more productive than one that does little more than oversee these technologies. Ultimately, these skill sets may be the true missing piece of the productivity puzzle.