Productivity continues to be a debated topic, and the subject of both academic and more practical white papers, with no clear-cut conclusion reached, although hopes and expectations for improvement from current low levels are widespread. A recent report by consultant McKinsey delved into this topic extensively. This isn’t entirely surprising since productivity is a more difficult statistic to measure.
What is productivity?
Essentially, it’s the amount of output earned per a given amount of input. Typically, this is considered to be the goods and services produced per worker hours in a given period. We care about productivity, as it represents the compliment to labor force growth—both of which, when combined, are the building blocks of economic GDP for a country. GDP growth naturally carries through a variety of meaningful areas in the investment realm, including company earnings growth, which affects stock valuations and corporate credit quality.
Economist Robert Solow notoriously mentioned in the late 1980’s that the computer age was ‘everywhere except for productivity statistics’ (a condition which became known as the Solow Paradox). The statement is almost amusing at this point, considering how far technology has developed since that time.
Why is it low?
The best guesses are that productivity is low for two reasons. One, that current levels are merely a ‘normalization’ lower from a unique stretch of high productivity during the 1990’s. While many decades feature technological innovations of one form or another, specific improvements to semiconductor size and speed, supply chain management tools in industrial and retail applications, as well as the widespread dispersion of the internet to businesses and consumers mid-decade was truly transformative. While taken for granted today, the use of e-mail, mobile phones and other new technologies represented exponential improvements on existing systems (e.g. paper filing systems, regular mail and land line phones). No doubt these shifts ballooned business output per a lower amount of work input—the essence of productivity—aside from the immense consumer benefits. However, in recent years, would the tenth iteration of the same cell phone with a slightly sharper camera result in this same degree of productivity improvement than the first phone created? Not as likely.
A second potential cause is leftover uncertainty about economic conditions and demand in the years following the Great Recession. It’s been well-noted by researchers that severe economic recessions and depressions, especially those centered around a financial sector crisis, have led to an aftermath of constrained business and consumer investment for up to a decade afterward (we’ve finally seemed to have begun crawling out of that trend more recently, almost on cue). Fewer business capital expenditures naturally puts a damper on output potential, although underlying labor markets are strong. Essentially a lot of workers, albeit a lack of skilled workers in some areas, but fewer efficient tools to use.
What’s next?
In looking back at the base equation, higher productivity is about doing more with the same or less. A global movement towards further ‘next wave’ digitization could be the tool that eventually takes us there. For example, these include electric and/or autonomous driving vehicles, movements in the financial industry further towards online products and shifts away from cash, continued growth of online retail as percentage of total retail (surprisingly only 10% today), technological innovations in artificial intelligence and machine learning, as well as smart grid/energy efficiency and infrastructure improvements. There are winners and losers in each of these segments, as there usually is during periods of technological evolution, but overall, signs point to the economy becoming more productive than less.