Technology Spending Is On Its Way Down. Or It Isn’t. Or It Doesn’t Matter.

mooseMy MIT colleague David Autor delivered a wonderful paper at the recent Jackson Hole Economic Policy Symposium about American job and wage patterns in recent decades, and their link to the computerization of the economy. I’ll say more later about his paper, which was one of the highlights of the event for me (sighting this moose was another one). For now I just want to highlight one graph that he included, and draw a couple additional ones.

Autor included a chart very much like the one below, which tracks US corporate spending over time on digital stuff — hardware and software — as a percentage of GDP:

The most striking pattern in this graph is the sharp increase in the late 1990s, and the steep falloff since then. We’re spending just about a full percentage point of GDP less on IT than we were fifteen years ago. This seems like a compelling prima facie case for believing that IT’s impact on the economy and the labor force should be less than it was before the turn of the century.

And this is what Autor believes. As he writes

the onset of the weak U.S. labor market of the 2000s coincided with a sharp deceleration in computer investment—a fact that appears first-order inconsistent with the onset of a new era of capital-­labor substitution.

I completely agree with him (based largely on his very convincing work) that other factors have strongly shaped the US economy and labor force since the 2000, particularly the emergence of China as an offshoring and manufacturing powerhouse. But I’m not so sure that the impact of digital technologies tapered off to the extent the graph above would have us believe.

To see this, let’s break out the data on software. Information processing equipment is simply a vehicle for software, in much the same way that a bottle of 5-hour energy is a delivery system for caffeine. Hardware runs software, in other words, and it’s software that runs things.

It’s easy to lose sight of that fact in an era of gorgeous devices like Apple’s, but without the apps my iPhone is just a… phone. It’s software that is ‘eating the world,’ to use Marc Andreessen’s memorable phrase.

So how has software spending held up? Pretty well:

There was a slight dropoff after the dot-com bubble burst and the Y2K fiasco passed, but we’re back near the all-time software spending peak. It’s true that this spending has been pretty flat for the past fifteen years, but we should keep in mind that this is also the time when open source software and the cloud and everything-as-a-service burst on the scene. All of these development have significantly lowered the bill for a given level of enterprise software capability, so I look at the graph above as pretty good evidence of constantly increasing demand for software, even though spending has remained constant for a while now.

The ascendancy of software can be seen in a graph of its share of total IT spending over time:


Software now accounts for over half of all IT spending. As Moore’s Law, volume manufacturing, and the cloud continue to drive down the costs of hardware, I expect software’s share of total spend to continue to rise steadily.

I don’t know what’s going to happen to total IT investment as a percentage of GDP going forward. It does feel to me like a sea change is taking place — that it’s getting so much cheaper to acquire digital technologies that even if demand for them rises strongly in the future total spending on them might not (or as an economist would put it, the price effect might be greater than the quantity effect).

So even if the first graph above doesn’t greatly change its shape in the years to come, I won’t take that as evidence that the digital revolution has run its course. Will you?