19th Century Economist Reveals Surefire Investment Strategy!

by Andrew McAfee on January 19, 2011

Intel just finished the “best year in [its] history,” and expects 2011 to be even better. This news suggests a few important questions: for how much longer are we going to keep buying more and more more powerful microchips? Will 2012 be still better for Intel and other hardware suppliers? 2020? 2050? How much can the demand for computation keep expanding?

I first started asking myself these questions after I drew graphs (using US BEA data) of changes over time in computer cost and aggregate US corporate computer spending. They reveal a deeply weird pattern: as computers get cheaper, companies spend more and more on them. See for yourself:

Prices of computing equipment over time

I can see how this pattern would hold for a while. There’s some level of demand among companies for computational power, and until that demand is met investment will continue. Total investment could well increase for a while, even as prices dropped, because there’s so much thirst for computers, but after some finite period of time it’ll level off and maybe even start to decrease. After everybody’s got an Internet-connected PC and a smartphone, and after datacenters have centralized and moved into the cloud, total US corporate spending on computing gear will taper off, right?

Wrong, if an obscure 19th century English economist’s thinking is any guide to our current situation. And I think it is.

In 1865 (when he was only twenty-nine years old) William Stanley Jevons published The Coal Question, a book about the consequences of greater energy efficiency. In it, he advanced an idea that’s still counterintuitive and controversial: that greater energy efficiency leads not to lower total energy consumption, but instead to exactly the opposite outcome: higher aggregate consumption.

The standard (and correct) line about greater efficiency is that it lets us do the same with less consumption. Therefore, standard thinking goes, we’ll consume less. But an equally correct statement about greater efficiency is that it lets us do more with less, and Jevons’s great insight was that we’ll take advantage of this fact to do more.

As coal burning furnaces become more efficient, for example, British manufacturers will build more of them, and increase total iron production while keeping their total coal bill the same. This greater supply will lower the price of iron, which will stimulate new uses for the metal, which will stimulate demand for more furnaces, which will mean a need for more coal. The end result of this will be, according to Jevons, “the greater number of furnaces will more than make up for the diminished consumption of each.”

How long can this go on? A recent New Yorker article (subscription required) by David Owen gives a vivid example:

In a paper published in 1998, the Yale economist William D. Nordhaus estimated the cost of lighting throughout human history. An ancient Babylonian, he calculated, needed to work more than forty one hours to acquire enough lamp oil to provide a thousand lumen-hours of light—the equivalent of a seventy five watt incandescent bulb burning for about an hour. Thirty five hundred years later, a contemporary of Thomas Jefferson’s could buy the same amount of illumination, in the form of tallow candles, by working for about five hours and twenty minutes. By 1992, an average American, with access to compact fluorescents, could do the same in less than half a second. Increasing the energy efficiency of illumination is nothing new; improved lighting has been “a lunch you’re paid to eat” ever since humans upgraded from cave fires (fifty eight hours of labor for our early Stone Age ancestors).

Yet our efficiency gains haven’t reduced the energy we expend on illumination or shrunk our energy consumption over all. On the contrary, we now generate light so extravagantly that darkness itself is spoken of as an endangered natural resource.

Not everyone agrees that the “rebound” effect hypothesized by Jevons is still a big deal in today’s energy consumption patterns, and many smart people believe that greater energy efficiency is in fact the way to reduce energy use in the future. Longtime efficiency advocate Amory Lovins of the Rocky Mountain Institute, for example, sent this reply to Owen’s article.

Rather than wading into the energy debate, though, I want to wonder out loud if computation is like energy. Both are necessary inputs to many productive activities. Both are consumed by every company in every industry. Both are amplifiers of human ability: energy amplifies or replaces our muscles; computation does the same for our brains and senses. Both come from physical things, yet are themselves ethereal; you can hold a lump of coal or a transistor in your hand, but not a joule or a megaFLOP. And the devices that generate both are getting more efficient over time.

If the analogy is a tight one, if Moore’s Law continues to hold true, and if Jevons was right about energy-intensive processes, then one conclusion seems inescapable: the trends visible in both of the graphs above are going to continue. Computers are going to keep getting cheaper, and aggregate demand for them is going to continue to rise.

Innovators will keep figuring out new things to do with computers as they keep getting cheaper (and smaller and lighter), and this innovation in use cases will more than offset the steadily falling cost of a unit of computation. Some of these innovations are visible now —  tablets, smartphones, RFID tags and other smart sensors, computer pills, labs-on-a-chip, and so on —  and some of them aren’t. To (badly) paraphrase David Mamet, nobody can predict innovation; that’s why they call it innovation.

One final implication of this logic is that the computation industries are going to keep growing for a long time to come, just like the lighting and other energy industries have done since Stone Age and Babylonian times. So if you’ll excuse me, I’m going to go find a good hardware manufacturer index fund to invest in…

What do you think? Do Jevons’s insights apply to demand and supply for computation today? Would now be a good or bad time to invest in hardware makers? Leave a comment, please, and let us know what you think.

  • http://thestateofme.wordpress.com Chris Swan

    I’ve bought into the Jevons argument for cloud computing for some time (having been indoctrinated into it by Simon Wardley and his presentations at London Cloudcamp that ultimately led to http://youtu.be/5Oyf4vvJyy4), but there are some issues…

    Moore’s Law has only remained true so far because Intel have made it happen. Physical barriers (the relationship between feature size and atomic size) and economic barriers (the cost of new fabs) loom ever closer. I’ve heard Intel people say 2015 is pretty much the end of the line.

    Hardware becoming a commodity (as a subset of computing as a whole becoming a commodity) may also not be a good reason to invest in hardware manufacturers. In the past firms that make chips and build servers around them have enjoyed some healthy margins. What did the margins for coal mining look like before and after the invention of Watt’s more efficient steam engine?

  • http://www.digispeaker.com Jon Smirl

    Sooner or later we are going to hit a wall for single CPU chips. It may even be in 2015 as some Intel people say. But this isn’t going to stop the demand for computational power. We have barely scratched the surface of “The Internet of Things”. That’s a world where everything contains small computers with radios. Your house alone will probably consume 1,000 processors is 2020. They will be tiny and cheap but they will be everywhere. The aggregate computation power of those nodes will be huge. Don’t think the 1,000 CPU number is crazy, I’ve counted over 500 in my current home. Modern lighting consumes hundreds of CPUs per house.

    Here’s a random idea – embed something like RFID into everything in the house. Now you can tell its location and take an inventory automatically. The items will just come with the chips inside of them, it won’t be something you add. This will probably happen via microwave energy harvesting devices within 15 years, but I want it in my car keys today! (RFID doesn’t work because you have to be too close to a scanner).

    Anything that works in the home also works in business, so these tiny CPUs will be ubiquitous. Office buildings could consume millions of them.

  • http://twitter.com/chris_p_walker Christian Walker

    In 1994 I had one PC in my house. I now have about a dozen devices that provide computing capabilities in some form or another. I am not including items such as toasters, coffee makers (although one of them does read bar codes), or the dishwasher.

    Yeah, the cheaper it gets the more of it we need / want.

  • Mmadsen0

    One point: Moores law doesn’t hold true any more for silicon transistor densities, as Moore himself said in 2005. Increasing performance of a single cpu through increased density and multiplying the number of CPUs are two different things, leading to different outcomes. Until a suitable replacement for silicon comes along we’re focused on producing a different kind of performance with different characteristics.

  • http://www.infovark.com Dean Thrasher

    Your question hinges on what happens to the chipmakers’ margins as CPUs become cheap and ubiquitous. A better parallel to the CPU might be the electric motor. Most of the motors manufactured today are standard parts that conform to a handful of well-known designs. The margins are razor thin and competition is fierce. There are a few manufacturers that cluster at the custom end of the market, serving small niches. We’ll see something similar in the chipmaking industry, I think.

  • Simon Wardley

    Excellent article Andrew. However, before you invest … whilst the use of computational resources is going to rapidly increase through the underlying forces behind Jevons’ (specifically componentisation and co-evolution impacts), the means of making these resources may well take a battering through printed electronics. Choose your hardware maker wisely … you might even want to consider large traditional printing companies especially as they currently think they have no future.

  • Charlie

    Yeah yeah ….anyone find a hardware manufacturer index fund yet?

  • http://twitter.com/piewords Laurence Hart

    I think the insights apply. As time moves on, more and more will be done by computers. That photo album on the shelf? It’s moving online. Artificial, digital, assistants? They will require some computing power.

    We’ll keep spending more in order to do more. The only way that will change is when the technology reaches a point where it doesn’t need to be replaced every few years. Then we may spend less as individuals, but we as a society will spend more as we push towards the stars.

    -Pie

  • http://pulse.yahoo.com/_6HFBTJ7TQTN3BXRETUX2F2UGV4 Anonymous

    As a college student in the 1990s, the notion that computers were going to help business create a “paperless” office existed, too. My guess is that both the Jevons principe and Moores Law are applicable to the consumption of paper as well and a similar trending graph could be created to show it. :)

  • http://twitter.com/vkurup18 Vinesh Kurup

    Intel’s figures need to be taken with a pinch of salt.. Is the US still it’s largest market? Even if that is true, I suspect Intel is seeing growth in new markets. However, agree with the observation that IT costs/employee may be rising – this needs to factor new IT needs like remote working and device diversity. Intel’s future in 2020 will depend on how it adopts mobile computing power… Watch out for ARM..

  • http://OnTheSpiral.com/ GregoryJRader

    I believe Ray Kurzweil has made a similar argument – basically that an iPhone is not just an ENIAC thousands or millions of times better but that it is a completely different sort of device that couldn’t exist until price/performance hit a certain ratio.

    I remember 5-10 years ago when people were wondering how much better desktop chips really needed to get. If most people are just surfing the web and using basic applications they didn’t really need 3ghz or 4 cores. That prediction turned out to be more or less true but that didn’t affect the overall trend as effort shifted from performance at all cost to miniaturization and power efficiency.

    So it is probably true that we don’t need smartphones with 100x more power, but instead all that price/performance improvement will shift in another direction and we will have devices that would be impossible with current technology…

  • http://www.cureahemorrhoids.com Cure Hemorrhoid

    however I’m unable to concur collectively with your judgment final result in I really feel the total opposite..

  • http://www.joeweinman.com/papers.htm Joe Weinman

    I wrote about this at Information Week in Spring of 2010 – http://www.informationweek.com/cloud-computing/blog/archives/2010/05/cloud_computing_18.html . It’s just price elasticity of demand. Mobile phones are a great example : the industry went from $0 in revenue to nearly $1T , as the cost of endpoints and of $/bit or $/minute dropped.

  • http://www.parcadunyasi.net parça kontör

    Yeah yeah ….anyone find a hardware manufacturer index fund yet?

  • http://thesoftshops.com/microsoft-office-2010.html Office 2010 pro

    At that time?economic seems a good strategy.

  • http://twitter.com/RayDePena Ray DePena

    Hi Andrew,

    I view Jevon’s theory and Moore’s law as complementary. 

    The notion of “cost savings”, “lower IT costs”, “greater efficiency” are all double entendres. That’s why we often hear, “it depends” (on the use and meaning) as a response. 
    They can mean “per unit” or in aggregate. We have greater lumen output at lower cost per unit in our household, that doesn’t necessarily mean that we would consume less lumens (but we do consume less energy). 
    If the cost of production per unit of energy dropped to zero, we would likely use more lights in our home. On a per unit (light bulb) basis it translates into greater efficiency and cost savings. On an aggregate (household) basis it would depend. If our lives were static, then our energy consumption would remain the same. I suspect that as my children reach their teenage years our light consumption will increase, so here’s to hoping that we achieve greater energy efficiency (lumen output) at a lower cost on a per unit basis.Like our growing household, humanity is not static. Innovation becomes the driver, and greater efficiency derived from technological innovation drives further innovation (as per the following hypothesis - http://innovation.ulitzer.com/node/1161563)).  -Ray

Previous post:

Next post: