On a recent trip to Munich I got to visit the geek paradise that is the Deutsches Museum, the largest science, technology, and engineering museum in the world. Its enormous collections were almost totally lost on me, though, because I spent just about all my time in the wing devoted to calculating devices.
This was not the plan. I intended to move on after checking out a few highlights like the Enigma machines (the legendary WWII German code machines deciphered by Polish and British brilliance) and the astonishing miniature Curtas (perfected while the inventor was an inmate in Buchenwald). But I wound up spending hours there, spellbound by the variety, ingenuity, and beauty of the devices on view.
They ranged from astrolabes to slide rules to planimeters to mechanical and electrical calculators to modern computers. I had no idea how most of them worked, or exactly what they did (even after reading the explanatory notes, many of which contained polynomials).
What was clear was that the devices on display were the fruits of centuries of human brilliance and craftsmanship, all aimed at improving our ability to calculate. They did so bit by bit, one hard-won step after another. The slide rules got a little more precise over the years, the mechanical calculators capable of handling one more digit.
And then we entered the digital era, and the situation changed utterly. Moore’s Law took hold, and calculating devices started getting twice as fast every 12-18 months. One of the crown jewels in the Deutsches Museum’s collection is a Cray-1 Supercomputer, the world’s fastest by far when it was introduced in 1976. 30 years later, the fastest PC microprocessors were over 100 times speedier (similar improvement has occurred in our ability to store data – the raw material and finished goods of calculation – in machines).
As I looked at the Cray, then down the hallway at the older devices I was reminded of Hemingway’s summary of how one goes broke: “Gradually, then all at once.” It struck me as a perfect description of how we improved our ability to calculate.
Throughout all of human civilization, professional smart people like scientists, engineers, financiers, accountants, managers, and analysts of every stripe have faced a computational bottleneck. One of the main things limiting their work, perhaps the main thing, was their inability to calculate any faster.
That might still be the case now for a few very specialized professions like large-scale climate modeling and cryptography, but all the rest of us are no longer limited by any lack of computing muscle. We have ample digital horsepower for our work. And in a world of cloud computing, high bandwidth, and mobile devices, we have access to this horsepower no matter where we are or what screen we’re in front of (I realize this is an overstatement, but it’s not a ridiculous one, and it might not be an overstatement at all for much longer).
When a change this big happens this quickly it takes a long time for all the implications to become clear. Many people remain tied to old practices and assumptions, and so are unable to take full advantage of the new state of affairs.
I don’t know all the ways that the business world is going to be changed by the arrival of superabundant computing. And neither does anyone else. It’s going to be a piecemeal process, full of trial and error. It’ll be carried out both by technology vendors and the people that deploy their offerings within companies. And it will affect every industry and every job that involves at least some knowledge work. Some of these changes will be small, but many of them will be major.
You can engage in the hard work of figuring out what digitization means for your company and its competitive situation, or you can ignore the phenomenon or hope that someone else will figure it out for you. If you take either of the latter courses I predict you’ll find your rivals pulling away from you. Gradually, and then all at once.