Mobs Rule!

A June 25 post to the New York Times Bits blog starts with “To get the best predictions about when your company’s latest product will ship or how it will sell, you might try asking your employees — anonymously.” A skeptical but fair response to this is something like “Sure, I might, but why should I? Our company has professionals who make these kinds of forecasts for a living. Why on earth should I place more trust, or any trust, in the predictions that come from an anonymous crowd?”

A convincing answer to this question is “You should place more trust in the crowd if it gives you better answers than your current methods do.” And the post in the NYT presents evidence that corporate prediction markets do just that.

Prediction markets are a technology that harnesses collective intelligence / the wisdom of crowds to gaze into the future and generate a consensus forecast (in other words, the crowd’s aggregate best guess) about what will happen. I’ve written about them here, and here’s a nice overview paper about them from Justin Wolfers and Eric Zitzewitz.

On the Internet they’ve been shown to deliver more accurate predictions about political elections and movie revenues than other techniques like polls and statistical forecasting methods. Pioneering efforts to use them within companies show that they’re also highly accurate when deployed behind the firewall (see, for example, the case I wrote with Karim Lakhani and Peter Coles about Google’s internal prediction market and this paper written by Google’s Bo Cowgill and his colleagues).

But are they more accurate than the other techniques companies use to forecast future events of interest? The NYT post presents data from the prediction market startup Crowdcast (disclosure: I am on Crowdcast’s advisory board, and have a small amount of stock in the company), which has deployed its technology within multiple large companies.  As Claire Cain Miller writes in the Times:

At a media company with a new product to ship, 1,200 employees predicted a ship date and sales figures that resulted in 61 percent less error than executives’ previous prediction, according to Crowdcast. A pharmaceutical company asked a panel of scientists and doctors to predict regulatory decisions and new drug sales using Crowdcast, and they were more accurate than the company’s original prediction 86 percent of the time.

Crowdcast chief scientist Leslie Fine, a deceptively normal looking person with what I’m pretty sure is a 4-digit IQ (she has a doctorate in experimental economics from CalTech, and they don’t just hand those out), gave me some more information about the setting in which accuracy vs. the ‘official’ forecast was improved by 61%:

For a major media and gaming company, we predicted total hardware and software sales, as well as metacritic scores for games.  Metacritic is an aggregate review score, and is by far the #1 predictor of sell-through.  Our client there is the Senior Director for Publishing Intelligence.  He starts the markets off at the company’s ‘official’ best guesses for future sales and scores, as soon as the publishing intelligence guys come up with them. For metacritic scores, they do this by ranking the game on a number of axes (out of 10): playability, graphics, etc..  They add them up, which yields the score.  For hardware and software sales, the official prediction comes from applying growth percentages to historical data.

There are also some great stats coming out of a survey we did at this company.  They’re a real turn on (ok, if you’re a total nerd):
– 80.6% say they participate because it’s fun.
– 35.5% say they do so because it “Gets me thinking about stuff that helps me do my job”
– 44.1% say “I believe my input is valuable”

– 52.1% say they usually research the forecast before betting; another 38% say at least sometimes
– 54.7% believe that that business leaders value the knowledge

Crowdcast has had two important insights; the first is common to all prediction market vendors, while the second is unique to them (as far as I know). The common one is captured well in a couple paragraphs from the Times post:

“It’s a huge problem in corporations — by the time information gets to the top, it’s meaningless or too late,” said Mat Fogarty, Crowdcast’s founder and chief executive. Mr. Fogarty spent a decade doing corporate forecasting at Electronic Arts and Unilever, where he saw how inaccurate official corporate forecasts could be. “There is a lot of information not getting to decision-makers, and that’s expensive,” he said.

At Electronic Arts, Mr. Fogarty said, the most junior people often had the most insight into the quality of new video games, and he picked up the best information about the latest version of Madden while playing soccer games on the company lawn.

Fogarty saw the same thing that the great Austrian economist Friedrich von Hayek noted in his seminal 1945 article “The Use of Knowledge in Society:”

The… problem of a rational economic order is determined precisely by the fact that the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess. The economic problem of society is… a problem of the utilization of knowledge which is not given to anyone in its totality.

Hayek also saw how beautifully prices established by markets solved this vexing problem:

We must look at the price system as such a mechanism for communicating information if we want to understand its real function… The most significant fact about this system is the economy of knowledge with which it operates, or how little the individual participants need to know in order to be able to take the right action. In abbreviated form, by a kind of symbol, only the most essential information is passed on and passed on only to those concerned. It is more than a metaphor to describe the price system as a kind of machinery for registering change…

The marvel is that in a case like that of a scarcity of one raw material, without an order being issued, without more than perhaps a handful of people knowing the cause, tens of thousands of people whose identity could not be ascertained by months of investigation, are made to use the material or its products more sparingly; i.e., they move in the right direction…

I have deliberately used the word “marvel” to shock the reader out of the complacency with which we often take the working of this mechanism for granted. I am convinced that if it were the result of deliberate human design, and if the people guided by the price changes understood that their decisions have significance far beyond their immediate aim, this mechanism would have been acclaimed as one of the greatest triumphs of the human mind.”

(sorry for the long quote, but I really love this paper)

Fogarty started Crowdcast because he believed, as did Hayek, that markets were marvelous devices for aggregating and sharing important knowledge. Fogarty further believed that they could be used to address the problems of poor intra-company information flows he had witnessed firsthand. In the era of high bandwidth, powerful browsers, and software as a service, he saw an opportuity to bring Hayek’s marvel inside the firewall, and to apply it to a range of tasks.

The company’s second insight was that most people are not stock traders by nature, and the typical trader’s interface of current prices, bid-ask spreads, and order books is off-putting and alienating to most (it certainly is to me). So the Crowdcast team replaced it with a much simpler interface —  one that lets participants make their trades graphically, by moving sliding bars across a bell curve. They make a prediction by selecting a range of values; a tight range indicates a confident prediction (and a high payout if it turns out to be correct), while a large range corresponds to a less confident prediction. This demo from Crowdcast shows how the system works. I find it simple and highly intuitive, and Fine has worked to make sure all the underlying math is solid.

The evidence is mounting that corporate prediction markets work as advertised, delivering quick, accurate, and decisive predictions in areas of great interest. Furthermore, the evidence so far also suggests that they work better than current corporate forecasting techniques, at least in some circumstances. So are there any good reasons left for not using them, or at least experimenting with them? I ask seriously: why would any enlightened company not avail itself of this technology? Can you come up with any legitimate reasons not to jump in with prediction markets?  Leave a comment, please, and let us know.